Black ops for Wii - $35. Has that zombie mode and multiplayer (free on Wii, no Xbox Live-type fees). I was thinking why not, spend the money, play it a few weeks, sell it again. Either that, or sell the Wii, I never use it.
http://www.rollingstone.com/music/news/kurt-cobain-rolling-stones-1994-cover-story-20110127
Friday, January 28, 2011
Thursday, January 27, 2011
Re: [Madness Writers] sports and other news
" Smart students will always tell you that most of what they learned
in college they learned on their own, which is true but opposite to
the purpose of college. "
in college they learned on their own, which is true but opposite to
the purpose of college. "
he copied my statement.
the only fallacy wit hthat is sciences. if people would just say
"go to college in the sciences" there wouldnt be a debate on the
merits of an education.
the rest of gladwell's articles
Blowup
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
January 22, 1996
DEPT. OF DISPUTATION
Who can be blamed for a disaster like the
Challenger explosion, a decade ago? No one,
according to the new risk theorists, and
we'd better get used to it.
1.
In the technological age, there is a ritual to disaster. When planes crash or chemical plants explode, each piece of physical evidence-of twisted metal or fractured concrete- becomes a kind of fetish object, painstakingly located, mapped, tagged, and analyzed, with findings submitted to boards of inquiry that then probe and interview and soberly draw conclusions. It is a ritual of reassurance, based on the principle that what we learn from one accident can help us prevent another, and a measure of its effectiveness is that Americans did not shut down the nuclear industry after Three Mile Island and do not abandon the skies after each new plane crash. But the rituals of disaster have rarely been played out so dramatically as they were in the case of the Challenger space shuttle, which blew up over southern Florida on January 28th ten years ago.
Fifty-five minutes after the explosion, when the last of the debris had fallen into the ocean, recovery ships were on the scene. They remained there for the next three months, as part of what turned into the largest maritime salvage operation in history, combing a hundred and fifty thousand square nautical miles for floating debris, while the ocean floor surrounding the crash site was inspected by submarines. In mid-April of 1986, the salvage team found several chunks of charred metal that confirmed what had previously been only suspected: the explosion was caused by a faulty seal in one of the shuttle's rocket boosters, which had allowed a stream of flame to escape and ignite an external fuel tank.
Armed with this confirmation, a special Presidential investigative commission concluded the following June that the deficient seal reflected shoddy engineering and lax management at NASA and its prime contractor, Morton Thiokol. Properly chastised, NASA returned to the drawing board, to emerge thirty-two months later with a new shuttle-Discovery-redesigned according to the lessons learned from the disaster. During that first post- Challenger flight, as America watched breathlessly, the crew of the Discovery held a short commemorative service. "Dear friends," the mission commander, Captain Frederick H. Hauck, said, addressing the seven dead Challenger astronauts, "your loss has meant that we could confidently begin anew." The ritual was complete. NASA was back.
But what if the assumptions that underlie our disaster rituals aren't true? What if these public post mortems don't help us avoid future accidents? Over the past few years, a group of scholars has begun making the unsettling argument that the rituals that follow things like plane crashes or the Three Mile Island crisis are as much exercises in self-deception as they are genuine opportunities for reassurance. For these revisionists, high-technology accidents may not have clear causes at all. They may be inherent in the complexity of the technological systems we have created.
This month, on the tenth anniversary of the Challenger disaster, such revisionism has been extended to the space shuttle with the publication, by the Boston College sociologist Diane Vaughan, of "The Challenger Launch Decision" (Chicago), which is the first truly definitive analysis of the events leading up to January 28, 1986. The conventional view is that the Challenger accident was an anomaly, that it happened because people at NASA had not done their job. But the study's conclusion is the opposite: it says that the accident happened because people at NASA had done exactly what they were supposed to do. "No fundamental decision was made at NASA to do evil," Vaughan writes. "Rather, a series of seemingly harmless decisions were made that incrementally moved the space agency toward a catastrophic outcome."
No doubt Vaughan's analysis will be hotly disputed in the coming months, but even if she is only partly right the implications of this kind of argument are enormous. We have surrounded ourselves in the modern age with things like power plants and nuclear-weapons systems and airports that handle hundreds of planes an hour, on the understanding that the risks they represent are, at the very least, manageable. But if the potential for catastrophe is actually found in the normal functioning of complex systems, this assumption is false. Risks are not easily manageable, accidents are not easily preventable, and the rituals of disaster have no meaning. The first time around, the story of the Challenger was tragic. In its retelling, a decade later, it is merely banal.
2.
Perhaps the best way to understand the argument over the Challenger explosion is to start with an accident that preceded it-the near-disaster at the Three Mile Island (T.M.I.) nuclear- power plant in March of 1979. The conclusion of the President's commission that investigated the T.M.I. accident was that it was the result of human error, particularly on the part of the plant's operators. But the truth of what happened there, the revisionists maintain, is a good deal more complicated than that, and their arguments are worth examining in detail.
The trouble at T.M.I. started with a blockage in what is called the plant's polisher-a kind of giant water filter. Polisher problems were not unusual at T.M.I., or particularly serious. But in this case the blockage caused moisture to leak into the plant's air system, inadvertently tripping two valves and shutting down the flow of cold water into the plant's steam generator.
As it happens, T.M.I. had a backup cooling system for precisely this situation. But on that particular day, for reasons that no one really knows, the valves for the backup system weren't open. They had been closed, and an indicator in the control room showing they were closed was blocked by a repair tag hanging from a switch above it. That left the reactor dependent on another backup system, a special sort of relief valve. But, as luck would have it, the relief valve wasn't working properly that day, either. It stuck open when it was supposed to close, and, to make matters even worse, a gauge in the control room which should have told the operators that the relief valve wasn't working was itself not working. By the time T.M.I.'s engineers realized what was happening, the reactor had come dangerously close to a meltdown.
Here, in other words, was a major accident caused by five discrete events. There is no way the engineers in the control room could have known about any of them. No glaring errors or spectacularly bad decisions were made that exacerbated those events. And all the malfunctions-the blocked polisher, the shut valves, the obscured indicator, the faulty relief valve, and the broken gauge-were in themselves so trivial that individually they would have created no more than a nuisance. What caused the accident was the way minor events unexpectedly interacted to create a major problem.
This kind of disaster is what the Yale University sociologist Charles Perrow has famously called a "normal accident." By "normal" Perrow does not mean that it is frequent; he means that it is the kind of accident one can expect in the normal functioning of a technologically complex operation. Modern systems, Perrow argues, are made up of thousands of parts, all of which interrelate in ways that are impossible to anticipate. Given that complexity, he says, it is almost inevitable that some combinations of minor failures will eventually amount to something catastrophic. In a classic 1984 treatise on accidents, Perrow takes examples of well-known plane crashes, oil spills, chemical-plant explosions, and nuclear-weapons mishaps and shows how many of them are best understood as "normal." If you saw last year's hit movie "Apollo 13," in fact, you have seen a perfect illustration of one of the most famous of all normal accidents: the Apollo flight went awry because of the interaction of failures of the spacecraft's oxygen and hydrogen tanks, and an indicator light that diverted the astronauts' attention from the real problem.
Had this been a "real" accident-if the mission had run into trouble because of one massive or venal error-the story would have made for a much inferior movie. In real accidents, people rant and rave and hunt down the culprit. They do, in short, what people in Hollywood thrillers always do. But what made Apollo 13 unusual was that the dominant emotion was not anger but bafflement--bafflement that so much could go wrong for so little apparent reason. There was no one to blame, no dark secret to un-earth, no recourse but to re-create an entire system in place of one that had inexplicably failed. In the end, the normal accident was the more terrifying one.
3.
Was the Challenger explosion a "normal accident"? In a narrow sense, the answer is no. Unlike what happened at T.M.I., its explosion was caused by a single, catastrophic malfunction: the so-called O-rings that were supposed to prevent hot gases from leaking out of the rocket boosters didn't do their job. But Vaughan argues that the O-ring problem was really just a symptom. The cause of the accident was the culture of NASA, she says, and that culture led to a series of decisions about the Challenger which very much followed the contours of a normal accident.
The heart of the question is how NASA chose to evaluate the problems it had been having with the rocket boosters' O-rings. These are the thin rubber bands that run around the lips of each of the rocket's four segments, and each O-ring was meant to work like the rubber seal on the top of a bottle of preserves, making the fit between each part of the rocket snug and airtight. But from as far back as 1981, on one shuttle flight after another, the O-rings had shown increasing problems. In a number of instances, the rubber seal had been dangerously eroded-a condition suggesting that hot gases had almost escaped. What's more, O-rings were strongly suspected to be less effective in cold weather, when the rubber would harden and not give as tight a seal. On the morning of January 28, 1986, the shuttle launchpad was encased in ice, and the temperature at liftoff was just above freezing. Anticipating these low temperatures, engineers at Morton Thiokol, the manufacturer of the shuttle's rockets, had recommended that the launch be delayed. Morton Thiokol brass and NASA, however, overruled the recommendation, and that decision led both the President's commission and numerous critics since to accuse NASA of egregious-if not criminal-misjudgment.
Vaughan doesn't dispute that the decision was fatally flawed. But, after reviewing thousands of pages of transcripts and internal NASA documents, she can't find any evidence of people acting negligently, or nakedly sacrificing safety in the name of politics or expediency. The mistakes that NASA made, she says, were made in the normal course of operation. For example, in retrospect it may seem obvious that cold weather impaired O-ring performance. But it wasn't obvious at the time. A previous shuttle flight that had suffered worse O-ring damage had been launched in seventy-five-degree heat. And on a series of previous occasions when NASA had proposed-but eventually scrubbed for other reasons-shuttle launches in weather as cold as forty-one degrees, Morton Thiokol had not said a word about the potential threat posed by the cold, so its pre-Challenger objection had seemed to NASA not reasonable but arbitrary. Vaughan confirms that there was a dispute between managers and engineers on the eve of the launch but points out that in the shuttle program disputes of this sort were commonplace. And, while the President's commission was astonished by NASA's repeated use of the phrases "acceptable risk" and "acceptable erosion" in internal discussion of the rocket-booster joints, Vaughan shows that flying with acceptable risks was a standard part of NASA culture. The lists of "acceptable risks" on the space shuttle, in fact, filled six volumes. "Although [O-ring] erosion itself had not been predicted, its occurrence conformed to engineering expectations about large-scale technical systems," she writes. "At NASA, problems were the norm. The word 'anomaly' was part of everyday talk. . . . The whole shuttle system operated on the assumption that deviation could be controlled but not eliminated."
What NASA had created was a closed culture that, in her words, "normalized deviance" so that to the outside world decisions that were obviously questionable were seen by NASA's management as prudent and reasonable. It is her depiction of this internal world that makes her book so disquieting: when she lays out the sequence of decisions which led to the launch- each decision as trivial as the string of failures that led to T.M.I.-it is difficult to find any precise point where things went wrong or where things might be improved next time. "It can truly be said that the Challenger launch decision was a rule- based decision," she concludes. "But the cultural understandings, rules, procedures, and norms that always had worked in the past did not work this time. It was not amorally calculating managers violating rules that were responsible for the tragedy. It was conformity."
4.
There is another way to look at this problem, and that is from the standpoint of how human beings handle risk. One of the assumptions behind the modern disaster ritual is that when a risk can be identified and eliminated a system can be made safer. The new booster joints on the shuttle, for example, are so much better than the old ones that the over-all chances of a Challenger-style accident's ever happening again must be lower-right? This is such a straightforward idea that questioning it seems almost impossible. But that is just what another group of scholars has done, under what is called the theory of "risk homeostasis." It should be said that within the academic community there are huge debates over how widely the theory of risk homeostasis can and should be applied. But the basic idea, which has been laid out brilliantly by the Canadian psychologist Gerald Wilde in his book "Target Risk," is quite simple: under certain circumstances, changes that appear to make a system or an organization safer in fact don't. Why? Because human beings have a seemingly fundamental tendency to compensate for lower risks in one area by taking greater risks in another.
Consider, for example, the results of a famous experiment conducted several years ago in Germany. Part of a fleet of taxicabs in Munich was equipped with antilock brake systems (A.B.S.), the recent technological innovation that vastly improves braking, particularly on slippery surfaces. The rest of the fleet was left alone, and the two groups-which were otherwise perfectly matched-were placed under careful and secret observation for three years. You would expect the better brakes to make for safer driving. But that is exactly the opposite of what happened. Giving some drivers A.B.S. made no difference at all in their accident rate; in fact, it turned them into markedly inferior drivers. They drove faster. They made sharper turns. They showed poorer lane discipline. They braked harder. They were more likely to tailgate. They didn't merge as well, and they were involved in more near-misses. In other words, the A.B.S. systems were not used to reduce accidents; instead, the drivers used the additional element of safety to enable them to drive faster and more recklessly without increasing their risk of getting into an accident. As economists would say, they "consumed" the risk reduction, they didn't save it.
Risk homeostasis doesn't happen all the time. Often-as in the case of seat belts, say-compensatory behavior only partly offsets the risk-reduction of a safety measure. But it happens often enough that it must be given serious consideration. Why are more pedestrians killed crossing the street at marked crosswalks than at unmarked crosswalks? Because they compensate for the "safe" environment of a marked crossing by being less viligant about oncoming traffic. Why did the introduction of childproof lids on medicine bottles lead, according to one study, to a substantial increase in fatal child poisonings? Because adults became less careful in keeping pill bottles out of the reach of children.
Risk homeostasis also works in the opposite direction. In the late nineteen-sixties, Sweden changed over from driving on the left-hand side of the road to driving on the right, a switch that one would think would create an epidemic of accidents. But, in fact, the opposite was true. People compensated for their unfamiliarity with the new traffic patterns by driving more carefully. During the next twelve months, traffic fatalities dropped seventeen per cent-before returning slowly to their previous levels. As Wilde only half-facetiously argues, countries truly interested in making their streets and highways safer should think about switching over from one side of the road to the other on a regular basis.
It doesn't take much imagination to see how risk homeostasis applies to NASA and the space shuttle. In one frequently quoted phrase, Richard Feynman, the Nobel Prize- winning physicist who served on the Challenger commission, said that at NASA decision-making was "a kind of Russian roulette." When the O-rings began to have problems and nothing happened, the agency began to believe that "the risk is no longer so high for the next flights," Feynman said, and that "we can lower our standards a little bit because we got away with it last time." But fixing the O-rings doesn't mean that this kind of risk-taking stops. There are six whole volumes of shuttle components that are deemed by NASA to be as risky as O-rings. It is entirely possible that better O-rings just give NASA the confidence to play Russian roulette with something else.
This is a depressing conclusion, but it shouldn't come as a surprise. The truth is that our stated commitment to safety, our faithful enactment of the rituals of disaster, has always masked a certain hypocrisy. We don't really want the safest of all possible worlds. The national fifty-five-mile-per-hour speed limit probably saved more lives than any other single government intervention of the past twenty-five years. But the fact that Congress lifted it last month with a minimum of argument proves that we would rather consume the recent safety advances of things like seat belts and air bags than save them. The same is true of the dramatic improvements that have been made in recent years in the design of aircraft and flight- navigation systems. Presumably, these innovations could be used to bring down the airline-accident rate as low as possible. But that is not what consumers want. They want air travel to be cheaper, more reliable, or more convenient, and so those safety advances have been at least partly consumed by flying and landing planes in worse weather and heavier traffic conditions.
What accidents like the Challenger should teach us is that we have constructed a world in which the potential for high-tech catastrophe is embedded in the fabric of day-to-day life. At some point in the future-for the most mundane of reasons, and with the very best of intentions-a NASA spacecraft will again go down in flames. We should at least admit this to ourselves now. And if we cannot-if the possibility is too much to bear-then our only option is to start thinking about getting rid of things like space shuttles altogether.
Loopholes for Living
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
April 15, 1996
BOOKS
When the means justify the ends.
1.
Leo Katz begins "Ill-Gotten Gains: Evasion, Blackmail, Fraud, and Kindred Puzzles of the Law" (Chicago; $29.95), his elegant defense of circumvention and subterfuge, with a fable for tax day. There was once, he writes, a wealthy shoemaker who was looking for a way to lessen the burden of supporting his son, to whom he was paying, year in and year out, an annual allowance of a thousand dollars. Cutting him off wasn't an option, because the shoemaker loved his son dearly. Nor was writing the thousand dollars off his taxes, because the I.R.S., understandably, doesn't allow family gifts to serve as tax deductions. But the shoemaker had a brainstorm. He gave his son ten thousand dollars, and then he asked for that same amount back in the form of a business loan, promising in return to pay interest on the loan at the rate of ten per cent a year, which amounts, of course, to a thousand dollars. VoilĂ ! With a minor sleight of hand, the shoemaker turns his family obligation into a seemingly legitimate business deduction.
This is what Katz, who teaches law at the University of Pennsylvania, calls "avoision"--behavior a little too fishy to seem like simple avoidance of illegality but not so obviously illegal as to constitute clear-cut evasion. Avoision covers those acts which lie in the awkward middle, and Katz sees the potential for avoision everywhere in the modern world. Imagine, for example, a tourist from a third-world country who comes to America and decides, at the last minute, that she wants to stay here. She then makes a series of provocative statements about her country which render her unwelcome at home and thereby qualify her for political asylum. Or what about a pornographer who, worried about running afoul of decency laws with his collection of highly explicit photographs, decides to put them in a book entitled "Sex in Marriage," together with long, windy essays on the future of marriage. The shoemaker, the tourist, and the pornographer all adhere to the form of the law, but they violate its spirit: they have exploited a loophole. Is what they are doing right? Should they be allowed to get away with it?
I think it's fair to say that most of us, intuitively, have a problem with avoision. Few would raise much of a fuss if the opportunistic tourist was deported, and even fewer would be fooled by the pornographer's cynical repackaging. And if the wealthy shoemaker managed to slip his ruse past the I.R.S. we would expect him at the very least to have the decency to be ashamed of what he had done. Even the authors of the self-help tax books that proliferate at this time of year rarely present their various tax-dodging schemes without some kind of moral justification. ("What is most important is not what a tax law says, but how the I.R.S. interprets and acts on it," Martin Kaplan and Naomi Weiss write in the best-selling "What the I.R.S. Doesn't Want You to Know," after reeling off a handful of anecdotes of capricious and vindictive government audits.) In fact, if the brief rise of Steve Forbes teaches us anything, it is that Americans have come to associate the paperwork, the complexity, and the game-playing surrounding the tax code with its corruption. What is the flat tax, after all, but a secular version of the tithe, an attempt to imbue what has become essentially a commercial transaction between citizen and state with the purity and simplicity of religious obligation?
This is the attitude that "Ill-Gotten Gains" sets out to confront. Katz likes loopholes. He thinks that the wealthy shoemaker has a point. And if, in the end, Katz is not entirely convincing it does not really matter. This is a heroically counterintuitive book that will make it difficult to think about tax day in quite the same way again.
2.
The problem with the way we feel about loopholes, according to Katz, is that we don't give them enough credit. We think of them in narrow, legal terms, as the unintended result of badly drafted laws. If the wealthy shoemaker can get away with masking his son's allowance as a business deduction, it's assumed that there is something amiss with the law, or with the vigilance of the I.R.S. But avoision is something that runs much deeper than that.
Katz produces one example after another from history and literature--from the confrontation between Neil Klugman and Brenda Patimkin over her diaphragm in Philip Roth's "Goodbye Columbus" to the way Freud phrased his exit statement to the Gestapo upon leaving Vienna--to prove that avoision is a kind of basic human strategy. Consider this, for example, from Bob Woodward and Carl Bernstein's Watergate memoir, "All the President's Men." Katz quotes the passage where the two Washington Post reporters are trying to get a senior Justice Department official to confirm off the record a rumor that Nixon's chief of staff, H. R. Haldeman, was about to be indicted:
"I'd like to help you, I really would," said the lawyer. "But I just can't say anything."
Bernstein thought for a moment and told the man they understood why he couldn't say anything. So they would do it another way: Bernstein would count to 10. If there was any reason for the reporters to hold back on the story, the lawyer should hang up before 10. If he was on the line after 10, it would mean the story was okay.
"Hang up, right?" the lawyer asked.
That was right, Bernstein instructed, and he started counting. Okay, Bernstein said, and thanked him effusively.
"You've got it straight now?" the lawyer asked.
This is classic avoision, a perfectly transparent piece of self-justification. Failing to deny the story has exactly the same consequence as confirming it. Nonetheless, in the eyes of the lawyer the difference between those alternatives was quite real. Using the loophole allowed him to live with his own conscience, to convince himself that he had not actively violated the confidentiality requirements of his position.
It is Katz's argument that we play these avoision games all the time, and that, far from being trivial or contemptible ruses, they embody real moral distinctions. Here is another of his many examples, involving a trolley driver whose brakes are shot. As the driver hurtles along, he comes to a fork in the track. Ahead are five people who cannot get out of the way in time. To his right is one person stranded on the track. We would all agree, I think, that the trolley driver should steer right, choosing to kill one person instead of five. But now consider an analogous situation: A physician has in his hospital five people who will die unless they receive immediate organ transplants. Two need kidneys. Two need lungs. One needs a heart. At that moment, a perfectly healthy person walks into the doctor's office. The doctor realizes that if he sacrifices that patient he can save five lives for the price of one. But this time, it's safe to say, no one would maintain that the physician should act as the trolley driver did. It's not good enough to want to save lives. You have to save lives in the right way.
This, at least, is what Katz believes. He describes himself as a "deontologist," which is to say that he thinks the morality of any outcome depends very much on how that outcome is achieved. It is in the illustration of this point that "Ill-Gotten Gains" truly takes flight. In one brilliant riff in the middle of the book's first section, for example, Katz gleefully plunges into Jesuitical theology, since he believes that the Jesuits were the ones who raised hairsplitting and loopholes to an art. Let's say that one wants to guiltlessly communicate an untruth. All one need do is, in the words of a Jesuit theologian quoted by Katz, "swear . . . that one has not done something, though one really has done it, by inwardly understanding that one did not do it on a certain day, or before one was born, or by implying some other similar circumstance."
Ridiculous? Not really, says Katz. For a man to disguise himself as a woman's boyfriend, creep into her bedroom in the middle of the night, and have sex with her is rape. But if another man met the same woman at a bar and by pretending to be a famous C.E.O. successfully seduced her his falsehood in that instance would not invalidate her consent. In other words, here are two lies, identical in their intent and in their result. Yet one is a crime and the other, however deplorable, is not. The Jesuits had a point. The circumstances under which a lie is told can make a big difference. Or consider the case of a woman standing in line for a movie who sees a man pointing a pistol right at her. If she grabs the person behind her and uses that person as a shield, we would say she was guilty, at least, of manslaughter. If she simply ducks, and the bullet hits and kills the person behind her, we would call her lucky--even if she was fully aware that if she ducked the person behind her would die.
This is how Katz resolves the question of whether the wealthy shoemaker is in the right. Here we have two identical actions--the gift of a thousand dollars from father to son. But in the first case the gift is direct, and in the second case it is not. The father gives the son an asset, and that asset, in turn, generates the income. How important is this distinction? Well, imagine that the son took his father's ten thousand dollars, put it in the bank, and lived off the interest. And suppose the shoemaker borrows ten thousand dollars not from his son but from the same bank at an identical interest rate. This is essentially the same transaction as before, just a bit more roundabout. But now no one would deny the shoemaker his tax deduction.
According to Katz, there is an important ethical principle involved here. Suppose I had designed the world's most powerful telescope, the only machine capable of glimpsing far- off planets. If I discovered a new galaxy and published my results under my son's name, we would all agree that my son would not deserve the ensuing fame. It would be like John F. Kennedy's accepting the Pulitzer Prize for "Profiles in Courage," a book that he is often said not to have written. You can't assign your fame to someone else. But suppose I gave the telescope to my son, and, armed with this unique instrument, he stumbled upon the same discovery. Now we would all concede that at least some of the fame due to this discovery should accrue to my son. Putting a little distance between the father and the son changes everything.
3.
How far should we go in accepting Katz's deontological fixation? Does he go overboard in his adherence to form? This is the question raised, indirectly, by a Yale University law professor, Stephen L. Carter, in his new book, "Integrity," an essay-length exploration of the consequences of the decline of public morality. Carter argues that integrity requires three things: "(1) discerning what is right and what is wrong; (2) acting on what you have discerned, even at personal cost; and (3) saying openly that you are acting on your understanding of right from wrong." Like Katz, Carter believes that an action should be judged by how it came about, by its adherence to rights and rules, by its form. But Carter's idea of form is far more restrictive than Katz's. Carter's precepts don't seem to make much of an ethical distinction, for example, between the man who posed as a woman's boyfriend in order to seduce her and the man who posed as a C.E.O. Neither had discerned right from wrong. Neither was acting on what he had discerned and certainly neither was "saying openly" that he was doing what he thought was right. Carter locates the morality of an act in its intention: Did the man deliberately mislead in the aid of the seduction? Katz is much more sensitive to the particulars of the act's execution.
A good example of this difference is found in an anecdote Carter tells at the beginning of his book about an incident he once saw while watching a football game on television. A player who had failed to catch a pass thrown his way rolled on the field, scooped up the ball, and jumped up, exultantly, as if he had caught the ball after all. The referee, shielded partially from the play, was so misled by the player's acting that he ruled the pass complete. The player, Carter concludes, lied, and he presents this incident as a telling example of the lack of integrity in American public life.
For the sake of argument, however, let's add two new wrinkles to the story. Suppose that the player, after scooping up the ball, didn't go through the pantomime of exultation. He simply ran over to the referee and loudly and hotly began insisting that he had caught the ball, even though he knew that he hadn't. Or suppose that the player, after attempting the catch, made no attempt to convince the referee that he had caught the ball at all. He was tired, and sick of playing football, and no longer interested in winning, so he shrugged and walked away, indifferent to the outcome of the game. Carter's rules, I think, end up lumping the faker, the arguer, and the quitter together: in one way or another, they all fail his integrity test.
Now, let's imagine how Katz would think about this incident. In the first instance, I think he might make the case that the faker was practicing avoision. Football, after all, deliberately does not use instant replay to review close calls. It relies on the judgment of referees, even though that judgment will occasionally be flawed, or there will be plays (like this one) that the referees cannot see. That's the loophole the player was exploiting--the inherent subjectivity of the way the rules are enforced. Notice as well how he chose to exploit this loophole. Carter says that the faker lied. But that's not quite right. It was the arguer who lied. He purposefully and directly misrepresented what happened on the play to the referee, putting himself clearly outside the realm of good sportsmanship. By contrast, the faker didn't say anything at all. What he did was bluff, and if Carter doesn't see a difference between lying and bluffing then I hereby extend to him a permanent invitation to my poker game.
That leaves us with the quitter, who is the only player who does not attempt to mislead. But isn't he really the worst of the three? Sports--organized games--can continue to function if players attempt to mislead one another, because there are referees who (most of the time) will catch and punish that conduct. But sports can't survive if players no longer try. The quitter, whose actions make him appear to be the most honest of the players, actually threatens the integrity of the entire game.
The point of all of this is that Carter's rules, for all their superficial appeal, turn out to be somewhat unsatisfying. Because he won't go as far as Katz in scrutinizing the form of actions, he ends up papering over some fairly important distinctions. Yes, in some broad moral sense all three of the players lack a certain integrity. But there isn't a football player in the world who wouldn't rather play with fakers than with arguers, or with arguers than with quitters.
This is not to say that Katz prefers those who play avoision games to those who act with perfect integrity, although it is sometimes tempting to read his book this way, since he spends so much time and enthusiasm talking about the people searching for loopholes and not a great deal of time talking about people who play fair. What Katz is trying to do is show that the loophole is not an arbitrary creation, that the ambiguities of our law reflect deep ethical conundrums that cannot be wished away. There is, in other words, a certain deontological dignity to our tortuous circumventions of the I.R.S. If the Jesuit theologians of the seventeenth century were here today, Katz believes, they would probably all be accountants, which is, when you think about it, probably the nicest thing anyone has ever said about the tax system.
Black Like Them
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
April 29, 1996
PERSONAL HISTORY
Through the lens of his own family's experience,
the author explores why West Indians and American
blacks are perceived differently.
1.
My cousin Rosie and her husband, Noel, live in a two-bedroom bungalow on Argyle Avenue, in Uniondale, on the west end of Long Island. When they came to America, twelve years ago, they lived in a basement apartment a dozen or so blocks away, next to their church. At the time, they were both taking classes at the New York Institute of Technology, which was right nearby. But after they graduated, and Rosie got a job managing a fast-food place and Noel got a job in asbestos removal, they managed to save a little money and bought the house on Argyle Avenue.
From the outside, their home looks fairly plain. It's in a part of Uniondale that has a lot of tract housing from just after the war, and most of the houses are alike--squat and square, with aluminum siding, maybe a dormer window in the attic, and a small patch of lawn out front. But there is a beautiful park down the street, the public schools are supposed to be good, and Rosie and Noel have built a new garage and renovated the basement. Now that Noel has started his own business, as an environmental engineer, he has his office down there--Suite 2B, it says on his stationery--and every morning he puts on his tie and goes down the stairs to make calls and work on the computer. If Noel's business takes off, Rosie says, she would like to move to a bigger house, in Garden City, which is one town over. She says this even though Garden City is mostly white. In fact, when she told one of her girlfriends, a black American, about this idea, her friend said that she was crazy--that Garden City was no place for a black person. But that is just the point. Rosie and Noel are from Jamaica. They don't consider themselves black at all.
This doesn't mean that my cousins haven't sometimes been lumped together with American blacks. Noel had a job once removing asbestos at Kennedy Airport, and his boss there called him "nigger" and cut his hours. But Noel didn't take it personally. That boss, he says, didn't like women or Jews, either, or people with college degrees--or even himself, for that matter. Another time, Noel found out that a white guy working next to him in the same job and with the same qualifications was making ten thousand dollars a year more than he was. He quit the next day. Noel knows that racism is out there. It's just that he doesn't quite understand--or accept--the categories on which it depends.
To a West Indian, black is a literal description: you are black if your skin is black. Noel's father, for example, is black. But his mother had a white father, and she herself was fair-skinned and could pass. As for Rosie, her mother and my mother, who are twins, thought of themselves while they were growing up as "middle-class brown," which is to say that they are about the same shade as Colin Powell. That's because our maternal grandfather was part Jewish, in addition to all kinds of other things, and Grandma, though she was a good deal darker than he was, had enough Scottish blood in her to have been born with straight hair. Rosie's mother married another brown Jamaican, and that makes Rosie a light chocolate. As for my mother, she married an Englishman, making everything that much more complicated, since by the racial categories of my own heritage I am one thing and by the racial categories of America I am another. Once, when Rosie and Noel came to visit me while I was living in Washington, D.C., Noel asked me to show him "where the black people lived," and I was confused for a moment until I realized that he was using "black" in the American sense, and so was asking in the same way that someone visiting Manhattan might ask where Chinatown was. That the people he wanted to see were in many cases racially indistinguishable from him didn't matter. The facts of his genealogy, of his nationality, of his status as an immigrant made him, in his own eyes, different.
This question of who West Indians are and how they define themselves may seem trivial, like racial hairsplitting. But it is not trivial. In the past twenty years, the number of West Indians in America has exploded. There are now half a million in the New York area alone and, despite their recent arrival, they make substantially more money than American blacks. They live in better neighborhoods. Their families are stronger. In the New York area, in fact, West Indians fare about as well as Chinese and Korean immigrants. That is why the Caribbean invasion and the issue of West Indian identity have become such controversial issues. What does it say about the nature of racism that another group of blacks, who have the same legacy of slavery as their American counterparts and are physically indistinguishable from them, can come here and succeed as well as the Chinese and the Koreans do? Is overcoming racism as simple as doing what Noel does, which is to dismiss it, to hold himself above it, to brave it and move on?
These are difficult questions, not merely for what they imply about American blacks but for the ways in which they appear to contradict conventional views of what prejudice is. Racism, after all, is supposed to be indiscriminate. For example, sociologists have observed that the more blacks there are in a community the more negative the whites' attitudes will be. Blacks in Denver have a far easier time than blacks in, say, Cleveland. Lynchings in the South at the turn of this century, to give another example, were far more common in counties where there was a large black population than in areas where whites were in the majority. Prejudice is the crudest of weapons, a reaction against blacks in the aggregate that grows as the perception of black threat grows. If that is the case, however, the addition of hundreds of thousands of new black immigrants to the New York area should have made things worse for people like Rosie and Noel, not better. And, if racism is so indiscriminate in its application, why is one group of blacks flourishing and the other not?
The implication of West Indian success is that racism does not really exist at all--at least, not in the form that we have assumed it does. The implication is that the key factor in understanding racial prejudice is not the behavior and attitudes of whites but the behavior and attitudes of blacks--not white discrimination but black culture. It implies that when the conservatives in Congress say the responsibility for ending urban poverty lies not with collective action but with the poor themselves they are right.
I think of this sometimes when I go with Rosie and Noel to their church, which is in Hempstead, just a mile away. It was once a white church, but in the past decade or so it has been taken over by immigrants from the Caribbean. They have so swelled its membership that the church has bought much of the surrounding property and is about to add a hundred seats to its sanctuary. The pastor, though, is white, and when the band up front is playing and the congregation is in full West Indian form the pastor sometimes seems out of place, as if he cannot move in time with the music. I always wonder how long the white minister at Rosie and Noel's church will last--whether there won't be some kind of groundswell among the congregation to replace him with one of their own. But Noel tells me the issue has never really come up. Noel says, in fact, that he's happier with a white minister, for the same reasons that he's happy with his neighborhood, where the people across the way are Polish and another neighbor is Hispanic and still another is a black American. He doesn't want to be shut off from everyone else, isolated within the narrow confines of his race. He wants to be part of the world, and when he says these things it is awfully tempting to credit that attitude with what he and Rosie have accomplished.
Is this confidence, this optimism, this equanimity all that separates the poorest of American blacks from a house on Argyle Avenue?
2.
In 1994, Philip Kasinitz, a sociologist at Manhattan's Hunter College, and Jan Rosenberg, who teaches at Long Island University, conducted a study of the Red Hook area of Brooklyn, a neighborhood of around thirteen or fourteen thousand which lies between the waterfront and the Gowanus Expressway. Red Hook has a large public-housing project at its center, and around the project, in the streets that line the waterfront, are several hundred thriving blue-collar businesses--warehouses, shipping companies, small manufacturers, and contractors. The object of the study was to resolve what Kasinitz and Rosenberg saw as the paradox of Red Hook: despite Red Hook's seemingly fortuitous conjunction of unskilled labor and blue-collar jobs, very few of the Puerto Ricans and African-Americans from the neighborhood ever found work in the bustling economy of their own back yard.
After dozens of interviews with local employers, the two researchers uncovered a persistent pattern of what they call positive discrimination. It was not that the employers did not like blacks and Hispanics. It was that they had developed an elaborate mechanism for distinguishing between those they felt were "good" blacks and those they felt were "bad" blacks, between those they judged to be "good" Hispanics and those they considered "bad" Hispanics. "Good" meant that you came from outside the neighborhood, because employers identified locals with the crime and dissipation they saw on the streets around them. "Good" also meant that you were an immigrant, because employers felt that being an immigrant implied a loyalty and a willingness to work and learn not found among the native-born. In Red Hook, the good Hispanics are Mexican and South American, not Puerto Rican. And the good blacks are West Indian.
The Harvard sociologist Mary C. Waters conducted a similar study, in 1993, which looked at a food-service company in Manhattan where West Indian workers have steadily displaced African-Americans in the past few years. The transcripts of her interviews with the company managers make fascinating reading, providing an intimate view of the perceptions that govern the urban workplace. Listen to one forty-year-old white male manager on the subject of West Indians:
They tend more to shy away from doing all of the illegal things because they have such strict rules down in their countries and jails. And they're nothing like here. So like, they're like really paranoid to do something wrong. They seem to be very, very self-conscious of it. No matter what they have to do, if they have to try and work three jobs, they do. They won't go into drugs or anything like that.
Or listen to this, from a fifty-three-year-old white female manager:
I work closely with this one girl who's from Trinidad. And she told me when she first came here to live with her sister and cousin, she had two children. And she said I'm here four years and we've reached our goals. And what was your goal? For her two children to each have their own bedroom. Now she has a three bedroom apartment and she said that's one of the goals she was shooting for. . . . If that was an American, they would say, I reached my goal. I bought a Cadillac.
This idea of the West Indian as a kind of superior black is not a new one. When the first wave of Caribbean immigrants came to New York and Boston, in the early nineteen-hundreds, other blacks dubbed them Jewmaicans, in derisive reference to the emphasis they placed on hard work and education. In the nineteen-eighties, the economist Thomas Sowell gave the idea a serious intellectual imprimatur by arguing that the West Indian advantage was a historical legacy of Caribbean slave culture. According to Sowell, in the American South slaveowners tended to hire managers who were married, in order to limit the problems created by sexual relations between overseers and slave women. But the West Indies were a hardship post, without a large and settled white population. There the overseers tended to be bachelors, and, with white women scarce, there was far more commingling of the races. The resulting large group of coloreds soon formed a kind of proto-middle class, performing various kinds of skilled and sophisticated tasks that there were not enough whites around to do, as there were in the American South. They were carpenters, masons, plumbers, and small businessmen, many years in advance of their American counterparts, developing skills that required education and initiative.
My mother and Rosie's mother came from this colored class. Their parents were schoolteachers in a tiny village buried in the hills of central Jamaica. My grandmother's and grandfather's salaries combined put them, at best, on the lower rungs of the middle class. But their expectations went well beyond that. In my grandfather's library were Dickens and Maupassant. My mother and her sister were pushed to win scholarships to a proper English- style boarding school at the other end of the island; and later, when my mother graduated, it was taken for granted that she would attend university in England, even though the cost of tuition and passage meant that my grandmother had to borrow a small fortune from the Chinese grocer down the road.
My grandparents had ambitions for their children, but it was a special kind of ambition, born of a certainty that American blacks did not have--that their values were the same as those of society as a whole, and that hard work and talent could actually be rewarded. In my mother's first year at boarding school, she looked up "Negro" in the eleventh edition of the Encyclopædia Britannica. "In certain . . . characteristics . . . the negro would appear to stand on a lower evolutionary plane than the white man," she read. And the entry continued:
The mental constitution of the negro is very similar to that of a child, normally good-natured and cheerful, but subject to sudden fits of emotion and passion during which he is capable of performing acts of singular atrocity, impressionable, vain, but often exhibiting in the capacity of servant a dog-like fidelity which has stood the supreme test.
All black people of my mother's generation--and of generations before and since--have necessarily faced a moment like this, when they are confronted for the first time with the allegation of their inferiority. But, at least in my mother's case, her school was integrated, and that meant she knew black girls who were more intelligent than white girls, and she knew how she measured against the world around her. At least she lived in a country that had blacks and browns in every position of authority, so her personal experience gave the lie to what she read in the encyclopedia. This, I think, is what Noel means when he says that he cannot quite appreciate what it is that weighs black Americans down, because he encountered the debilitating effects of racism late, when he was much stronger. He came of age in a country where he belonged to the majority.
When I was growing up, my mother sometimes read to my brothers and me from the work of Louise Bennett, the great Jamaican poet of my mother's generation. The poem I remember best is about two women--one black and one white--in a hair salon, the black woman getting her hair straightened and, next to her, the white woman getting her hair curled:
same time me mind start 'tink
'bout me and de white woman
how me tek out me natural perm
and she put in false one
There is no anger or resentment here, only irony and playfulness--the two races captured in a shared moment of absurdity. Then comes the twist. The black woman is paying less to look white than the white woman is to look black:
de two a we da tek a risk
what rain or shine will bring
but fe har risk is t're poun'
fi me onle five shillin'
In the nineteen-twenties, the garment trade in New York was first integrated by West Indian women, because, the legend goes, they would see the sign on the door saying "No blacks need apply" and simply walk on in. When I look back on Bennett's poem, I think I understand how they found the courage to do that.
3.
It is tempting to use the West Indian story as evidence that discrimination doesn't really exist--as proof that the only thing inner-city African-Americans have to do to be welcomed as warmly as West Indians in places like Red Hook is to make the necessary cultural adjustments. If West Indians are different, as they clearly are, then it is easy to imagine that those differences are the reason for their success--that their refusal to be bowed is what lets them walk on by the signs that prohibit them or move to neighborhoods that black Americans would shy away from. It also seems hard to see how the West Indian story is in any way consistent with the idea of racism as an indiscriminate, pernicious threat aimed at all black people.
But here is where things become more difficult, and where what seems obvious about West Indian achievement turns out not to be obvious at all. One of the striking things in the Red Hook study, for example, is the emphasis that the employers appeared to place on hiring outsiders--Irish or Russian or Mexican or West Indian immigrants from places far from Red Hook. The reason for this was not, the researchers argue, that the employers had any great familiarity with the cultures of those immigrants. They had none, and that was the point. They were drawn to the unfamiliar because what was familiar to them--the projects of Red Hook--was anathema. The Columbia University anthropologist Katherine Newman makes the same observation in a recent study of two fast-food restaurants in Harlem. She compared the hundreds of people who applied for jobs at those restaurants with the few people who were actually hired, and found, among other things, that how far an applicant lived from the job site made a huge difference. Of those applicants who lived less than two miles from the restaurant, ten per cent were hired. Of those who lived more than two miles from the restaurant, nearly forty per cent were hired. As Newman puts it, employers preferred the ghetto they didn't know to the ghetto they did.
Neither study describes a workplace where individual attitudes make a big difference, or where the clunky and impersonal prejudices that characterize traditional racism have been discarded. They sound like places where old-style racism and appreciation of immigrant values are somehow bound up together. Listen to another white manager who was interviewed by Mary Waters:
Island blacks who come over, they're immigrant. They may not have such a good life where they are so they gonna try to strive to better themselves and I think there's a lot of American blacks out there who feel we owe them. And enough is enough already. You know, this is something that happened to their ancestors, not now. I mean, we've done so much for the black people in America now that it's time that they got off their butts.
Here, then, are the two competing ideas about racism side by side: the manager issues a blanket condemnation of American blacks even as he holds West Indians up as a cultural ideal. The example of West Indians as "good" blacks makes the old blanket prejudice against American blacks all the easier to express. The manager can tell black Americans to get off their butts without fear of sounding, in his own ears, like a racist, because he has simultaneously celebrated island blacks for their work ethic. The success of West Indians is not proof that discrimination against American blacks does not exist. Rather, it is the means by which discrimination against American blacks is given one last, vicious twist: I am not so shallow as to despise you for the color of your skin, because I have found people your color that I like. Now I can despise you for who you are.
This is racism's newest mutation--multicultural racism, where one ethnic group can be played off against another. But it is wrong to call West Indians the victors in this competition, in anything but the narrowest sense. In American history, immigrants have always profited from assimilation: as they have adopted the language and customs of this country, they have sped their passage into the mainstream. The new racism means that West Indians are the first group of people for whom that has not been true. Their advantage depends on their remaining outsiders, on remaining unfamiliar, on being distinct by custom, culture, and language from the American blacks they would otherwise resemble. There is already some evidence that the considerable economic and social advantages that West Indians hold over American blacks begin to dissipate by the second generation, when the island accent has faded, and those in positions of power who draw distinctions between good blacks and bad blacks begin to lump West Indians with everyone else. For West Indians, assimilation is tantamount to suicide. This is a cruel fate for any immigrant group, but it is especially so for West Indians, whose history and literature are already redolent with the themes of dispossession and loss, with the long search for identity and belonging. In the nineteen-twenties, Marcus Garvey sought community in the idea of Africa. Bob Marley, the Jamaican reggae singer, yearned for Zion. In "Rites of Passage" the Barbadian poet Edward Kamau Brathwaite writes:
Where, then, is the nigger's
home?
In Paris Brixton Kingston
Rome?
Here?
Or in Heaven?
America might have been home. But it is not: not Red Hook, anyway; not Harlem; not even Argyle Avenue.
There is also no small measure of guilt here, for West Indians cannot escape the fact that their success has come, to some extent, at the expense of American blacks, and that as they have noisily differentiated themselves from African-Americans--promoting the stereotype of themselves as the good blacks--they have made it easier for whites to join in. It does not help matters that the same kinds of distinctions between good and bad blacks which govern the immigrant experience here have always lurked just below the surface of life in the West Indies as well. It was the infusion of white blood that gave the colored class its status in the Caribbean, and the members of this class have never forgotten that, nor have they failed, in a thousand subtle ways, to distance themselves from those around them who experienced a darker and less privileged past.
In my mother's house, in Harewood, the family often passed around a pencilled drawing of two of my great-grandparents; she was part Jewish, and he was part Scottish. The other side--the African side--was never mentioned. My grandmother was the ringleader in this. She prized my grandfather's light skin, but she also suffered as a result of this standard. "She's nice, you know, but she's too dark," her mother-in-law would say of her. The most telling story of all, though, is the story of one of my mother's relatives, whom I'll call Aunt Joan, who was as fair as my great-grandmother was. Aunt Joan married what in Jamaica is called an Injun--a man with a dark complexion that is redeemed from pure Africanness by straight, fine black hair. She had two daughters by him--handsome girls with dark complexions. But he died young, and one day, while she was travelling on a train to visit her daughter, she met and took an interest in a light-skinned man in the same railway car. What happened next is something that Aunt Joan told only my mother, years later, with the greatest of shame. When she got off the train, she walked right by her daughter, disowning her own flesh and blood, because she did not want a man so light-skinned and desirable to know that she had borne a daughter so dark.
My mother, in the nineteen-sixties, wrote a book about her experiences. It was entitled "Brown Face, Big Master," the brown face referring to her and the big master, in the Jamaican dialect, referring to God. Sons, of course, are hardly objective on the achievements of their mothers, but there is one passage in the book that I find unforgettable, because it is such an eloquent testimony to the moral precariousness of the Jamaican colored class--to the mixture of confusion and guilt that attends its position as beneficiary of racism's distinctions. The passage describes a time just after my mother and father were married, when they were living in London and my eldest brother was still a baby. They were looking for an apartment, and after a long search my father found one in a London suburb. On the day after they moved in, however, the landlady ordered them out. "You didn't tell me your wife was colored," she told my father, in a rage.
In her book my mother describes her long struggle to make sense of this humiliation, to reconcile her experience with her faith. In the end, she was forced to acknowledge that anger was not an option--that as a Jamaican "middle-class brown," and a descendant of Aunt Joan, she could hardly reproach another for the impulse to divide good black from bad black:
I complained to God in so many words: "Here I was, the wounded representative of the negro race in our struggle to be accounted free and equal with the dominating whites!" And God was amused; my prayer did not ring true with Him. I would try again. And then God said, "Have you not done the same thing? Remember this one and that one, people whom you have slighted or avoided or treated less considerately than others because they were different superficially, and you were ashamed to be identified with them. Have you not been glad that you are not more colored than you are? Grateful that you are not black?" My anger and hate against the landlady melted. I was no better than she was, nor worse for that matter. . . . We were both guilty of the sin of self-regard, the pride and the exclusiveness by which we cut some people off from ourselves.
4.
I grew up in Canada, in a little farming town an hour and a half outside of Toronto. My father teaches mathematics at a nearby university, and my mother is a therapist. For many years, she was the only black person in town, but I cannot remember wondering or worrying, or even thinking, about this fact. Back then, color meant only good things. It meant my cousins in Jamaica. It meant the graduate students from Africa and India my father would bring home from the university. My own color was not something I ever thought much about, either, because it seemed such a stray fact. Blacks knew what I was. They could discern the hint of Africa beneath my fair skin. But it was a kind of secret--something that they would ask me about quietly when no one else was around. ("Where you from?" an older black man once asked me. "Ontario," I said, not thinking. "No," he replied. "Where you from?" And then I understood and told him, and he nodded as if he had already known. "We was speculatin' about your heritage," he said.) But whites never guessed, and even after I informed them it never seemed to make a difference. Why would it? In a town that is ninety-nine per cent white, one modest alleged splash of color hardly amounts to a threat.
But things changed when I left for Toronto to attend college. This was during the early nineteen-eighties, when West Indians were immigrating to Canada in droves, and Toronto had become second only to New York as the Jamaican expatriates' capital in North America. At school, in the dining hall, I was served by Jamaicans. The infamous Jane-Finch projects, in northern Toronto, were considered the Jamaican projects. The drug trade then taking off was said to be the Jamaican drug trade. In the popular imagination, Jamaicans were--and are--welfare queens and gun-toting gangsters and dissolute youths. In Ontario, blacks accused of crimes are released by the police eighteen per cent of the time; whites are released twenty-nine per cent of the time. In drug-trafficking and importing cases, blacks are twenty-seven times as likely as whites to be jailed before their trial takes place, and twenty times as likely to be imprisoned on drug-possession charges.
After I had moved to the United States, I puzzled over this seeming contradiction--how West Indians celebrated in New York for their industry and drive could represent, just five hundred miles northwest, crime and dissipation. Didn't Torontonians see what was special and different in West Indian culture? But that was a naĂŻve question. The West Indians were the first significant brush with blackness that white, smug, comfortable Torontonians had ever had. They had no bad blacks to contrast with the newcomers, no African-Americans to serve as a safety valve for their prejudices, no way to perform America's crude racial triage.
Not long ago, I sat in a coffee shop with someone I knew vaguely from college, who, like me, had moved to New York from Toronto. He began to speak of the threat that he felt Toronto now faced. It was the Jamaicans, he said. They were a bad seed. He was, of course, oblivious of my background. I said nothing, though, and he launched into a long explanation of how, in slave times, Jamaica was the island where all the most troublesome and obstreperous slaves were sent, and how that accounted for their particularly nasty disposition today.
I have told that story many times since, usually as a joke, because it was funny in an appalling way--particularly when I informed him much, much later that my mother was Jamaican. I tell the story that way because otherwise it is too painful. There must be people in Toronto just like Rosie and Noel, with the same attitudes and aspirations, who want to live in a neighborhood as nice as Argyle Avenue, who want to build a new garage and renovate their basement and set up their own business downstairs. But it is not completely up to them, is it? What has happened to Jamaicans in Toronto is proof that what has happened to Jamaicans here is not the end of racism, or even the beginning of the end of racism, but an accident of history and geography. In America, there is someone else to despise. In Canada, there is not. In the new racism, as in the old, somebody always has to be the nigger.
The Tipping Point
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
June 3, 1996
DEPT. OF DISPUTATION
Why is the city suddenly so much safer---
could it be that crime really is an epidemic?
1.
As you drive east on Atlantic Avenue, through the part of New York City that the Police Department refers to as Brooklyn North, the neighborhoods slowly start to empty out: the genteel brownstones of the western part of Brooklyn give way to sprawling housing projects and vacant lots. Bedford-Stuyvesant is followed by Bushwick, then by Brownsville, and, finally, by East New York, home of the Seventy-fifth Precinct, a 5.6-square-mile tract where some of the poorest people in the city live. East New York is not a place of office buildings or parks and banks, just graffiti- covered bodegas and hair salons and auto shops. It is an economically desperate community destined, by most accounts, to get more desperate in the years ahead-which makes what has happened there over the past two and a half years all the more miraculous. In 1993, there were a hundred and twenty-six homicides in the Seven-Five, as the police call it. Last year, there were forty-four. There is probably no other place in the country where violent crime has declined so far, so fast.
Once the symbol of urban violence, New York City is in the midst of a strange and unprecedented transformation. According to the preliminary crime statistics released by the F.B.I. earlier this month, New York has a citywide violent-crime rate that now ranks it a hundred and thirty-sixth among major American cities, on a par with Boise, Idaho. Car thefts have fallen to seventy-one thousand, down from a hundred and fifty thousand as recently as six years ago. Burglaries have fallen from more than two hundred thousand in the early nineteen-eighties to just under seventy-five thousand in 1995. Homicides are now at the level of the early seventies, nearly half of what they were in 1990. Over the past two and a half years, every precinct in the city has recorded double-digit decreases in violent crime. Nowhere, however, have the decreases been sharper than Brooklyn North, in neighborhoods that not long ago were all but written off to drugs and violence. On the streets of the Seven-Five today, it is possible to see signs of everyday life that would have been unthinkable in the early nineties. There are now ordinary people on the streets at dusk-small children riding their bicycles, old people on benches and stoops, people coming out of the subways alone. "There was a time when it wasn't uncommon to hear rapid fire, like you would hear somewhere in the jungle in Vietnam," Inspector Edward A. Mezzadri, who commands the Seventy-fifth Precinct, told me. "You would hear that in Bed-Stuy and Brownsville and, particularly, East New York all the time. I don't hear the gunfire anymore. I've been at this job one year and twelve days. The other night when I was going to the garage to get my car, I heard my first volley. That was my first time."
But what accounts for the drop in crime rates? William J. Bratton-who as the New York City Police Commissioner presided over much of the decline from the fall of 1994 until his resignation, this spring-argues that his new policing strategies made the difference: he cites more coördination between divisions of the N.Y.P.D., more accountability from precinct commanders, more arrests for gun possession, more sophisticated computer-aided analysis of crime patterns, more aggressive crime prevention. In the Seven-Five, Mezzadri has a team of officers who go around and break up the groups of young men who congregate on street corners, drinking, getting high, and playing dice-and so remove what was once a frequent source of violent confrontations. He says that he has stepped up random "safety checks" on the streets, looking for drunk drivers or stolen cars. And he says that streamlined internal procedures mean that he can now move against drug-selling sites in a matter of days, where it used to take weeks. "It's aggressive policing," he says. "It's a no-nonsense attitude. Persistence is not just a word, it's a way of life."
All these changes make good sense. But how does breaking up dice games and streamlining bureaucracy cut murder rates by two-thirds? Many criminologists have taken a broader view, arguing that changes in crime reflect fundamental demographic and social trends-for example, the decline and stabilization of the crack trade, the aging of the population, and longer prison sentences, which have kept hard-core offenders off the streets. Yet these trends are neither particularly new nor unique to New York City; they don't account for why the crime rate has dropped so suddenly here and now. Furthermore, whatever good they have done is surely offset, at least in part, by the economic devastation visited on places like Brownsville and East New York in recent years by successive rounds of federal, state, and city social-spending cuts.
It's not that there is any shortage of explanations, then, for what has happened in New York City. It's that there is a puzzling gap between the scale of the demographic and policing changes that are supposed to have affected places like the Seven-Five and, on the other hand, the scale of the decrease in crime there. The size of that gap suggests that violent crime doesn't behave the way we expect it to behave. It suggests that we need a new way of thinking about crime, which is why it may be time to turn to an idea that has begun to attract serious attention in the social sciences: the idea that social problems behave like infectious agents. It may sound odd to talk about the things people do as analogous to the diseases they catch. And yet the idea has all kinds of fascinating implications. What if homicide, which we often casually refer to as an epidemic, actually is an epidemic, and moves through populations the way the flu bug does? Would that explain the rise and sudden decline of homicide in Brooklyn North?
2.
When social scientists talk about epidemics, they mean something very specific. Epidemics have their own set of rules. Suppose, for example, that one summer a thousand tourists come to Manhattan from Canada carrying an untreatable strain of twenty-four-hour flu. The virus has a two-per-cent infection rate, which is to say that one out of every fifty people who come into close contact with someone carrying it catches the bug himself. Let's say that fifty is also exactly the number of people the average Manhattanite-in the course of riding the subways and mingling with colleagues at work-comes into contact with every day. What we have, then, given the recovery rate, is a disease in equilibrium. Every day, each carrier passes on the virus to a new person. And the next day those thousand newly infected people pass on the virus to another thousand people, so that throughout the rest of the summer and the fall the flu chugs along at a steady but unspectacular clip.
But then comes the Christmas season. The subways and buses get more crowded with tourists and shoppers, and instead of running into an even fifty people a day, the average Manhattanite now has close contact with, say, fifty-five people a day. That may not sound like much of a difference, but for our flu bug it is critical. All of a sudden, one out of every ten people with the virus will pass it on not just to one new person but to two. The thousand carriers run into fifty-five thousand people now, and at a two-per-cent infection rate that translates into eleven hundred new cases the following day. Some of those eleven hundred will also pass on the virus to more than one person, so that by Day Three there are twelve hundred and ten Manhattanites with the flu and by Day Four thirteen hundred and thirty-one, and by the end of the week there are nearly two thousand, and so on up, the figure getting higher every day, until Manhattan has a full-blown flu epidemic on its hands by Christmas Day.
In the language of epidemiologists, fifty is the "tipping point" in this epidemic, the point at which an ordinary and stable phenomenon-a low-level flu outbreak- can turn into a public-health crisis. Every epidemic has its tipping point, and to fight an epidemic you need to understand what that point is. Take AIDS, for example. Since the late eighties, the number of people in the United States who die of AIDS every year has been steady at forty thousand, which is exactly the same as the number of people who are estimated to become infected with H.I.V. every year. In other words, AIDS is in the same self- perpetuating phase that our Canadian flu was in, early on; on the average, each person who dies of aids infects, in the course of his or her lifetime, one new person.
That puts us at a critical juncture. If the number of new infections increases just a bit-if the average H.I.V. carrier passes on the virus to slightly more than one person-then the epidemic can tip upward just as dramatically as our flu did when the number of exposed people went from fifty to fifty-five. On the other hand, even a small decrease in new infections can cause the epidemic to nosedive. It would be as if the number of people exposed to our flu were cut from fifty to forty-five a day-a change that within a week would push the number of flu victims down to four hundred and seventy-eight.
Nobody really knows what the tipping point for reducing AIDS may be. Donald Des Jarlais, an epidemiologist at Beth Israel Hospital, in Manhattan, estimates that halving new infections to twenty thousand a year would be ideal. Even cutting it to thirty thousand, he says, would probably be enough. The point is that it's not some completely unattainable number. "I think people think that to beat AIDS everybody has to either be sexually abstinent or use a clean needle or a condom all the time," Des Jarlais said. "But you don't really need to completely eliminate risk. If over time you can just cut the number of people capable of transmitting the virus, then our present behavior-change programs could potentially eradicate the disease in this country."
That's the surprising thing about epidemics. They don't behave the way we think they will behave. Suppose, for example, that the number of new H.I.V. infections each year was a hundred thousand, and by some heroic aids- education effort you managed to cut that in half. You would expect the size of the epidemic to also be cut in half, right? This is what scientists call a linear assumption-the expectation that every extra increment of effort will produce a corresponding improvement in result. But epidemics aren't linear. Improvement does not correspond directly to effort. All that matters is the tipping point, and because fifty thousand is still above that point, all these heroics will come to naught. The epidemic would still rise. This is the fundamental lesson of nonlinearity. When it comes to fighting epidemics, small changes-like bringing new infections down to thirty thousand from forty thousand-can have huge effects. And large changes-like reducing new infections to fifty thousand from a hundred thousand-can have small effects. It all depends on when and how the changes are made.
The reason this seems surprising is that human beings prefer to think in linear terms. Many expectant mothers, for example, stop drinking entirely, because they've heard that heavy alcohol use carries a high risk of damaging the fetus. They make the perfectly understandable linear assumption that if high doses of alcohol carry a high risk, then low doses must carry a low- but still unacceptable-risk. The problem is that fetal-alcohol syndrome isn't linear. According to one study, none of the sixteen problems associated with fetal-alcohol syndrome show up until a pregnant woman starts regularly consuming more than three drinks a day. But try telling that to a neurotic nineties couple.
I can remember struggling with these same theoretical questions as a child, when I tried to pour ketchup on my dinner. Like all children encountering this problem for the first time, I assumed that the solution was linear: that steadily increasing hits on the base of the bottle would yield steadily increasing amounts of ketchup out the other end. Not so, my father said, and he recited a ditty that, for me, remains the most concise statement of the fundamental nonlinearity of everyday life: Tomato ketchup in a bottle-None will come and then the lot'll
3.
What does this have to do with the murder rate in Brooklyn? Quite a bit, as it turns out, because in recent years social scientists have started to apply the theory of epidemics to human behavior. The foundational work in this field was done in the early seventies by the economist Thomas Schelling, then at Harvard University, who argued that "white flight" was a tipping-point phenomenon. Since that time, sociologists have actually gone to specific neighborhoods and figured out what the local tipping point is. A racist white neighborhood, for example, might empty out when blacks reach five per cent of the population. A liberal white neighborhood, on the other hand, might not tip until blacks make up forty or fifty per cent. George Galster, of the Urban Institute, in Washington, argues that the same patterns hold for attempts by governments or developers to turn a bad neighborhood around. "You get nothing until you reach the threshold," he says, "then you get boom."
Another researcher, David Rowe, a psychologist at the University of Arizona, uses epidemic theory to explain things like rates of sexual intercourse among teen-agers. If you take a group of thirteen-year-old virgins and follow them throughout their teen-age years, Rowe says, the pattern in which they first have sex will look like an epidemic curve. Non-virginity starts out at a low level, and then, at a certain point, it spreads from the precocious to the others as if it were a virus.
Some of the most fascinating work, however, comes from Jonathan Crane, a sociologist at the University of Illinois. In a 1991 study in the American Journal of Sociology, Crane looked at the effect the number of role models in a community-the professionals, managers, teachers whom the Census Bureau has defined as "high status"-has on the lives of teen-agers in the same neighborhood. His answer was surprising. He found little difference in teen-pregnancy rates or school-dropout rates in neighborhoods with between forty and five per cent of high-status workers. But when the number of professionals dropped below five per cent, the problems exploded. For black school kids, for example, as the percentage of high- status workers falls just 2.2 percentage points-from 5.6 per cent to 3.4 per cent-dropout rates more than double. At the same tipping point, the rates of childbearing for teen-age girls-which barely move at all up to that point-nearly double as well.
The point made by both Crane and Rowe is not simply that social problems are contagious-that non-virgins spread sex to virgins and that when neighborhoods decline good kids become infected by the attitudes of dropouts and teen-age mothers. Their point is that teen-age sex and dropping out of school are contagious in the same way that an infectious disease is contagious. Crane's study essentially means that at the five-per-cent tipping point neighborhoods go from relatively functional to wildly dysfunctional virtually overnight. There is no steady decline: a little change has a huge effect. The neighborhoods below the tipping point look like they've been hit by the Ebola virus.
It is possible to read in these case studies a lesson about the fate of modern liberalism. Liberals have been powerless in recent years to counter the argument that their policy prescriptions don't work. A program that spends, say, an extra thousand dollars to educate inner-city kids gets cut by Congress because it doesn't raise reading scores. But if reading problems are nonlinear the failure of the program doesn't mean-as conservatives might argue-that spending extra money on inner-city kids is wasted. It may mean that we need to spend even more money on these kids so that we can hit their tipping point. Hence liberalism's crisis. Can you imagine explaining the link between tipping points and big government to Newt Gingrich? Epidemic theory, George Galster says, "greatly complicates the execution of public policy. . . . You work, and you work, and you work, and if you haven't quite reached the threshold you don't seem to get any payoff. That's a very tough situation to sustain politically."
At the same time, tipping points give the lie to conservative policies of benign neglect. In New York City, for example, one round of cuts in, say, subway maintenance is justified with the observation that the previous round of cuts didn't seem to have any adverse consequences. But that's small comfort. With epidemic problems, as with ketchup, nothing comes and then the lot'll.
4.
Epidemic theory, in other words, should change the way we think about whether and why social programs work. Now for the critical question: Should it change the way we think about violent crime as well? This is what a few epidemiologists at the Centers for Disease Control, in Atlanta, suggested thirteen years ago, and at the time no one took them particularly seriously. "There was just a small group of us in an old converted bathroom in the sub- subbasement of Building Three at C.D.C.," Mark L. Rosenberg, who heads the Centers' violence group today, says. "Even within C.D.C., we were viewed as a fringe group. We had seven people and our budget was two hundred thousand dollars. People were very skeptical." But that was before Rosenberg's group began looking at things like suicide and gunshot wounds in ways that had never quite occurred to anyone else. Today, bringing epidemiological techniques to bear on violence is one of the hottest ideas in criminal research. "We've got a hundred and ten people and a budget of twenty-two million dollars," Rosenberg says. "There is interest in this all around the world now."
The public-health approach to crime doesn't hold that all crime acts like infectious disease. Clearly, there are neighborhoods where crime is simply endemic-where the appropriate medical analogy for homicide is not something as volatile as aids but cancer, a disease that singles out its victims steadily and implacably. There are, however, times and places where the epidemic model seems to make perfect sense. In the United States between the early sixties and the early seventies, the homicide rate doubled. In Stockholm between 1950 and 1970, rape went up three hundred per cent, murder and attempted murder went up six hundred per cent, and robberies a thousand per cent. That's not cancer; that's aids.
An even better example is the way that gangs spread guns and violence. "Once crime reaches a certain level, a lot of the gang violence we see is reciprocal," Robert Sampson, a sociologist at the University of Chicago, says. "Acts of violence lead to further acts of violence. You get defensive gun ownership. You get retaliation. There is a nonlinear phenomenon. With a gang shooting, you have a particular act, then a counter-response. It's sort of like an arms race. It can blow up very quickly."
How quickly? Between 1982 and 1992, the number of gang-related homicides in Los Angeles County handled by the L.A.P.D. and the County Sheriff's Department went from a hundred and fifty-eight to six hundred and eighteen. A more interesting number, however, is the proportion of those murders which resulted from drive-by shootings. Between 1979 and 1986, that number fluctuated, according to no particular pattern, between twenty-two and fifty-one: the phenomenon, an epidemiologist would say, was in equilibrium. Then, in 1987, the death toll from drive-bys climbed to fifty-seven, the next year to seventy-one, and the year after that to a hundred and ten; by 1992, it had reached two hundred and eleven. At somewhere between fifty and seventy homicides, the idea of drive-by shootings in L.A. had become epidemic. It tipped. When these results were published last fall in the Journal of the American Medical Association, the paper was entitled "The Epidemic of Gang-Related Homicides in Los Angeles County from 1979 Through 1994." The choice of the word "epidemic" was not metaphorical. "If this were a disease," H. Range Hutson, the physician who was the leading author on the study, says, "you would see the government rushing down here to assess what infectious organism is causing all these injuries and deaths."
Some of the best new ideas in preventing violence borrow heavily from the principles of epidemic theory. Take, for example, the so-called "broken window" hypothesis that has been used around the country as the justification for cracking down on "quality of life" crimes like public urination and drinking. In a famous experiment conducted twenty-seven years ago by the Stanford University psychologist Philip Zimbardo, a car was parked on a street in Palo Alto, where it sat untouched for a week. At the same time, Zimbardo had an identical car parked in a roughly comparable neighborhood in the Bronx, only in this case the license plates were removed and the hood was propped open. Within a day, it was stripped. Then, in a final twist, Zimbardo smashed one of the Palo Alto car's windows with a sledgehammer. Within a few hours, that car, too, was destroyed. Zimbardo's point was that disorder invites even more disorder-that a small deviation from the norm can set into motion a cascade of vandalism and criminality. The broken window was the tipping point.
The broken-window hypothesis was the inspiration for the cleanup of the subway system conducted by the New York City Transit Authority in the late eighties and early nineties. Why was the Transit Authority so intent on removing graffiti from every car and cracking down on the people who leaped over turnstiles without paying? Because those two "trivial" problems were thought to be tipping points-broken windows-that invited far more serious crimes. It is worth noting that not only did this strategy seem to work-since 1990, felonies have fallen more than fifty per cent-but one of its architects was the then chief of the Transit Police, William Bratton, who was later to take his ideas about preventing crime to the city as a whole when he became head of the New York Police Department.
Which brings us to North Brooklyn and the Seventy- fifth Precinct. In the Seven-Five, there are now slightly more officers than before. They stop more cars. They confiscate more guns. They chase away more street-corner loiterers. They shut down more drug markets. They have made a series of what seem, when measured against the extraordinary decline in murders, to be small changes. But it is the nature of nonlinear phenomena that sometimes the most modest of changes can bring about enormous effects. What happened to the murder rate may not be such a mystery in the end. Perhaps what William Bratton and Inspector Mezzadri have done is the equivalent of repairing the broken window or preventing that critical ten or fifteen thousand new H.I.V. infections. Perhaps Brooklyn-and with it New York City-has tipped.
Conquering the Coma
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 8, 1996
ANNALS OF MEDICINE
What does it take to save the life of a coma
patient like the Central Park victim? Not a
miracle, as her family and doctor explain.
1.
On the afternoon of Tuesday, June 4th, a young woman was taken by ambulance from Central Park to New York Hospital, on the Upper East Side. When she arrived in the emergency room, around four o'clock, she was in a coma, and she had no identification. Her head was, in the words of one physician, "the size of a pumpkin." She was bleeding from her nose and her left ear. Her right eye was swollen shut, and the bones above the eye were broken and covered by a black-and-blue bruise.
Within minutes, she was put on a ventilator and then given X-rays and a cat scan. A small hole was drilled in her skull and a slender silicone catheter inserted, to drain cerebral spinal fluid and relieve the pressure steadily building in her brain. At midnight, after that pressure had risen precipitously, a neurosurgeon removed a blood clot from her right frontal cortex. A few hours later, Urgent Four-as the trauma-unit staff named her, because she was the fourth unidentified trauma patient in the hospital at that time-was wheeled from the operating room to an intensive-care bed overlooking the East River. She had staples in her scalp from the operation, and her chest and arms and fingers were hooked up, via a maze of intravenous lines and cables, to monitors registering heart rate, arterial blood pressure, and blood-oxygen saturation. She had special inflatable cuffs on her legs to prevent the formation of blood clots, and splints on her ankles, since coma patients tend to point their toes. Four days later, after she was identified and her parents and her two sisters arrived at her bedside, one of the first things the family did was to put two large pictures of her on the wall above her bed-one of her holding her niece, and the other of her laughing and leaning through a doorway-just so people would know what she really looked like.
Urgent Four-or the Central Park victim, as she became known during the spate of media attention that surrounded her case-came close to dying on two occasions. Each time, she fought back. On Wednesday, June 12th, eight days after entering the hospital, she opened one blue eye. The mayor of New York, Rudolph Giuliani, was paying one of his daily visits to her room at the time, and she looked directly at him. Several days later, she began tracking people with her one good eye as they came in and out of her room. She began to frown and smile. On June 19th, the neurosurgeon supervising her care leaned over her bed, pinched her to get her attention, and asked, "Can you open your mouth?" She opened her mouth. He said, "Is your name --?" She nodded and mouthed her name.
"I'm at the foot of her bed, her cheering squad," her mother recalls. She is a striking woman, with thick black hair and luminous eyes, and her voice grows animated with the memory. " 'Go! Go! You're doing great! Go!' And the doctor says, 'Do you want me to pinch her again?' And I'm yelling at her. I'm telling her, 'Say no, get out of here, go!' "
And her daughter mouthed, "Go."
2.
There is something compelling about such stories of medical recovery, and something undeniably moving about a young woman fighting back from the most devastating of injuries. In the days following the Central Park beating, the case assumed national proportions as the police frantically worked to locate Urgent Four's family and identify her attacker. The victim turned out to be a talented musician, a piano teacher beloved by her students. Her alleged assailant turned out to be a strange and deeply disturbed unemployed salesclerk, who veered off into Eastern mysticism during his interrogation by the police. The story also had a hero, in Jam Ghajar, the man who saved her life: a young and handsome neurosurgeon with an M.D. and a Ph.D., a descendant of Iranian royalty who has an athlete's walk, strong, beautiful hands, and ten medical-device patents to his name. If this were the movies, Ghajar would be played by Andy Garcia.
But the story of Urgent Four is not the standard tale of the triumph of medicine and the human spirit. To think of this as an episode of "E.R." is to diminish it. The typical narratives of recovery are about exceptional people in exceptional circumstances, and that is why the narratives are both irresistible and, finally, less than consoling. Brilliant doctors and new technology can work miracles. But what if your doctor isn't brilliant and your hospital doesn't have the newest technology? If the principal failing of the American medical system is that it provides one standard of care for the fortunate and another for everyone else, the typical story of medical triumph ends up as a kind of indictment, a reminder that miracles are apportioned by privilege and position.
The case of Urgent Four is different. The profession that saved her life is in the midst of an ambitious transformation, an attempt to insure that you do not have to be ten minutes away from one of the best hospitals in the country in order to survive a vicious beating. One of the leaders of the movement, in fact, is the doctor who saved Urgent Four's life, and he has held up the care she received as an example of what ought to be routine in the treatment of brain injury. That makes the lesson of the Central Park victim and her remarkable recovery exactly the opposite of the lesson of the heroic medical dramas on television. Recovery need not be remarkable. The real medical miracle is the kind that can be repeated over and over again.
3.
When Urgent Four was attacked in Central Park, her assailant smashed her forehead on the smooth, hard surface of the sidewalk with such force that the bones above her right eye were shattered. Then, as if he didn't consider his task completed, he turned her over and began again, pounding the back of her head against the ground hard enough to fracture her skull behind her left ear. The ferocity-and the thoroughness-of the attack bruised the muscles between her scalp and her skull, causing her scalp to swell. What was more serious was that in response to the trauma her brain also began to swell, pressing up against the inside of her skull. In any trauma patient, this swelling, which increases what is known as intracranial pressure (ICP), is the neurosurgeon's chief concern, because the continuing rise in ICP makes it harder and harder for the body to supply the brain with an adequate amount of blood. Upon autopsy, ninety per cent of coma patients show clear signs of stroke: their brains quite literally starved to death.
This is why when Urgent Four was brought to the New York Hospital-Cornell Medical Center complex the neurosurgical resident on duty inserted a catheter through her skull to siphon off excess cerebral spinal fluid. In cases of trauma, this clear fluid, in which the brain floats, flows into a cavity that is called the ventricle, in the center of the brain, and the hope was to empty the ventricle, reducing the pressure inside the skull. This is also why the trauma staff kept a very close eye on the pressure gauge attached to that catheter during the first few hours after Urgent Four was admitted. According to the index used by neurologists, a healthy person's ICP is between zero and ten. Urgent Four's was at twenty, which is high but not disastrous. A further rise, however, would put her in the danger zone. At nine o'clock Tuesday night, that is exactly what happened: Urgent Four's ICP abruptly surged into the fifties.
The physician in charge of her case, Jam Ghajar, is, at forty- four, one of the country's leading neurotrauma specialists. On his father's side, he is descended from the family that ruled Persia from the late seventeen-hundreds until 1925, and his grandfather on his mother's side was the Shah of Iran's personal physician. Neurosurgeons, Ghajar says, with a smile, are "overachievers," and the description fits him perfectly. As a seventeen-year-old, he was a volunteer at U.C.L.A's Brain Research Institute. As a first-year resident at New York Hospital, he invented a device--a tiny tripod to guide the insertion of ventricular catheters--that made the cover of the Journal of Neurosurgery. Today, Ghajar is the chief of neurosurgery at Jamaica Hospital, in Queens. He is also the president of the Aitken Neuroscience Institute, in Manhattan, a research group that grew out of the double tragedy experienced by the children of Sunny von BĂĽlow, who lost not only their mother to coma but also their father, Prince Alfred von Auersperg, after a car accident, thirteen years ago. Most days, Ghajar drives back and forth between the hospital and the institute, juggling his research at Aitken with a clinical schedule that keeps him on call two weeks out of every four. "Jam is completely committed--he's got a razor-sharp focus," Sunny von BĂĽlow's daughter, Ala Isham, told me. "He's godfather to my son. I always joke that we should carry little cards in our wallets saying that if anything happens to us call Jam Ghajar."
Ghajar spent all day Tuesday, June 4th, at Jamaica Hospital. In the evening, he returned to the Aitken Neuroscience Institute, where a colleague, Michael Lavyne, told him of the young woman hovering near death across the street at New York Hospital. At seven o'clock, Ghajar left his office for the hospital. Two hours later, with Urgent Four's ICP at dangerous levels, he ordered a second cat scan, which immediately identified the culprit: the bruise on her right frontal cortex had given rise to a massive clot. At midnight, Ghajar drilled a small hole in her skull, cut out a chunk three inches in diameter with a zip saw, and, he said, "this big brain hemorrhage just came out-plop-like a big piece of black jelly."
But the task was only half finished. The rule of thumb for a trauma patient is that the blood pressure has to be kept at least seventy points higher than the ICP or the flow of oxygen and nutrients to the brain will be impaired. Even after Urgent Four's clot was removed, her differential was only fifty points. At the same time, however, her heart was racing at a hundred and eighty beats per minute. This made raising her blood pressure tricky. "We're standing around her bed," Ghajar recalls. "It's four in the morning. There's Dr. Fischer"-Eva Fischer, the group-care physician-"there's three surgical residents, there's myself, there's a chief resident from neurosurgery, and then two nurses, and we're all standing around her trying to figure out what the best drug would be to reduce her pulse and increase her blood pressure at the same time." It took three hours-and two different blood-pressure medicines-to get Urgent Four out of the danger zone. It was 7 a.m. when Ghajar left her bedside and began neurosurgery rounds.
4.
The identity of Urgent Four did not become known until the next day, Thursday. By a series of flukes, no one in her family had even suspected that she was missing. Her older sister, whom I will call Jane, had been with her the previous Saturday night, when she played in a concert. The two sisters, who share a birthday, spoke on the phone on Monday afternoon, and it wasn't unusual for several days to pass between conversations. Nor did the news of the attack, when it became public, make much of an impression on Jane: her car radio was broken, and, because she was busy with work, she had no time to read the newspaper. Her parents, meanwhile, were travelling in Utah, and were equally oblivious. "It was the first vacation we'd ever taken where we hadn't read a newspaper," her mother told me. "Or watched the news."
On Thursday, however, one of Urgent Four's piano students showed up for her weekly lesson, and when her teacher didn't arrive the student remembered seeing drawings of the Central Park victim that had been posted on buildings and mailboxes throughout Manhattan, and she began to wonder. She called the police. They searched the woman's apartment, on Fifty- seventh Street, and learned her parents' address, in New Jersey. Upon finding that they were away, the police telephoned Jane, at her home, also in New Jersey, using as a guide the return address on a letter Jane had written to her sister. It was one- thirty Friday morning.
"I got a call from the police, which I didn't believe, of course," Jane said. She is a graceful woman, with shoulder- length black hair and a hint of a Jersey accent. "I thought it was a prank call, and I thought I was being stalked. They asked me my name and if I had a sister with that name, and I was almost rude to them on the phone, because I thought it was someone playing a joke on me. Then they referred to this incident, and I had no idea what they were talking about. At that point, my husband ran and got the newspaper, because he had been following the story and had seen the sketch. I got off the phone and had to fight collapsing. The captain probably sensed that. He said, 'Can you come? I'll send you an escort.' And then he called back a little while later and said, 'Would you be willing to ride in a helicopter to get here?' "
At 3 a.m., she and her husband landed in Manhattan. They were taken immediately by police car to the hospital, and there they were greeted by Mayor Giuliani and Howard Safir, the police commissioner. "They probably spent twenty minutes trying to let me understand what had happened and prepare me, and I ended up saying, 'Don't bother trying to prepare me. It's not going to work.' The anticipation was awful. And when I saw her, of course, the effect was indescribable."
The next to arrive was the family's youngest daughter, who came by car with her husband later on Friday morning. At midnight Friday, the parents arrived. The police had tracked them down by tracing their rental-car registration and then sending the Utah police cruising through motel parking lots in and around Zion National Park to spot the corresponding license plate. "At one point after they found us," her father told me, "we drove through a town in Utah which had my mother's name. Both of us burst out crying. My mother was pretty close to her. So we took that as an omen that she would be looking over her." Jane said that when she first saw the patient at the hospital she knew immediately she was her sister. But her father said that if he had not been told who she was he would never have known her. "To me she was almost unrecognizable," he said.
I met with Urgent Four's family-her mother, father, and older sister-in Jam Ghajar's office, on East Seventy-second Street, two days after she first began to speak. Her parents have been together for thirty-eight years, and have the easy affection of the well-married. The father, trim and gray-haired, is an engineer by training, with the discipline and plainspokenness characteristic of that profession. His wife is a schoolteacher, intelligent and articulate. They spoke with me on the condition that the personal details of their lives be kept private, and they confined their conversation to details of the case which they considered germane: their religious faith, their admiration for Dr. Ghajar's medical team, their hopes for their daughter's recovery. It was an intense and moving conversation. Over the past three weeks, the family has fashioned a protective cocoon for themselves, refusing to read any of the press accounts of Urgent Four's assailant, and barely leaving her hospital room except to rest and eat. This was the first time they had talked to the outside world, and long-pent-up feelings and thoughts came out in a rush.
"We went for days on two hours' sleep," her mother said. "You don't feel as tired, because you're so wound up. You want it all to be over. You want to wake up and know it's over-and it's not." The mother seemed the most shaken and most exhausted of the three. At one point as we talked, she accidentally referred to her daughter in the past tense, saying, "She was-"
"Is," Jane interrupted. "Is."
5.
Had Urgent Four been taken to a smaller hospital, or to any of the thousands of trauma centers in America which do not specialize in brain injuries, the chances are that she would have been dead by the time any of her family arrived. This is what trauma experts who are familiar with the case believe, and, of the many lessons of the Central Park beating, it is the one that is hardest to understand. It's not, after all, as if Urgent Four were suffering from a rare and difficult brain tumor. Brain trauma is the leading cause of death due to injury for Americans under forty-five, and results in the death of some sixty thousand people every year. Nor is it as if Urgent Four had been given some kind of daring experimental therapy, available only at the most exclusive research hospitals. The insertion of the ventricular catheter is something that all neurosurgeons are taught to do in their first year of residency. CAT scanners are in every hospital. The removal of Urgent Four's blood clots was straightforward neurosurgery. The raising and monitoring of blood pressure are taught in Nursing 101. Urgent Four was treated according to standards and protocols that have been discussed in the medical literature, outlined at conferences, and backed by every expert in the field.
Yet the fact is that if she had been taken to a smaller hospital or to any one of the thousands of trauma centers in America which do not specialize in brain injuries she would have been treated very differently. When Ghajar and five other researchers surveyed the country's trauma centers five years ago, they found that seventy-nine per cent of the coma patients were routinely given steroids, despite the fact that steroids have been shown repeatedly to be of no use-and possibly of some harm-in reducing intracranial pressure. Ninety-five per cent of the centers surveyed were relying as well on hyperventilation, in which a patient is made to breathe more rapidly to reduce swelling-a technique that specialists like Ghajar will use only as a last resort. Prolonged hyperventilation does reduce ICP, but it can also end up reducing the flow of blood to the brain, causing irreversible brain damage. The most troubling finding, however, was that only a third of the trauma centers surveyed said that they routinely monitored ICP at all. In another hospital, the surge in Urgent Four's ICP on Tuesday night which signalled the formation of a blood clot might not have been caught.
Such dramatic variations in medical practice are hardly confined to neurosurgery. It is not unusual for doctors in one community to perform hysterectomies, say, at two or three times the rate of doctors in another town. Rates for some cardiac procedures differ around the country by as much as fifty per cent. Obstetrical specialists are almost twice as likely to deliver children by cesarean section as family physicians are. In one classic study published seven years ago, a team of researchers found that children in Boston were 3.8 times as likely to be hospitalized for asthma as children in Rochester, New York; 6.1 times as likely to be hospitalized for accidental poisoning; and 2.6 times as likely to be hospitalized for head injury.
In most cases, however, the concern about practice variation has focussed on the issue of cost. The point of the Boston- Rochester study was not that the children of Boston were receiving considerably better care than their counterparts in upstate New York but, rather, that health care for children in Boston might well be needlessly expensive. When it comes to brain injury, the stakes are a little higher. At the handful of centers around the country specializing in brain trauma, it is now not unusual for the mortality rates of coma patients to run in the range of twenty per cent or less. At trauma centers where brain injury is not a specialty, mortality rates for coma patients are often twice that. "If I break my leg, I don't care where I go," Randall Chesnut, a trauma specialist at San Francisco General Hospital, told me. "But, if I hit my head, I want to choose my hospital."
Part of the problem is that in the field of neurosurgery it has been difficult to reach hard, scientific conclusions about procedures and treatments. Physicians in the field have long assumed, for example, that blood clots in the brain should be removed as soon as possible. But how could that assumption ever be scientifically verified? Who would ever agree to let a comatose family member lie still with a mass of congealed blood in the brain while a team of curious researchers watched to see what happened? The complexity and mystery of the brain has, moreover, led to a culture that rewards intuition, and has thus convinced each neurosurgeon that his own experience is as valid as anyone else's. Worse, brain injury is an area that is of no more than passing interest to many neurosurgeons. Most neurosurgeons make their living doing disk surgery and removing brain tumors. Trauma is an afterthought. It doesn't pay particularly well, because many car-accident and shooting victims don't have insurance. (Urgent Four herself was without insurance, and a public collection has been made to help defray her medical expenses.) Nor does it pose the kind of surgical challenge that, say, an aneurysm or a tumor does. "It's something like-well, you've got mashed-up brains, and someone got hit by a car, and it's not really very interesting," Ghajar says. "But brain tumors are kind of interesting. What's happening with the DNA? Why does a tumor develop?"
Then, there are the hours, long and unpredictable, tied to the rhythms of street thugs and drunk drivers. Ghajar, for example, routinely works through the night. He practices primarily out of Jamaica Hospital, not the far more prestigious New York Hospital, because Jamaica gets serious brain-trauma cases every second day and New York might get one only every second week. "If I were operating and doing disks and brain tumors, I'd be making ten times as much," he says. In the entire country, there are probably no more than two dozen neurosurgeons who, like Ghajar, exclusively focus on researching and treating brain trauma.
Ghajar says that in talking to other neurosurgeons he sensed a certain resignation in treating brain injury-a feeling that the prognosis facing coma patients was so poor that the neurosurgeon's role was limited. "It wasn't that the neurosurgeons were lazy," Ghajar said. "It was just that there was so much information out there that it was confusing. When they got young people in comas, half of the patients would die. And the half that lived would be severely disabled, so the neurosurgeon is saying, 'What am I doing for these people? Am I saving vegetables?' And that was honestly the feeling that neurosurgeons had, because the methods they were trained in and were using would produce that kind of result."
Three years ago, after a neurosurgery meeting in Vancouver, Ghajar-along with Randall Chesnut and Donald W. Marion, a brain-trauma specialist at the University of Pittsburgh-decided to act. For help they turned to the Brain Trauma Foundation, which is the education arm of the brain-trauma institute started by Sunny von Bülow's children. The foundation gathered some of the world's top brain-injury specialists together for eleven meetings between the winter of 1994 and last summer. Four thousand scientific papers covering fourteen aspects of brain- injury management were reviewed. Peter C. Quinn, the executive director of the Brain Trauma Foundation, who coördinated the effort, says, "Sometimes I felt I was in a courtroom drama, because what they did was argue the evidence of the scientific documents, and as soon as someone said, 'It's been my experience,' everyone would say, 'Oh, no, that won't cut it. We want to know what the evidence is.' They would come in on a Friday and work all day Saturday and Sunday. They'd work a twenty-hour weekend. It was gruelling."
In March of this year, the group produced a book-a blue three-ring binder with fifteen bright-colored chapter tabs-laying out the scientific evidence and state-of-the-art treatment in every phase of brain-trauma care. The guidelines represent the first successful attempt by the neurosurgery community to come up with a standard treatment protocol, and if they are adopted by anything close to a majority of the country's trauma centers they could save more than ten thousand lives a year. A copy has now been sent to every neurosurgeon in the country. The Brain Trauma Foundation has mailed the guidelines to scientific journals, hospitals, managed-care groups, and insurance companies, and the neurosurgeons involved with the project have been promoting their work at medical meetings around the country. This is why the story of the Central Park victim does not end the way most medical dramas end, in empty celebration of heroics and exceptionalism, but instead has become a powerful symbol of the campaign to reform neurosurgery. For everything Jam Ghajar used to save Urgent Four's life is in that binder.
"What we are hoping is that if a woman gets hurt in the middle of rural Wyoming, and there is a neurosurgeon there and a hospital with an I.C.U., then she will have as good a chance to survive as she would in the middle of New York City," I was told by Jack Wilberger, Jr., who is an associate professor of neurosurgery at the University of Pittsburgh Medical Center and a member of the guidelines team. "That's what we're hoping for. To give everyone the same chance, to give a everyone a level playing field."
6.
Urgent Four had one more scare before she began her climb toward recovery. Late Sunday night, her ICP began to rise again, back up into the thirties. Ghajar, who was in Paris meeting with the World Health Organization about the brain- trauma guidelines and was calling in to the hospital residents for updates, began to get worried. He booked a flight home. While he was in the air, Urgent Four's condition worsened. A third cat scan was ordered, and it showed that she had developed a second clot-this time on her left temporal lobe, in the place behind her ear where her attacker had banged the back of her head. This clot was far more serious than the first, because the temporal lobe is the seat of comprehension, and to remove the clot might well risk damaging Urgent Four's ability both to speak and to understand. "At about twelve-thirty, quarter to one on Monday, there was a pounding on the door of our room," the patient's father said. "We were wanted back on the floor, and we had to make a decision within a very few minutes on whether they should operate. What we were given was: If you don't operate, she might die. The other side of it was that if they did operate it could save her life but with a decent likelihood that she might be very badly impaired. So we and our two daughters went back and thrashed it out and we unanimously decided to go forward."
It was by then one-thirty in the morning. For four hours, the family waited, sleepless and exhausted, terrified that they had made the wrong decision. At dawn, the surgeon filling in for Ghajar, Michael Lavyne, emerged from the operating room. A miracle had happened, he reported: as soon as an incision was made, the clot had just popped out, all on its own. "They got lucky," Ghajar says.
From that point, Urgent Four's progress was steady. Her eye opened. Then she began to talk. The swelling around her face receded. Her ICP became normal. Soon she was sitting up. By last week, she was working with a speech therapist, and Ghajar and her father had begun driving around the New York area looking for a good rehabilitation center.
"Yesterday, she was looking at me, and I said, 'You know, you had a bad accident, and your brain was bruised'-I'd told everyone not to tell her she was assaulted. 'Your brain was bruised, and you are recovering.' She looked at me and she frowned. Her eye went up with this 'Oh, really?' look. And I said, 'Do you remember your accident?' She shook her head. But it's too early. Sometimes they do." Ghajar went on, "We are very good at predicting outcome, in the sense of mortality, but we're not good at predicting functional outcome, which is the constant question for this patient. 'Is she going to be able to play the piano?' We still can't answer that question."
In his first week back on call after the Urgent Four case, Ghajar saw three new coma patients. The latest was a thirty- year-old man who had barely survived a serious car accident. He was in worse shape than Urgent Four had been, with a hemorrhage on top of his brain. He was admitted to Jamaica Hospital on Monday at 11 p.m., and Ghajar operated from midnight to 6 a.m. He inserted a catheter in the patient's skull to drain the spinal fluid and monitored his blood pressure, to make sure it was seventy points higher than his ICP. Then, that evening-fourteen hours later-the patient's condition worsened. "I had to go back in and take out the hemorrhages," Ghajar said, and there was a note of exhaustion in his voice. He left the hospital at one o'clock Wednesday morning.
"People want to personalize this," Ghajar said. He was on Seventy-second Street, outside his office, walking back to New York Hospital to visit Urgent Four. "I guess that's human nature. They want to say, 'It's Dr. Ghajar's protocol. He's a wonderful doctor.' But that's not it. These are standards developed according to the best available science. These are standards that everyone can use."
The Science of Shopping
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
November 4, 1996
A REPORTER AT LARGE
The American shopper has never been so fickle.
What are stores, including the new flagship designer
boutiques, doing about it? Applying science.
1.
Human beings walk the way they drive, which is to say that Americans tend to keep to the right when they stroll down shopping-mall concourses or city sidewalks. This is why in a well-designed airport travellers drifting toward their gate will always find the fast-food restaurants on their left and the gift shops on their right: people will readily cross a lane of pedestrian traffic to satisfy their hunger but rarely to make an impulse buy of a T-shirt or a magazine. This is also why Paco Underhill tells his retail clients to make sure that their window displays are canted, preferably to both sides but especially to the left, so that a potential shopper approaching the store on the inside of the sidewalk-the shopper, that is, with the least impeded view of the store window-can see the display from at least twenty-five feet away.
Of course, a lot depends on how fast the potential shopper is walking. Paco, in his previous life, as an urban geographer in Manhattan, spent a great deal of time thinking about walking speeds as he listened in on the great debates of the nineteen-seventies over whether the traffic lights in midtown should be timed to facilitate the movement of cars or to facilitate the movement of pedestrians and so break up the big platoons that move down Manhattan sidewalks. He knows that the faster you walk the more your peripheral vision narrows, so you become unable to pick up visual cues as quickly as someone who is just ambling along. He knows, too, that people who walk fast take a surprising amount of time to slow down-just as it takes a good stretch of road to change gears with a stick-shift automobile. On the basis of his research, Paco estimates the human downshift period to be anywhere from twelve to twenty-five feet, so if you own a store, he says, you never want to be next door to a bank: potential shoppers speed up when they walk past a bank (since there's nothing to look at), and by the time they slow down they've walked right past your business. The downshift factor also means that when potential shoppers enter a store it's going to take them from five to fifteen paces to adjust to the light and refocus and gear down from walking speed to shopping speed-particularly if they've just had to navigate a treacherous parking lot or hurry to make the light at Fifty- seventh and Fifth. Paco calls that area inside the door the Decompression Zone, and something he tells clients over and over again is never, ever put anything of value in that zone- not shopping baskets or tie racks or big promotional displays- because no one is going to see it. Paco believes that, as a rule of thumb, customer interaction with any product or promotional display in the Decompression Zone will increase at least thirty per cent once it's moved to the back edge of the zone, and even more if it's placed to the right, because another of the fundamental rules of how human beings shop is that upon entering a store-whether it's Nordstrom or K mart, Tiffany or the Gap-the shopper invariably and reflexively turns to the right. Paco believes in the existence of the Invariant Right because he has actually verified it. He has put cameras in stores trained directly on the doorway, and if you go to his office, just above Union Square, where videocassettes and boxes of Super-eight film from all his work over the years are stacked in plastic Tupperware containers practically up to the ceiling, he can show you reel upon reel of grainy entryway video-customers striding in the door, downshifting, refocussing, and then, again and again, making that little half turn.
Paco Underhill is a tall man in his mid-forties, partly bald, with a neatly trimmed beard and an engaging, almost goofy manner. He wears baggy khakis and shirts open at the collar, and generally looks like the academic he might have been if he hadn't been captivated, twenty years ago, by the ideas of the urban anthropologist William Whyte. It was Whyte who pioneered the use of time-lapse photography as a tool of urban planning, putting cameras in parks and the plazas in front of office buildings in midtown Manhattan, in order to determine what distinguished a public space that worked from one that didn't. As a Columbia undergraduate, in 1974, Paco heard a lecture on Whyte's work and, he recalls, left the room "walking on air." He immediately read everything Whyte had written. He emptied his bank account to buy cameras and film and make his own home movie, about a pedestrian mall in Poughkeepsie. He took his "little exercise" to Whyte's advocacy group, the Project for Public Spaces, and was offered a job. Soon, however, it dawned on Paco that Whyte's ideas could be taken a step further-that the same techniques he used to establish why a plaza worked or didn't work could also be used to determine why a store worked or didn't work. Thus was born the field of retail anthropology, and, not long afterward, Paco founded Envirosell, which in just over fifteen years has counselled some of the most familiar names in American retailing, from Levi Strauss to Kinney, Starbucks, McDonald's, Blockbuster, Apple Computer, A.T. & T., and a number of upscale retailers that Paco would rather not name. When Paco gets an assignment, he and his staff set up a series of video cameras throughout the test store and then back the cameras up with Envirosell staffers-trackers, as they're known-armed with clipboards. Where the cameras go and how many trackers Paco deploys depends on exactly what the store wants to know about its shoppers. Typically, though, he might use six cameras and two or three trackers, and let the study run for two or three days, so that at the end he would have pages and pages of carefully annotated tracking sheets and anywhere from a hundred to five hundred hours of film. These days, given the expansion of his business, he might tape fifteen thousand hours in a year, and, given that he has been in operation since the late seventies, he now has well over a hundred thousand hours of tape in his library. Even in the best of times, this would be a valuable archive. But today, with the retail business in crisis, it is a gold mine. The time per visit that the average American spends in a shopping mall was sixty-six minutes last year-down from seventy-two minutes in 1992-and is the lowest number ever recorded. The amount of selling space per American shopper is now more than double what it was in the mid-seventies, meaning that profit margins have never been narrower, and the costs of starting a retail business-and of failing-have never been higher. In the past few years, countless dazzling new retailing temples have been built along Fifth and Madison Avenues- Barneys, Calvin Klein, Armani, Valentino, Banana Republic, Prada, Chanel, Nike Town, and on and on-but it is an explosion of growth based on no more than a hunch, a hopeful multimillion-dollar gamble that the way to break through is to provide the shopper with spectacle and more spectacle. "The arrogance is gone," Millard Drexler, the president and CEO of the Gap, told me. "Arrogance makes failure. Once you think you know the answer, it's almost always over." In such a competitive environment, retailers don't just want to know how shoppers behave in their stores. They have to know. And who better to ask than Paco Underhill, who in the past decade and a half has analyzed tens of thousands of hours of shopping videotape and, as a result, probably knows more about the strange habits and quirks of the species Emptor americanus than anyone else alive?
2.
Paco is considered the originator, for example, of what is known in the trade as the butt-brush theory-or, as Paco calls it, more delicately, le facteur bousculade-which holds that the likelihood of a woman's being converted from a browser to a buyer is inversely proportional to the likelihood of her being brushed on her behind while she's examining merchandise. Touch-or brush or bump or jostle-a woman on the behind when she has stopped to look at an item, and she will bolt. Actually, calling this a theory is something of a misnomer, because Paco doesn't offer any explanation for why women react that way, aside from venturing that they are "more sensitive back there." It's really an observation, based on repeated and close analysis of his videotape library, that Paco has transformed into a retailing commandment: a women's product that requires extensive examination should never be placed in a narrow aisle.
Paco approaches the problem of the Invariant Right the same way. Some retail thinkers see this as a subject crying out for interpretation and speculation. The design guru Joseph Weishar, for example, argues, in his magisterial "Design for Effective Selling Space," that the Invariant Right is a function of the fact that we "absorb and digest information in the left part of the brain" and "assimilate and logically use this information in the right half," the result being that we scan the store from left to right and then fix on an object to the right "essentially at a 45 degree angle from the point that we enter." When I asked Paco about this interpretation, he shrugged, and said he thought the reason was simply that most people are right-handed. Uncovering the fundamentals of "why" is clearly not a pursuit that engages him much. He is not a theoretician but an empiricist, and for him the important thing is that in amassing his huge library of in- store time-lapse photography he has gained enough hard evidence to know how often and under what circumstances the Invariant Right is expressed and how to take advantage of it.
What Paco likes are facts. They come tumbling out when he talks, and, because he speaks with a slight hesitation-lingering over the first syllable in, for example, "re-tail" or "de-sign"-he draws you in, and you find yourself truly hanging on his words. "We have reached a historic point in American history," he told me in our very first conversation. "Men, for the first time, have begun to buy their own underwear." He then paused to let the comment sink in, so that I could absorb its implications, before he elaborated: "Which means that we have to totally rethink the way we sell that product." In the parlance of Hollywood scriptwriters, the best endings must be surprising and yet inevitable; and the best of Paco's pronouncements take the same shape. It would never have occurred to me to wonder about the increasingly critical role played by touching-or, as Paco calls it, petting- clothes in the course of making the decision to buy them. But then I went to the Gap and to Banana Republic and saw people touching and fondling and, one after another, buying shirts and sweaters laid out on big wooden tables, and what Paco told me-which was no doubt based on what he had seen on his videotapes-made perfect sense: that the reason the Gap and Banana Republic have tables is not merely that sweaters and shirts look better there, or that tables fit into the warm and relaxing residential feeling that the Gap and Banana Republic are trying to create in their stores, but that tables invite-indeed, symbolize-touching. "Where do we eat?" Paco asks. "We eat, we pick up food, on tables."
Paco produces for his clients a series of carefully detailed studies, totalling forty to a hundred and fifty pages, filled with product-by-product breakdowns and bright-colored charts and graphs. In one recent case, he was asked by a major clothing retailer to analyze the first of a new chain of stores that the firm planned to open. One of the things the client wanted to know was how successful the store was in drawing people into its depths, since the chances that shoppers will buy something are directly related to how long they spend shopping, and how long they spend shopping is directly related to how deep they get pulled into the store. For this reason, a supermarket will often put dairy products on one side, meat at the back, and fresh produce on the other side, so that the typical shopper can't just do a drive-by but has to make an entire circuit of the store, and be tempted by everything the supermarket has to offer. In the case of the new clothing store, Paco found that ninety-one per cent of all shoppers penetrated as deep as what he called Zone 4, meaning more than three-quarters of the way in, well past the accessories and shirt racks and belts in the front, and little short of the far wall, with the changing rooms and the pants stacked on shelves. Paco regarded this as an extraordinary figure, particularly for a long, narrow store like this one, where it is not unusual for the rate of penetration past, say, Zone 3 to be under fifty per cent. But that didn't mean the store was perfect-far from it. For Paco, all kinds of questions remained.
Purchasers, for example, spent an average of eleven minutes and twenty-seven seconds in the store, nonpurchasers two minutes and thirty-six seconds. It wasn't that the nonpurchasers just cruised in and out: in those two minutes and thirty-six seconds, they went deep into the store and examined an average of 3.42 items. So why didn't they buy? What, exactly, happened to cause some browsers to buy and other browsers to walk out the door?
Then, there was the issue of the number of products examined. The purchasers were looking at an average of 4.81 items but buying only 1.33 items. Paco found this statistic deeply disturbing. As the retail market grows more cutthroat, store owners have come to realize that it's all but impossible to increase the number of customers coming in, and have concentrated instead on getting the customers they do have to buy more. Paco thinks that if you can sell someone a pair of pants you must also be able to sell that person a belt, or a pair of socks, or a pair of underpants, or even do what the Gap does so well: sell a person a complete outfit. To Paco, the figure 1.33 suggested that the store was doing something very wrong, and one day when I visited him in his office he sat me down in front of one of his many VCRs to see how he looked for the 1.33 culprit.
It should be said that sitting next to Paco is a rather strange experience. "My mother says that I'm the best-paid spy in America," he told me. He laughed, but he wasn't entirely joking. As a child, Paco had a nearly debilitating stammer, and, he says, "since I was never that comfortable talking I always relied on my eyes to understand things." That much is obvious from the first moment you meet him: Paco is one of those people who look right at you, soaking up every nuance and detail. It isn't a hostile gaze, because Paco isn't hostile at all. He has a big smile, and he'll call you "chief" and use your first name a lot and generally act as if he knew you well. But that's the awkward thing: he has looked at you so closely that you're sure he does know you well, and you, meanwhile, hardly know him at all. This kind of asymmetry is even more pronounced when you watch his shopping videos with him, because every movement or gesture means something to Paco-he has spent his adult life deconstructing the shopping experience-but nothing to the outsider, or, at least, not at first. Paco had to keep stopping the video to get me to see things through his eyes before I began to understand. In one sequence, for example, a camera mounted high on the wall outside the changing rooms documented a man and a woman shopping for a pair of pants for what appeared to be their daughter, a girl in her mid-teens. The tapes are soundless, but the basic steps of the shopping dance are so familiar to Paco that, once I'd grasped the general idea, he was able to provide a running commentary on what was being said and thought. There is the girl emerging from the changing room wearing her first pair. There she is glancing at her reflection in the mirror, then turning to see herself from the back. There is the mother looking on. There is the father-or, as fathers are known in the trade, the "wallet carrier"-stepping forward and pulling up the jeans. There's the girl trying on another pair. There's the primp again. The twirl. The mother. The wallet carrier. And then again, with another pair. The full sequence lasted twenty minutes, and at the end came the take-home lesson, for which Paco called in one of his colleagues, Tom Moseman, who had supervised the project. "This is a very critical moment," Tom, a young, intense man wearing little round glasses, said, and he pulled up a chair next to mine. "She's saying, 'I don't know whether I should wear a belt.' Now here's the salesclerk. The girl says to him, 'I need a belt,' and he says, 'Take mine.' Now there he is taking her back to the full-length mirror." A moment later, the girl returns, clearly happy with the purchase. She wants the jeans. The wallet carrier turns to her, and then gestures to the salesclerk. The wallet carrier is telling his daughter to give back the belt. The girl gives back the belt. Tom stops the tape. He's leaning forward now, a finger jabbing at the screen. Beside me, Paco is shaking his head. I don't get it-at least, not at first-and so Tom replays that last segment. The wallet carrier tells the girl to give back the belt. She gives back the belt. And then, finally, it dawns on me why this store has an average purchase number of only 1.33. "Don't you see?" Tom said. "She wanted the belt. A great opportunity to make an add-on sale . . . lost!"
3.
Should we be afraid of Paco Underhill? One of the fundamental anxieties of the American consumer, after all, has always been that beneath the pleasure and the frivolity of the shopping experience runs an undercurrent of manipulation, and that anxiety has rarely seemed more justified than today. The practice of prying into the minds and habits of American consumers is now a multibillion-dollar business. Every time a product is pulled across a supermarket checkout scanner, information is recorded, assembled, and sold to a market-research firm for analysis. There are companies that put tiny cameras inside frozen-food cases in supermarket aisles; market-research firms that feed census data and behavioral statistics into algorithms and come out with complicated maps of the American consumer; anthropologists who sift through the garbage of carefully targeted households to analyze their true consumption patterns; and endless rounds of highly organized focus groups and questionnaire takers and phone surveyors. That some people are now tracking our every shopping move with video cameras seems in many respects the last straw: Paco's movies are, after all, creepy. They look like the surveillance videos taken during convenience-store holdups-hazy and soundless and slightly warped by the angle of the lens. When you watch them, you find yourself waiting for something bad to happen, for someone to shoplift or pull a gun on a cashier.
The more time you spend with Paco's videos, though, the less scary they seem. After an hour or so, it's no longer clear whether simply by watching people shop-and analyzing their every move-you can learn how to control them. The shopper that emerges from the videos is not pliable or manipulable. The screen shows people filtering in and out of stores, petting and moving on, abandoning their merchandise because checkout lines are too long, or leaving a store empty-handed because they couldn't fit their stroller into the aisle between two shirt racks. Paco's shoppers are fickle and headstrong, and are quite unwilling to buy anything unless conditions are perfect-unless the belt is presented at exactly the right moment. His theories of the butt-brush and petting and the Decompression Zone and the Invariant Right seek not to make shoppers conform to the desires of sellers but to make sellers conform to the desires of shoppers. What Paco is teaching his clients is a kind of slavish devotion to the shopper's every whim. He is teaching them humility. Paco has worked with supermarket chains, and when you first see one of his videos of grocery aisles it looks as if he really had- at least in this instance-got one up on the shopper. The clip he showed me was of a father shopping with a small child, and it was an example of what is known in the trade as "advocacy," which basically means what happens when your four-year-old goes over and grabs a bag of cookies that the store has conveniently put on the bottom shelf, and demands that it be purchased. In the clip, the father takes what the child offers him. "Generally, dads are not as good as moms at saying no," Paco said as we watched the little boy approach his dad. "Men tend to be more impulse-driven than women in grocery stores. We know that they tend to shop less often with a list. We know that they tend to shop much less frequently with coupons, and we know, simply by watching them shop, that they can be marching down the aisle and something will catch their eye and they will stop and buy." This kind of weakness on the part of fathers might seem to give the supermarket an advantage in the cookie-selling wars, particularly since more and more men go grocery shopping with their children. But then Paco let drop a hint about a study he'd just done in which he discovered, to his and everyone else's amazement, that shoppers had already figured this out, that they were already one step ahead-that families were avoiding the cookie aisle. This may seem like a small point. But it begins to explain why, even though retailers seem to know more than ever about how shoppers behave, even though their efforts at intelligence-gathering have rarely seemed more intrusive and more formidable, the retail business remains in crisis. The reason is that shoppers are a moving target. They are becoming more and more complicated, and retailers need to know more and more about them simply to keep pace. This fall, for example, Estée Lauder is testing in a Toronto shopping mall a new concept in cosmetics retailing. Gone is the enclosed rectangular counter, with the sales staff on one side, customers on the other, and the product under glass in the middle. In its place the company has provided an assortment of product-display, consultation, and testing kiosks arranged in a broken circle, with a service desk and a cashier in the middle. One of the kiosks is a "makeup play area," which allows customers to experiment on their own with a hundred and thirty different shades of lipstick. There are four self-service displays-for perfumes, skin-care products, and makeup-which are easily accessible to customers who have already made up their minds. And, for those who haven't, there is a semiprivate booth for personal consultations with beauty advisers and makeup artists. The redesign was prompted by the realization that the modern working woman no longer had the time or the inclination to ask a salesclerk to assist her in every purchase, that choosing among shades of lipstick did not require the same level of service as, say, getting up to speed on new developments in skin care, that a shopper's needs were now too diverse to be adequately served by just one kind of counter. "I was going from store to store, and the traffic just wasn't there," Robin Burns, the president and C.E.O. of Estée Lauder U.S.A. and Canada, told me. "We had to get rid of the glass barricade." The most interesting thing about the new venture, though, is what it says about the shifting balance of power between buyer and seller. Around the old rectangular counter, the relationship of clerk to customer was formal and subtly paternalistic. If you wanted to look at a lipstick, you had to ask for it. "Twenty years ago, the sales staff would consult with you and tell you what you needed, as opposed to asking and recommending," Burns said. "And in those days people believed what the salesperson told them." Today, the old hierarchy has been inverted. "Women want to draw their own conclusions," Burns said. Even the architecture of the consultation kiosk speaks to the transformation: the beauty adviser now sits beside the customer, not across from her.
4.
This doesn't mean that marketers and retailers have stopped trying to figure out what goes on in the minds of shoppers. One of the hottest areas in market research, for example, is something called typing, which is a sophisticated attempt to predict the kinds of products that people will buy or the kind of promotional pitch they will be susceptible to on the basis of where they live or how they score on short standardized questionnaires. One market-research firm in Virginia, Claritas, has divided the entire country, neighborhood by neighborhood, into sixty-two different categories-Pools & Patios, Shotguns & Pickups, Bohemia Mix, and so on-using census data and results from behavioral surveys. On the basis of my address in Greenwich Village, Claritas classifies me as Urban Gold Coast, which means that I like Kellogg's Special K, spend more than two hundred and fifty dollars on sports coats, watch "Seinfeld," and buy metal polish. Such typing systems-and there are a number of them- can be scarily accurate. I actually do buy Kellogg's Special K, have spent more than two hundred and fifty dollars on a sports coat, and watch "Seinfeld." (I don't buy metal polish.) In fact, when I was typed by a company called Total Research, in Princeton, the results were so dead-on that I got the same kind of creepy feeling that I got when I first watched Paco's videos. On the basis of a seemingly innocuous multiple-choice test, I was scored as an eighty-nine-per-cent Intellect and a seven-per-cent Relief Seeker (which I thought was impressive until John Morton, who developed the system, told me that virtually everyone who reads The New Yorker is an Intellect). When I asked Morton to guess, on the basis of my score, what kind of razor I used, he riffed, brilliantly, and without a moment's hesitation. "If you used an electric razor, it would be a Braun," he began. "But, if not, you're probably shaving with Gillette, if only because there really isn't an Intellect safety-razor positioning out there. Schick and Bic are simply not logical choices for you, although I'm thinking, You're fairly young, and you've got that Relief Seeker side. It's possible you would use Bic because you don't like that all- American, overly confident masculine statement of Gillette. It's a very, very conventional positioning that Gillette uses. But then they've got the technological angle with the Gillette Sensor. . . . I'm thinking Gillette. It's Gillette."
He was right. I shave with Gillette-though I didn't even know that I do. I had to go home and check. But information about my own predilections may be of limited usefulness in predicting how I shop. In the past few years, market researchers have paid growing attention to the role in the shopping experience of a type of consumer known as a Market Maven. "This is a person you would go to for advice on a car or a new fashion," said Linda Price, a marketing professor at the University of South Florida, who first came up with the Market Maven concept, in the late eighties. "This is a person who has information on a lot of different products or prices or places to shop. This is a person who likes to initiate discussions with consumers and respond to requests. Market Mavens like to be helpers in the marketplace. They take you shopping. They go shopping for you, and it turns out they are a lot more prevalent than you would expect." Mavens watch more television than almost anyone else does, and they read more magazines and open their junk mail and look closely at advertisements and have an awful lot of influence on everyone else. According to Price, sixty per cent of Americans claim to know a Maven.
The key question, then, is not what I think but what my Mavens think. The challenge for retailers and marketers, in turn, is not so much to figure out and influence my preferences as to figure out and influence the preferences of my Mavens, and that is a much harder task. "What's really interesting is that the distribution of Mavens doesn't vary by ethnic category, by income, or by professional status," Price said. "A working woman is just as likely to be a Market Maven as a nonworking woman. You might say that Mavens are likely to be older, unemployed people, but that's wrong, too. There is simply not a clear demographic guide to how to find these people." More important, Mavens are better consumers than most of the rest of us. In another of the typing systems, developed by the California-based SRI International, Mavens are considered to be a subcategory of the consumer type known as Fulfilled, and Fulfilleds, one SRI official told me, are "the consumers from Hell-they are very feature oriented." He explained, "They are not pushed by promotions. You can reach them, but it's an intellectual argument." As the complexity of the marketplace grows, in other words, we have responded by appointing the most skeptical and the most savvy in our midst to mediate between us and sellers. The harder stores and manufacturers work to sharpen and refine their marketing strategies, and the harder they try to read the minds of shoppers, the more we hide behind Mavens.
5.
Imagine that you want to open a clothing store, men's and women's, in the upper-middle range-say, khakis at fifty dollars, dress shirts at forty dollars, sports coats and women's suits at two hundred dollars and up. The work of Paco Underhill would suggest that in order to succeed you need to pay complete and concentrated attention to the whims of your customers. What does that mean, in practical terms? Well, let's start with what's called the shopping gender gap. In the retail-store study that Paco showed me, for example, male buyers stayed an average of nine minutes and thirty-nine seconds in the store and female buyers stayed twelve minutes and fifty-seven seconds. This is not atypical. Women always shop longer than men, which is one of the major reasons that in the standard regional mall women account for seventy per cent of the dollar value of all purchases. "Women have more patience than men," Paco says. "Men are more distractible. Their tolerance level for confusion or time spent in a store is much shorter than women's." If you wanted, then, you could build a store designed for men, to try to raise that thirty-per-cent sales figure to forty or forty-five per cent. You could make the look more masculine-more metal, darker woods. You could turn up the music. You could simplify the store, put less product on the floor. "I'd go narrow and deep," says James Adams, the design director for NBBJ Retail Concepts, a division of one of the country's largest retail- design firms. "You wouldn't have fifty different cuts of pants. You'd have your four basics with lots of color. You know the Garanimals they used to do to help kids pick out clothes, where you match the giraffe top with the giraffe bottom? I'm sure every guy is like 'I wish I could get those, too.' You'd want to stick with the basics. Making sure most of the color story goes together. That is a big deal with guys, because they are always screwing the colors up." When I asked Carrie Gennuso, the Gap's regional vice-president for New York, what she would do in an all-male store, she laughed and said, "I might do fewer displays and more signage. Big signs. Men! Smalls! Here!" As a rule, though, you wouldn't want to cater to male customers at the expense of female ones. It's no accident that many clothing stores have a single look in both men's and women's sections, and that the quintessential nineties look-light woods, white walls-is more feminine than masculine. Women are still the shoppers in America, and the real money is to be made by making retailing styles more female-friendly, not less. Recently, for example, NBBJ did a project to try to increase sales of the Armstrong flooring chain. Its researchers found that the sales staff was selling the flooring based on its functional virtues-the fact that it didn't scuff, that it was long-lasting, that it didn't stain, that it was easy to clean. It was being sold by men to men, as if it were a car or a stereo. And that was the problem. "It's a wonder product technologically," Adams says. "But the woman is the decision-maker on flooring, and that's not what's she's looking for. This product is about fashion, about color and design. You don't want to get too caught up in the man's way of thinking."
To appeal to men, then, retailers do subtler things. At the Banana Republic store on Fifth Avenue in midtown, the men's socks are displayed near the shoes and between men's pants and the cash register (or cash/wrap, as it is known in the trade), so that the man can grab them easily as he rushes to pay. Women's accessories are by the fitting rooms, because women are much more likely to try on pants first, and then choose an item like a belt or a bag. At the men's shirt table, the display shirts have matching ties on them-the tie table is next to it-in a grownup version of the Garanimals system. But Banana Republic would never match scarves with women's blouses or jackets. "You don't have to be that direct with women," Jeanne Jackson, the president of Banana Republic, told me. "In fact, the Banana woman is proud of her sense of style. She puts her own looks together." Jackson said she liked the Fifth Avenue store because it's on two floors, so she can separate men's and women's sections and give men what she calls "clarity of offer," which is the peace of mind that they won't inadvertently end up in, say, women's undergarments. In a one-floor store, most retailers would rather put the menswear up front and the women's wear at the back (that is, if they weren't going to split the sexes left and right), because women don't get spooked navigating through apparel of the opposite sex, whereas men most assuredly do. (Of course, in a store like the Gap at Thirty- ninth and Fifth, where, Carrie Gennuso says, "I don't know if I've ever seen a man," the issue is moot. There, it's safe to put the women's wear out front.)
The next thing retailers want to do is to encourage the shopper to walk deep into the store. The trick there is to put "destination items"-basics, staples, things that people know you have and buy a lot of-at the rear of the store. Gap stores, invariably, will have denim, which is a classic destination item for them, on the back wall. Many clothing stores also situate the cash/wrap and the fitting rooms in the rear of the store, to compel shoppers to walk back into Zone 3 or 4. In the store's prime real estate-which, given Paco's theory of the Decompression Zone and the Invariant Right, is to the right of the front entrance and five to fifteen paces in-you always put your hottest and newest merchandise, because that's where the maximum number of people will see it. Right now, in virtually every Gap in the country, the front of the store is devoted to the Gap fall look-casual combinations in black and gray, plaid shirts and jackets, sweaters, black wool and brushed-twill pants. At the Gap at Fifth Avenue and Seventeenth Street, for example, there is a fall ensemble of plaid jacket, plaid shirt, and black pants in the first prime spot, followed, three paces later, by an ensemble of gray sweater, plaid shirt, T-shirt, and black pants, followed, three paces after that, by an ensemble of plaid jacket, gray sweater, white T-shirt, and black pants. In all, three variations on the same theme, each placed so that the eye bounces naturally from the first to the second to the third, and then, inexorably, to a table deep inside Zone 1 where merchandise is arrayed and folded for petting. Every week or ten days, the combinations will change, the "look" highlighted at the front will be different, and the entryway will be transformed.
Through all of this, the store environment-the lighting, the colors, the fixtures-and the clothes have to work together. The point is not so much beauty as coherence. The clothes have to match the environment. "In the nineteen-seventies, you didn't have to have a complete wardrobe all the time," Gabriella Forte, the president and chief operating officer of Calvin Klein, says. "I think now the store has to have a complete point of view. It has to have all the options offered, so people have choices. It's the famous one-stop shopping. People want to come in, be serviced, and go out. They want to understand the clear statement the designer is making."
At the new Versace store on Fifth Avenue, in the restored neoclassical Vanderbilt mansion, Gianni Versace says that the "statement" he is making with the elaborate mosaic and parquet floors, the marble façade and the Corinthian columns is "quality-my message is always a scream for quality." At her two new stores in London, Donna Karan told me, she never wants "customers to think that they are walking into a clothing store." She said, "I want them to think that they are walking into an environment, that I am transforming them out of their lives and into an experience, that it's not about clothes, it's about who they are as people." The first thing the shopper sees in her stark, all-white DKNY store is a video monitor and café: "It's about energy," Karan said, "and nourishment." In her more sophisticated, "collection" store, where the walls are black and ivory and gold, the first thing that the customer notices is the scent of a candle: "I wanted a nurturing environment where you feel that you will be taken care of." And why, at a Giorgio Armani store, is there often only a single suit in each style on display? Not because the store has only the one suit in stock but because the way the merchandise is displayed has to be consistent with the message of the designers: that Armani suits are exclusive, that the Armani customer isn't going to run into another man wearing his suit every time he goes to an art opening at Gagosian.
The best stores all have an image-or what retailers like to call a "point of view." The flagship store for Ralph Lauren's Polo collection, for example, is in the restored Rhinelander mansion, on Madison Avenue and Seventy-second Street. The Polo Mansion, as it is known, is alive with color and artifacts that suggest a notional prewar English gentility. There are fireplaces and comfortable leather chairs and deep-red Oriental carpets and soft, thick drapes and vintage photographs and paintings of country squires and a color palette of warm crimsons and browns and greens-to the point that after you've picked out a double-breasted blazer or a cashmere sweater set or an antique silver snuffbox you feel as though you ought to venture over to Central Park for a vigorous morning of foxhunting. The Calvin Klein flagship store, twelve blocks down Madison Avenue, on the other hand, is a vast, achingly beautiful minimalist temple, with white walls, muted lighting, soaring ceilings, gray stone flooring, and, so it seems, less merchandise in the entire store than Lauren puts in a single room. The store's architect, John Pawson, says, "People who enter are given a sense of release. They are getting away from the hustle and bustle of the street and New York. They are in a calm space. It's a modern idea of luxury, to give people space."
The first thing you see when you enter the Polo Mansion is a display of two hundred and eight sweaters, in twenty- eight colors, stacked in a haberdasher's wooden fixture, behind an antique glass counter; the first thing you see at the Klein store is a white wall, and then, if you turn to the right, four clear-glass shelves, each adorned with three solitary- looking black handbags. The Polo Mansion is an English club. The Klein store, Pawson says, is the equivalent of an art gallery, a place where "neutral space and light make a work of art look the most potent." When I visited the Polo Mansion, the stereo was playing Bobby Short. At Klein, the stereo was playing what sounded like Brian Eno. At the Polo Mansion, I was taken around by Charles Fagan, a vice-president at Polo Ralph Lauren. He wore pale-yellow socks, black loafers, tight jeans, a pale-purple polo shirt, blue old-school tie, and a brown plaid jacket-which sounds less attractive on paper than it was in reality. He looked, in a very Ralph Lauren way, fabulous. He was funny and engaging and bounded through the store, keeping up a constant patter ("This room is sort of sportswear, Telluride-y, vintage"), all the while laughing and hugging people and having his freshly cut red hair tousled by the sales assistants in each section. At the Calvin Klein store, the idea that the staff-tall, austere, sombre-suited-might laugh and hug and tousle each other's hair is unthinkable. Lean over and whisper, perhaps. At the most, murmur discreetly into tiny black cellular phones. Visiting the Polo Mansion and the Calvin Klein flagship in quick succession is rather like seeing a "Howards End"-"The Seventh Seal" double feature.
Despite their differences, though, these stores are both about the same thing-communicating the point of view that shoppers are now thought to demand. At Polo, the "life style" message is so coherent and all-encompassing that the store never has the 1.33 items-per-purchase problem that Paco saw in the retailer he studied. "We have multiple purchases in excess-it's the cap, it's the tie, it's the sweater, it's the jacket, it's the pants," Fagan told me, plucking each item from its shelf and tossing it onto a tartan-covered bench seat. "People say, 'I have to have the belt.' It's a life-style decision."
As for the Klein store, it's really concerned with setting the tone for the Calvin Klein clothes and products sold outside the store-including the designer's phenomenally successful underwear line, the sales of which have grown nearly fivefold in the past two and a half years, making it one of the country's dominant brands. Calvin Klein underwear is partly a design triumph: lowering the waistband just a tad in order to elongate, and flatter, the torso. But it is also a triumph of image-transforming, as Gabriella Forte says, a "commodity good into something desirable," turning a forgotten necessity into fashion. In the case of women's underwear, Bob Mazzoli, president of Calvin Klein Underwear, told me that the company "obsessed about the box being a perfect square, about the symmetry of it all, how it would feel in a woman's hand." He added, "When you look at the boxes they are little works of art." And the underwear itself is without any of the usual busyness-without, in Mazzoli's words, "the excessive detail" of most women's undergarments. It's a clean look, selling primarily in white, heather gray, and black. It's a look, in other words, not unlike that of the Calvin Klein flagship store, and it exemplifies the brilliance of the merchandising of the Calvin Klein image: preposterous as it may seem, once you've seen the store and worn the underwear, it's difficult not to make a connection between the two.
All this imagemaking seeks to put the shopping experience in a different context, to give it a story line. "I wish that the customers who come to my stores feel the same comfort they would entering a friend's house-that is to say, that they feel at ease, without the impression of having to deal with the 'sanctum sanctorum' of a designer," Giorgio Armani told me. Armani has a house. Donna Karan has a kitchen and a womb. Ralph Lauren has a men's club. Calvin Klein has an art gallery. These are all very different points of view. What they have in common is that they have nothing to do with the actual act of shopping. (No one buys anything at a friend's house or a men's club.) Presumably, by engaging in this kind of misdirection designers aim to put us at ease, to create a kind of oasis. But perhaps they change the subject because they must, because they cannot offer an ultimate account of the shopping experience itself. After all, what do we really know, in the end, about why people buy? We know about the Invariant Right and the Decompression Zone. We know to put destination items at the back and fashion at the front, to treat male shoppers like small children, to respect the female derrière, and to put the socks between the cash/wrap and the men's pants. But this is grammar; it's not prose. It is enough. But it is not much.
6.
One of the best ways to understand the new humility in shopping theory is to go back to the work of William Whyte. Whyte put his cameras in parks and in the plazas in front of office buildings because he believed in the then radical notion that the design of public spaces had been turned inside out- that planners were thinking of their designs first and of people second, when they should have been thinking of people first and of design second. In his 1980 classic, "The Social Life of Small Urban Spaces," for example, Whyte trained his cameras on a dozen or so of the public spaces and small parks around Manhattan, like the plaza in front of the General Motors Building, on Fifth Avenue, and the small park at 77 Water Street, downtown, and Paley Park, on Fifty-third Street, in order to determine why some, like the tiny Water Street park, averaged well over a hundred and fifty people during a typical sunny lunch hour and others, like the much bigger plaza at 280 Park Avenue, were almost empty. He concluded that all the things used by designers to attempt to lure people into their spaces made little or no difference. It wasn't the size of the space, or its beauty, or the presence of waterfalls, or the amount of sun, or whether a park was a narrow strip along the sidewalk or a pleasing open space. What mattered, overwhelmingly, was that there were plenty of places to sit, that the space was in some way connected to the street, and-the mystical circularity-that it was already well frequented. "What attracts people most, it would appear, is other people," Whyte noted:
If I labor the point, it is because many urban spaces still are being designed as though the opposite were true-as though what people liked best were the places they stay away from. People often do talk along such lines, and therefore their responses to questionnaires can be entirely misleading. How many people would say they like to sit in the middle of a crowd? Instead, they speak of "getting away from it all," and use words like "escape," "oasis," "retreat." What people do, however, reveals a different priority.
Whyte's conclusions demystified the question of how to make public space work. Places to sit, streets to enjoy, and people to watch turned out to be the simple and powerful rules for park designers to follow, and these rules demolished the orthodoxies and theoretical principles of conventional urban design. But in a more important sense-and it is here that Whyte's connection with Paco Underhill and retail anthropology and the stores that line Fifth and Madison is most striking-what Whyte did was to remystify the art of urban planning. He said, emphatically, that people could not be manipulated, that they would enter a public space only on their own terms, that the goal of observers like him was to find out what people wanted, not why they wanted it. Whyte, like Paco, was armed with all kinds of facts and observations about what it took to build a successful public space. He had strict views on how wide ledges had to be to lure passersby (at least thirty inches, or two backsides deep), and what the carrying capacity of prime outdoor sitting space is (total number of square feet divided by three). But, fundamentally, he was awed by the infinite complexity and the ultimate mystery of human behavior. He took people too seriously to think that he could control them. Here is Whyte, in "The Social Life of Small Urban Spaces," analyzing hours of videotape and describing what he has observed about the way men stand in public. He's talking about feet. He could just as easily be talking about shopping:
Foot movements . . . seem to be a silent language. Often, in a schmoozing group, no one will be saying anything. Men stand bound in amiable silence, surveying the passing scene. Then, slowly, rhythmically, one of the men rocks up and down; first on the ball of the foot, then back on the heel. He stops. Another man starts the same movement. Sometimes there are reciprocal gestures. One man makes a half turn to the right. Then, after a rhythmic interval, another responds with a half turn to the left. Some kind of communication seems to be taking place here, but I've never broken the code.
Damaged
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
February 24, 1997
CRIME AND SCIENCE
Why do some people turn into violent criminals?
New evidence suggests that it may all be in the brain
1.
On the morning of November 18, 1996, Joseph Paul Franklin was led into Division 15 of the St. Louis County Courthouse, in Clayton, Missouri. He was wearing a pair of black high-top sneakers, an orange jumpsuit with short sleeves that showed off his prison biceps, and a pair of thick black-rimmed glasses. There were two guards behind him, two guards in front of him, and four more guards stationed around the courtroom, and as he walked into the room-or, rather, shuffled, since his feet were manacled-Franklin turned to one of them and said "Wassup?" in a loud, Southern-accented voice. Then he sat down between his attorneys and stared straight ahead at the judge, completely still except for his left leg, which bounced up and down in an unceasing nervous motion.
Joseph Franklin takes credit for shooting and paralyzing Larry Flynt, the publisher of Hustler, outside a Lawrenceville, Georgia, courthouse in March of 1978, apparently because Flynt had printed photographs of a racially mixed couple. Two years later, he says, he gunned down the civil-rights leader Vernon Jordan outside a Marriott in Fort Wayne, Indiana, tearing a hole in Jordan's back the size of a fist. In the same period in the late seventies, as part of what he later described as a "mission" to rid America of blacks and Jews and of whites who like blacks and Jews, Franklin says that he robbed several banks, bombed a synagogue in Tennessee, killed two black men jogging with white women in Utah, shot a black man and a white woman coming out of a Pizza Hut in a suburb of Chattanooga, Tennessee, and on and on-a violent spree that may have spanned ten states and claimed close to twenty lives, and, following Franklin's arrest, in 1980, earned him six consecutive life sentences.
Two years ago, while Franklin was imprisoned in Marion Federal Penitentiary, in Illinois, he confessed to another crime. He was the one, he said, who had hidden in the bushes outside a synagogue in suburban St. Louis in the fall of 1977 and opened fire on a group of worshippers, killing forty-two-year-old Gerald Gordon. After the confession, the State of Missouri indicted him on one count of capital murder and two counts of assault. He was moved from Marion to the St. Louis County jail, and from there, on a sunny November morning last year, he was brought before Judge Robert Campbell, of the St. Louis County Circuit Court, so that it could be determined whether he was fit to stand trial-whether, in other words, embarking on a campaign to rid America of Jews and blacks was an act of evil or an act of illness.
The prosecution went first. On a television set at one side of the courtroom, two videotapes were shown-one of an interview with Franklin by a local news crew and the other of Franklin's formal confession to the police. In both, he seems lucid and calm, patiently retracing how he planned and executed his attack on the synagogue. He explains that he bought the gun in a suburb of Dallas, answering a classified ad, so the purchase couldn't be traced. He drove to the St. Louis area and registered at a Holiday Inn. He looked through the Yellow Pages to find the names of synagogues. He filed the serial number off his rifle and bought a guitar case to carry the rifle in. He bought a bicycle. He scouted out a spot near his chosen synagogue from which he could shoot without being seen. He parked his car in a nearby parking lot and rode his bicycle to the synagogue. He lay in wait in the bushes for several hours, until congregants started to emerge. He fired five shots. He rode the bicycle back to the parking lot, climbed into his car, pulled out of the lot, checked his police scanner to see if he was being chased, then drove south, down I-55, back home toward Memphis.
In the interview with the news crew, Franklin answered every question, soberly and directly. He talked about his tattoos ("This one is the Grim Reaper. I got it in Dallas") and his heroes ("One person I like is Howard Stern. I like his honesty"), and he respectfully disagreed with the media's description of racially motivated crimes as "hate crimes," since, he said, "every murder is committed out of hate." In his confession to the police, after he detailed every step of the synagogue attack, Franklin was asked if there was anything he'd like to say. He stared thoughtfully over the top of his glasses. There was a long silence. "I can't think of anything," he answered. Then he was asked if he felt any remorse. There was another silence. "I can't say that I do," he said. He paused again, then added, "The only thing I'm sorry about is that it's not legal."
"What's not legal?"
Franklin answered as if he'd just been asked the time of day: "Killing Jews."
After a break for lunch, the defense called Dorothy Otnow Lewis, a psychiatrist at New York's Bellevue Hospital and a professor at New York University School of Medicine. Over the past twenty years, Lewis has examined, by her own rough estimate, somewhere between a hundred and fifty and two hundred murderers. She was the defense's only expert witness in the trial of Arthur Shawcross, the Rochester serial killer who strangled eleven prostitutes in the late eighties. She examined Joel Rifkin, the Long Island serial killer, and Mark David Chapman, who shot John Lennon-both for the defense. Once, in a Florida prison, she sat for hours talking with Ted Bundy. It was the day before his execution, and when they had finished Bundy bent down and kissed her cheek. "Bundy thought I was the only person who didn't want something from him," Lewis says. Frequently, Lewis works with Jonathan Pincus, a neurologist at Georgetown University. Lewis does the psychiatric examination; Pincus does the neurological examination. But Franklin put his foot down. He could tolerate being examined by a Jewish woman, evidently, but not by a Jewish man. Lewis testified alone.
Lewis is a petite woman in her late fifties, with short dark hair and large, liquid brown eyes. She was wearing a green blazer and a black skirt with a gold necklace, and she was so dwarfed by the witness stand that from the back of the courtroom only her head was visible. Under direct examination she said that she had spoken with Franklin twice-once for six hours and once for less than half an hour-and had concluded that he was a paranoid schizophrenic: a psychotic whose thinking was delusional and confused, a man wholly unfit to stand trial at this time. She talked of brutal physical abuse he had suffered as a child. She mentioned scars on his scalp from blows Franklin had told her were inflicted by his mother. She talked about his obsessive desire to be castrated, his grandiosity, his belief that he may have been Jewish in an earlier life, his other bizarre statements and beliefs. At times, Lewis seemed nervous, her voice barely audible, but perhaps that was because Franklin was staring at her unblinkingly, his leg bouncing faster and faster under the table. After an hour, Lewis stepped down. She paused in front of Franklin and, ever the psychiatrist, suggested that when everything was over they should talk. Then she walked slowly through the courtroom, past the defense table and the guards, and out the door.
Later that day, on the plane home to New York City, Lewis worried aloud that she hadn't got her point across. Franklin, at least as he sat there in the courtroom, didn't seem insane. The following day, Franklin took the stand himself for two hours, during which he did his own psychiatric diagnosis, confessing to a few "minor neuroses," but not to being "stark raving mad," as he put it. Of the insanity defense, he told the court, "I think it is hogwash, to tell you the truth. I knew exactly what I was doing." During his testimony, Franklin called Lewis "a well-intentioned lady" who "seems to embellish her statements somewhat." Lewis seemed to sense that that was the impression she'd left: that she was overreaching, that she was some kind of caricature- liberal Jewish New York psychiatrist comes to Middle America to tell the locals to feel sorry for a murderer. Sure enough, a week later the Judge rejected Lewis's arguments and held Franklin competent to stand trial. But, flying back to New York, Lewis insisted that she wasn't making an ideological point of Franklin; rather, she was saying that she didn't feel that Franklin's brain worked the way brains are supposed to work-that he had identifiable biological and psychiatric problems that diminished his responsibility for his actions. "I just don't believe people are born evil," she said. "To my mind, that is mindless. Forensic psychiatrists tend to buy into the notion of evil. I felt that that's no explanation. The deed itself is bizarre, grotesque. But it's not evil. To my mind, evil bespeaks conscious control over something. Serial murderers are not in that category. They are driven by forces beyond their control."
The plane was in the air now. By some happy set of circumstances, Lewis had been bumped up to first class. She was sipping champagne. Her shoes were off. "You know, when I was leaving our last interview, he sniffed me right here," she said, and she touched the back of her neck and flared her nostrils in mimicry of Franklin's gesture. "He'd said to his attorney, 'You know, if you weren't here, I'd make a play for her.' " She had talked for six hours to this guy who hated Jews so much that he hid in the bushes and shot at them with a rifle, and he had come on to her, just like that. She shivered at the memory: "He said he wanted some pussy."
2.
When Dorothy Lewis graduated from Yale School of Medicine, in 1963, neurology, the study of the brain and the rest of the nervous system, and psychiatry, the study of behavior and personality, were entirely separate fields. This was still the Freudian era. Little attempt was made to search for organic causes of criminality. When, after medical school, she began working with juvenile delinquents in New Haven, the theory was that these boys were robust, healthy. According to the prevailing wisdom, a delinquent was simply an ordinary kid who had been led astray by a troubled home life-by parents who were too irresponsible or too addled by drugs and alcohol to provide proper discipline. Lewis came from the archetypal do- gooding background-reared on Central Park West; schooled at Ethical Culture; a socialist mother who as a child had once introduced Eugene V. Debs at a political rally; father in the garment business; heated dinner-table conversations about the Rosenbergs-and she accepted this dogma. Criminals were just like us, only they had been given bad ideas about how to behave. The trouble was that when she began working with delinquents they didn't seem like that at all. They didn't lack for discipline. If anything, she felt, they were being disciplined too much. And these teenagers weren't robust and rowdy; on the contrary, they seemed to be damaged and impaired. "I was studying for my boards in psychiatry, and in order to do a good job you wanted to do a careful medical history and a careful mental-status exam," she says. "I discovered that many of these kids had had serious accidents, injuries, or illnesses that seemed to have affected the central nervous system and that hadn't been identified previously."
In 1976, she was given a grant by the State of Connecticut to study a group of nearly a hundred juvenile delinquents. She immediately went to see Pincus, then a young professor of neurology at Yale. They had worked together once before. "Dorothy came along and said she wanted to do this project with me," Pincus says. "She wanted to look at violence. She had this hunch that there was something physically wrong with these kids. I said, 'That's ridiculous. Everyone knows violence has nothing to do with neurology.' " At that point, Pincus recalls, he went to his bookshelf and began reading out loud from what was then the definitive work in the field: "Criminality and Psychiatric Disorders," by Samuel Guze, the chairman of the psychiatry department of Washington University, in St. Louis. "Sociopathy, alcoholism, and drug dependence are the psychiatric disorders characteristically associated with serious crime," he read. "Schizophrenia, primary affective disorders, anxiety neurosis, obsessional neurosis, phobic neurosis, and"-and there he paused- "brain syndromes are not." But Lewis would have none of it. "She said, 'We should do it anyway.' I said, 'I don't have the time.' She said, 'Jonathan, I can pay you.' So I would go up on Sunday, and I would examine three or four youths, just give them a standard neurological examination." But, after seeing the kids for himself, Pincus, too, became convinced that the prevailing wisdom about juvenile delinquents--and, by extension, about adult criminals--was wrong, and that Lewis was right. "Almost all the violent ones were damaged," Pincus recalls, shaking his head.
Over the past twenty years, Lewis and Pincus have testified for the defense in more than a dozen criminal cases, most of them death- penalty appeals. Together, they have published a series of groundbreaking studies on murderers and delinquents, painstakingly outlining the medical and psychiatric histories of the very violent; one of their studies has been cited twice in United States Supreme Court opinions. Of the two, Pincus is more conservative. He doesn't have doubts about evil the way Lewis does, and sharply disagrees with her on some of the implications of their work. On the core conclusions, however, they are in agreement. They believe that the most vicious criminals are, overwhelmingly, people with some combination of abusive childhoods, brain injuries, and psychotic symptoms (in particular, paranoia), and that while each of these problems individually has no connection to criminality (most people who have been abused or have brain injuries or psychotic symptoms never end up harming anyone else), somehow these factors together create such terrifying synergy as to impede these individuals' ability to play by the rules of society.
Trying to determine the causes of human behavior is, of course, a notoriously tricky process. Lewis and Pincus haven't done the kind of huge, population-wide studies that could definitively answer just how predictive of criminality these factors are. Their findings are, however, sufficiently tantalizing that their ideas have steadily gained ground in recent years. Other researchers have now done some larger studies supporting their ideas. Meanwhile, a wave of new findings in the fields of experimental psychiatry and neurology has begun to explain why it is that brain dysfunction and child abuse can have such dire effects. The virtue of this theory is that it sidesteps all the topics that so cripple contemporary discussions of violence-genetics, biological determinism, and, of course, race. In a sense, it's a return to the old liberal idea that environment counts, and that it is possible to do something significant about crime by changing the material conditions of people's lives. Only, this time the maddening imprecision of the old idea (what, exactly, was it about bad housing, say, that supposedly led to violent crime?) has been left behind. Lewis and Pincus and other neurologists and psychiatrists working in the field of criminal behavior think they are beginning to understand what it is that helps to turn some people into violent criminals-right down to which functions of the brain are damaged by abuse and injury. That's what Lewis means when she says she doesn't think that people are intrinsically evil. She thinks that some criminals simply suffer from a dysfunction of the brain, the way cardiac patients suffer from a dysfunction of the heart, and this is the central and in some ways disquieting thing about her. When she talks about criminals as victims, she doesn't use the word in the standard liberal metaphorical sense. She means it literally.
Lewis works out of a tiny set of windowless offices on the twenty- first floor of the new wing of Bellevue Hospital, in Manhattan's East Twenties. The offices are decorated in institutional colors-gray carpeting and bright-orange trim-and since they're next to the children's psychiatric ward you can sometimes hear children crying out. Lewis's desk is stacked high with boxes of medical and court records from cases she has worked on, and also with dozens of videotapes of interviews with murderers which she has conducted over the years. She talks about some of her old cases-especially some of her death-row patients-as if they had just happened, going over and over details, sometimes worrying about whether she made the absolutely correct diagnosis. The fact that everyone else has long since given up on these people seems to be just what continues to attract her. Years ago, when she was in college, Lewis found herself sitting next to the Harvard theologian Paul Tillich on the train from New York to Boston. "When you read about witches being burned at the stake," Tillich asked her, in the midst of a long and wide-ranging conversation, "do you identify with the witch or with the people looking on?" Tillich said he himself identified with the crowd. Not Lewis. She identified with the witch.
In her offices, Lewis has tapes of her interviews with Shawcross, the serial killer, and also tapes of Shawcross being interviewed by Park Elliott Dietz, the psychiatrist who testified for the prosecution in that case. Dietz is calm, in control, and has a slightly bored air, as if he had heard everything before. By contrast, Lewis, in her interviews, has a kind of innocence about her. She seems completely caught up in what is happening, and at one point, when Shawcross makes some particularly outrageous comment on what he did to one of the prostitutes he murdered, she looks back at the camera wide-eyed, as if to say "Wow!" When Dietz was on the stand, his notes were beside him in one of those rolling evidence carts, where everything is labelled and items are distinguished by color-coded dividers, so that he had the entire case at his fingertips. When Lewis testified, she kept a big stack of untidy notes on her lap and fussed through them after she was asked a question. She is like that in everyday life as well-a little distracted and spacey, wrapped up in the task at hand. It makes her so approachable and so unthreatening that it's no wonder she gets hardened criminals to tell her their secrets. It's another way of identifying with the witch. Once, while talking with Bundy, Lewis looked up after several hours and found that she had been so engrossed in their conversation that she hadn't noticed that everyone outside the soundproof glass of the interview booth-the guard, the prison officials at their desks-had left for lunch. She and Bundy were utterly alone. Terrified, Lewis stayed glued to her seat, her eyes never leaving his. "I didn't bat an eyelash," she recalls. Another time, after Lewis had interviewed a murderer in a Tennessee prison, she returned to her hotel room to find out that there had been a riot in the prison while she was there.
3.
The human brain comprises, in the simplest terms, four interrelated regions, stacked up in ascending order of complexity. At the bottom is the brain stem, which governs the most basic and primitive functions-breathing, blood pressure, and body temperature. Above that is the diencephalon, the seat of sleep and appetite. Then comes the limbic region, the seat of sexual behavior and instinctual emotions. And on top, covering the entire outside of the brain in a thick carpet of gray matter, is the cortex, the seat of both concrete and abstract thought. It is the function of the cortex-and, in particular, those parts of the cortex beneath the forehead, known as the frontal lobes-to modify the impulses that surge up from within the brain, to provide judgment, to organize behavior and decision-making, to learn and adhere to rules of everyday life. It is the dominance of the cortex and the frontal lobes, in other words, that is responsible for making us human; and the central argument of the school to which Lewis and Pincus belong is that what largely distinguishes many violent criminals from the rest of us is that something has happened inside their brains to throw the functioning of the cortex and the frontal lobes out of whack. "We are a highly socialized animal. We can sit in theatres with strangers and not fight with each other," Stuart Yudofsky, the chairman of psychiatry at Baylor College of Medicine, in Houston, told me. "Many other mammals could never crowd that closely together. Our cortex helps us figure out when we are and are not in danger. Our memory tells us what we should be frightened of and angry with and what we shouldn't. But if there are problems there-if it's impaired-one can understand how that would lead to confusion, to problems with disinhibition, to violence." One of the most important things that Lewis and Pincus have to do, then, when they evaluate a murderer is check for signs of frontal-lobe impairment. This, the neurological exam, is Pincus's task.
Pincus begins by taking a medical history: he asks about car accidents and falls from trees and sports injuries and physical abuse and problems at birth and any blows to the head of a kind that might have caused damage to the frontal lobes. He asks about headaches, tests for reflexes and sensorimotor functions, and compares people's right and left sides and observes gait. "I measure the head circumference-if it's more than two standard deviations below the normal brain circumference, there may be some degree of mental retardation, and, if it's more than two standard deviations above, there may be hydrocephalus," Pincus told me. "I also check gross motor coördination. I ask people to spread their fingers and hold their hands apart and look for choreiform movements-discontinuous little jerky movements of the fingers and arms." We were in Pincus's cluttered office at Georgetown University Medical Center, in Washington, D.C., and Pincus, properly professorial in a gray Glen- plaid suit, held out his hand to demonstrate. "Then I ask them to skip, to hop," he went on, and he hopped up and down in a small space on the floor between papers and books.
Pincus stands just over six feet, has the long-limbed grace of an athlete, and plays the part of neurologist to perfection: calm, in command, with a distinguished sprinkle of white hair. At the same time, he has a look of mischief in his eyes, a streak of irreverence that allows him to jump up and down in his office before a total stranger. It's an odd combination, like Walter Matthau playing Sigmund Freud.
"Then I check for mixed dominance, to see if the person is, say, right-eyed, left-footed," he said. "If he is, it might mean that his central nervous system hasn't differentiated the way it should." He was sitting back down now. "No one of these by itself means he is damaged. But they can tell us something in aggregate."
At this point, Pincus held up a finger forty-five degrees to my left and moved it slowly to the right. "Now we're checking for frontal functions," he said. "A person should be able to look at the examiner's finger and follow it smoothly with his eyes. If he can only follow it jerkily, the frontal eye fields are not working properly. Then there's upward gaze." He asked me to direct my eyes to the ceiling. "The eye should go up five millimetres and a person should also be able to direct his gaze laterally and maintain it for twenty seconds. If he can't, that's motor impersistence." Ideally, Pincus will attempt to amplify his results with neuropsychological testing, an EEG (an electroencephalogram, which measures electrical patterns in the brain), and an MRI scan (that's magnetic resonance imaging), to see if he can spot scarring or lesions in any of the frontal regions which might contribute to impairment.
Pincus is also interested in measuring judgment. But since there is no objective standard for judgment, he tries to pick up evidence of an inability to cope with complexity, a lack of connection between experience and decision-making which is characteristic of cortical dysfunction. Now he walked behind me, reached over the top of my head, and tapped the bridge of my nose in a steady rhythm. I blinked once, then stopped. That, he told me, was normal.
"When you tap somebody on the bridge of the nose, it's reasonable for a person to blink a couple of times, because there is a threat from the outside," Pincus said. "When it's clear there is no threat, the subject should be able to accommodate that. But, if the subject blinks more than three times, that's 'insufficiency of suppression,' which may reflect frontal-lobe dysfunction. The inability to accommodate means you can't adapt to a new situation. There's a certain rigidity there."
Arthur Shawcross, who had a cyst pressing on one temporal lobe and scarring in both frontal lobes (probably from, among other things, being hit on the head with a sledgehammer and with a discus, and falling on his head from the top of a forty-foot ladder), used to walk in absolutely straight lines, splashing through puddles instead of walking around them, and he would tear his pants on a barbed-wire fence instead of using a gate a few feet away. That's the kind of behavior Pincus tries to correlate with abnormalities on the neurological examination. "In the Wisconsin Card Sorting Test, the psychologist shows the subject four playing cards-three red ones, one black one- and asks which doesn't fit," Pincus said. "Then he shows the subject, say, the four of diamonds, the four of clubs, the four of hearts, and the three of diamonds. Somebody with frontal-lobe damage who correctly picked out the black one the first time-say, the four of clubs- is going to pick the four of clubs the second time. But the rules have changed. It's now a three we're after. We're going by numbers now, not color. It's that kind of change that people with frontal-lobe damage can't make. They can't change the rules. They get stuck in a pattern. They keep using rules that are demonstrably wrong. Then there's the word-fluency test. I ask them to name in one minute as many different words as they can think of which begin with the letter 'f.' Normal is fourteen, plus or minus five. Anyone who names fewer than nine is abnormal."
This is not an intelligence test. People with frontal-lobe damage might do just as well as anyone else if they were asked, say, to list the products they might buy in a supermarket. "Under those rules, most people can think of at least sixteen products in a minute and rattle them off," Pincus said. But that's a structured test, involving familiar objects, and it's a test with rules. The thing that people with frontal-lobe damage can't do is cope with situations where there are no rules, where they have to improvise, where they need to make unfamiliar associations. "Very often, they get stuck on one word- they'll say 'four,' 'fourteen,' 'forty-four,' " Pincus said. "They'll use the same word again and again-'farm' and then 'farming.' Or, as one fellow in a prison once said to me, 'fuck,' 'fucker,' 'fucking.' They don't have the ability to come up with something else."
What's at stake, fundamentally, with frontal-lobe damage is the question of inhibition. A normal person is able to ignore the tapping after one or two taps, the same way he can ignore being jostled in a crowded bar. A normal person can screen out and dismiss irrelevant aspects of the environment. But if you can't ignore the tapping, if you can't screen out every environmental annoyance and stimulus, then you probably can't ignore being jostled in a bar, either. It's living life with a hair trigger.
A recent study of two hundred and seventy-nine veterans who suffered penetrating head injuries in Vietnam showed that those with frontal-lobe damage were anywhere from two to six times as violent and aggressive as veterans who had not suffered such injuries. This kind of aggression is what is known as neurological, or organic, rage. Unlike normal anger, it's not calibrated by the magnitude of the original insult. It's explosive and uncontrollable, the anger of someone who no longer has the mental equipment to moderate primal feelings of fear and aggression.
"There is a reactivity to it, in which a modest amount of stimulation results in a severe overreaction," Stuart Yudofsky told me. "Notice that reactivity implies that, for the most part, this behavior is not premeditated. The person is rarely violent and frightening all the time. There are often brief episodes of violence punctuating stretches when the person does not behave violently at all. There is also not any gain associated with organic violence. The person isn't using the violence to manipulate someone else or get something for himself. The act of violence does just the opposite. It is usually something that causes loss for the individual. He feels that it is out of his control and unlike himself. He doesn't blame other people for it. He often says, 'I hate myself for acting this way.' The first person with organic aggression I ever treated was a man who had been inflating a truck tire when the tire literally exploded and the rim was driven into his prefrontal cortex. He became extraordinarily aggressive. It was totally uncharacteristic: he had been a religious person with strong values. But now he would not only be physically violent-he would curse. When he came to our unit, a nurse offered him some orange juice. He was calm at that moment. But then he realized that the orange juice was warm, and in one quick motion he threw it back at her, knocking her glasses off and injuring her cornea. When we asked him why, he said, 'The orange juice was warm.' But he also said, 'I don't know what got into me.' It wasn't premeditated. It was something that accelerated quickly. He went from zero to a hundred in a millisecond." At that point, I asked Yudofsky an obvious question. Suppose you had a person from a difficult and disadvantaged background, who had spent much of his life on the football field, getting his head pounded by the helmets of opposing players. Suppose he was involved in a tempestuous on-again, off-again relationship with his ex-wife. Could a vicious attack on her and another man fall into the category of neurological rage? "You're not the first person to ask that question," Yudofsky replied dryly, declining to comment further.
Pincus has found that when he examines murderers neurological problems of this kind come up with a frequency far above what would be expected in the general population. For example, Lewis and Pincus published a study of fifteen death-row inmates randomly referred to them for examination; they were able to verify forty-eight separate incidents of significant head injury. Here are the injuries suffered by just the first three murderers examined:
I.
three years: beaten almost to death by father
(multiple facial scars)
early childhood: thrown into sink onto head
(palpable scar)
late adolescence: one episode of loss of consciousness while boxing
II.
childhood: beaten in head with two-by-fours by parents
childhood: fell into pit, unconscious for several hours
seventeen years: car accident with injury to right eye
eighteen years: fell from roof apparently because of a blackout
III.
six years: glass bottle deliberately dropped onto head from tree (palpable scar on top of cranium)
eight years: hit by car
nine years: fell from platform, received head injury
fourteen years: jumped from moving car, hit head.
4.
Dorothy Lewis's task is harder than Jonathan Pincus's. He administers relatively straightforward tests of neurological function. But she is interested in the psychiatric picture, which means getting a murderer to talk about his family, his feelings and behavior, and, perhaps most important, his childhood. It is like a normal therapy session, except that Lewis doesn't have weeks in which to establish intimacy. She may have only a session or two. On one occasion, when she was visiting a notorious serial killer at San Quentin, she got lucky. "By chance, one of the lawyers had sent me some clippings from the newspaper, where I read that when he was caught he had been carrying some Wagner records," she told me. "For some reason, that stuck in my mind. The first time I went to see him, I started to approach him and he pointed at me and said, 'What's happening on June 18th?' And I said, 'That's the first night PBS is broadcasting "Der Ring des Nibelungen." ' You know, we'd studied Wagner at Ethical Culture. Granted, it was a lucky guess. But I showed him some respect, and you can imagine the rapport that engendered." Lewis says that even after talking for hours with someone guilty of horrendous crimes she never gets nightmares. She seems to be able to separate her everyday life from the task at hand-to draw a curtain between her home and her work. Once, I visited Lewis at her home: she and her husband, Mel, who is a professor of psychiatry at Yale, live in New Haven. The two dote on each other ("When I met Mel, I knew within a week that this was the man I wanted to marry," she says, flushing, "and I've never forgiven him, because it took him two weeks to ask me"), and as soon as I walked in they insisted on giving me a detailed tour of their house, picking up each memento, pointing out their children's works of art, and retelling the stories behind thirty years of anniversaries and birthdays: sometimes they told their stories in alternating sentences, and sometimes they told a story twice, first from Dorothy's perspective and then from Mel's. All in all, it was a full hour of domestic vaudeville. Then Dorothy sat on her couch, with her cat, Ptolemy, on her lap, and began to talk about serial killers, making a seamless transition from the sentimental to the unspeakable.
At the heart of Lewis's work with murderers is the search for evidence of childhood abuse. She looks for scars. She combs through old medical records for reports of suspicious injuries. She tries to talk to as many family members and friends as possible. She does all this because, of course, child abuse has devastating psychological consequences for children and the adults they become. But there is the more important reason-the one at the heart of the new theory of violence-which is that finding evidence of prolonged child abuse is a key to understanding criminal behavior because abuse also appears to change the anatomy of the brain.
When a child is born, the parts of his brain that govern basic physiological processes-that keep him breathing and keep his heart beating-are fully intact. But a newborn can't walk, can't crawl, can't speak, can't reason or do much of anything besides sleep and eat, because the higher regions of his brain-the cortex, in particular-aren't developed yet. In the course of childhood, neurons in the cortex begin to organize themselves-to differentiate and make connections-and that maturation process is in large part responsive to what happens in the child's environment. Bruce Perry, a psychiatrist at Baylor College of Medicine, has done brain scans of children who have been severely neglected, and has found that their cortical and sub-cortical areas never developed properly, and that, as a result, those regions were roughly twenty or thirty per cent smaller than normal. This kind of underdevelopment doesn't affect just intelligence; it affects emotional health. "There are parts of the brain that are involved in attachment behavior-the connectedness of one individual to another-and in order for that to be expressed we have to have a certain nature of experience and have that experience at the right time," Perry told me. "If early in life you are not touched and held and given all the somatosensory stimuli that are associated with what we call love, that part of the brain is not organized in the same way."
According to Perry, the section of the brain involved in attachment-which he places just below the cortex, in the limbic region-would look different in someone abused or neglected. The wiring wouldn't be as dense or as complex. "Such a person is literally lacking some brain organization that would allow him to actually make strong connections to other human beings. Remember the orphans in Romania? They're a classic example of children who, by virtue of not being touched and held and having their eyes gazed into, didn't get the somatosensory bath. It doesn't matter how much you love them after age two-they've missed that critical window."
In a well-known paper in the field of child abuse, Mary Main, a psychologist at Berkeley, and Carol George, now at Mills College, studied a group of twenty disadvantaged toddlers, half of whom had been subjected to serious physical abuse and half of whom had not. Main and George were interested in how the toddlers responded to a classmate in distress. What they found was that almost all the nonabused children responded to a crying or otherwise distressed peer with concern or sadness or, alternatively, showed interest and made some attempt to provide comfort. But not one of the abused toddlers showed any concern. At the most, they showed interest. The majority of them either grew distressed and fearful themselves or lashed out with threats, anger, and physical assaults. Here is the study's description of Martin, an abused boy of two and a half, who- emotionally retarded in the way that Perry describes-seemed incapable of normal interaction with another human being:
Martin . . . tried to take the hand of the crying other child, and when she resisted, he slapped her on the arm with his open hand. He then turned away from her to look at the ground and began vocalizing very strongly. "Cut it out! cut it out!," each time saying it a little faster and louder. He patted her, but when she became disturbed by his patting, he retreated "hissing at her and baring his teeth." He then began patting her on the back again, his patting became beating, and he continued beating her despite her screams.
Abuse also disrupts the brain's stress-response system, with profound results. When something traumatic happens-a car accident, a fight, a piece of shocking news-the brain responds by releasing several waves of hormones, the last of which is cortisol. The problem is that cortisol can be toxic. If someone is exposed to too much stress over too long a time, one theory is that all that cortisol begins to eat away at the organ of the brain known as the hippocampus, which serves as the brain's archivist: the hippocampus organizes and shapes memories and puts them in context, placing them in space and time and tying together visual memory with sound and smell. J. Douglas Bremner, a psychiatrist at Yale, has measured the damage that cortisol apparently does to the hippocampus by taking M.R.I. scans of the brains of adults who suffered severe sexual or physical abuse as children and comparing them with the brains of healthy adults. An M.R.I. scan is a picture of a cross-section of the brain-as if someone's head had been cut into thin slices like a tomato, and then each slice had been photographed-and in the horizontal section taken by Bremner the normal hippocampus is visible as two identical golf-ball-size organs, one on the left and one on the right, and each roughly even with the ear. In child-abuse survivors, Bremner found, the golf ball on the left is on average twelve per cent smaller than that of a healthy adult, and the theory is that it was shrunk by cortisol. Lewis says that she has examined murderers with dozens of scars on their backs, and that they have no idea how the scars got there. They can't remember their abuse, and if you look at Bremner's scans that memory loss begins to make sense: the archivist in their brain has been crippled.
Abuse also seems to affect the relationship between the left hemisphere of the brain, which plays a large role in logic and language, and the right hemisphere, which is thought to play a large role in creativity and depression. Martin Teicher, a professor of psychiatry at Harvard and McLean Hospital, recently gave EEGs to a hundred and fifteen children who had been admitted to a psychiatric facility, some of whom had a documented history of abuse. Not only did the rate of abnormal EEGs among the abused turn out to be twice that of the non-abused but all those abnormal brain scans turned out to be a result of problems on the left side of the brain. Something in the brain's stress response, Teicher theorized, was interfering with the balanced development of the brain's hemispheres.
Then Teicher did M.R.I.s of the brains of a subset of the abused children, looking at what is known as the corpus callosum. This is the fibre tract-the information superhighway-that connects the right and the left hemispheres. Sure enough, he found that parts of the corpus callosum of the abused kids were smaller than they were in the nonabused children. Teicher speculated that these abnormalities were a result of something wrong with the sheathing-the fatty substance, known as myelin, that coats the nerve cells of the corpus callosum. In a healthy person, the myelin helps the neuronal signals move quickly and efficiently. In the abused kids, the myelin seemed to have been eaten away, perhaps by the same excess cortisol that is thought to attack the hippocampus.
Taken together, these changes in brain hardware are more than simple handicaps. They are, in both subtle and fundamental ways, corrosive of self. Richard McNally, a professor of psychology at Harvard, has done memory studies with victims of serious trauma, and he has discovered that people with post-traumatic-stress disorder, or P.T.S.D., show marked impairment in recalling specific autobiographical memories. A healthy trauma survivor, asked to name an instance when he exhibited kindness, says, "Last Friday, I helped a neighbor plow out his driveway." But a trauma survivor with P.T.S.D. can only say something like "I was kind to people when I was in high school." This is what seems to happen when your hippocampus shrinks: you can't find your memories. "The ability to solve problems in the here and now depends on one's ability to access specific autobiographical memories in which one has encountered similar problems in the past," McNally says. "It depends on knowing what worked and what didn't." With that ability impaired, abuse survivors cannot find coherence in their lives. Their sense of identity breaks down.
It is a very short walk from this kind of psychological picture to a diagnosis often associated with child abuse; namely, dissociative identity disorder, or D.I.D. Victims of child abuse are thought sometimes to dissociate, as a way of coping with their pain, of distancing themselves from their environment, of getting away from the danger they faced. It's the kind of disconnection that would make sense if a victim's memories were floating around without context and identification, his left and right hemispheres separated and unequal, and his sense of self fragmented and elusive. It's also a short walk from here to understanding how someone with such neurological problems could become dangerous. Teicher argues that in some of his EEG and M.R.I. analyses of the imbalance between the left and the right hemispheres he is describing the neurological basis for the polarization so often observed in psychiatrically disturbed patients-the mood swings, the sharply contrasting temperaments. Instead of having two integrated hemispheres, these patients have brains that are, in some sense, divided down the middle. "What you get is a kind of erratic-ness," says Frank Putnam, who heads the Unit on Developmental Traumatology at the National Institute of Mental Health, in Maryland. "These kinds of people can be very different in one situation compared with another. There is the sense that they don't have a larger moral compass."
Several years ago, Lewis and Pincus worked together on an appeal for David Wilson, a young black man on death row in Louisiana. Wilson had been found guilty of murdering a motorist, Stephen Stinson, who had stopped to help when the car Wilson was in ran out of gas on I-10 outside New Orleans; and the case looked, from all accounts, almost impossible to appeal. Wilson had Stinson's blood on his clothes, in his pocket he had a shotgun shell of the same type and gauge as the one found in the gun at the murder scene, and the prosecution had an eyewitness to the whole shooting. At the trial, Wilson denied that the bloody clothes were his, denied that he had shot Stinson, denied that a tape-recorded statement the prosecution had played for the jury was of his voice, and claimed he had slept through the entire inci-dent. It took the jury thirty-five minutes to convict him of first-degree murder and sixty-five minutes more, in the sentencing phase, to send him to the electric chair.
But when Lewis and Pincus examined him they became convinced that his story was actually much more complicated. In talking to Wilson's immediate family and other relatives, they gathered evidence that he had never been quite normal-that his personality had always seemed fractured and polarized. His mother recalled episodes from a very early age during which he would become "glassy-eyed" and seem to be someone else entirely. "David had, like, two personalities," his mother said. At times, he would wander off and be found, later, miles away, she recalled. He would have violent episodes during which he would attack his siblings' property, and subsequently deny that he had done anything untoward at all. Friends would say that they had seen someone who looked just like Wilson at a bar, but weren't sure that it had been Wilson, because he'd been acting altogether differently. On other occasions, Wilson would find things in his pockets and have no idea how they got there. He sometimes said he was born in 1955 and at other times said 1948.
What he had, in other words, were the classic symptoms of dissociation, and when Lewis and Pincus dug deeper into his history they began to understand why. Wilson's medical records detailed a seemingly endless list of hospitalizations for accidents, falls, periods of unconsciousness, and "sunstroke," dating from the time Wilson was two through his teens-the paper trail of a childhood marked by extraordinary trauma and violence. In his report to Wilson's attorneys, based on his examination of Wilson, Pincus wrote that there had been "many guns" in the home and that Wilson was often shot at as a child. He was also beaten "with a bull whip, 2x4's, a hose, pipes, a tree approximately 4 inches in diameter, wire, a piece of steel and belt buckles . . . on his back, legs, abdomen and face," until "he couldn't walk." Sometimes, when the beatings became especially intense, Wilson would have to "escape from the house and live in the fields for as long as two weeks." A kindly relative would leave food surreptitiously for him. The report goes on:
As a result of his beatings David was ashamed to go to school lest he be seen with welts. He would "lie down in the cold sand in a hut" near his home to recuperate for several days rather than go to school.
At the hearing, Lewis argued that when Wilson said he had no memory of shooting Stinson he was actually telling the truth. The years of abuse had hurt his ability to retrieve memories. Lewis also argued that Wilson had a violent side that he was, quite literally, unaware of; that he had the classic personality polarization of the severely abused who develop dissociative identity disorder. Lewis has videotapes of her sessions with Wilson: he is a handsome man with long fine black hair, sharply defined high cheekbones, and large, soft eyes. In the videotapes, he looks gentle. "During the hearing," Lewis recalls, "I was testifying, and I looked down at the defense table and David wasn't there. You know, David is a sweetie. He has a softness and a lovable quality. Instead, seated in his place there was this glowering kind of character, and I interrupted myself. I said, 'Excuse me, Your Honor, I just wanted to call to your attention that that is not David.' Everyone just looked." In the end, the judge vacated Wilson's death sentence.
Lewis talks a great deal about the Wilson case. It is one of the few instances in which she and Pincus succeeded in saving a defendant from the death penalty, and when she talks about what happened she almost always uses one of her favorite words-"poignant," spoken with a special emphasis, with a hesitation just before and just afterward. "In the course of evaluating someone, I always look for scars," Lewis told me. We were sitting in her Bellevue offices, watching the video of her examination of Wilson, and she was remembering the poignant moment she first met him. "Since I was working with a male psychologist, I said to him, 'Would you be good enough to go into the bathroom and look at David's back?' So he did that, and then he came back out and said, 'Dorothy! You must come and see this.' David had scars all over his back and chest. Burn marks. Beatings. I've seen a lot. But that was really grotesque."
5.
Abuse, in and of itself, does not necessarily result in violence, any more than neurological impairment or psychosis does. Lewis and Pincus argue, however, that if you mix these conditions together they become dangerous, that they have a kind of pathological synergy, that, like the ingredients of a bomb, they are troublesome individually but explosive in combination.
Several years ago, Lewis and some colleagues did a followup study of ninety-five male juveniles she and Pincus had first worked with in the late nineteen-seventies, in Connecticut. She broke the subjects into several groups: Group 1 consisted of those who did not have psychiatric or neurological vulnerabilities or an abusive childhood; Group 2 consisted of those with vulnerabilities but no abuse at home; Group 3 consisted of those with abuse but no vulnerabilities; yet another group consisted of those with abuse and extensive vulnerabilities. Seven years later, as adults, those in Group 1 had been arrested for an average of just over two criminal offenses, none of which were violent, so the result was essentially no jail time. Group 2, the psychiatrically or neurologically impaired kids, had been convicted of an average of almost ten offenses, two of which were violent, the result being just under a year of jail time. Group 3, the abused kids, had 11.9 offenses, 1.9 of them violent, the result being five hundred and sixty-two days in jail. But the group of children who had the most vulnerabilities and abuse were in another league entirely. In the intervening seven years, they had been arrested for, on average, 16.8 crimes, 5.4 of which were violent, the result being a thousand two hundred and fourteen days in prison.
In another study on this topic, a University of Southern California psychologist named Adrian Raine looked at four thousand two hundred and sixty-nine male children born and living in Denmark, and classified them according to two variables. The first was whether there were complications at birth-which correlates, loosely, with neurological impairment. The second was whether the child had been rejected by the mother (whether the child was unplanned, unwanted, and so forth)-which correlates, loosely, with abuse and neglect. Looking back eighteen years later, Raine found that those children who had not been rejected and had had no birth complications had roughly the same chance of becoming criminally violent as those with only one of the risk factors-around three per cent. For the children with both complications and rejection, however, the risk of violence tripled: in fact, the children with both problems accounted for eighteen per cent of all the violent crimes, even though they made up only 4.5 per cent of the group.
There is in these statistics a powerful and practical suggestion for how to prevent crime. In the current ideological climate, liberals argue that fighting crime requires fighting poverty, and conservatives argue that fighting crime requires ever more police and prisons; both of these things may be true, but both are also daunting. The studies suggest that there may be instances in which more modest interventions can bring large dividends. Criminal behavior that is associated with specific neurological problems is behavior that can, potentially, be diagnosed and treated like any other illness. Already, for example, researchers have found drugs that can mimic the cortical function of moderating violent behavior. The work is preliminary but promising. "We are on the cusp of a revolution in treating these conditions," Stuart Yudofsky told me. "We can use anticonvulsants, antidepressants, anti- hypertensive medications. There are medications out there that are F.D.A.-approved for other conditions which have profound effects on mitigating aggression." At the prevention end, as well, there's a strong argument for establishing aggressive child-abuse-prevention programs. Since 1992, for example, the National Committee to Prevent Child Abuse, a not-for- profit advocacy group based in Chicago, has been successfully promoting a program called Healthy Families America, which, working with hospitals, prenatal clinics, and physicians, identifies mothers in stressful and potentially abusive situations either before they give birth or immediately afterward, and then provides them with weekly home visits, counselling, and support for as long as five years. The main thing holding back nationwide adoption of programs like this is money: Healthy Families America costs up to two thousand dollars per family per year, but if we view it as a crime-prevention measure that's not a large sum.
These ideas, however, force a change in the way we think about criminality. Advances in the understanding of human behavior are necessarily corrosive of the idea of free will. That much is to be expected, and it is why courts have competency hearings, and legal scholars endlessly debate the definition and the use of the insanity defense. But the new research takes us one step further. If the patient of Yudofsky's who lashed out at his nurse because his orange juice was warm had, in the process, accidentally killed her, could we really hold him criminally responsible? Yudofsky says that that scenario is no different from one involving a man who is driving a car, has a heart attack, and kills a pedestrian. "Would you put him in jail?" he asks. Or consider Joseph Paul Franklin. By all accounts, he suffered through a brutal childhood on a par with that of David Wilson. What if he has a lesion on one of his frontal lobes, an atrophied hippocampus, a damaged and immature corpus callosum, a maldeveloped left hemisphere, a lack of synaptic complexity in the precortical limbic area, a profound left-right hemisphere split? What if in his remorselessness he was just the grownup version of the little boy Martin, whose ability to understand and relate to others was so retarded that he kept on hitting and hitting, even after the screams began? What if a history of abuse had turned a tendency toward schizophrenia-recall Franklin's colorful delusions-from a manageable impairment into the engine of murderousness? Such a person might still be sane, according to the strict legal definition. But that kind of medical diagnosis suggests, at the very least, that his ability to live by the rules of civilized society, and to understand and act on the distinctions between right and wrong, is quite different from that of someone who had a normal childhood and a normal brain.
What is implied by these questions is a far broader debate over competency and responsibility-an attempt to make medical considerations far more central to the administration of justice, so that we don't bring in doctors only when the accused seems really crazy but, rather, bring in doctors all the time, to add their expertise to the determination of responsibility.
One of the state-of-the-art diagnostic tools in neurology and psychiatry is the pet scan, a computerized X-ray that tracks the movement and rate of the body's metabolism. When you sing, for instance, the neurons in the specific regions that govern singing will start to fire. Blood will flow toward those regions, and if you take a pet scan at that moment the specific areas responsible for singing will light up on the pet computer monitor. Bremner, at Yale, has done pet scans of Vietnam War veterans suffering from post-traumatic-stress disorder. As he scanned the vets, he showed them a set of slides of Vietnam battle scenes accompanied by an appropriate soundtrack of guns and helicopters. Then he did the same thing with vets who were not suffering from P.T.S.D. Bremner printed out the results of the comparison for me, and they are fascinating. The pictures are color- coded. Blue shows the parts of the brain that were being used identically in the two groups of veterans, and most of each picture is blue. A few parts are light blue or green, signifying that the P.T.S.D. vets were using those regions a little less than the healthy vets were. The key color, however, is white. White shows brain areas that the healthy vets were using as they watched the slide show and the unhealthy vets were hardly using at all; in Bremner's computer printout, there is a huge white blob in the front of every non-P.T.S.D. scan.
"That's the orbitofrontal region," Bremner told me. "It's responsible for the extinction of fear." The orbitofrontal region is the part of your brain that evaluates the primal feelings of fear and anxiety which come up from the brain's deeper recesses. It's the part that tells you that you're in a hospital watching a slide show of the Vietnam War, not in Vietnam living through the real thing. The vets with P.T.S.D. weren't using that part of their brain. That's why every time a truck backfires or they see a war picture in a magazine they are forced to relive their wartime experiences: they can't tell the difference.
It doesn't take much imagination to see that this technique might someday be used to evaluate criminals-to help decide whether to grant parole, for example, or to find out whether some kind of medical treatment might aid reëntry into normal society. We appear to be creating a brand-new criminal paradigm: the research suggests that instead of thinking about and categorizing criminals merely by their acts-murder, rape, armed robbery, and so on-we ought to categorize criminals also according to their disabilities, so that a murderer with profound neurological damage and a history of vicious childhood abuse is thought of differently from a murderer with no brain damage and mild child abuse, who is, in turn, thought of differently from a murderer with no identifiable impairment at all. This is a more flexible view. It can be argued that it is a more sophisticated view. But even those engaged in such research-for example, Pincus-confess to discomfort at its implications, since something is undoubtedly lost in the translation. The moral force of the old standard, after all, lay in its inflexibility. Murder was murder, and the allowances made for aggravated circumstances were kept to a minimum. Is a moral standard still a moral standard when it is freighted with exceptions and exemptions and physiological equivocation?
When Lewis went to see Bundy, in Florida, on the day before his execution, she asked him why he had invited her-out of a great many people lining up outside his door-to see him. He answered, "Because everyone else wants to know what I did. You are the only one who wants to know why I did it." It's impossible to be sure what the supremely manipulative Bundy meant by this: whether he genuinely appreciated Lewis, or whether he simply regarded her as his last conquest. What is clear is that, over the four or five times they met in Bundy's last years, the two reached a curious understanding: he was now part of her scientific enterprise.
"I wasn't writing a book about him," Lewis recalls. "That he knew. The context in which he had first seen me was a scientific study, and this convinced him that I wasn't using him. In the last meeting, as I recall, he said that he wanted any material that I found out about him to be used to understand what causes people to be violent. We even discussed whether he would allow his brain to be studied. It was not an easy thing to talk about with him, let me tell you." At times, Lewis says, Bundy was manic, "high as a kite." On one occasion, he detailed to her just how he had killed a woman, and, on another occasion, he stared at her and stated flatly, "The man sitting across from you did not commit any murders." But she says that at the end she sensed a certain breakthrough. "The day before he was executed, he asked me to turn off the tape recorder. He said he wanted to tell me things that he didn't want recorded, so I didn't record them. It was very confidential." To this day, Lewis has never told anyone what Bundy said. There is something almost admirable about this. But there is also something strange about extending the physician-patient privilege to a killer like Bundy-about turning the murderer so completely into a patient. It is not that the premise is false, that murderers can't also be patients. It's just that once you make that leap-once you turn the criminal into an object of medical scrutiny-the crime itself inevitably becomes pushed aside and normalized. The difference between a crime of evil and a crime of illness is the difference between a sin and a symptom. And symptoms don't intrude in the relationship between the murderer and the rest of us: they don't force us to stop and observe the distinctions between right and wrong, between the speakable and the unspeakable, the way sins do. It was at the end of that final conversation that Bundy reached down and kissed Lewis on the cheek. But that was not all that happened. Lewis then reached up, put her arms around him, and kissed him back.
Listening to Khakis
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 28, 1997
ANNALS OF STYLE
What America's most popular pants tell us about the way guys think.
1.
In the fall of 1987, Levi Strauss & Co. began running a series of national television commercials to promote Dockers, its new brand of men's khakis. All the spots-and there were twenty-eight-had the same basic structure. A handheld camera would follow a group of men as they sat around a living room or office or bar. The men were in their late thirties, but it was hard to tell, because the camera caught faces only fleetingly. It was trained instead on the men from the waist down-on the seats of their pants, on the pleats of their khakis, on their hands going in and out of their pockets. As the camera jumped in quick cuts from Docker to Docker, the men chatted in loose, overlapping non sequiturs-guy-talk fragments that, when they are rendered on the page, achieve a certain Dadaist poetry. Here is the entire transcript of "Poolman," one of the first-and, perhaps, best-ads in the series:
"She was a redhead about five foot six inches tall."
"And all of a sudden this thing starts spinning, and it's going round and round."
"Is that Nelson?"
"And that makes me safe, because with my wife, I'll never be that way."
"It's like your career, and you're frustrated. I mean that-that's-what you want."
"Of course, that's just my opinion."
"So money's no object."
"Yeah, money's no object."
"What are we going to do with our lives, now?"
"Well . . ."
"Best of all . . ."
[Voice-over] "Levi's one-hundred-per-cent-cotton Dockers. If you're not wearing Dockers, you're just wearing pants."
"And I'm still paying the loans off."
"You've got all the money in the world."
"I'd like to at least be your poolman."
By the time the campaign was over, at the beginning of the nineties, Dockers had grown into a six-hundred-million-dollar business-a brand that if it had spun off from Levi's would have been (and would still be) the fourth-largest clothing brand in the world. Today, seventy per cent of American men between the ages of twenty-five and forty-five own a pair of Dockers, and khakis are expected to be as popular as blue jeans by the beginning of the next century. It is no exaggeration to call the original Dockers ads one of the most successful fashion-advertising campaigns in history.
This is a remarkable fact for a number of reasons, not the least of which is that the Dockers campaign was aimed at men, and no one had ever thought you could hit a home run like that by trying to sell fashion to the American male. Not long ago, two psychologists at York University, in Toronto-Irwin Silverman and Marion Eals-conducted an experiment in which they had men and women sit in an office for two minutes, without any reading material or distraction, while they ostensibly waited to take part in some kind of academic study. Then they were taken from the office and given the real reason for the experiment: to find out how many of the objects in the office they could remember. This was not a test of memory so much as it was a test of awareness-of the kind and quality of unconscious attention that people pay to the particulars of their environment. If you think about it, it was really a test of fashion sense, because, at its root, this is what fashion sense really is-the ability to register and appreciate and remember the details of the way those around you look and dress, and then reinterpret those details and memories yourself.
When the results of the experiment were tabulated, it was found that the women were able to recall the name and the placement of seventy per cent more objects than the men, which makes perfect sense. Women's fashion, after all, consists of an endless number of subtle combinations and variations-of skirt, dress, pants, blouse, T-shirt, hose, pumps, flats, heels, necklace, bracelet, cleavage, collar, curl, and on and on-all driven by the fact that when a woman walks down the street she knows that other women, consciously or otherwise, will notice the name and the placement of what she is wearing. Fashion works for women because women can appreciate its complexity. But when it comes to men what's the point? How on earth do you sell fashion to someone who has no appreciation for detail whatsoever?
The Dockers campaign, however, proved that you could sell fashion to men. But that was only the first of its remarkable implications. The second-which remains as weird and mysterious and relevant to the fashion business today as it was ten years ago-was that you could do this by training a camera on a man's butt and having him talk in yuppie gibberish.
2.
I watched "Poolman" with three members of the new team handling the Dockers account at Foote, Cone & Belding (F.C.B.), Levi's ad agency. We were in a conference room at Levi's Plaza, in downtown San Francisco, a redbrick building decorated (appropriately enough) in khaki like earth tones, with the team members-Chris Shipman, Iwan Thomis, and Tanyia Kandohla-forming an impromptu critical panel. Shipman, who had thick black glasses and spoke in an almost inaudible laid-back drawl, put a videocassette of the first campaign into a VCR-stopping, starting, and rewinding-as the group analyzed what made the spots so special.
"Remember, this is from 1987," he said, pointing to the screen, as the camera began its jerky dance. "Although this style of film making looks everyday now, that kind of handheld stuff was very fresh when these were made."
"They taped real conversations," Kandohla chimed in. "Then the footage was cut together afterward. They were thrown areas to talk about. It was very natural, not at all scripted. People were encouraged to go off on tangents."
After "Poolman," we watched several of the other spots in the original group-"Scorekeeper" and "Dad's Chair," "Flag Football," and "The Meaning of Life"-and I asked about the headlessness of the commercials, because if you watch too many in a row all those anonymous body parts begin to get annoying. But Thomis maintained that the headlessness was crucial, because it was the absence of faces that gave the dialogue its freedom. "They didn't show anyone's head because if they did the message would have too much weight," he said. "It would be too pretentious. You know, people talking about their hopes and dreams. It seems more genuine, as opposed to something stylized."
The most striking aspect of the spots is how different they are from typical fashion advertising. If you look at men's fashion magazines, for example, at the advertisements for the suits of Ralph Lauren or Valentino or Hugo Boss, they almost always consist of a beautiful man, with something interesting done to his hair, wearing a gorgeous outfit. At the most, the man may be gesturing discreetly, or smiling in the demure way that a man like that might smile after, say, telling the supermodel at the next table no thanks he has to catch an early-morning flight to Milan. But that's all. The beautiful face and the clothes tell the whole story. The Dockers ads, though, are almost exactly the opposite. There's no face. The camera is jumping around so much that it's tough to concentrate on the clothes. And instead of stark simplicity, the fashion image is overlaid with a constant, confusing patter. It's almost as if the Dockers ads weren't primarily concerned with clothes at all-and in fact that's exactly what Levi's intended. What the company had discovered, in its research, was that baby-boomer men felt that the chief thing missing from their lives was male friendship. Caught between the demands of the families that many of them had started in the eighties and career considerations that had grown more onerous, they felt they had lost touch with other men. The purpose of the ads-the chatter, the lounging around, the quick cuts-was simply to conjure up a place where men could put on one-hundred-per-cent-cotton khakis and reconnect with one another. In the original advertising brief, that imaginary place was dubbed Dockers World.
This may seem like an awfully roundabout way to sell a man a pair of pants. But that was the genius of the campaign. One of the truisms of advertising is that it's always easier to sell at the extremes than in the middle, which is why the advertisements for Valentino and Hugo Boss are so simple. The man in the market for a thousand-dollar suit doesn't need to be convinced of the value of nice clothes. The man in the middle, though-the man in the market for a forty-dollar pair of khakis-does. In fact, he probably isn't comfortable buying clothes at all. To sell him a pair of pants you have to take him somewhere he is comfortable, and that was the point of Dockers World. Even the apparent gibberish of lines like " 'She was a redhead about five foot six inches tall.' / 'And all of a sudden this thing starts spinning, and it's going round and round.' / 'Is that Nelson?' " have, if you listen closely enough, a certain quintessentially guy-friendly feel. It's the narrative equivalent of the sports-highlight reel-the sequence of five- second film clips of the best plays from the day's basketball or football or baseball games, which millions of American men watch every night on television. This nifty couplet from "Scorekeeper," for instance-" 'Who remembers their actual first girlfriend?'/ 'I would have done better, but I was bald then, too' "-is not nonsense but a twenty- minute conversation edited down to two lines. A man schooled in the highlight reel no more needs the other nineteen minutes and fifty- eight seconds of that exchange than he needs to see the intervening catch and throw to make sense of a sinking liner to left and a close play at the plate.
"Men connected to the underpinnings of what was being said," Robert Hanson, the vice-president of marketing for Dockers, told me. "These guys were really being honest and genuine and real with each other, and talking about their lives. It may not have been the truth, but it was the fantasy of what a lot of customers wanted, which was not just to be work-focussed but to have the opportunity to express how you feel about your family and friends and lives. The content was very important. The thing that built this brand was that we absolutely nailed the emotional underpinnings of what motivates baby boomers."
Hanson is a tall, striking man in his early thirties. He's what Jeff Bridges would look like if he had gone to finishing school. Hanson said that when he goes out on research trips to the focus groups that Dockers holds around the country he often deliberately stays in the background, because if the men in the group see him "they won't necessarily respond as positively or as openly." When he said this, he was wearing a pair of stone-white Dockers, a deep-blue shirt, a navy blazer, and a brilliant-orange patterned tie, and these worked so well together that it was obvious what he meant. When someone like Hanson dresses up that fabulously in Dockers, he makes it clear just how many variations and combinations are possible with a pair of khakis-but that, of course, defeats the purpose of the carefully crafted Dockers World message, which is to appeal to the man who wants nothing to do with fashion's variations and combinations. It's no coincidence that every man in every one of the group settings profiled in each commercial is wearing-albeit in different shades-exactly the same kind of pants. Most fashion advertising sells distinctiveness. (Can you imagine, say, an Ann Taylor commercial where a bunch of thirtyish girlfriends are lounging around chatting, all decked out in matching sweater sets?) Dockers was selling conformity.
"We would never do anything with our pants that would frighten anyone away," Gareth Morris, a senior designer for the brand, told me. "We'd never do too many belt loops, or an unusual base cloth. Our customers like one-hundred-per-cent-cotton fabrics. We would never do a synthetic. That's definitely in the market, but it's not where we need to be. Styling-wise, we would never do a wide, wide leg. We would never do a peg-legged style. Our customers seem to have a definite idea of what they want. They don't like tricky openings or zips or a lot of pocket flaps and details on the back. We've done button-through flaps, to push it a little bit. But we usually do a welt pocket-that's a pocket with a button-through. It's funny. We have focus groups in New York, Chicago, and San Francisco, and whenever we show them a pocket with a flap-it's a simple thing-they hate it. They won't buy the pants. They complain, 'How do I get my wallet?' So we compromise and do a welt. That's as far as they'll go. And there's another thing. They go, 'My butt's big enough. I don't want flaps hanging off of it, too.' They like inseam pockets. They like to know where they put their hands." He gestured to the pair of experimental prototype Dockers he was wearing, which had pockets that ran almost parallel to the waistband of the pants. "This is a stretch for us," he said. "If you start putting more stuff on than we have on our product, you're asking for trouble."
The apotheosis of the notion of khakis as nonfashion-guy fashion came several years after the original Dockers campaign, when Haggar Clothing Co. hired the Goodby, Silverstein & Partners ad agency, in San Francisco, to challenge Dockers' khaki dominance. In retrospect, it was an inspired choice, since Goodby, Silverstein is Guy Central. It does Porsche ("Kills Bugs Fast") and Isuzu and the recent "Got Milk?" campaign and a big chunk of the Nike business, and it operates out of a gutted turn-of-the-century building downtown, refurbished in what is best described as neo-Erector set. The campaign that it came up with featured voice-overs by Roseanne's television husband, John Goodman. In the best of the ads, entitled "I Am," a thirtyish man wakes up, his hair all mussed, pulls on a pair of white khakis, and half sleepwalks outside to get the paper. "I am not what I wear. I'm not a pair of pants, or a shirt," Goodman intones. The man walks by his wife, handing her the front sections of the paper. "I'm not in touch with my inner child. I don't read poetry, and I'm not politically correct." He heads away from the kitchen, down a hallway, and his kid grabs the comics from him. "I'm just a guy, and I don't have time to think about what I wear, because I've got a lot of important guy things to do." All he has left now is the sports section and, gripping it purposefully, he heads for the bathroom. "One-hundred-per-cent-cotton wrinkle-free khaki pants that don't require a lot of thought. Haggar. Stuff you can wear."
"We softened it," Richard Silverstein told me as we chatted in his office, perched on chairs in the midst of-among other things--a lacrosse stick, a bike stand, a gym bag full of yesterday's clothing, three toy Porsches, and a giant model of a Second World War Spitfire hanging from the ceiling. "We didn't say 'Haggar Apparel' or 'Haggar Clothing.' We said, 'Hey, listen, guys, don't worry. It's just stuff. Don't worry about it.' The concept was 'Make it approachable.' " The difference between this and the Dockers ad is humor. F.C.B. assiduously documented men's inner lives. Goodby, Silverstein made fun of them. But it's essentially the same message. It's instructive, in this light, to think about the Casual Friday phenomenon of the past decade, the loosening of corporate dress codes that was spawned by the rise of khakis. Casual Fridays are commonly thought to be about men rejecting the uniform of the suit. But surely that's backward. Men started wearing khakis to work because Dockers and Haggar made it sound as if khakis were going to be even easier than a suit. The khaki-makers realized that men didn't want to get rid of uniforms; they just wanted a better uniform.
The irony, of course, is that this idea of nonfashion-of khakis as the choice that diminishes, rather than enhances, the demands of fashion-turned out to be a white lie. Once you buy even the plainest pair of khakis, you invariably also buy a sports jacket and a belt and a whole series of shirts to go with it-maybe a polo knit for the weekends, something in plaid for casual, and a button-down for a dressier look-and before long your closet is thick with just the kinds of details and options that you thought you were avoiding. You may not add these details as brilliantly or as consciously as, say, Hanson does, but you end up doing it nonetheless. In the past seven years, sales of men's clothing in the United States have risen an astonishing twenty- one per cent, in large part because of this very fact-that khakis, even as they have simplified the bottom half of the male wardrobe, have forced a steady revision of the top. At the same time, even khakis themselves-within the narrow constraints of khakidom-have quietly expanded their range. When Dockers were launched, in the fall of 1986, there were just three basic styles: the double-pleated Docker in khaki, olive, navy, and black; the Steamer, in cotton canvas; and the more casual flat-fronted Docker. Now there are twenty-four. Dockers and Haggar and everyone else has been playing a game of bait and switch: lure men in with the promise of a uniform and then slip them, bit by bit, fashion. Put them in an empty room and then, ever so slowly, so as not to scare them, fill the room with objects.
3.
There is a puzzle in psychology known as the canned-laughter problem, which has a deeper and more complex set of implications about men and women and fashion and why the Dockers ads were so successful. Over the years, several studies have been devoted to this problem, but perhaps the most instructive was done by two psychologists at the University of Wisconsin, Gerald Cupchik and Howard Leventhal. Cupchik and Leventhal took a stack of cartoons (including many from The New Yorker), half of which an independent panel had rated as very funny and half of which it had rated as mediocre. They put the cartoons on slides, had a voice-over read the captions, and presented the slide show to groups of men and women. As you might expect, both sexes reacted pretty much the same way. Then Cupchik and Leventhal added a laugh track to the voice-over-the subjects were told that it was actual laughter from people who were in the room during the taping-and repeated the experiment. This time, however, things got strange. The canned laughter made the women laugh a little harder and rate the cartoons as a little funnier than they had before. But not the men. They laughed a bit more at the good cartoons but much more at the bad cartoons. The canned laughter also made them rate the bad cartoons as much funnier than they had rated them before, but it had little or no effect on their ratings of the good cartoons. In fact, the men found a bad cartoon with a laugh track to be almost as funny as a good cartoon without one. What was going on?
The guru of male-female differences in the ad world is Joan Meyers-Levy, a professor at the University of Chicago business school. In a groundbreaking series of articles written over the past decade, Meyers-Levy has explained the canned-laughter problem and other gender anomalies by arguing that men and women use fundamentally different methods of processing information. Given two pieces of evidence about how funny something is-their own opinion and the opinion of others (the laugh track)-the women came up with a higher score than before because they added the two clues together: they integrated the information before them. The men, on the other hand, picked one piece of evidence and ignored the other. For the bad cartoons, they got carried away by the laugh track and gave out hugely generous scores for funniness. For the good cartoons, however, they were so wedded to their own opinion that suddenly the laugh track didn't matter at all.
This idea-that men eliminate and women integrate-is called by Meyers-Levy the "selectivity hypothesis." Men are looking for a way to simplify the route to a conclusion, so they seize on the most obvious evidence and ignore the rest, while women, by contrast, try to process information comprehensively. So-called bandwidth research, for example, has consistently shown that if you ask a group of people to sort a series of objects or ideas into categories, the men will create fewer and larger categories than the women will. They use bigger mental bandwidths. Why? Because the bigger the bandwidth the less time and attention you have to pay to each individual object. Or consider what is called the invisibility question. If a woman is being asked a series of personal questions by another woman, she'll say more if she's facing the woman she's talking to than she will if her listener is invisible. With men, it's the opposite. When they can't see the person who's asking them questions, they suddenly and substantially open up. This, of course, is a condition of male communication which has been remarked on by women for millennia. But the selectivity hypothesis suggests that the cause of it has been misdiagnosed. It's not that men necessarily have trouble expressing their feelings; it's that in a face-to-face conversation they experience emotional overload. A man can't process nonverbal information (the expression and body language of the person asking him questions) and verbal information (the personal question being asked) at the same time any better than he can process other people's laughter and his own laughter at the same time. He has to select, and it is Meyers- Levy's contention that this pattern of behavior suggests significant differences in the way men and women respond to advertising.
Joan Meyers-Levy is a petite woman in her late thirties, with a dark pageboy haircut and a soft voice. She met me in the downtown office of the University of Chicago with three large folders full of magazine advertisements under one arm, and after chatting about the origins and the implications of her research she handed me an ad from several years ago for Evian bottled water. It has a beautiful picture of the French Alps and, below that, in large type, "Our factory." The text ran for several paragraphs, beginning:
You're not just looking at the French Alps. You're looking at one of the most pristine places on earth. And the origin of Evian Natural Spring Water.
Here, it takes no less than 15 years for nature to purify every drop of Evian as it flows through mineral-rich glacial formations deep within the mountains. And it is here that Evian acquires its unique balance of minerals.
"Now, is that a male or a female ad?" she asked. I looked at it again. The picture baffled me. But the word "factory" seemed masculine, so I guessed male.
She shook her head. "It's female. Look at the picture. It's just the Alps, and then they label it 'Our factory.' They're using a metaphor. To understand this, you're going to have to engage in a fair amount of processing. And look at all the imagery they're encouraging you to build up. You're not just looking at the French Alps. It's 'one of the most pristine places on earth' and it will take nature 'no less than fifteen years' to purify." Her point was that this is an ad that works only if the viewer appreciates all its elements-if the viewer integrates, not selects. A man, for example, glancing at the ad for a fraction of a second, might focus only on the words "Our factory" and screen out the picture of the Alps entirely, the same way he might have screened out the canned laughter. Then he wouldn't get the visual metaphor. In fact, he might end up equating Evian with a factory, and that would be a disaster. Anyway, why bother going into such detail about the glaciers if it's just going to get lost in the big male bandwidth?
Meyers-Levy handed me another Evian advertisement. It showed a man-the Olympic Gold Medal swimmer Matt Biondi-by a pool drinking Evian, with the caption "Revival of the fittest." The women's ad had a hundred and nineteen words of text. This ad had just twenty-nine words: "No other water has the unique, natural balance of minerals that Evian achieves during its 15-year journey deep within the French Alps. To be the best takes time." Needless to say, it came from a men's magazine. "With men, you don't want the fluff," she said. "Women, though, participate a lot more in whatever they are processing. By giving them more cues, you give them something to work with. You don't have to be so literal. With women you can be more allusive, so you can draw them in. They will engage in elaboration, and the more associations they make the easier it is to remember and retrieve later on."
Meyers-Levy took a third ad from her pile, this one for the 1997 Mercury Mountaineer four-wheel-drive sport-utility vehicle. It covers two pages, has the heading "Take the Rough with the Smooth," and shows four pictures-one of the vehicle itself, one of a mother and her child, one of a city skyline, and a large one of the interior of the car, over which the ad's text is superimposed. Around the border of the ad are forty-four separate, tiny photographs of roadways and buildings and construction sites and manhole covers. Female. Next to it on the table she put another ad-this one a single page, with a picture of the Mountaineer's interior, fifteen lines of text, a picture of the car's exterior, and, at the top, the heading: "When the Going Gets Tough, the Tough Get Comfortable." Male. "It's details, details. They're saying lots of different stuff," she said, pointing to the female version. "With men, instead of trying to cover everything in a single execution, you'd probably want to have a whole series of ads, each making a different point."
After a while, the game got very easy-if a bit humiliating. Meyers- Levy said that her observations were not antimale-that both the male and the female strategies have their strengths and their weaknesses- and, of course, she's right. On the other hand, reading the gender of ads makes it painfully obvious how much the advertising world- consciously or not-talks down to men. Before I met Meyers-Levy, I thought that the genius of the famous first set of Dockers ads was their psychological complexity, their ability to capture the many layers of eighties guyness. But when I thought about them again after meeting Meyers-Levy, I began to think that their real genius lay in their heroic simplicity-in the fact that F.C.B. had the self-discipline to fill the allotted thirty seconds with as little as possible. Why no heads? The invisibility rule. Guys would never listen to that Dadaist extemporizing if they had to process nonverbal cues, too. Why were the ads set in people's living rooms and at the office? Bandwidth. The message was that khakis were wide-bandwidth pants. And why were all the ads shot in almost exactly the same way, and why did all the dialogue run together in one genial, faux-philosophical stretch of highlight reel? Because of canned laughter. Because if there were more than one message to be extracted men would get confused.
4.
In the early nineties, Dockers began to falter. In 1992, the company sold sixty-six million pairs of khakis, but in 1993, as competition from Haggar and the Gap and other brands grew fiercer, that number slipped to fifty-nine million six hundred thousand, and by 1994 it had fallen to forty-seven million. In marketing-speak, user reality was encroaching on brand personality; that is, Dockers were being defined by the kind of middle-aged men who wore them, and not by the hipper, younger men in the original advertisements. The brand needed a fresh image, and the result was the "Nice Pants" campaign currently being shown on national television-a campaign widely credited with the resurgence of Dockers' fortunes. In one of the spots, "Vive la France," a scruffy young man in his early twenties, wearing Dockers, is sitting in a café in Paris. He's obviously a tourist. He glances up and sees a beautiful woman (actually, the supermodel Tatjana Patitz) looking right at him. He's in heaven. She starts walking directly toward him, and as she passes by she says, "Beau pantalon." As he looks frantically through his French phrase book for a translation, the waiter comes by and cuffs him on the head: "Hey, she says, 'Nice pants.' " Another spot in the series, "Subway Love," takes place on a subway car in Chicago. He (a nice young man wearing Dockers) spots her (a total babe), and their eyes lock. Romantic music swells. He moves toward her, but somehow, in a sudden burst of pushing and shoving, they get separated. Last shot: she's inside the car, her face pushed up against the glass. He's outside the car, his face pushed up against the glass. As the train slowly pulls away, she mouths two words: "Nice pants."
It may not seem like it, but "Nice Pants" is as radical a campaign as the original Dockers series. If you look back at the way that Sansabelt pants, say, were sold in the sixties, each ad was what advertisers would call a pure "head" message: the pants were comfortable, durable, good value. The genius of the first Dockers campaign was the way it combined head and heart: these were all- purpose, no-nonsense pants that connected to the emotional needs of baby boomers. What happened to Dockers in the nineties, though, was that everyone started to do head and heart for khakis. Haggar pants were wrinkle-free (head) and John Goodman-guy (heart). The Gap, with its brilliant billboard campaign of the early nineties-"James Dean wore khakis," "Frank Lloyd Wright wore khakis"-perfected the heart message by forging an emotional connection between khakis and a particular nostalgic, glamorous all-Americanness. To reassert itself, Dockers needed to go an extra step. Hence "Nice Pants," a campaign that for the first time in Dockers history raises the subject of sex.
"It's always been acceptable for a man to be a success in business," Hanson said, explaining the rationale behind "Nice Pants." "It's always been expected of a man to be a good provider. The new thing that men are dealing with is that it's O.K. for men to have a sense of personal style, and that it's O.K. to be seen as sexy. It's less about the head than about the combination of the head, the heart, and the groin. It's those three things. That's the complete man."
The radical part about this, about adding the groin to the list, is that almost no other subject for men is as perilous as the issue of sexuality and fashion. What "Nice Pants" had to do was talk about sex the same way that "Poolman" talked about fashion, which was to talk about it by not talking about it-or, at least, to talk about it in such a coded, cautious way that no man would ever think Dockers was suggesting that he wear khakis in order to look pretty. When I took a videotape of the "Nice Pants" campaign to several of the top agencies in New York and Los Angeles, virtually everyone agreed that the spots were superb, meaning that somehow F.C.B. had managed to pull off this balancing act.
What David Altschiller, at Hill, Holliday/Altschiller, in Manhattan, liked about the spots, for example, was that the hero was naïve: in neither case did he know that he had on nice pants until a gorgeous woman told him so. Naïveté, Altschiller stressed, is critical. Several years ago, he did a spot for Claiborne for Men cologne in which a great-looking guy in a bar, wearing a gorgeous suit, was obsessing neurotically about a beautiful woman at the other end of the room: "I see this woman. She's perfect. She's looking at me. She's smiling. But wait. Is she smiling at me? Or laughing at me? . . . Or looking at someone else?" You'd never do this in an ad for women's cologne. Can you imagine? "I see this guy. He's perfect. Ohmigod. Is he looking at me?" In women's advertising, self-confidence is sexy. But if a man is self-confident-if he knows he is attractive and is beautifully dressed- then he's not a man anymore. He's a fop. He's effeminate. The cologne guy had to be neurotic or the ad wouldn't work. "Men are still abashed about acknowledging that clothing is important," Altschiller said. "Fashion can't be important to me as a man. Even when, in the first commercial, the waiter says 'Nice pants,' it doesn't compute to the guy wearing the nice pants. He's thinking, What do you mean, 'Nice pants'?" Altschiller was looking at a videotape of the Dockers ad as he talked-standing at a forty-five-degree angle to the screen, with one hand on the top of the monitor, one hand on his hip, and a small, bemused smile on his lips. "The world may think they are nice, but so long as he doesn't think so he doesn't have to be self-conscious about it, and the lack of self-consciousness is very important to men. Because 'I don't care.' Or 'Maybe I care, but I can't be seen to care.' " For the same reason, Altschiller liked the relative understatement of the phrase "nice pants," as opposed to something like "great pants," since somewhere between "nice" and "great" a guy goes from just happening to look good to the unacceptable position of actually trying to look good. "In focus groups, men said that to be told you had 'nice pants' was one of the highest compliments a man could wish for," Tanyia Kandohla told me later, when I asked about the slogan. "They wouldn't want more attention drawn to them than that."
In many ways, the "Nice Pants" campaign is a direct descendant of the hugely successful campaign that Rubin-Postaer & Associates, in Santa Monica, did for Bugle Boy Jeans in the early nineties. In the most famous of those spots, the camera opens on an attractive but slightly goofy-looking man in a pair of jeans who is hitchhiking by the side of a desert highway. Then a black Ferrari with a fabulous babe at the wheel drives by, stops, and backs up. The babe rolls down the window and says, "Excuse me. Are those Bugle Boy Jeans that you're wearing?" The goofy guy leans over and pokes his head in the window, a surprised half smile on his face: "Why, yes, they are Bugle Boy Jeans."
"Thank you," the babe says, and she rolls up the window and drives away.
This is really the same ad as "Nice Pants"-the babe, the naĂŻve hero, the punch line. The two ads have something else in common. In the Bugle Boy spot, the hero wasn't some stunning male model. "I think he was actually a box boy at Vons in Huntington Beach," Larry Postaer, the creative director of Rubin-Postaer & Associates, told me. "I guess someone"-at Bugle Boy-"liked him." He's O.K.-looking, but not nearly in the same class as the babe in the Ferrari. In "Subway Love," by the same token, the Dockers man is medium-sized, almost small, and gets pushed around by much tougher people in the tussle on the train. He's cute, but he's a little bit of a wimp. Kandohla says that F.C.B. tried very hard to find someone with that look-someone who was, in her words, "aspirational real," not some "buff, muscle- bound jock." In a fashion ad for women, you can use Claudia Schiffer to sell a cheap pair of pants. But not in a fashion ad for men. The guy has to be believable. "A woman cannot be too gorgeous," Postaer explained. "A man, however, can be too gorgeous, because then he's not a man anymore. It's pretty rudimentary. Yet there are people who don't buy that, and have gorgeous men in their ads. I don't get it. Talk to Barneys about how well that's working. It couldn't stay in business trying to sell that high-end swagger to a mass market. The general public wouldn't accept it. Look at beer commercials. They always have these gorgeous girls-even now, after all the heat-and the guys are always just guys. That's the way it is. We only reflect what's happening out there, we're not creating it. Those guys who run the real high-end fashion ads-they don't understand that. They're trying to remold how people think about gender. I can't explain it, though I have my theories. It's like a Grecian ideal. But you can't be successful at advertising by trying to re-create the human condition. You can't alter men's minds, particularly on subjects like sexuality. It'll never happen."
Postaer is a gruff, rangy guy, with a Midwestern accent and a gravelly voice, who did Budweiser commercials in Chicago before moving West fifteen years ago. When he wasn't making fun of the pretentious style of East Coast fashion advertising, he was making fun of the pretentious questions of East Coast writers. When, for example, I earnestly asked him to explain the logic behind having the goofy guy screw up his face in such a-well, goofy-way when he says, "Why, yes, they are Bugle Boy Jeans," Postaer took his tennis shoes off his desk, leaned forward bemusedly in his chair, and looked at me as if my head came to a small point. "Because that's the only way he could say it," he said. "I suppose we might have had him say it a little differently if he could actually act."
Incredibly, Postaer said, the people at Bugle Boy wanted the babe to invite the goofy guy into the car, despite the fact that this would have violated the most important rule that governs this new style of groin messages in men's-fashion advertising, which is that the guy absolutely cannot ever get the girl. It's not just that if he got the girl the joke wouldn't work anymore; it's that if he got the girl it might look as if he had deliberately dressed to get the girl, and although at the back of every man's mind as he's dressing in the morning there is the thought of getting the girl, any open admission that that's what he's actually trying to do would undermine the whole unself- conscious, antifashion statement that men's advertising is about. If Tatjana Patitz were to say "Beau garçon" to the guy in "Vive la France," or the babe on the subway were to give the wimp her number, Dockers would suddenly become terrifyingly conspicuous-the long-pants equivalent of wearing a tight little Speedo to the beach. And if the Vons box boy should actually get a ride from the Ferrari babe, the ad would suddenly become believable only to that thin stratum of manhood which thinks that women in Ferraris find twenty- four-dollar jeans irresistible. "We fought that tooth and nail," Postaer said. "And it more or less cost us the account, even though the ad was wildly successful." He put his tennis shoes back up on the desk. "But that's what makes this business fun-trying to prove to clients how wrong they are."
5.
The one ad in the "Nice Pants" campaign which isn't like the Bugle Boy spots is called "Motorcycle." In it a nice young man happens upon a gleaming Harley on a dark back street of what looks like downtown Manhattan. He strokes the seat and then, unable to contain himself, climbs aboard the bike and bounces up and down, showing off his Dockers (the "product shot") but accidentally breaking a mirror on the handlebar. He looks up. The Harley's owner-a huge, leather-clad biker-is looking down at him. The biker glowers, looking him up and down, and says, "Nice pants." Last shot: the biker rides away, leaving the guy standing on the sidewalk in just his underwear.
What's surprising about this ad is that, unlike "Vive la France" and "Subway Love," it does seem to cross the boundaries of acceptable sex talk. The rules of guy advertising so carefully observed in those spots-the fact that the hero has to be naĂŻve, that he can't be too good-looking, that he can't get the girl, and that he can't be told anything stronger than "Nice pants"-are all, in some sense, reactions to the male fear of appearing too concerned with fashion, of being too pretty, of not being masculine. But what is "Motorcycle"? It's an ad about a sweet-looking guy down in the Village somewhere who loses his pants to a butch-looking biker in leather. "I got so much feedback at the time of 'Well, God, that's kind of gay, don't you think?' " Robert Hanson said. "People were saying, 'This buff guy comes along and he rides off with the guy's pants. I mean, what the hell were they doing?' It came from so many different people within the industry. It came from some of our most conservative retailers. But do you know what? If you put these three spots up-'Vive la France,' 'Subway Love,' and 'Motorcycle'-which one do you think men will talk about ad nauseam? 'Motorcycle.' It's No. 1. It's because he's really cool. He's in a really cool environment, and it's every guy's fantasy to have a really cool, tricked-out fancy motorcycle."
Hanson paused, as if he recognized that what he was saying was quite sensitive. He didn't want to say that men failed to pick up the gay implications of the ad because they're stupid, because they aren't stupid. And he didn't want to sound condescending, because Dockers didn't build a six-hundred-million-dollar business in five years by sounding condescending. All he was trying to do was point out the fundamental exegetical error in calling this a gay ad, because the only way for a Dockers man to be offended by "Motorcycle" would be if he thought about it with a little imagination, if he picked up on some fairly subtle cues, if he integrated an awful lot of detail. In other words, a Dockers man could only be offended if he did precisely what, according to Meyers-Levy, men don't do. It's not a gay ad because it's a guy ad. "The fact is," Hanson said, "that most men's interpretation of that spot is: You know what? Those pants must be really cool, because they prevented him from getting the shit kicked out of him." The Coolhunt
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
March 17, 1997
ANNALS OF STYLE
Who decides what's cool?
Certain kids in certain places--
and only the coolhunters know who they are.
1.
Baysie Wightman met DeeDee Gordon, appropriately enough, on a coolhunt. It was 1992. Baysie was a big shot for Converse, and DeeDee, who was barely twenty-one, was running a very cool boutique called Placid Planet, on Newbury Street in Boston. Baysie came in with a camera crew-one she often used when she was coolhunting-and said, "I've been watching your store, I've seen you, I've heard you know what's up," because it was Baysie's job at Converse to find people who knew what was up and she thought DeeDee was one of those people. DeeDee says that she responded with reserve-that "I was like, 'Whatever' "-but Baysie said that if DeeDee ever wanted to come and work at Converse she should just call, and nine months later DeeDee called. This was about the time the cool kids had decided they didn't want the hundred-and-twenty- five-dollar basketball sneaker with seventeen different kinds of high-technology materials and colors and air-cushioned heels anymore. They wanted simplicity and authenticity, and Baysie picked up on that. She brought back the Converse One Star, which was a vulcanized, suède, low-top classic old-school sneaker from the nineteen-seventies, and, sure enough, the One Star quickly became the signature shoe of the retro era. Remember what Kurt Cobain was wearing in the famous picture of him lying dead on the ground after committing suicide? Black Converse One Stars. DeeDee's big score was calling the sandal craze. She had been out in Los Angeles and had kept seeing the white teen-age girls dressing up like cholos, Mexican gangsters, in tight white tank tops known as "wife beaters," with a bra strap hanging out, and long shorts and tube socks and shower sandals. DeeDee recalls, "I'm like, 'I'm telling you, Baysie, this is going to hit. There are just too many people wearing it. We have to make a shower sandal.' " So Baysie, DeeDee, and a designer came up with the idea of making a retro sneaker-sandal, cutting the back off the One Star and putting a thick outsole on it. It was huge, and, amazingly, it's still huge.
Today, Baysie works for Reebok as general-merchandise manager-part of the team trying to return Reebok to the position it enjoyed in the mid-nineteen-eighties as the country's hottest sneaker company. DeeDee works for an advertising agency in Del Mar called Lambesis, where she puts out a quarterly tip sheet called the L Report on what the cool kids in major American cities are thinking and doing and buying. Baysie and DeeDee are best friends. They talk on the phone all the time. They get together whenever Baysie is in L.A. (DeeDee: "It's, like, how many times can you drive past O. J. Simpson's house?"), and between them they can talk for hours about the art of the coolhunt. They're the Lewis and Clark of cool.
What they have is what everybody seems to want these days, which is a window on the world of the street. Once, when fashion trends were set by the big couture houses-when cool was trickle- down-that wasn't important. But sometime in the past few decades things got turned over, and fashion became trickle-up. It's now about chase and flight-designers and retailers and the mass consumer giving chase to the elusive prey of street cool-and the rise of coolhunting as a profession shows how serious the chase has become. The sneakers of Nike and Reebok used to come out yearly. Now a new style comes out every season. Apparel designers used to have an eighteen-month lead time between concept and sale. Now they're reducing that to a year, or even six months, in order to react faster to new ideas from the street. The paradox, of course, is that the better coolhunters become at bringing the mainstream close to the cutting edge, the more elusive the cutting edge becomes. This is the first rule of the cool: The quicker the chase, the quicker the flight. The act of discovering what's cool is what causes cool to move on, which explains the triumphant circularity of coolhunting: because we have coolhunters like DeeDee and Baysie, cool changes more quickly, and because cool changes more quickly, we need coolhunters like DeeDee and Baysie.
DeeDee is tall and glamorous, with short hair she has dyed so often that she claims to have forgotten her real color. She drives a yellow 1977 Trans Am with a burgundy stripe down the center and a 1973 Mercedes 450 SL, and lives in a spare, Japanese-style cabin in Laurel Canyon. She uses words like "rad" and "totally," and offers non-stop, deadpan pronouncements on pop culture, as in "It's all about Pee-wee Herman." She sounds at first like a teen, like the same teens who, at Lambesis, it is her job to follow. But teen speech-particularly girl-teen speech, with its fixation on reported speech ("so she goes," "and I'm like," "and he goes") and its stock vocabulary of accompanying grimaces and gestures-is about using language less to communicate than to fit in. DeeDee uses teen speech to set herself apart, and the result is, for lack of a better word, really cool. She doesn't do the teen thing of climbing half an octave at the end of every sentence. Instead, she drags out her vowels for emphasis, so that if she mildly disagreed with something I'd said she would say "Maalcolm" and if she strongly disagreed with what I'd said she would say "Maaalcolm."
Baysie is older, just past forty (although you would never guess that), and went to Exeter and Middlebury and had two grandfathers who went to Harvard (although you wouldn't guess that, either). She has curly brown hair and big green eyes and long legs and so much energy that it is hard to imagine her asleep, or resting, or even standing still for longer than thirty seconds. The hunt for cool is an obsession with her, and DeeDee is the same way. DeeDee used to sit on the corner of West Broadway and Prince in SoHo-back when SoHo was cool-and take pictures of everyone who walked by for an entire hour. Baysie can tell you precisely where she goes on her Reebok coolhunts to find the really cool alternative white kids ("I'd maybe go to Portland and hang out where the skateboarders hang out near that bridge") or which snowboarding mountain has cooler kids-Stratton, in Vermont, or Summit County, in Colorado. (Summit, definitely.) DeeDee can tell you on the basis of the L Report's research exactly how far Dallas is behind New York in coolness (from six to eight months). Baysie is convinced that Los Angeles is not happening right now: "In the early nineteen-nineties a lot more was coming from L.A. They had a big trend with the whole Melrose Avenue look-the stupid goatees, the shorter hair. It was cleaned-up aftergrunge. There were a lot of places you could go to buy vinyl records. It was a strong place to go for looks. Then it went back to being horrible." DeeDee is convinced that Japan is happening: "I linked onto this future-technology thing two years ago. Now look at it, it's huge. It's the whole resurgence of Nike-Nike being larger than life. I went to Japan and saw the kids just bailing the most technologically advanced Nikes with their little dresses and little outfits and I'm like, 'Whoa, this is trippy!' It's performance mixed with fashion. It's really superheavy." Baysie has a theory that Liverpool is cool right now because it's the birthplace of the whole "lad" look, which involves soccer blokes in the pubs going superdressy and wearing Dolce & Gabbana and Polo Sport and Reebok Classics on their feet. But when I asked DeeDee about that, she just rolled her eyes: "Sometimes Baysie goes off on these tangents. Man, I love that woman!"
I used to think that if I talked to Baysie and DeeDee long enough I could write a coolhunting manual, an encyclopedia of cool. But then I realized that the manual would have so many footnotes and caveats that it would be unreadable. Coolhunting is not about the articulation of a coherent philosophy of cool. It's just a collection of spontaneous observations and predictions that differ from one moment to the next and from one coolhunter to the next. Ask a coolhunter where the baggy-jeans look came from, for example, and you might get any number of answers: urban black kids mimicking the jailhouse look, skateboarders looking for room to move, snowboarders trying not to look like skiers, or, alternatively, all three at once, in some grand concordance.
Or take the question of exactly how Tommy Hilfiger-a forty- five-year-old white guy from Greenwich, Connecticut, doing all- American preppy clothes-came to be the designer of choice for urban black America. Some say it was all about the early and visible endorsement given Hilfiger by the hip-hop auteur Grand Puba, who wore a dark-green-and-blue Tommy jacket over a white Tommy T-shirt as he leaned on his black Lamborghini on the cover of the hugely influential "Grand Puba 2000" CD, and whose love for Hilfiger soon spread to other rappers. (Who could forget the rhymes of Mobb Deep? "Tommy was my nigga /And couldn't figure /How me and Hilfiger / used to move through with vigor.") Then I had lunch with one of Hilfiger's designers, a twenty-six-year-old named Ulrich (Ubi) Simpson, who has a Puerto Rican mother and a Dutch-Venezuelan father, plays lacrosse, snowboards, surfs the long board, goes to hip-hop concerts, listens to Jungle, Edith Piaf, opera, rap, and Metallica, and has working with him on his design team a twenty-seven-year-old black guy from Montclair with dreadlocks, a twenty-two-year-old Asian-American who lives on the Lower East Side, a twenty-five-year-old South Asian guy from Fiji, and a twenty-one-year-old white graffiti artist from Queens. That's when it occurred to me that maybe the reason Tommy Hilfiger can make white culture cool to black culture is that he has people working for him who are cool in both cultures simultaneously. Then again, maybe it was all Grand Puba. Who knows?
One day last month, Baysie took me on a coolhunt to the Bronx and Harlem, lugging a big black canvas bag with twenty-four different shoes that Reebok is about to bring out, and as we drove down Fordham Road, she had her head out the window like a little kid, checking out what everyone on the street was wearing. We went to Dr. Jay's, which is the cool place to buy sneakers in the Bronx, and Baysie crouched down on the floor and started pulling the shoes out of her bag one by one, soliciting opinions from customers who gathered around and asking one question after another, in rapid sequence. One guy she listened closely to was maybe eighteen or nineteen, with a diamond stud in his ear and a thin beard. He was wearing a Polo baseball cap, a brown leather jacket, and the big, oversized leather boots that are everywhere uptown right now. Baysie would hand him a shoe and he would hold it, look at the top, and move it up and down and flip it over. The first one he didn't like: "Oh-kay." The second one he hated: he made a growling sound in his throat even before Baysie could give it to him, as if to say, "Put it back in the bag-now!" But when she handed him a new DMX RXT-a low-cut run/walk shoe in white and blue and mesh with a translucent "ice" sole, which retails for a hundred and ten dollars-he looked at it long and hard and shook his head in pure admiration and just said two words, dragging each of them out: "No doubt."
Baysie was interested in what he was saying, because the DMX RXT she had was a girls' shoe that actually hadn't been doing all that well. Later, she explained to me that the fact that the boys loved the shoe was critical news, because it suggested that Reebok had a potential hit if it just switched the shoe to the men's section. How she managed to distill this piece of information from the crowd of teenagers around her, how she made any sense of the two dozen shoes in her bag, most of which (to my eyes, anyway) looked pretty much the same, and how she knew which of the teens to really focus on was a mystery. Baysie is a Wasp from New England, and she crouched on the floor in Dr. Jay's for almost an hour, talking and joking with the homeboys without a trace of condescension or self-consciousness.
Near the end of her visit, a young boy walked up and sat down on the bench next to her. He was wearing a black woollen cap with white stripes pulled low, a blue North Face pleated down jacket, a pair of baggy Guess jeans, and, on his feet, Nike Air Jordans. He couldn't have been more than thirteen. But when he started talking you could see Baysie's eyes light up, because somehow she knew the kid was the real thing.
"How many pairs of shoes do you buy a month?" Baysie asked.
"Two," the kid answered. "And if at the end I find one more I like I get to buy that, too."
Baysie was onto him. "Does your mother spoil you?"
The kid blushed, but a friend next to him was laughing. "Whatever he wants, he gets."
Baysie laughed, too. She had the DMX RXT in his size. He tried them on. He rocked back and forth, testing them. He looked back at Baysie. He was dead serious now: "Make sure these come out."
Baysie handed him the new "Rush" Emmitt Smith shoe due out in the fall. One of the boys had already pronounced it "phat," and another had looked through the marbleized-foam cradle in the heel and cried out in delight, "This is bug!" But this kid was the acid test, because this kid knew cool. He paused. He looked at it hard. "Reebok," he said, soberly and carefully, "is trying to get butter."
In the car on the way back to Manhattan, Baysie repeated it twice. "Not better. Butter! That kid could totally tell you what he thinks." Baysie had spent an hour coolhunting in a shoe store and found out that Reebok's efforts were winning the highest of hip-hop praise. "He was so fucking smart."
2.
If you want to understand how trends work, and why coolhunters like Baysie and DeeDee have become so important, a good place to start is with what's known as diffusion research, which is the study of how ideas and innovations spread. Diffusion researchers do things like spending five years studying the adoption of irrigation techniques in a Colombian mountain village, or developing complex matrices to map the spread of new math in the Pittsburgh school system. What they do may seem like a far cry from, say, how the Tommy Hilfiger thing spread from Harlem to every suburban mall in the country, but it really isn't: both are about how new ideas spread from one person to the next.
One of the most famous diffusion studies is Bruce Ryan and Neal Gross's analysis of the spread of hybrid seed corn in Greene County, Iowa, in the nineteen-thirties. The new seed corn was introduced there in about 1928, and it was superior in every respect to the seed that had been used by farmers for decades. But it wasn't adopted all at once. Of two hundred and fifty-nine farmers studied by Ryan and Gross, only a handful had started planting the new seed by 1933. In 1934, sixteen took the plunge. In 1935, twenty-one more followed; the next year, there were thirty-six, and the year after that a whopping sixty-one. The succeeding figures were then forty-six, thirty-six, fourteen, and three, until, by 1941, all but two of the two hundred and fifty-nine farmers studied were using the new seed. In the language of diffusion research, the handful of farmers who started trying hybrid seed corn at the very beginning of the thirties were the "innovators," the adventurous ones. The slightly larger group that followed them was the "early adopters." They were the opinion leaders in the community, the respected, thoughtful people who watched and analyzed what those wild innovators were doing and then did it themselves. Then came the big bulge of farmers in 1936, 1937, and 1938-the "early majority" and the "late majority," which is to say the deliberate and the skeptical masses, who would never try anything until the most respected farmers had tried it. Only after they had been converted did the "laggards," the most traditional of all, follow suit. The critical thing about this sequence is that it is almost entirely interpersonal. According to Ryan and Gross, only the innovators relied to any great extent on radio advertising and farm journals and seed salesmen in making their decision to switch to the hybrid. Everyone else made his decision overwhelmingly because of the example and the opinions of his neighbors and peers.
Isn't this just how fashion works? A few years ago, the classic brushed-suède Hush Puppies with the lightweight crêpe sole-the moc-toe oxford known as the Duke and the slip-on with the golden buckle known as the Columbia-were selling barely sixty-five thousand pairs a year. The company was trying to walk away from the whole suède casual look entirely. It wanted to do "aspirational" shoes: "active casuals" in smooth leather, like the Mall Walker, with a Comfort Curve technology outsole and a heel stabilizer-the kind of shoes you see in Kinney's for $39.95. But then something strange started happening. Two Hush Puppies executives-Owen Baxter and Jeff Lewis-were doing a fashion shoot for their Mall Walkers and ran into a creative consultant from Manhattan named Jeffrey Miller, who informed them that the Dukes and the Columbias weren't dead, they were dead chic. "We were being told," Baxter recalls, "that there were areas in the Village, in SoHo, where the shoes were selling-in resale shops-and that people were wearing the old Hush Puppies. They were going to the ma-and-pa stores, the little stores that still carried them, and there was this authenticity of being able to say, 'I am wearing an original pair of Hush Puppies.' "
Baxter and Lewis-tall, solid, fair-haired Midwestern guys with thick, shiny wedding bands-are shoe men, first and foremost. Baxter was working the cash register at his father's shoe store in Mount Prospect, Illinois, at the age of thirteen. Lewis was doing inventory in his father's shoe store in Pontiac, Michigan, at the age of seven. Baxter was in the National Guard during the 1968 Democratic Convention, in Chicago, and was stationed across the street from the Conrad Hilton downtown, right in the middle of things. Today, the two men work out of Rockford, Michigan (population thirty-eight hundred), where Hush Puppies has been making the Dukes and the Columbias in an old factory down by the Rogue River for almost forty years. They took me to the plant when I was in Rockford. In a crowded, noisy, low-slung building, factory workers stand in long rows, gluing, stapling, and sewing together shoes in dozens of bright colors, and the two executives stopped at each production station and described it in detail. Lewis and Baxter know shoes. But they would be the first to admit that they don't know cool. "Miller was saying that there is something going on with the shoes-that Isaac Mizrahi was wearing the shoes for his personal use," Lewis told me. We were seated around the conference table in the Hush Puppies headquarters in Rockford, with the snow and the trees outside and a big water tower behind us. "I think it's fair to say that at the time we had no idea who Isaac Mizrahi was."
By late 1994, things had begun to happen in a rush. First, the designer John Bartlett called. He wanted to use Hush Puppies as accessories in his spring collection. Then Anna Sui called. Miller, the man from Manhattan, flew out to Michigan to give advice on a new line ("Of course, packing my own food and thinking about 'Fargo' in the corner of my mind"). A few months later, in Los Angeles, the designer Joel Fitzpatrick put a twenty-five-foot inflatable basset hound on the roof of his store on La Brea Avenue and gutted his adjoining art gallery to turn it into a Hush Puppies department, and even before he opened-while he was still painting and putting up shelves-Pee-wee Herman walked in and asked for a couple of pairs. Pee-wee Herman! "It was total word of mouth. I didn't even have a sign back then," Fitzpatrick recalls. In 1995, the company sold four hundred and thirty thousand pairs of the classic Hush Puppies. In 1996, it sold a million six hundred thousand, and that was only scratching the surface, because in Europe and the rest of the world, where Hush Puppies have a huge following-where they might outsell the American market four to one-the revival was just beginning.
The cool kids who started wearing old Dukes and Columbias from thrift shops were the innovators. Pee-wee Herman, wandering in off the street, was an early adopter. The million six hundred thousand people who bought Hush Puppies last year are the early majority, jumping in because the really cool people have already blazed the trail. Hush Puppies are moving through the country just the way hybrid seed corn moved through Greene County-all of which illustrates what coolhunters can and cannot do. If Jeffrey Miller had been wrong-if cool people hadn't been digging through the thrift shops for Hush Puppies-and he had arbitrarily decided that Baxter and Lewis should try to convince non-cool people that the shoes were cool, it wouldn't have worked. You can't convince the late majority that Hush Puppies are cool, because the late majority makes its coolness decisions on the basis of what the early majority is doing, and you can't convince the early majority, because the early majority is looking at the early adopters, and you can't convince the early adopters, because they take their cues from the innovators. The innovators do get their cool ideas from people other than their peers, but the fact is that they are the last people who can be convinced by a marketing campaign that a pair of suède shoes is cool. These are, after all, the people who spent hours sifting through thrift-store bins. And why did they do that? Because their definition of cool is doing something that nobody else is doing. A company can intervene in the cool cycle. It can put its shoes on really cool celebrities and on fashion runways and on MTV. It can accelerate the transition from the innovator to the early adopter and on to the early majority. But it can't just manufacture cool out of thin air, and that's the second rule of cool.
At the peak of the Hush Puppies craziness last year, Hush Puppies won the prize for best accessory at the Council of Fashion Designers' awards dinner, at Lincoln Center. The award was accepted by the Hush Puppies president, Louis Dubrow, who came out wearing a pair of custom-made black patent-leather Hush Puppies and stood there blinking and looking at the assembled crowd as if it were the last scene of "Close Encounters of the Third Kind." It was a strange moment. There was the president of the Hush Puppies company, of Rockford, Michigan, population thirty-eight hundred, sharing a stage with Calvin Klein and Donna Karan and Isaac Mizrahi-and all because some kids in the East Village began combing through thrift shops for old Dukes. Fashion was at the mercy of those kids, whoever they were, and it was a wonderful thing if the kids picked you, but a scary thing, too, because it meant that cool was something you could not control. You needed someone to find cool and tell you what it was.
3.
When Baysie Wightman went to Dr. Jay's, she was looking for customer response to the new shoes Reebok had planned for the fourth quarter of 1997 and the first quarter of 1998. This kind of customer testing is critical at Reebok, because the last decade has not been kind to the company. In 1987, it had a third of the American athletic-shoe market, well ahead of Nike. Last year, it had sixteen per cent. "The kid in the store would say, 'I'd like this shoe if your logo wasn't on it,' " E. Scott Morris, who's a senior designer for Reebok, told me. "That's kind of a punch in the mouth. But we've all seen it. You go into a shoe store. The kid picks up the shoe and says, 'Ah, man, this is nice.' He turns the shoe around and around. He looks at it underneath. He looks at the side and he goes, 'Ah, this is Reebok,' and says, 'I ain't buying this,' and puts the shoe down and walks out. And you go, 'You was just digging it a minute ago. What happened?' " Somewhere along the way, the company lost its cool, and Reebok now faces the task not only of rebuilding its image but of making the shoes so cool that the kids in the store can't put them down.
Every few months, then, the company's coolhunters go out into the field with prototypes of the upcoming shoes to find out what kids really like, and come back to recommend the necessary changes. The prototype of one recent Emmitt Smith shoe, for example, had a piece of molded rubber on the end of the tongue as a design element; it was supposed to give the shoe a certain "richness," but the kids said they thought it looked overbuilt. Then Reebok gave the shoes to the Boston College football team for wear-testing, and when they got the shoes back they found out that all the football players had cut out the rubber component with scissors. As messages go, this was hard to miss. The tongue piece wasn't cool, and on the final version of the shoe it was gone. The rule of thumb at Reebok is that if the kids in Chicago, New York, and Detroit all like a shoe, it's a guaranteed hit. More than likely, though, the coolhunt is going to turn up subtle differences from city to city, so that once the coolhunters come back the designers have to find out some way to synthesize what was heard, and pick out just those things that all the kids seemed to agree on. In New York, for example, kids in Harlem are more sophisticated and fashion-forward than kids in the Bronx, who like things a little more colorful and glitzy. Brooklyn, meanwhile, is conservative and preppy, more like Washington, D.C. For reasons no one really knows, Reeboks are coolest in Philadelphia. In Philly, in fact, the Reebok Classics are so huge they are known simply as National Anthems, as in "I'll have a pair of blue Anthems in nine and a half." Philadelphia is Reebok's innovator town. From there trends move along the East Coast, trickling all the way to Charlotte, North Carolina.
Reebok has its headquarters in Stoughton, Massachusetts, outside Boston-in a modern corporate park right off Route 24. There are basketball and tennis courts next to the building, and a health club on the ground floor that you can look directly into from the parking lot. The front lobby is adorned with shrines for all of Reebok's most prominent athletes-shrines complete with dramatic action photographs, their sports jerseys, and a pair of their signature shoes-and the halls are filled with so many young, determinedly athletic people that when I visited Reebok headquarters I suddenly wished I'd packed my gym clothes in case someone challenged me to wind sprints. At Stoughton, I met with a handful of the company's top designers and marketing executives in a long conference room on the third floor. In the course of two hours, they put one pair of shoes after another on the table in front of me, talking excitedly about each sneaker's prospects, because the feeling at Reebok is that things are finally turning around. The basketball shoe that Reebok brought out last winter for Allen Iverson, the star rookie guard for the Philadelphia 76ers, for example, is one of the hottest shoes in the country. Dr. Jay's sold out of Iversons in two days, compared with the week it took the store to sell out of Nike's new Air Jordans. Iverson himself is brash and charismatic and faster from foul line to foul line than anyone else in the league. He's the equivalent of those kids in the East Village who began wearing Hush Puppies way back when. He's an innovator, and the hope at Reebok is that if he gets big enough the whole company can ride back to coolness on his coattails, the way Nike rode to coolness on the coattails of Michael Jordan. That's why Baysie was so excited when the kid said Reebok was trying to get butter when he looked at the Rush and the DMX RXT: it was a sign, albeit a small one, that the indefinable, abstract thing called cool was coming back.
When Baysie comes back from a coolhunt, she sits down with marketing experts and sales representatives and designers, and reconnects them to the street, making sure they have the right shoes going to the right places at the right price. When she got back from the Bronx, for example, the first thing she did was tell all these people they had to get a new men's DMX RXT out, fast, because the kids on the street loved the women's version. "It's hotter than we realized," she told them. The coolhunter's job in this instance is very specific. What DeeDee does, on the other hand, is a little more ambitious. With the L Report, she tries to construct a kind of grand matrix of cool, comprising not just shoes but everything kids like, and not just kids of certain East Coast urban markets but kids all over. DeeDee and her staff put it out four times a year, in six different versions-for New York, Los Angeles, San Francisco, Austin-Dallas, Seattle, and Chicago-and then sell it to manufacturers, retailers, and ad agencies (among others) for twenty thousand dollars a year. They go to each city and find the coolest bars and clubs, and ask the coolest kids to fill out questionnaires. The information is then divided into six categories-You Saw It Here First, Entertainment and Leisure, Clothing and Accessories, Personal and Individual, Aspirations, and Food and Beverages-which are, in turn, broken up into dozens of subcategories, so that Personal and Individual, for example, includes Cool Date, Cool Evening, Free Time, Favorite Possession, and on and on. The information in those subcategories is subdivided again by sex and by age bracket (14-18, 19-24, 25-30), and then, as a control, the L Report gives you the corresponding set of preferences for "mainstream" kids.
Few coolhunters bother to analyze trends with this degree of specificity. DeeDee's biggest competitor, for example, is something called the Hot Sheet, out of Manhattan. It uses a panel of three thousand kids a year from across the country and divides up their answers by sex and age, but it doesn't distinguish between regions, or between trendsetting and mainstream respondents. So what you're really getting is what all kids think is cool-not what cool kids think is cool, which is a considerably different piece of information. Janine Misdom and Joanne DeLuca, who run the Sputnik coolhunting group out of the garment district in Manhattan, meanwhile, favor an entirely impressionistic approach, sending out coolhunters with video cameras to talk to kids on the ground that it's too difficult to get cool kids to fill out questionnaires. Once, when I was visiting the Sputnik girls-as Misdom and DeLuca are known on the street, because they look alike and their first names are so similar and both have the same awesome New York accents-they showed me a video of the girl they believe was the patient zero of the whole eighties revival going on right now. It was back in September of 1993. Joanne and Janine were on Seventh Avenue, outside the Fashion Institute of Technology, doing random street interviews for a major jeans company, and, quite by accident, they ran into this nineteen-year- old raver. She had close-cropped hair, which was green at the top, and at the temples was shaved even closer and dyed pink. She had rings and studs all over her face, and a thick collection of silver tribal jewelry around her neck, and vintage jeans. She looked into the camera and said, "The sixties came in and then the seventies came in and I think it's ready to come back to the eighties. It's totally eighties: the eye makeup, the clothes. It's totally going back to that." Immediately, Joanne and Janine started asking around. "We talked to a few kids on the Lower East Side who said they were feeling the need to start breaking out their old Michael Jackson jackets," Joanne said. "They were joking about it. They weren't doing it yet. But they were going to, you know? They were saying, 'We're getting the urge to break out our Members Only jackets.' " That was right when Joanne and Janine were just starting up; calling the eighties revival was their first big break, and now they put out a full-blown videotaped report twice a year which is a collection of clips of interviews with extremely progressive people.
What DeeDee argues, though, is that cool is too subtle and too variegated to be captured with these kind of broad strokes. Cool is a set of dialects, not a language. The L Report can tell you, for example, that nineteen-to-twenty-four-year-old male trendsetters in Seattle would most like to meet, among others, King Solomon and Dr. Seuss, and that nineteen-to-twenty-four-year- old female trendsetters in San Francisco have turned their backs on Calvin Klein, Nintendo Gameboy, and sex. What's cool right now? Among male New York trendsetters: North Face jackets, rubber and latex, khakis, and the rock band Kiss. Among female trendsetters: ska music, old-lady clothing, and cyber tech. In Chicago, snowboarding is huge among trendsetters of both sexes and all ages. Women over nineteen are into short hair, while those in their teens have embraced mod culture, rock climbing, tag watches, and bootleg pants. In Austin-Dallas, meanwhile, twenty-five-to- thirty-year-old women trendsetters are into hats, heroin, computers, cigars, Adidas, and velvet, while men in their twenties are into video games and hemp. In all, the typical L Report runs over one hundred pages. But with that flood of data comes an obsolescence disclaimer: "The fluctuating nature of the trendsetting market makes keeping up with trends a difficult task." By the spring, in other words, everything may have changed.
The key to coolhunting, then, is to look for cool people first and cool things later, and not the other way around. Since cool things are always changing, you can't look for them, because the very fact they are cool means you have no idea what to look for. What you would be doing is thinking back on what was cool before and extrapolating, which is about as useful as presuming that because the Dow rose ten points yesterday it will rise another ten points today. Cool people, on the other hand, are a constant.
When I was in California, I met Salvador Barbier, who had been described to me by a coolhunter as "the Michael Jordan of skateboarding." He was tall and lean and languid, with a cowboy's insouciance, and we drove through the streets of Long Beach at fifteen miles an hour in a white late-model Ford Mustang, a car he had bought as a kind of ironic status gesture ("It would look good if I had a Polo jacket or maybe Nautica," he said) to go with his '62 Econoline van and his '64 T-bird. Sal told me that he and his friends, who are all in their mid-twenties, recently took to dressing up as if they were in eighth grade again and gathering together-having a "rally"-on old BMX bicycles in front of their local 7-Eleven. "I'd wear muscle shirts, like Def Leppard or Foghat or some old heavy-metal band, and tight, tight tapered Levi's, and Vans on my feet-big, like, checkered Vans or striped Vans or camouflage Vans-and then wristbands and gloves with the fingers cut off. It was total eighties fashion. You had to look like that to participate in the rally. We had those denim jackets with patches on the back and combs that hung out the back pocket. We went without I.D.s, because we'd have to have someone else buy us beers." At this point, Sal laughed. He was driving really slowly and staring straight ahead and talking in a low drawl-the coolhunter's dream. "We'd ride to this bar and I'd have to carry my bike inside, because we have really expensive bikes, and when we got inside people would freak out. They'd say, 'Omigod,' and I was asking them if they wanted to go for a ride on the handlebars. They were like, 'What is wrong with you. My boyfriend used to dress like that in the eighth grade!' And I was like, 'He was probably a lot cooler then, too.' "
This is just the kind of person DeeDee wants. "I'm looking for somebody who is an individual, who has definitely set himself apart from everybody else, who doesn't look like his peers. I've run into trendsetters who look completely Joe Regular Guy. I can see Joe Regular Guy at a club listening to some totally hardcore band playing, and I say to myself 'Omigod, what's that guy doing here?' and that totally intrigues me, and I have to walk up to him and say, 'Hey, you're really into this band. What's up?' You know what I mean? I look at everything. If I see Joe Regular Guy sitting in a coffee shop and everyone around him has blue hair, I'm going to gravitate toward him, because, hey, what's Joe Regular Guy doing in a coffee shop with people with blue hair?"
We were sitting outside the Fred Segal store in West Hollywood. I was wearing a very conservative white Brooks Brothers button-down and a pair of Levi's, and DeeDee looked first at my shirt and then my pants and dissolved into laughter: "I mean, I might even go up to you in a cool place."
Picking the right person is harder than it sounds, though. Piney Kahn, who works for DeeDee, says, "There are a lot of people in the gray area. You've got these kids who dress ultra funky and have their own style. Then you realize they're just running after their friends." The trick is not just to be able to tell who is different but to be able to tell when that difference represents something truly cool. It's a gut thing. You have to somehow just know. DeeDee hired Piney because Piney clearly knows: she is twenty-four and used to work with the Beastie Boys and has the formidable self-possession of someone who is not only cool herself but whose parents were cool. "I mean," she says, "they named me after a tree."
Piney and DeeDee said that they once tried to hire someone as a coolhunter who was not, himself, cool, and it was a disaster.
"You can give them the boundaries," Piney explained. "You can say that if people shop at Banana Republic and listen to Alanis Morissette they're probably not trendsetters. But then they might go out and assume that everyone who does that is not a trendsetter, and not look at the other things."
"I mean, I myself might go into Banana Republic and buy a T-shirt," DeeDee chimed in.
Their non-cool coolhunter just didn't have that certain instinct, that sense that told him when it was O.K. to deviate from the manual. Because he wasn't cool, he didn't know cool, and that's the essence of the third rule of cool: you have to be one to know one. That's why Baysie is still on top of this business at forty-one. "It's easier for me to tell you what kid is cool than to tell you what things are cool," she says. But that's all she needs to know. In this sense, the third rule of cool fits perfectly into the second: the second rule says that cool cannot be manufactured, only observed, and the third says that it can only be observed by those who are themselves cool. And, of course, the first rule says that it cannot accurately be observed at all, because the act of discovering cool causes cool to take flight, so if you add all three together they describe a closed loop, the hermeneutic circle of coolhunting, a phenomenon whereby not only can the uncool not see cool but cool cannot even be adequately described to them. Baysie says that she can see a coat on one of her friends and think it's not cool but then see the same coat on DeeDee and think that it is cool. It is not possible to be cool, in other words, unless you are-in some larger sense-already cool, and so the phenomenon that the uncool cannot see and cannot have described to them is also something that they cannot ever attain, because if they did it would no longer be cool. Coolhunting represents the ascendancy, in the marketplace, of high school.
Once, I was visiting DeeDee at her house in Laurel Canyon when one of her L Report assistants, Jonas Vail, walked in. He'd just come back from Niketown on Wilshire Boulevard, where he'd bought seven hundred dollars' worth of the latest sneakers to go with the three hundred dollars' worth of skateboard shoes he'd bought earlier in the afternoon. Jonas is tall and expressionless, with a peacoat, dark jeans, and short-cropped black hair. "Jonas is good," DeeDee says. "He works with me on everything. That guy knows more pop culture. You know: What was the name of the store Mrs. Garrett owned on 'The Facts of Life'? He knows all the names of the extras from eighties sitcoms. I can't believe someone like him exists. He's fucking unbelievable. Jonas can spot a cool person a mile away."
Jonas takes the boxes of shoes and starts unpacking them on the couch next to DeeDee. He picks up a pair of the new Nike ACG hiking boots, and says, "All the Japanese in Niketown were really into these." He hands the shoes to DeeDee.
"Of course they were!" she says. "The Japanese are all into the tech-looking shit. Look how exaggerated it is, how bulbous." DeeDee has very ambivalent feelings about Nike, because she thinks its marketing has got out of hand. When she was in the New York Niketown with a girlfriend recently, she says, she started getting light-headed and freaked out. "It's cult, cult, cult. It was like, 'Hello, are we all drinking the Kool-Aid here?' " But this shoe she loves. It's Dr. Jay's in the Bronx all over again. DeeDee turns the shoe around and around in the air, tapping the big clear-blue plastic bubble on the side-the visible Air-Sole unit- with one finger. "It's so fucking rad. It looks like a platypus!" In front of me, there is a pair of Nike's new shoes for the basketball player Jason Kidd.
I pick it up. "This looks . . . cool," I venture uncertainly.
DeeDee is on the couch, where she's surrounded by shoeboxes and sneakers and white tissue paper, and she looks up reprovingly because, of course, I don't get it. I can't get it. "Beyooond cool, Maalcolm. Beyooond cool."
The Sports Taboo
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 19, 1997
DEPT. OF DISPUTATION
Why blacks are like boys and whites are like girls.
1.
The education of any athlete begins, in part, with an education in the racial taxonomy of his chosen sport-in the subtle, unwritten rules about what whites are supposed to be good at and what blacks are supposed to be good at. In football, whites play quarterback and blacks play running back; in baseball whites pitch and blacks play the outfield. I grew up in Canada, where my brother Geoffrey and I ran high-school track, and in Canada the rule of running was that anything under the quarter-mile belonged to the West Indians. This didn't mean that white people didn't run the sprints. But the expectation was that they would never win, and, sure enough, they rarely did. There was just a handful of West Indian immigrants in Ontario at that point-clustered in and around Toronto-but they owned Canadian sprinting, setting up under the stands at every major championship, cranking up the reggae on their boom boxes, and then humiliating everyone else on the track. My brother and I weren't from Toronto, so we weren't part of that scene. But our West Indian heritage meant that we got to share in the swagger. Geoffrey was a magnificent runner, with powerful legs and a barrel chest, and when he was warming up he used to do that exaggerated, slow-motion jog that the white guys would try to do and never quite pull off. I was a miler, which was a little outside the West Indian range. But, the way I figured it, the rules meant that no one should ever outkick me over the final two hundred metres of any race. And in the golden summer of my fourteenth year, when my running career prematurely peaked, no one ever did.
When I started running, there was a quarter-miler just a few years older than I was by the name of Arnold Stotz. He was a bulldog of a runner, hugely talented, and each year that he moved through the sprinting ranks he invariably broke the existing four-hundred-metre record in his age class. Stotz was white, though, and every time I saw the results of a big track meet I'd keep an eye out for his name, because I was convinced that he could not keep winning. It was as if I saw his whiteness as a degenerative disease, which would eventually claim and cripple him. I never asked him whether he felt the same anxiety, but I can't imagine that he didn't. There was only so long that anyone could defy the rules. One day, at the provincial championships, I looked up at the results board and Stotz was gone.
Talking openly about the racial dimension of sports in this way, of course, is considered unseemly. It's all right to say that blacks dominate sports because they lack opportunities elsewhere. That's the "Hoop Dreams" line, which says whites are allowed to acknowledge black athletic success as long as they feel guilty about it. What you're not supposed to say is what we were saying in my track days-that we were better because we were black, because of something intrinsic to being black. Nobody said anything like that publicly last month when Tiger Woods won the Masters or when, a week later, African men claimed thirteen out of the top twenty places in the Boston Marathon. Nor is it likely to come up this month, when African-Americans will make up eighty per cent of the players on the floor for the N.B.A. playoffs. When the popular television sports commentator Jimmy (the Greek) Snyder did break this taboo, in 1988- infamously ruminating on the size and significance of black thighs-one prominent N.A.A.C.P. official said that his remarks "could set race relations back a hundred years." The assumption is that the whole project of trying to get us to treat each other the same will be undermined if we don't all agree that under the skin we actually are the same.
The point of this, presumably, is to put our discussion of sports on a par with legal notions of racial equality, which would be a fine idea except that civil-rights law governs matters like housing and employment and the sports taboo covers matters like what can be said about someone's jump shot. In his much heralded new book "Darwin's Athletes," the University of Texas scholar John Hoberman tries to argue that these two things are the same, that it's impossible to speak of black physical superiority without implying intellectual inferiority. But it isn't long before the argument starts to get ridiculous. "The spectacle of black athleticism," he writes, inevitably turns into "a highly public image of black retardation." Oh, really? What, exactly, about Tiger Woods's victory in the Masters resembled "a highly public image of black retardation"? Today's black athletes are multimillion- dollar corporate pitchmen, with talk shows and sneaker deals and publicity machines and almost daily media opportunities to share their thoughts with the world, and it's very hard to see how all this contrives to make them look stupid. Hoberman spends a lot of time trying to inflate the significance of sports, arguing that how we talk about events on the baseball diamond or the track has grave consequences for how we talk about race in general. Here he is, for example, on Jackie Robinson:
The sheer volume of sentimental and intellectual energy that has been invested in the mythic saga of Jackie Robinson has discouraged further thinking about what his career did and did not accomplish. . . . Black America has paid a high and largely unacknowledged price for the extraordinary prominence given the black athlete rather than other black men of action (such as military pilots and astronauts), who represent modern aptitudes in ways that athletes cannot.
Please. Black America has paid a high and largely unacknowledged price for a long list of things, and having great athletes is far from the top of the list. Sometimes a baseball player is just a baseball player, and sometimes an observation about racial difference is just an observation about racial difference. Few object when medical scientists talk about the significant epidemiological differences between blacks and whites-the fact that blacks have a higher incidence of hypertension than whites and twice as many black males die of diabetes and prostate cancer as white males, that breast tumors appear to grow faster in black women than in white women, that black girls show signs of puberty sooner than white girls. So why aren't we allowed to say that there might be athletically significant differences between blacks and whites?
According to the medical evidence, African-Americans seem to have, on the average, greater bone mass than do white Americans-a difference that suggests greater muscle mass. Black men have slightly higher circulating levels of testosterone and human-growth hormone than their white counterparts, and blacks over all tend to have proportionally slimmer hips, wider shoulders, and longer legs. In one study, the Swedish physiologist Bengt Saltin compared a group of Kenyan distance runners with a group of Swedish distance runners and found interesting differences in muscle composition: Saltin reported that the Africans appeared to have more blood-carrying capillaries and more mitochondria (the body's cellular power plant) in the fibres of their quadriceps. Another study found that, while black South African distance runners ran at the same speed as white South African runners, they were able to use more oxygen- eighty-nine per cent versus eighty-one per cent-over extended periods: somehow, they were able to exert themselves more. Such evidence suggested that there were physical differences in black athletes which have a bearing on activities like running and jumping, which should hardly come as a surprise to anyone who follows competitive sports.
To use track as an example-since track is probably the purest measure of athletic ability-Africans recorded fifteen out of the twenty fastest times last year in the men's ten-thousand- metre event. In the five thousand metres, eighteen out of the twenty fastest times were recorded by Africans. In the fifteen hundred metres, thirteen out of the twenty fastest times were African, and in the sprints, in the men's hundred metres, you have to go all the way down to the twenty-third place in the world rankings-to Geir Moen, of Norway-before you find a white face. There is a point at which it becomes foolish to deny the fact of black athletic prowess, and even more foolish to banish speculation on the topic. Clearly, something is going on. The question is what.
2.
If we are to decide what to make of the differences between blacks and whites, we first have to decide what to make of the word "difference," which can mean any number of things. A useful case study is to compare the ability of men and women in math. If you give a large, representative sample of male and female students a standardized math test, their mean scores will come out pretty much the same. But if you look at the margins, at the very best and the very worst students, sharp differences emerge. In the math portion of an achievement test conducted by Project Talent-a nationwide survey of fifteen-year-olds-there were 1.3 boys for every girl in the top ten per cent, 1.5 boys for every girl in the top five per cent, and seven boys for every girl in the top one per cent. In the fifty-six-year history of the Putnam Mathematical Competition, which has been described as the Olympics of college math, all but one of the winners have been male. Conversely, if you look at people with the very lowest math ability, you'll find more boys than girls there, too. In other words, although the average math ability of boys and girls is the same, the distribution isn't: there are more males than females at the bottom of the pile, more males than females at the top of the pile, and fewer males than females in the middle. Statisticians refer to this as a difference in variability.
This pattern, as it turns out, is repeated in almost every conceivable area of gender difference. Boys are more variable than girls on the College Board entrance exam and in routine elementary-school spelling tests. Male mortality patterns are more variable than female patterns; that is, many more men die in early and middle age than women, who tend to die in more of a concentrated clump toward the end of life. The problem is that variability differences are regularly confused with average differences. If men had higher average math scores than women, you could say they were better at the subject. But because they are only more variable the word "better" seems inappropriate.
The same holds true for differences between the races. A racist stereotype is the assertion of average difference-it's the claim that the typical white is superior to the typical black. It allows a white man to assume that the black man he passes on the street is stupider than he is. By contrast, if what racists believed was that black intelligence was simply more variable than white intelligence, then it would be impossible for them to construct a stereotype about black intelligence at all. They wouldn't be able to generalize. If they wanted to believe that there were a lot of blacks dumber than whites, they would also have to believe that there were a lot of blacks smarter than they were. This distinction is critical to understanding the relation between race and athletic performance. What are we seeing when we remark black domination of Ă©lite sporting events-an average difference between the races or merely a difference in variability?
This question has been explored by geneticists and physical anthropologists, and some of the most notable work has been conducted over the past few years by Kenneth Kidd, at Yale. Kidd and his colleagues have been taking DNA samples from two African Pygmy tribes in Zaire and the Central African Republic and comparing them with DNA samples taken from populations all over the world. What they have been looking for is variants-subtle differences between the DNA of one person and another-and what they have found is fascinating. "I would say, without a doubt, that in almost any single African population-a tribe or however you want to define it-there is more genetic variation than in all the rest of the world put together," Kidd told me. In a sample of fifty Pygmies, for example, you might find nine variants in one stretch of DNA. In a sample of hundreds of people from around the rest of the world, you might find only a total of six variants in that same stretch of DNA-and probably every one of those six variants would also be found in the Pygmies. If everyone in the world was wiped out except Africans, in other words, almost all the human genetic diversity would be preserved.
The likelihood is that these results reflect Africa's status as the homeland of Homo sapiens: since every human population outside Africa is essentially a subset of the original African population, it makes sense that everyone in such a population would be a genetic subset of Africans, too. So you can expect groups of Africans to be more variable in respect to almost anything that has a genetic component. If, for example, your genes control how you react to aspirin, you'd expect to see more Africans than whites for whom one aspirin stops a bad headache, more for whom no amount of aspirin works, more who are allergic to aspirin, and more who need to take, say, four aspirin at a time to get any benefit-but far fewer Africans for whom the standard two-aspirin dose would work well. And to the extent that running is influenced by genetic factors you would expect to see more really fast blacks-and more really slow blacks-than whites but far fewer Africans of merely average speed. Blacks are like boys. Whites are like girls.
There is nothing particularly scary about this fact, and certainly nothing to warrant the kind of gag order on talk of racial differences which is now in place. What it means is that comparing élite athletes of different races tells you very little about the races themselves. A few years ago, for example, a prominent scientist argued for black athletic supremacy by pointing out that there had never been a white Michael Jordan. True. But, as the Yale anthropologist Jonathan Marks has noted, until recently there was no black Michael Jordan, either. Michael Jordan, like Tiger Woods or Wayne Gretzky or Cal Ripken, is one of the best players in his sport not because he's like the other members of his own ethnic group but precisely because he's not like them-or like anyone else, for that matter. Élite athletes are élite athletes because, in some sense, they are on the fringes of genetic variability. As it happens, African populations seem to create more of these genetic outliers than white populations do, and this is what underpins the claim that blacks are better athletes than whites. But that's all the claim amounts to. It doesn't say anything at all about the rest of us, of all races, muddling around in the genetic middle.
3.
There is a second consideration to keep in mind when we compare blacks and whites. Take the men's hundred-metre final at the Atlanta Olympics. Every runner in that race was of either Western African or Southern African descent, as you would expect if Africans had some genetic affinity for sprinting. But suppose we forget about skin color and look just at country of origin. The eight-man final was made up of two African-Americans, two Africans (one from Namibia and one from Nigeria), a Trinidadian, a Canadian of Jamaican descent, an Englishman of Jamaican descent, and a Jamaican. The race was won by the Jamaican-Canadian, in world-record time, with the Namibian coming in second and the Trinidadian third. The sprint relay-the 4 x 100-was won by a team from Canada, consisting of the Jamaican-Canadian from the final, a Haitian-Canadian, a Trinidadian-Canadian, and another Jamaican-Canadian. Now it appears that African heritage is important as an initial determinant of sprinting ability, but also that the most important advantage of all is some kind of cultural or environmental factor associated with the Caribbean.
Or consider, in a completely different realm, the problem of hypertension. Black Americans have a higher incidence of hypertension than white Americans, even after you control for every conceivable variable, including income, diet, and weight, so it's tempting to conclude that there is something about being of African descent that makes blacks prone to hypertension. But it turns out that although some Caribbean countries have a problem with hypertension, others-Jamaica, St. Kitts, and the Bahamas-don't. It also turns out that people in Liberia and Nigeria-two countries where many New World slaves came from-have similar and perhaps even lower blood-pressure rates than white North Americans, while studies of Zulus, Indians, and whites in Durban, South Africa, showed that urban white males had the highest hypertension rates and urban white females had the lowest. So it's likely that the disease has nothing at all to do with Africanness.
The same is true for the distinctive muscle characteristic observed when Kenyans were compared with Swedes. Saltin, the Swedish physiologist, subsequently found many of the same characteristics in Nordic skiers who train at high altitudes and Nordic runners who train in very hilly regions-conditions, in other words, that resemble the mountainous regions of Kenya's Rift Valley, where so many of the country's distance runners come from. The key factor seems to be Kenya, not genes.
Lots of things that seem to be genetic in origin, then, actually aren't. Similarly, lots of things that we wouldn't normally think might affect athletic ability actually do. Once again, the social-science literature on male and female math achievement is instructive. Psychologists argue that when it comes to subjects like math, boys tend to engage in what's known as ability attribution. A boy who is doing well will attribute his success to the fact that he's good at math, and if he's doing badly he'll blame his teacher or his own lack of motivation-anything but his ability. That makes it easy for him to bounce back from failure or disappointment, and gives him a lot of confidence in the face of a tough new challenge. After all, if you think you do well in math because you're good at math, what's stopping you from being good at, say, algebra, or advanced calculus? On the other hand, if you ask a girl why she is doing well in math she will say, more often than not, that she succeeds because she works hard. If she's doing poorly, she'll say she isn't smart enough. This, as should be obvious, is a self-defeating attitude. Psychologists call it "learned helplessness"-the state in which failure is perceived as insurmountable. Girls who engage in effort attribution learn helplessness because in the face of a more difficult task like algebra or advanced calculus they can conceive of no solution. They're convinced that they can't work harder, because they think they're working as hard as they can, and that they can't rely on their intelligence, because they never thought they were that smart to begin with. In fact, one of the fascinating findings of attribution research is that the smarter girls are, the more likely they are to fall into this trap. High achievers are sometimes the most helpless. Here, surely, is part of the explanation for greater math variability among males. The female math whizzes, the ones who should be competing in the top one and two per cent with their male counterparts, are the ones most often paralyzed by a lack of confidence in their own aptitude. They think they belong only in the intellectual middle.
The striking thing about these descriptions of male and female stereotyping in math, though, is how similar they are to black and white stereotyping in athletics-to the unwritten rules holding that blacks achieve through natural ability and whites through effort. Here's how Sports Illustrated described, in a recent article, the white basketball player Steve Kerr, who plays alongside Michael Jordan for the Chicago Bulls. According to the magazine, Kerr is a "hard-working overachiever," distinguished by his "work ethic and heady play" and by a shooting style "born of a million practice shots." Bear in mind that Kerr is one of the best shooters in basketball today, and a key player on what is arguably one of the finest basketball teams in history. Bear in mind, too, that there is no evidence that Kerr works any harder than his teammates, least of all Jordan himself, whose work habits are legendary. But you'd never guess that from the article. It concludes, "All over America, whenever quicker, stronger gym rats see Kerr in action, they must wonder, How can that guy be out there instead of me?"
There are real consequences to this stereotyping. As the psychologists Carol Dweck and Barbara Licht write of high- achieving schoolgirls, "[They] may view themselves as so motivated and well disciplined that they cannot entertain the possibility that they did poorly on an academic task because of insufficient effort. Since blaming the teacher would also be out of character, blaming their abilities when they confront difficulty may seem like the most reasonable option." If you substitute the words "white athletes" for "girls" and "coach" for "teacher," I think you have part of the reason that so many white athletes are underrepresented at the highest levels of professional sports. Whites have been saddled with the athletic equivalent of learned helplessness-the idea that it is all but fruitless to try and compete at the highest levels, because they have only effort on their side. The causes of athletic and gender discrimination may be diverse, but its effects are not. Once again, blacks are like boys, and whites are like girls.
4.
When I was in college, I once met an old acquaintance from my high-school running days. Both of us had long since quit track, and we talked about a recurrent fantasy we found we'd both had for getting back into shape. It was that we would go away somewhere remote for a year and do nothing but train, so that when the year was up we might finally know how good we were. Neither of us had any intention of doing this, though, which is why it was a fantasy. In adolescence, athletic excess has a certain appeal-during high school, I happily spent Sunday afternoons running up and down snow-covered sandhills-but with most of us that obsessiveness soon begins to fade. Athletic success depends on having the right genes and on a self-reinforcing belief in one's own ability. But it also depends on a rare form of tunnel vision. To be a great athlete, you have to care, and what was obvious to us both was that neither of us cared anymore. This is the last piece of the puzzle about what we mean when we say one group is better at something than another: sometimes different groups care about different things. Of the seven hundred men who play major-league baseball, for example, eighty-six come from either the Dominican Republic or Puerto Rico, even though those two islands have a combined population of only eleven million. But then baseball is something that Dominicans and Puerto Ricans care about-and you can say the same thing about African-Americans and basketball, West Indians and sprinting, Canadians and hockey, and Russians and chess. Desire is the great intangible in performance, and unlike genes or psychological affect we can't measure it and trace its implications. This is the problem, in the end, with the question of whether blacks are better at sports than whites. It's not that it's offensive, or that it leads to discrimination. It's that, in some sense, it's not a terribly interesting question; "better" promises a tidier explanation than can ever be provided.
I quit competitive running when I was sixteen-just after the summer I had qualified for the Ontario track team in my age class. Late that August, we had travelled to St. John's, Newfoundland, for the Canadian championships. In those days, I was whippet-thin, as milers often are, five feet six and not much more than a hundred pounds, and I could skim along the ground so lightly that I barely needed to catch my breath. I had two white friends on that team, both distance runners, too, and both, improbably, even smaller and lighter than I was. Every morning, the three of us would run through the streets of St. John's, charging up the hills and flying down the other side. One of these friends went on to have a distinguished college running career, the other became a world-class miler; that summer, I myself was the Canadian record holder in the fifteen hundred metres for my age class. We were almost terrifyingly competitive, without a shred of doubt in our ability, and as we raced along we never stopped talking and joking, just to prove how absurdly easy we found running to be. I thought of us all as equals. Then, on the last day of our stay in St. John's, we ran to the bottom of Signal Hill, which is the town's principal geographical landmark-an abrupt outcrop as steep as anything in San Francisco. We stopped at the base, and the two of them turned to me and announced that we were all going to run straight up Signal Hill backward. I don't know whether I had more running ability than those two or whether my Africanness gave me any genetic advantage over their whiteness. What I do know is that such questions were irrelevant, because, as I realized, they were willing to go to far greater lengths to develop their talent. They ran up the hill backward. I ran home.
Love that bomb
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 25, 1998
It's not just the Indians who fetishize nukes.
In the American hierarchy of things that kill people, bombs have never had much moral cachet. There is no National Bomb Association dedicated to the inalienable right of hunters to off Bambi's mom with, say, fragmentation grenades. Even in the shrillest debates on Capitol Hill, no freshman Republican has ever got up to declare that "bombs don't kill people, people do." In the movies, it's the bad guy who plants the bomb in the good guy's car; the good guy uses a .44 Magnum. When it comes to how we think about blowing each other away, there is an unspoken assumption that the kind of weapon you point is nobler than the kind you wire up to an alarm clock.
A similar ordering obtains on the level of international relations. "Weapons of mass destruction"--poison gas, germ- warfare agents, and, above all, nuclear bombs and missiles--are bad. Guns are, if not exactly good, the backbone of one of America's most profitable export businesses. Such distinctions have little to do with how deadly guns and bombs have proved to be in real life. On that score, firearms, which claimed tens of thousands of American lives last year, win hands down. In most states, they can be bought almost as easily as toaster ovens. (We will ban cigarettes long before we will ban handguns.) But a bomb is something that evil geniuses like the Unabomber use. Bombs have elaborate detonators and timing devices, and must be defused by experts. Guns are dumb; bombs are brainy (and nuclear bombs are the brainiest of all).
Imagine if, last week, India had conducted provocative military maneuvers along the Pakistani border, or had started infiltrating commandos across it, or, for that matter, had merely continued along the pugnacious path that the new government there had publicly charted. None of those conventionally bellicose acts would have turned India into International Public Enemy No. 1.
This is not to minimize or excuse what India has done in conducting the underground nuclear tests that disturbed the world's peace last week. Nuclear weapons have always received special consideration, for very good and very obvious reasons, and the fact that not one has been fired in anger since the Second World War is among the greatest successes of modern diplomacy. That India has chosen to flaunt its sinister expertise in this area is rightly a cause for indignation: India has triggered an arms race on the Asian subcontinent; it has further destabilized an already unstable region. The imposition of sanctions is entirely justified. Still, it is worth asking why this particular act--among the infinite variety of nasty things that countries do to one another and to their own citizens--is treated as uniquely outrageous. For if there is a lesson to be learned from the last fifty years--from what has been done by Hitler, Stalin, Mao, Pol Pot, the Rwandans, and the Bosnians, among others--it is that human beings don't need weapons of mass destruction in order to engage in mass destruction. Human ingenuity and human depravity are such that guns and machetes will do. It is true that with a bomb you can kill people faster and more emphatically. But here we risk reënacting, on the international stage, the inanity of our domestic things-that- kill-people hierarchy. Pol Pot, instead of shooting and starving millions of his countrymen in the course of several years, could presumably have herded them together and detonated a small nuclear device in their midst. That the former genocidal act did not compel us to action but the latter most assuredly would have is not evidence of moral seriousness on our part. It is evidence of moral myopia.
The Indians are not the only ones to fetishize the bomb. A decade after the fading of the Cold War, the United States and Russia continue to maintain arsenals of tens of thousands of atomic and hydrogen bombs, for what purpose no one can say--unless it is to serve some sort of vague national prestige. To the Indians, this action, or inaction, evidently speaks louder than all our anti-proliferation words. No wonder the announcement of the tests last week was greeted with euphoria by so many Indians. To them, the bomb is a way of earning respect--a salve for what President Clinton quickly diagnosed as their belief that "they have been underappreciated in the world." Clinton spoke of India as a school principal might speak of a troubled but promising adolescent suffering from low self-esteem. If this was diplomacy as guidance counselling, it was nonetheless a shrewd insight.
It is important that India be firmly divested of the nuclear illusion, but it is equally important that we divest ourselves of it as well. The bomb fetish is a James Bondish fantasy. It is also an embodiment of the high-modernist creed that form--in this case, the mastery of the scientific--carries automatic moral weight. And it is this creed that has led us to channel our indignation disproportionately into those instances in which evil meets a predetermined set of technical criteria. To the victims of mass slaughter, the distinction is without a difference.
The Estrogen Question
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
June 9, 1997
MEDICAL DISPATCHES
How wrong is Dr. Susan Love?
1.
When Dr. Susan Love gives speeches, she stands informally, with her feet slightly apart and her hands in casual motion. She talks without notes, as if she were holding a conversation, and translates the complexities of medicine and women's health with clarity and wit. "I see my role as a truthteller," she told a sold-out audience of middle-aged women at the Smithsonian, in Washington, last month, and everybody roared with approval, because that's what they've come to expect of Susan Love. She was, as usual, dressed simply that day--in a blue pants suit, with no makeup and with her hair in a short perm that looked as if she had combed it with her fingers. She had no briefcase or purse or adornment of any sort, and certainly none of a surgeon's customary hauteur, since it is Love's belief that medicine has for too long condescended to women. She was there to promote her new best-seller on estrogen therapy and menopause, but she made it clear right away that she wasn't about to preach. "Don't expect to leave here tonight knowing what to do," she said. Love wanted her audience to hear the evidence but, above all, to listen to their own feelings. "You have lived in your body a long time," she told the crowd, smiling warmly and reassuringly. "You know it pretty well--you know how it reacts to things, and you can trust it."
There are at least three doctors in America who fall into the category of media-celebrity--who can reliably write a best-seller or fill a lecture hall. The first is Deepak Chopra, practitioner of quantum healing and mind-body medicine. The second is Andrew Weil, whose seventh book, "Eight Weeks to Optimum Health," has been on the best-seller lists since March. And the third is Susan Love, breast surgeon, co-founder of the National Breast Cancer Coalition, and the author of two hugely successful books: "Dr. Susan Love's Breast Book," in 1990, and this year's "Dr. Susan Love's Hormone Book."
These celebrity doctors are all, in one way or another, proponents of what is fast becoming a basic tenet of popular medicine: that the system of health care devised by doctors and drug companies and hospitals is close-minded, arrogant, and paternalistic--dismissive of the role that nontraditional remedies, and patients themselves, can play in treating illness. "Blind faith in professional medicine is not healthy," Weil states flatly in "Natural Health, Natural Medicine"--and it's a sentence that could easily have appeared in any of the books by Love or Chopra.
Of the three, though, Love's critique is the most sophisticated. She's not a hippie, like Weil, or a mystic, like Chopra. She's a respected clinician--the former director of the Revlon-U.C.L.A. Breast Center, and an adviser to the National Institute of Health's vast Women's Health Initiative--and her criticisms have the power of the insider. Karen Stabiner writes, in "To Dance with the Devil," her brilliant, recently published account of Love's tenure at U.C.L.A.:
Love had a set of immutable rules about proper examining room behavior, all designed to even out what she saw as an impossibly inequitable relationship. She always had the patient get dressed after an exam and threatened that otherwise she would have to disrobe to even things out. She never stood with her hands folded across her chest, which would make her seem inaccessible. She tried never to stand near the door, which made the patient feel that the doctor was in a hurry. Love had been known to breeze into a room and sit on the floor, legs splayed, her notes on her lap. She often sat on the footstool the patients used to step up onto the table. It was a conscious maneuver. These women felt helpless enough without having to assume a supplicant's posture, staring up at the all-knowing physician.
In "Dr. Susan Love's Hormone Book" Love applies this skepticism to perhaps the most important topic in women's health today: whether older women should take estrogen. The medical establishment and the pharmaceutical industry, she says, have told women that they have a disease, menopause, and have then given them a cure, estrogen, even though it's not clear that the disease is a disease or that the cure is a cure. "The reason I got into this is that a lot of the books out there were 'Don't worry, dear, we'll take care of it,' " Love told me just before she took the stage at the Smithsonian. "Women were dying to get more information, literally and figuratively. They weren't hearing the voice that says, 'You can figure this out. This is your choice, this is your body, this is your life. You don't have to do what the doctor says. You can do what feels right for you.' That's the voice that was missing." It's a nearly irresistible argument, made all the more so by the way Love presents it--honestly, passionately, forthrightly. So why, after even the slightest scrutiny, does so much of what Love has to say begin to fall apart?
2.
Estrogen, Or Premarin (the trade name under which it is principally sold), is the most widely used prescription drug in the United States. Taken in the short term, during the onset of menopause, it offers relief from hot flashes and other symptoms. Taken over the long term, as part of a regime of hormone-replacement therapy (H.R.T.), it has been shown to reduce the risk of hip and spinal fractures in older women by as much as half, to lower the risk of heart disease by somewhere between forty and fifty per cent, and even--in recent and very preliminary work--to either forestall or modify the ravages of Alzheimer's disease and osteoarthritis. H.R.T. has two potential side effects, however. It raises the risk of uterine cancer, and that's why many women who take Premarin add a dose of the hormone progestin, which blocks the action of estrogen in the uterus. Long-term H.R.T. may also lead to higher rates of breast cancer.
It is the last fact that Love considers the most important. She has spent almost all her professional career fighting breast cancer, and was one of the earliest and most vocal opponents of radical mastectomies. Through the National Breast Cancer Coalition, she helped lead the fight to increase government funding for breast-cancer research, and it's hardly an exaggeration to say that her first book, "Dr. Susan Love's Breast Book," is to women's health what Benjamin Spock's "Baby and Child Care" was to parenting. Love is concerned about breast cancer above all else: she's worried about anything that might increase the risk of such an implacable disease. What's more, she believes that the benefits of estrogen are vastly exaggerated. Women humped over with osteoporosis are, according to Love, "far more common in Premarin ads than in everyday life," and she says that, since serious bone loss doesn't occur until very late in life, taking estrogen over the long term is unnecessary. On the question of heart disease, she says that the studies purporting to show estrogen's benefits are critically flawed. In any case, she points out, there are ways women can cut their risk of heart attack which don't involve taking drugs--such as eating right and exercising. So why take the chance? "It's only very recently that we've started talking about using drugs for prevention, and that's O.K. when we talk about high cholesterol or high blood pressure," she told me. "Those are people who have something wrong. But when you talk about H.R.T. for postmenopausal women, you're talking about women who have nothing wrong, who are normal, who may never get these diseases, and who are not necessarily at high risk. There is no drug that is a free lunch. There are always side effects, so why would we put women on a drug that has the side effect of a potentially life-threatening disease?"
What Love has done is recalculate the risk/benefit equation for estrogen which is fine, except that she consistently overstates the risks and understates the benefits. In the case of osteoporosis, for example, it is true that most women don't experience the effects of bone loss until their seventies. But some--about ten to fifteen per cent of women--do, with quite serious consequences. It's also the case that the maximum protection against hip fractures comes only after ten years of H.R.T., which, considering how debilitating hip fractures are to the well-being of the elderly, is a strong argument for long-term estrogen use. Or consider how Love deals with the question of heart disease. All the major studies from which conclusions have been drawn are what are called observational studies: epidemiologists have found a large group of women who were taking estrogen, followed them for a number of years, and then determined that those women had about half the number of heart attacks that women who weren't taking estrogen had. The problem with this kind of study, of course, is that it doesn't tell you whether estrogen lowers the risk of heart disease or whether the kind of women who have the lower risk of heart disease are the kind of women who take estrogen. Love suspects that it's the latter. In all the studies, she points in her new book, "the women who took estrogen were of higher socioeconomic status, better educated, thinner, more likely to be non-smokers . . . more likely to go to doctors . . . and therefore more likely to have had overall preventative care, such as having their blood pressure checked and their cholesterol monitored."
What Love doesn't point out, though, is that over the past decade estrogen researchers have been scrupulously attempting to account for this problem, by breaking down the data in order to match up the estrogen users more closely with the nonusers. Women on hormones who smoke, have a college degree, and have high blood pressure, say, are matched up with women who smoke, have a college degree and high blood pressure, and don't take hormones. It's an imperfect way of breaking down the data, since the resulting samples are not always large enough to be statistically significant. But it gives you some idea of how real the effect of estrogen is, and when researchers have done this kind of reanalysis they've found that estrogen cuts heart attacks by about forty per cent, which is a lower figure than before the reanalysis but still awfully impressive.
With breast cancer, Love takes the opposite approach--taking a relatively weak piece of evidence and making it appear more robust than it is. Her logic goes something like this. We know that hysterectomies, regular exercise, and early pregnancy--all things that lower a woman's exposure to her own estrogen--reduce the risk of breast cancer. We also know that having one's first period before the age of twelve, having children late or never having children, reaching menopause late, drinking a lot of alcohol, or being overweight--things that raise a woman's exposure to her own estrogen--increase the risk of breast cancer. "Since your body's own hormones can cause breast cancer," Love writes, "it makes sense to conclude that hormones taken as drugs will also increase your risk."
That sounds persuasive. But where's the clinical evidence to support it? "I just reviewed the hormone/breast-cancer research from the last five years," Trudy Bush, an epidemiologist at the University of Maryland, told me. "I found one report, from the Nurses' Health Study, showing a forty-percent increase in breast- cancer risk. I found four reports--two very large and well done--showing no effect, and I found another study showing that estrogen gave women significant protection against breast cancer. They're all over the place."
The problem is that measuring the link between estrogen and breast-cancer risk is tricky. The Nurses' Health Study, for example, which showed that women on H.R.T. had a forty-per-cent greater chance of getting breast cancer, is the study that has received the most media attention and the one that preoccupies Love: it is among the largest and best of the studies, and its conclusions are worrying. But it has some of the same selection-bias problems as the heart-disease studies. The estrogen users in the study, for example, had fewer pregnancies, got their periods earlier, and have other differences with the control group which would lead you to believe that they might have had a higher risk of breast cancer anyway.
There is another possible complication: estrogen does such a good job of fighting heart disease that most women who are on H.R.T. live substantially longer than women who aren't. In a recent computer analysis, Nananda Col, who teaches at the Tufts School of Medicine, and her colleagues there took the most conservative possible estimates--the highest available estimate for breast- cancer risk and the lowest one for heart-disease benefit--and devised an H.R.T. risk/benefit table, from which any woman can figure out on the basis of her own risk factors what her expected benefit would be. It shows that a woman who smokes, has relatively high cholesterol, high blood pressure, and moderate breast-cancer risk can expect to live two and a half years longer if she takes estrogen. That's two and a half years in which she has a chance to develop another disease of old age--for example, breast cancer. In other words, you'd expect to see more breast cancer in women who are on estrogen than in women who aren't, even if estrogen has nothing whatever to do with cancer, for the simple reason that women on estrogen live so much longer that they have a greater chance of getting the disease naturally.
Most experts agree that, in the end, H.R.T. is probably linked to some increased breast-cancer risk. What all the questions suggest, though, is that the effect is probably not huge and is certainly nowhere close to cancelling out the benefits of estrogen in fighting heart disease. Col, in her computer analysis, estimates that only about one per cent of women--those with the very highest risk of breast cancer and only a slight heart-disease risk--can expect no gain, or even a loss, in life expectancy from H.R.T. Everybody else--even those who have a close relative with breast cancer--is likely to benefit from the drug, and for some women taking estrogen is as good a way of living longer as quitting smoking. It is, unfortunately, very hard to convince most women of this fact. As few as a quarter of those who begin H.R.T. stay on the treatment for more than two years, and much of that has to do with the persistent inclination of many women to overestimate their risks of getting breast cancer and underestimate their risks of developing heart disease. In one recent study of several hundred educated middle-aged women, almost three- quarters of those polled thought that their risk of developing heart disease by age seventy was less than one per cent--when, in fact, statistically, it's more like twenty per cent. In making her argument the way she does, then, Love is not "truthtelling"; she's simply furthering an existing--and dangerous--myth. "You can understand where she's coming from," Trudy Bush says. "Fourteen hours a day, six days a week, she sees women with breast cancer, and that's all she sees. Your world becomes very narrowly defined. It happens with everyone who is a breast surgeon. But I also think that there is a perception on the part of some women who are activists that there is a conspiracy to force women to buy these hormones and force doctors to prescribe them. Instead of the military-industrial complex, it would be the A.M.A.-pharmaceutical complex. But things aren't so simple. In my opinion, we're all struggling here, trying to tease this out. We can only look at the data."
In March, Love published an Op-Ed piece in the Times, in which she directly addressed the question of the relative risks facing women. "Pharmaceutical companies defend their products by pointing out that one in three women dies of heart disease, while one in eight women gets breast cancer," she wrote. "Although this is true, it is important to note that in women younger than age 75 there are actually three times as many deaths from breast cancer as there are form heart disease."
This statistic is central to Love's argument. She is saying that it makes no sense to avoid something that will kill you tomorrow if it increases your chances of dying of something else today. Incredibly, however, Love has her numbers backward: in women younger than seventy-five, there are actually more than three times as many deaths from heart disease as from breast cancer. (In 1993, about ninety-six thousand women between thirty-five and seventy-four died of heart disease, while twenty-eight thousand died of breast cancer.) Even the general idea behind this argument--that heart disease is more of a problem for older women and breast cancer is more of a problem for younger women--is wrong. In every menopausal and post menopausal age category, more women die of heart attacks than die of breast cancer. For women between the ages of forty-five and fifty-four, death rates for heart disease are roughly 1.4 times those for breast cancer. For women between the ages of fifty-five and sixty-four, it's nearly three times the problem; for women between the ages of sixty-five and seventy-four, it's five and a half times the problem; and for women seventy-five and older it's almost twenty times the problem.
It's hard to know what to make of this kind of error. The Harvard epidemiologist Meir Stampfer was so dismayed by it that he wrote a letter to the editor of the Times, which was published a week after Love's article appeared. But he didn't think that her mistake was deliberate. He thought that she had just looked at the government's mortality tables and confused the heart-disease category with the breast-cancer category. "Somebody told me that they heard her on the radio or TV giving those wrong numbers, and I was pretty astonished," Stampfer told me. "And then, when I saw it in print, I flipped my lid a little bit. I'm assuming that it's just an unwitting transposition of the numbers."
That, at least, is the charitable explanation. When I met with Love, a month or so after Stampfer's letter appeared, I asked her about the relative risks of breast cancer and heart disease. We were sitting together in a booth at a hotel coffee shop in downtown Washington. It was the kind of situation, you'd think, where she might have felt free to admit to embarrassment or to offer some kind of candid explanation for what went wrong. But that's not what happened. "One of the problems with that comparison is that they act like these diseases are all at the same time," she answered. "Most women at fifty know someone who has died of breast cancer. Most women at fifty don't know someone who has had heart disease." Her eyes locked reassuringly on mine. "That's because under seventy-five there are three times as many deaths from breast cancer as from heart disease."
3.
There is an even more striking problem with the anti-estrogen movement, and that is the way that it ignores the next generation of H.R.T., the compounds known as serms (for "selective estrogen- receptor modulators"). For many years, it was thought that estrogen was a kind of blunt instrument, that if a woman took the hormone it would affect her bones and her breasts and her heart and her uterus in the same way. In other words, it was thought that a woman's body had one kind of molecular switch that would be turned on all over the body whenever she took the hormone. But when scientists were testing the drug Tamoxifen on women with breast cancer several years ago, they found out something unexpected. Tamoxifen was supposed to turn off the estrogen switch, so someone with breast cancer would take it on the theory that starving breast tissue of natural estrogen would help shrink or prevent tumors. "Everyone expected that the bone quality in these women on Tamoxifen would not be good." Donald McDonnell., a pharmacologist at Duke University, told me, explaining that people had assumed that if there was no estrogen going to the breasts there would be none going to the bones, either. In fact, though, the women's bones were fine. Somehow, Tamoxifen was turning off the estrogen switch in the breasts by acting just like estrogen in the bones. "What that suggested for the first time was that maybe estrogen doesn't work the same way in every cell and maybe we could use this information to build better compounds that would be tissue-selective. " McDonnell said.
What researchers now believe is that there are many kinds of estrogen switches in the body, and that whether they turn on or off is highly dependent on the type of the estrogen like compound that they are presented with. Tamoxifen, by purest chance, happens to be a compound that turns the switch on in the bones and off in the breasts. Unfortunately, it also turns the switch on in the uterus, raising the risk of uterine cancer. But a second generation of serms is now in development; these act like estrogen in the heart and the bones but block the harmful effects of estrogen in the breasts and the uterus. McDonnell has one such compound that is about to go into clinical trials. The Indianapolis-based drug firm Eli Lilly has another--Raloxifene--that is much further along and could be on the market within a year or so. Before very long, in short, women worried about raising their breast-cancer risk will have the option of taking a different kind of hormone that doesn't affect their breasts at all --or that may even protect against breast cancer.
"In the past, what you were looking at was a risk/benefit game," John D. Termine, a vice-president at Lilly's research laboratories, told me. "There was estrogen with all these terrific properties, but at the same time there was this downside, that women were afraid of breast cancer. Now Raloxifene and the other serms come along, and we're going to have alternatives. Now the risks and benefits are much different, because we have something else. . . . One of the physicians on our advisory board said that it's like when beta-blockers were introduced for heart diseases. It changed the game completely."
You might think that this development would be of enormous significance to Love, answering, as it does, her great worry about the potential side effects of H.R.T. In fact, she mentions serms just twice in her book and, each item, only briefly. It's a bit as if someone had written a book about computers in 1984 and
Scientists are hoping to use some of this new information to design the perfect hormone: one that will protect the uterus and breast from cancer, stop hot flashes, and prevent osteoporosis and heart disease. It would be lovely--could it do housework too?--but I'm skeptical,. It would still be a drug. And I have yet to see a drug that doesn't have some side effects.
This is an extraordinary passage. It would still be a drug? What form does a successful medical intervention have to take before Love finds it acceptable? And, for that matter, since when does the possibility of side effects negate the usefulness of a drug? Drugs have side effects, but we take them anyway, because in most cases the side effects are a lot less significant than the main effects. (That's why they're called side effects.) At one point in her speech in Washington, Love spoke of her daily breakfast of soy milk and flax-seed granola, and boasted jokingly that it was so rich in natural plant estrogens that "one bowl is as good as a Premarin pill." Now it turns out that one bowl is not as good as a Premarin pill, because plant estrogens as much weaker than animal estrogens. Nor are plant estrogens exactly "natural," because plant estrogens are, technically, nonsteroidal while Premarin--like the estrogen a woman makes herself--is a steroid. But Love wasn't really intending to enter into a discussion of estrogen chemistry. She was simply expressing her skepticism of modern medicine--of the idea that medical salvation can come in the form of a pill. Her objection is not to Premarin itself so much as it is to the idea that postmenopausal women should rely on any sort of drug at all.
This is where, sooner or later, you end up when you start down the path of people like Andrew Weil and Deepak Chopra and Susan Love. To read the health books on the best-seller lists right now is to be left with the impression that exercise and a good diet are all that matter--that medicine is too ineffectual to help us if we do not first help ourselves. That's one of the reasons these books are so successful: they take the language of emotional and spiritual fulfillment and apply it to medicine, prompting people to find and follow their own instincts about health in the same way they have been taught to find and follow their own instincts in relationships, say. When Love told me in Washington that "this is your choice, this is your body, this is your life," that's what she meant--that the medical was the personal. This kind of talk may inspire people to shape up, which is all to the good. But it does not begin to reflect how sophisticated and powerful medicine has become. In the introduction to his 1990 book "Natural Health, Natural Medicine" Weil claims that "professional medicine" is "bad" at treating, among other diseases, cancer and viral infections. Yet today, just a few years after he wrote those words, not only are we on the verge of getting a new class of anti-influenza drugs but a combination therapy for H.I.V. appears to have dramatically extended the lives of aids patients, and over the next several years the biotechnology industry is likely to get approval for almost two dozen new cancer drugs, representing second generation of treatments, to replace chemotherapy and radiation. The list of things that traditional medicine is bad at gets shorter all the time.
Earlier this year, a study appeared in the Journal of the American Medical Association that put many of these changes in perspective. The study, conducted by a team from Harvard University's School of Public Health, attempted to figure out why the mortality rate from coronary heart disease dropped so dramatically in the nineteen-eighties. In that decade, the decline averaged 3.4 per cent a year, which means that in 1990 there were about a hundred and thirty thousand fewer deaths from heart disease in America than there would have been if the mortality rate had been the same as it was in 1980. Most people, I think, would credit this extraordinary achievement to our getting more exercise in the nineteen-eighties and losing weight. But we didn't, much. Smoking, which is obviously a major risk factor for heart disease, was down, but not by a lot: the Harvard group estimated that declines in smoking probably account for about six per cent of the decrease. People did eat better as the decade progressed, but better diet probably accounts for only about a quarter of the difference. Most of the drop--about seventy per cent of the total--happened because of the increased use of procedures like angioplasty and coronary bypass and, more important, the advent of a new class of powerful clot-dissolving drugs, like streptokinase and tissue-plasminogen activator.
This does not, of course, change the fact that people should exercise and eat properly and take charge of their lives., We should all listen to our bodies and make our own choices. But there are times when what we can find out about our bodies and do for ourselves pales in comparison to what we do not know and cannot do--when we have to rely on doctors and medicine to do things for us. There is more to medicine than can be explained by the language of personal fulfillment.
The Dead Zone
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
September 29, 1997
A REPORTER AT LARGE
Seven bodies buried in the Artic tundra might
solve the riddle of the worst flu pandemic
in history -- and might help us prevent it
from happening again.
I--PERMAFROST
On September 24, 1918, three days after setting sail from Norway's northern coast, the Forsete arrived in Longyearbyen, a tiny mining town on one of the Norwegian islands north of the Arctic Circle. It was the last ship of the year, before ice made the Arctic fjords impassable, and it carried among its passengers a number of fishermen and farmers going north for the winter to earn extra money in Longyearbyen's coal mines. During the voyage, however, the ship had been hit with an outbreak of flu. Upon landing, many of the passengers had to be taken to the local hospital, and over the next two weeks seven of them died. They were buried side by side in the local cemetery, their graves marked by six white crosses and one headstone:
Ole Kristoffersen
February 1, 1896-October 1, 1918
Magnus Gabrielsen
May 10, 1890-October 2, 1918
Hans Hansen
September 14, 1891-October 3, 1918
Tormod Albrigtsen
February 2, 1899-October 3, 1918
Johan Bjerk
July 3, 1892-October 4, 1918
William Henry Richardsen
April 7, 1893-October 4, 1918
Kristian Hansen
March 10, 1890-October 7, 1918
The Longyearbyen cemetery is at the base of a steep hill, just beyond the town limits. If you look up from the cemetery, you can see the gray wooden skeleton of the coal mine that used to burrow into the side of the hill, and if you look to your left you can see the icy fringes of a glacier. Farther down the mountain are a shallow stream, a broad shale plain, and then, half a mile or so across the valley, Longyearbyen itself: a small cluster of red-roofed, brightly painted frame buildings. There are no trees, because Longyearbyen is many miles above the tree line, and from almost anywhere in the valley the cemetery is in plain view. Each grave site is slightly elevated and surrounded by rocks, and there are well-worn pathways among the rows of crosses. A chain-link fence rings the periphery. When I was there in late August, the ground had been warmed by the Arctic summer sun and was soft and spongy, carpeted with orange and red and white lichen. In the last row I found the miners' graves--seven deaths separated by six days.
It is possible to go to almost any cemetery in the world and find a similar cluster of graves from the fall of 1918. Between September and November of that year, as the First World War came to an end, an extraordinarily lethal strain of influenza swept the globe, killing between twenty million and forty million people. More Americans died of the flu over the next few months than were killed during the First World War, the Second World War, the Korean War, and the Vietnam War combined. The Spanish flu, as it came to be known, reached every continent and virtually every country on the map, going wherever ships sailed or cars or trucks or trains travelled, killing so many so quickly that some cities were forced to convert streetcars into hearses, and others buried their dead in mass graves, because they ran out of coffins. Of all those millions of graves, though, the seven in Longyearbyen stand apart. There, less than eight hundred miles from the North Pole, the ground beneath the lichen is hard-frozen permafrost. The bodies of the seven miners may well be intact, cryogenically preserved in the tundra, and, if so, the flu virus they caught on board the Forsete--the deadliest virus that the world has ever known--may still be down there with them.
At the beginning of next month, a scientific team led by the Canadian geographer Kirsty Duncan will fly to Longyearbyen and set up a workstation in the church graveyard. The team will map the site, and then scan it with ground-penetrating radar, passing what looks like a small black vacuum cleaner over the tundra to see how deep the bodies are buried. If the radar shows that they are below the active layer of the permafrost--that is, below the layer that thaws each summer--the team will return next fall with enough medical equipment and gear to outfit a small morgue. The site will be prepared with tarpaulins and duckboards. Pavement breakers--electric jackhammers--will be used to break up the tundra, and the chunks of earth will be scooped out with a shovel. As the excavation gets close to the coffins, the diggers will don biohazard spacesuits, and a dome or a tent will be erected over the operation.
To minimize the possibility of infection, the bodies will be left where they are, in their coffins, and autopsies will be performed in the ground. If the clothes on the corpses are frozen to the skin or tightly matted, someone on the team might run a hair dryer over the material to loosen it up--but only briefly. "If the bodies are thawed out and this material is taken out, it will melt, and then there is always the chance of the spread of micro droplets," Peter Lewin, one of the team members, told me. Lewin is a pediatrician at Toronto's Hospital for Sick Children who doubles as a medical archeologist, and he has earned international renown for his pioneering cat scans of Egyptian mummies. (He helped determine that Ramses V died of smallpox.) "Say you're doing an autopsy"--he gestured to indicate a body spread out on the desk in front of him--"if it melts, there may be a mucousy, secondary blood product--some type of liquid exudation. The liquid seeping out of that material may suddenly, by mistake, be aerosolized and someone inhales it. You just don't want to take any chances."
From the ad-hoc morgue in the Longyearbyen cemetery, the samples will be flown to a BSL-4 facility--4 is the highest level of biological containment--either in England or at the United States Army's infectious-disease research facility, at Fort Detrick, Maryland. There's a small possibility that what scientists will find is a live virus--a virus that, once thawed, could be as deadly and infectious as it was in 1918. If they don't, the hope is that they'll at least be able to recover the virus's genetic footprint--what scientists call RNA residue. Samples of the virus will then be sent to laboratories around the world. Its genetic code will be sequenced and compared with every major sample of the flu virus on file in the world's virological centers.
This task has a certain urgency. Scientists know that global outbreaks of deadly influenza go back at least four hundred years, and that there have been two more since 1918--the Asian flu, of 1957, which killed seventy thousand Americans, and the Hong Kong flu, which killed thirty-three thousand during the winter of 1968-69. With luck, we'll be able to anticipate the next Spanish flu before it does much damage. The problem is that we're not really sure what to look for. No one kept a sample of the virus in 1918, because the flu virus wasn't isolated until fifteen years later. And, because influenza mutates so rapidly, there's almost nothing to be learned about the peculiarities of the 1918 virus from looking at the influenzas in circulation today. The only way to find out about the 1918 virus is to find the 1918 virus.
"We've designed core-biopsy-removal equipment to take core samples," Peter Lewin said. "You drill into the body, because it's solid. It's a technique taken from forestry. You use what's called a hole-saw tube." He drew a diagram on the back of a file folder, outlining a long, hollow cylinder, with circular, screw like grooves on its outside, a serrated edge on its tip, and a T-shaped handle at its other end. "It's about nine inches long, about a quarter inch in diameter," he went on, explaining that as the tube is twisted into a body it will collect a long cross-sectional sample of tissue. "We'll probably take four core samples of the lung"--he pointed at the upper and lower chambers of his left and right lung--"one of the brain, one of the trachea, perhaps two of the bowel and liver."
Lewin was raised in Egypt, where his father was a British military officer--two of Lewin's schoolmates were Adnan Khashoggi and the future King Hussein--and he has the unflappable, genteel air of a nineteenth-century colonial explorer. He ticked off the details of the exhumation in Longyearbyen as if he were reciting a grocery list. "We're doing some practice runs on frozen material--basically, on frozen pigs--to see if this thing works. We were initially going to use a drill. But the drill goes so fast that it heats the tissue up, and, of course, we don't want that. So why not just slowly twist it in?" He rotated his hand. "They use hole saws on trees to get core samples of rings. They're very useful. But no one has ever used them here. I mean"--he laughed--"how often do you do core samples of frozen bodies?"
II--The Second Wave
The first known case of Spanish flu was reported on March 4, 1918, at Camp Funston, in Kansas. By April, it had spread to most cities in America and had reached Europe, following the trail of the hundreds of thousands of American soldiers who crossed the Atlantic that spring for the closing offensives of the First World War. The spring wave was serious but not disastrous, and by midsummer it had subsided. A month or so later, however, the Spanish flu resurfaced. It was the same virus in the sense that if you'd got the flu in the spring you were resistant to it in the fall. But somehow over the summer it had mutated. Now it was a killer.
The first case of the second wave was recorded on August 22nd, in Brest, a major port for incoming American troops. Within days, it appeared simultaneously in Boston and Freetown, Sierra Leone, carried in the former case by returning American soldiers and in the latter by H.M.S. Mantua, a British navy ship. The virus crossed Europe in a matter of weeks. It attacked Spain through Portugal, in the west, and across the Pyrenees, in the north, lingering long enough to be dubbed--erroneously, as it turned out--the Spanish flu. Scandinavia was infected by England; Italy was infected by France; and Sicily was infected by Italy. Allied soldiers coming to the aid of anti-Bolshevik forces during the Russian Revolution carried the flu to the White Sea area of northwestern Russia. European and American ships brought the flu to Iceland in mid-October, and American ships brought the flu to New Zealand at around the same time. In India, the virus came by sea and raced inland along the country's railroad lines. As many as half of all those who died in the pandemic died within India's borders. In America, an estimated six hundred and seventy-five thousand people died. In Philadelphia, seventy-six hundred people died within fourteen days. Putrefying bodies were stacked up three and four deep in the corridors of the city morgue, creating such a stench that the morgue was forced to throw open its doors for ventilation. In "America's Forgotten Pandemic" (1976), a definitive history of the Spanish flu, the historian Alfred Crosby offered this description of the flu's advance on Alaska:
On or about November 1 the virus reached the finest medium for its propagation in Nome and vicinity, the city's Eskimo village. Few Eskimos escaped infection. In a single eight-day period 162 of them died. Some Eskimos, hounded by superstitious horror, fled from cabin to cabin, infecting increasing numbers with disease and panic. The temperature fell below freezing, and when rescuers broke into cabins from whose chimney came no sign of smoke, they found many, sometimes whole families, who had been too sick to renew their fires and who had frozen to death. When a number of Eskimos were rounded up from their separate cabins and placed in a single large building so they could be cared for efficiently, several of them responded to what they apparently perceived as incarceration in a death house by hanging themselves.
This was not the flu as we normally think of it. Typically, influenza infects the inner lining of the respiratory tract, damaging the air-filled cells of the lungs known as alveoli. Sometimes it brings on pneumonia. Usually it passes. This was much worse. "If you autopsied some of the worst cases, you'd find the lungs very red and very firm," said Jeffery Taubenberger, a pathologist at the Armed Forces Institute of Pathology, in Washington, D.C. "The lungs are normally filled with air, so they are compressible. These would be very heavy and very dense. It's the difference between a dry sponge and a wet sponge. A normal piece of lung would float in water because it was basically filled with air. These would sink. Microscopically, you would see that the alveoli would be filled with fluid, which made it impossible to breathe. These people were drowning. There was so much liquid in the air spaces of their lungs that patients would have bloody fluid coming out of their noses. When they died, it would often drench the bedsheets."
Without sufficient oxygen, patients would suffer from cyanosis--a discoloration of the skin. "Two hours after admission they have the mahogany spots over the cheek bones," a physician wrote at the time, describing the epidemic at Camp Devens, Massachusetts. "And in a few hours you can begin to see the cyanosis extended from the ears and spreading all over the face, until it is hard to distinguish the colored man from the white." Nurses would triage incoming flu patients by looking at the color of their feet. Patients whose feet were black were considered as good as dead.
Something else was strange about the 1918 strain, and that was its choice of victim. Flu epidemics kill mostly at the demographic fringes--the very old, whose immune systems are the least robust, and the very young. Other adults do get sick, but they rarely die. In 1918, however, the usual pattern of mortality was reversed. The Longyearbyen seven, for example, were all between the ages of nineteen and twenty-eight, and that was by no means unusual. In the United States, men between twenty-five and twenty-nine died of the Spanish flu at several times the rate of men between seventy and seventy-four. This wasn't just a deadly infectious disease. It was a deadly infectious disease with the singular and terrifying quality of being better at killing the young and healthy than the old and the infirm.
III--Process of Elimination
Kirsty Duncan, the leader of the Longyearbyen expedition, is a medical geographer and climatologist by training, with dual appointments at the University of Windsor and the University of Toronto. We met in her parents' house, a bungalow in the Toronto suburb of Etobicoke, she on one side of the family dining-room table, I on the other. Between us were five overstuffed black binders, filled with the results of four and a half years that Duncan had spent searching for frozen flu victims. In the kitchen behind us, Duncan's mother prepared lunch. Whenever the phone rang, or the banging from the kitchen got too loud, or Duncan was coming to a critical part of her story, she dropped her voice almost to a whisper, so that I had to lean forward to hear what she saying. She has large dark eyes and straight dark-brown hair that runs so far down her back that once when she got up her hair got caught in the chair. She's thirty, but she looks much younger. When I first walked up to the house, I approached a woman watering the flowers and said, "Professor Duncan?" The woman replied, "Oh no. I'm her mother. Kirsty's inside."
Duncan's obsession with the Spanish flu began when she read Crosby's book on the pandemic. "I was absolutely fascinated--horrified, more than anything--that we didn't know what caused this disease," she told me. "I said to my husband, 'I'm going to find the cause of the Spanish flu.'" The logical place to start, it seemed to her at the time, was Alaska, so she wrote to the Alaska bureau of Vital Statistics and had it ship her records from 1918. "I went through thousands of death certificates, and I found all kinds of cases of Spanish flu. The problem was trying to decipher where the permafrost was." In 1951, the Army had led a secret expedition to a grave site near Marks Air Force Base, in Nome, to dig up 1918 corpses, but the mission--code-named Project George--failed for that very reason: the bodies weren't in permafrost, and they had melted and decomposed. After Alaska, Duncan thought of Iceland. "But, of course, with all that geothermal energy it's too warm," she said. "Then I had a friend returning from Norway, and he mentioned permafrost, and I became excited, because I knew flu had been in Norway."
Duncan's focus was on the huge archipelago of islands, about six hundred miles north of Norway, that is known as Svalbard--and, in particular, on the town of Longyearbyen, a settlement of just over a thousand people which has served as Svalbard's major port for the better part of the century. "I knew that people used to do coal mining in Svalbard," she said. "I contacted the Norwegian Polar Institute. But they told me I had a really difficult task ahead of me. There are no medical records, because the hospital was bombed in the Second World War; no church records, because the first minister didn't come out until the nineteen-twenties; and no government records, because Svalbard didn't officially become part of Norway until 1925. They said there are these diaries, though, that the coal company kept." Duncan called the coal company, which referred her to a schoolteacher in Longyearbyen. She called the schoolteacher. He found, in the 1918 entries, a record of the deaths of seven young miners from Spanish flu. "So now I knew that there were seven bodies, and that they were buried in the churchyard in Longyearbyen," Duncan said. "I contacted the minister at the church. I said I wanted to know if the graves were marked. He said they were."
The bodies of the seven miners are not, in all likelihood, perfectly preserved. Prolonged freezing desiccates soft tissue, so the best Duncan's team can hope for is, essentially, natural mummies. "In a frozen state, the fluids in the body simply evaporate," Michael Zimmerman, an anthropologist at the University of Pennsylvania and an expert on mummification, explained to me. "The process is called sublimation. It's the change from the solid state to a gaseous state without going through a liquid state. If you put a tray of ice cubes in your freezer and go back two weeks later, they're a lot smaller. That's what we're talking about." Zimmerman estimated that the Longyearbyen seven, if they had been properly buried, would probably be down to about half their original weight, and maybe even less, so that their skin would be stretched tight over their bones, and every one of their ribs would be showing, as if they had been deprived of food for an extended period. "The eyes collapse, because there is a large fat pad behind the eye that's mostly water, and when that dries the eye falls back into the socket," he said. "Like everything else, the lips will tend to retract, so the teeth will become more prominent." Nonetheless, Zimmerman thought that a full autopsy would still be possible. "I don't see a problem," he went on, "especially given that these bodies were buried only about eighty years ago. The tissues are probably still fairly flexible. They're not like Egyptian mummies. Their tissues are like old leather, like an ancient book, and unless you're careful they'll crumble. Frozen bodies, since they don't completely desiccate until they've been frozen for a thousand years, are still flexible. You can get big pieces out pretty easily." There is a catch, though. During the summer months, the top layer of the permafrost thaws. In Longyearbyen, that layer is between one and 1.2 metres deep. If the miners were buried in that layer--if the gravediggers in 1918 hadn't gone to the trouble of blasting or pickaxing their way deep into the tundra--the bodies would be dust and bone by now. "I contacted the Norwegian authorities and asked what depth the bodies would have been buried, and they said, 'Well, no one knows,'" Duncan went on. "Back then, that was no man's land. But they assumed they would have followed the practice of the time, which was about two metres. The church minister believes they will be at two metres."
This was more than simply a guess. In the permafrost, anything buried in the active layer will, over time, "float"--that is, be pushed up toward the surface by the continual expansion and contraction of the ground. For that reason, it's relatively common in the hills around Longyearbyen to stumble across skeletons.
"If you go places where trappers are buried, you often see the coffin, open on the ground," Kjell Mork, a Longyearbyen high-school teacher who serves as the town's unofficial historian, told me. Mork is the man who gave the coal-company diaries to Duncan. He's a dead ringer for the novelist Robertson Davies, and has in his house a polar-bear pelt that takes up almost an entire wall. "I see it all the time. Back in the sixteenth, seventeenth, and eighteenth centuries, the trapping teams had only two or three people, so they couldn't take the effort to bury the bodies deep enough. Up at the northwest corner of the fjord, there used to be plenty of them. But now there's a new ethic--to cover them up again. I think the polar bears were going there." In the Longyearbyen churchyard, however, nothing has ever floated. Next to one of the crosses, just a few feet beyond the fence, I had seen a pile of fairly sizable white bones, including what looked like a human-size femur. But when I asked Mork about this he shook his head. "I think that's just reindeer," he said. "They come down the mountain to die."
Duncan's next big problem was to find out what had happened to the bodies before they were buried. The flu virus, after all, is notoriously unstable. It's an RNA virus, as opposed to a DNA virus, and that means that instead of being composed of double strands of genetic code it has just one strand, and is much more vulnerable. The moment someone dies, enzymes are released that begin breaking down these nucleic acids and the genetic information they carry. A DNA virus, like herpes or hepatitis, could probably last in a body for a few days before being totally destroyed. But an RNA virus, like flu, would last between twelve and twenty-four hours at the most. The diaries kept by the manager of the coal company show that the bodies of the Longyearbyen seven were not buried until October 17th, ten days after the last of them died. What happened in those ten days? Did the bodies start to decompose before they were buried? Duncan was told that orderlies at the Longyearbyen hospital would have taken the bodies to an outdoor morgue while they waited for the graves to be dug. For there still to be RNA residue, the weather would have had to have been cold enough in those first two weeks of October to keep the RNA-dissolving enzymes at bay. She checked the weather records. The average temperature in early October was minus five degrees Celsius. Duncan had her bodies, and she knew where to find them.
IV--Wax Museum
The Spanish-flu virus has been glimpsed just once, and that was in a scrap of lung tissue found two years ago in the National Tissue Repository, a division of the Armed Forces Institute of Pathology. The repository is in an annex of the Walter Reed Army Medical Center, in Maryland, just across the District of Columbia line, in a windowless corrugated-steel building behind a former elementary school. At the side are a parking lot and a loading dock, and there is an ill-kempt lawn out front. It looks like an industrial warehouse. Inside, there are three rooms, the largest of which is filled with rows of tall metal shelves, all stacked high with small brown cardboard boxes. Inside each of those boxes are pieces of human tissue about the size of a fingernail which have been preserved in formaldehyde and encased in a block of transparent paraffin wax. The repository holds more than two and a half million samples--some pressed between glass slides, some in boxes, some fully preserved organs--from autopsies on soldiers dating back to before the First World War. It's the world's largest library of death.
The supervisor of the repository is Al Riddick, a powerfully built black man in his mid-forties with a bald head, a gold chain, and glasses. When I toured the repository in late summer, Riddick took me to the back of the main room and pulled out a cardboard box from one of the shelves. Inside it were seventeen wax blocks, measuring roughly an inch by an inch by half an inch. "This is from a 1958 autopsy," he said. He picked up one of the blocks, tilting it so that I could see a speckled, bright-orange sliver of tissue embedded in the wax. "That's a brain block right there," he said. Then he picked out another block, this one encasing a dark-reddish pockmarked rectangle that looked like a dried scab. "I would say that's liver."
Next, we walked into an adjacent room, where the Army keeps its collection of organs. On a lab bench was a plastic Ziploc bag with a heavy-looking, linen-wrapped object inside. "That's a large surgical case," Riddick said. "Could be a breast. Could be a lung. It's big. Looks like a lung." He picked it up in his hands, and began to knead the package delicately, as one might check a mango for ripeness. "No, there's some bone in there." He was as matter-of-fact as Peter Lewin had been in describing how to use a hole saw on a frozen corpse. I asked Riddick whether he was ever spooked, working in roomfuls of human parts. He shook his head. "Son," he said. "I'm a Vietnam vet. It's the people who move that bother me."
In March of 1995, Jeffery Taubenberger, who heads the institute's division of molecular pathology, called over to the repository to see whether it had any tissue samples from Spanish-flu victims. Taubenberger is not a "flu man," meaning that he is not one of the small circle of scientists who have devoted their lives to influenza research. But he is one of the world's experts in the arcane art of recovering genetic information from preserved tissue samples, and it occurred to him that he stood as good a chance of finding the Spanish flu as the scientists looking for frozen bodies. The archivists told Taubenberger that they had a hundred and twenty autopsy samples of flu victims. Some, though, were just microscopic slices of tissue between glass slides, and they didn't give him enough material to work with. Taubenberger wanted wax blocks, which reduced his choices to seventy. Taubenberger and Ann Reid, a technician who worked with him on the project, randomly selected the medical records of thirty of those seventy cases. Of those, in turn, they rejected all the soldiers whose disease had not progressed rapidly, on the theory that those victims were less likely to have had the virus in their lungs when they died. That left them with seven cases. They were ready to begin.
Taubenberger and Reid started by taking lung samples from all seven and slicing off a microscopically thin sliver from the end of each block. "You take that slice and put it in a test tube and get rid of the wax," Taubenberger explained. "And you take that tissue and spin it really fast, so it all goes to the bottom of the tube. You digest it in chemicals to chew up the membranes and the proteins, you go through a series of chemical purifications, and what you end up with is something highly enriched with the RNA." Over an entire year, Taubenberger, Reid, and other members of their team worked to perfect a method of genetic analysis that could isolate the right material and stretch the tiny pieces of tissue they had as far as possible. Given the fragility of RNA, it was not an easy task. No one had ever recovered RNA from a sample so old. Early last year, they began testing the seven samples. One turned up positive.
Taubenberger is wiry and intense, with thick dark-brown hair and a patient, precise manner. He speaks in complete sentences and strings them together in complete paragraphs, until he has made even the most abstruse point crystal clear. I met him and Ann Reid in his office at the institute, a squat, five-story concrete bunker originally built as a nuclear shelter for President Eisenhower. The building has no windows, only a battered concrete doorway, and the walls inside are covered with tiles of a disorienting government-issue yellow. Taubenberger, Reid, and I sat in a circle, and the room was so small and cluttered that our feet were nearly touching. "He was a twenty-one-year-old army private," Taubenberger said. "We know he died in South Carolina, at Fort Jackson. I believe he was from the state of New York. He had no prior medical history. He got sick at the height of the pandemic at Fort Jackson and had a fast downhill course. He presented with massive pneumonia and died six days later, on September 26th,at six-thirty in the morning. His autopsy was performed around noon."
V--Viral Sex
One would think that, with the soldier's sample in hand, many of the questions that surround the Spanish flu could be answered. In a certain sense, that's true. Taubenberger and Reid have so far decoded about fifteen per cent of the genes in the soldier's virus, and their work has made possible a few preliminary conclusions about the Spanish flu. It had already been hypothesized, for example, that the Spanish flu originated--at least, in part--with a bird, probably a wild duck. Waterfowl are what virologists call the "reservoir" for influenza. They carry most of the known subtypes of influenza--without apparent ill effect--and excrete them all in their feces, thereby spreading them through land and water to the rest of the animal kingdom. All animals that get the flu--horses, ferrets, seals, pigs, among others--and human beings probably get it originally from birds.
"At this time of the year in Canada, if you look at the wild ducks that are about to migrate south before the winter, around thirty per cent probably have the flu," I was told by Robert Webster, a leading flu expert at St. Jude Children's Research Hospital, in Memphis. "They're popping it out in the water. If you sampled the lakes in Canada, you'd find all kinds of avian influenza." At some point prior to the spring of 1918, then, a flu-carrying duck must have shed feces while flying over or nesting in some inhabited part of the world. If, in fact, the pandemic started in the place where the first case was reported--Camp Funston--the precipitating event was probably somewhere in or around Kansas.
That bird virus probably didn't directly infect a human being, though, because human beings generally can't catch flu directly from birds. Viruses are particular in that way. A virus infects and takes over a cell by latching onto what is called a receptor, but--as far as we know--there isn't a receptor for avian flu in human beings. So how did the 1918 virus get from ducks to people? One possibility, according to Taubenberger's analysis, is through pigs--one of the genes he studied looks like classic swine flu. This makes sense, because pigs, uniquely, have both human and avian flu receptors; they're the perfect bridge between species. So perhaps the flu-contaminated duck feces dropped into a barnyard, whereupon a pig became infected while nosing in the dirt and passed the virus on to a farmer.
It's not quite as simple as that, though, since another of the flu genes analyzed by Taubenberger looks very much as if it came from human flu. This wasn't just bird flu passed on by a pig, in other words. This could well have been bird, pig, and human flu that somehow got mixed up together. The pig must have already been infected with one flu when it picked up the other: what it passed on to the farmer was a hybrid.
This is not as far-fetched as it sounds. A flu virus consists of eight gene segments that are so loosely bundled that they are like pieces of a jigsaw puzzle thrown together in a bag. If a pig got infected with avian and human flu simultaneously, the eight jigsaw pieces from the duck and the eight jigsaw pieces from the human being would be thrown together, and an entirely new puzzle could emerge.
Some scientists call this process of two viruses combining "viral sex," which is an apt term, because, as in human reproduction, offspring split the genetic inheritance of mother and father. According to many influenza experts, this flukish interaction of separate species is probably how almost all the pandemic strains that periodically sweep the world first arise. The Hong Kong flu, for example, consisted of seven genes from an everyday human virus and one gene from a duck that combined inside a pig to create a nasty new hybrid. The Asian flu resulted from the same kind of reassortment.
Taubenberger couldn't tell from his sample, though, what everyone really wants to know, which is what made the Spanish flu so devastating. It is possible to look at the flu strains that have proved deadly in domestic poultry and to explain their lethality almost entirely by pointing to an insertion mutation in one of the genes--a curious genetic glitch that allows the virus to attack almost any cell. One idea had been that the Spanish flu shared this same mutation. But Taubenberger showed that this wasn't the case. The relevant segment of the soldier's virus showed no such anomalies, and that meant that the secret of the Spanish flu's lethality is probably somewhere else. Perhaps it lies in one of the genes that Taubenberger and Reid haven't looked at yet. Or perhaps it's not one mutation at all, but several, all combining in some subtle way.
"One thing to keep in mind is whether the virus that Taubenberger has is just a precursor," Webster pointed out. "We've only got one virus so far. It might have been early in the pandemic's evolution. Have we yet looked at the nasty bastard? We need more than one virus. One's not enough." With an earlier or later sample, Taubenberger could see what specific changes the virus made over the summer to become a killer. Just as good would be to find a strain from another part of the world which might have had a slightly different evolution, so that Taubenberger could eliminate the differences, and focus only on what the viruses had in common. But finding that second virus has proved difficult, since only the United States Army seems to have been so assiduous in hanging on to autopsy samples from the First World War. One famous pathology archive in Germany was destroyed in the Second World War. A handful of samples found in England have yet to turn up anything. Taubenberger has put out feelers to Spain and Italy, and found nothing. I asked him about Russia, since the Russians were also pioneers in medical record-keeping, but at that he and Reid burst out laughing. "There is an epidemiologist at the Centers for Disease Control who is Russian, and I spent some time talking to him about this when I was down there," Taubenberger said. At this point, he dropped his voice an octave, imitating a thick Russian accent, and said, "October, 1918. Very bad time for Russia. Very bad time."
This spring, Taubenberger met Duncan at a conference on the Spanish flu at the Centers for Disease Control, in Atlanta, and agreed to join her team. His lab will analyze whatever frozen samples she collects. The best hope for another copy of the Spanish flu, a second copy that will help make sense of the first, may well be lying in the permafrost of Longyearbyen.
VI--Drift and Shift
Every year, early in the winter, the Food and Drug Administration hosts what some call the Flu Meeting, to insure that if the Spanish flu ever happened again we would not be unprepared. This year, the meeting took place on January 30th in the Versailles Ballroom of the Holiday Inn in Bethesda, Maryland, beginning at eight in the morning and ending at four. At the front of the auditorium, twenty or so medical experts sat behind a long table. Off to the side was a lectern, where throughout the day officials from the Centers for Disease Control and the World Health Organization gave presentations. The audience was large--well over a hundred--and included public-health officials from around the world, and vaccine manufacturers eager to get guidance from the government about what kinds of flu strains to put in the upcoming fall flu shot. Video cameras recorded the proceedings for those who couldn't attend. Of the dozens of daylong conferences that the F.D.A. hosts every year, none are as important.
The first two speakers at this year's meeting were from the surveillance section of the C.D.C.'s flu division--the eyes and ears of the flu world. Flu surveillance is critical because the flu virus comes in so many shapes and varieties. All flu viruses wear a kind of protective coat--an outer covering made up of two proteins known as hemagglutinin (h) and neuraminidase (n). That's how you can tell a flu virus under a microscope. But there are at least fifteen varieties of h and nine varieties of n, and any one of the former can combine with any one of the latter to create a different virus family. For the past twenty years, the world has been dominated by two of these flu families--the descendants of the Hong Kong flu of 1968 and the Russian flu of 1977--and every year each of them spawns dozens of offspring: genetic variants that result as individual viruses spread from person to person and change to stay one step ahead of the human immune system. Whenever a new offspring emerges, virologists say the virus has "drifted." At the same time, there is always the possibility that another avian strain will get mixed up with a human strain inside a pig and an entirely new family will emerge. If that were to happen, virologists would say the virus had "shifted."
It's this constant drifting and shifting that makes the flu so dangerous. If the flu stayed the same each year, you could be vaccinated against it the way you can be vaccinated against polio--for life. But, since the flu is always changing, the World Health Organization has had to set up a far-flung international surveillance network. Every day, in Moscow or Berlin or Iowa City or some distant Chinese province, doctors take nose and throat samples from flu sufferers, pack them in plastic vials, and send them to laboratories to be tested. The labs send isolates of the most interesting cases to the C.D.C. or to one of three other national labs working with W.H.O., in Tokyo, Melbourne, and London, for complete analysis, from which virus family trees are constructed.
Every known subtype of h and n has been identified and numbered, and every known strain has been labelled as well, with the city or the place-name where it was first isolated. If you got the flu last winter, for example, chances are you came down with h3n2 A/Wuhan/359/95; that is, a virus with No. 3 hemagglutinin, No. 2 neuraminidase, which was the three-hundred-and-fifty-ninth sample isolated from the Wuhan area of China in 1995. (The Wuhans were very big last year.) If you got the flu two years ago, on the other hand, chances are that you came down with something very similar to h3n2 A/Johannesburg/33/94.
At the Flu Meeting, the C.D.C. presented a road map of where the virus had travelled, and what forms it had taken during the previous year. It's a fantastically detailed account, in which the flu virus comes across as a malevolent hitchhiker, stopping only to infect the locals before moving on. "February, there was a ship outbreak, the U.S.S.Arkansas, so severe that they brought the ship back into port," Helen Regnery, the chief of the C.D.C.'s surveillance section, told the meeting as she explained the travels of the h3n2 strain throughout America last year. "The people on board the ship, almost one hundred per cent, were ill, with varying degrees of severity of illness." On an overhead projector was a list of all the known offspring of the American h3n2 family, and Regnery pointed to another strain. "The Alaska/02 was an isolate in July. It is from a sporadic case and it has been sequenced, and will be on the sequencing tree. Hawaii in July had a nursing-home outbreak and increased activities.... Wisconsin, at a university, had an outbreak in September. New York/43 is from an H.I.V.-positive patient in November." New Jersey followed, and then Indiana and Texas.
Regnery's road map was intended to give the F.D.A. and vaccine makers a guide to the upcoming flu season. Vaccines consist, essentially, of a dose of virus that has been chemically deactivated, so that it will stimulate the immune system without causing disease. She was helping them to decide what virus strain to use. But if you want to inoculate a hundred million people you've got to grow enough virus to make a hundred million flu shots, and that takes time. Drug companies grow the virus in chicken eggs, injecting a microscopic droplet of flu virus into the air sac above the embryo and the yolk. There, in the nutrient-rich membrane of the sac, the virus grows until, after two or three days, the original droplet has become a tablespoonful. At that point, the tops of the eggs are lopped off and the virus is suctioned out. Mary Ritchey, an executive at Wyeth-Ayerst, one of the nation's biggest flu-vaccine makers, told me that her company might use a hundred and fifty thousand eggs at a time, from which it might harvest two hundred and fifty gallons of pure virus. To supply the entire country with enough virus, vaccine makers have to do dozens of those batches, totalling millions of eggs. Then they have to purify the virus, test it, run it by the F.D.A., and then have it packaged, labelled, and sent to clinics around the country--all of which takes at least six months.
If the drug companies are going to have a flu shot ready for the fall flu season, then, they have to be told what strains to use by February or March. That means that the W.H.O.'s international surveillance teams have to guess what's going to happen in the fall based on what they have seen the previous winter. There was a time, ten or twenty years ago, when this process was notoriously inexact: a flu shot might be prepared in the summer that offered only marginal protection against the flu strains that surfaced the following fall. With an improved surveillance system and more sophisticated genetic analysis, though, that has now changed. Every year, the C.D.C. gives itself a grade based on how closely the guesses made at the Flu Meeting correlate with the actual flu in the fall. For the last four years, those grades have been perfect.
If something like the Spanish flu ever came back, this is the system we are relying on to protect us. Right now, virtually all the flu in the human population is either h1n1 or h3n2, so the road map presented by the C.D.C. at the Flu Meeting was almost entirely an account of genetic drift within those two families. The minute that the C.D.C. or a W.H.O. laboratory received a flu that didn't fall into the h1n1 or h3n2 families, it would sound the alarm. The surveillance system is also specifically focussed on those parts of the world where flu is prevalent and the inter-species movement that creates pandemic strains is more likely to occur. That means China, where there are as many ducks as people, and where pigs are often raised on farms in close proximity to wild and domestic poultry. China has been the source of the last two pandemics, and most observers think it likely that the next will be from there as well, possibly arising out of the marshy resting sites for ducks both along the nation's eastern seaboard and inland in an arc extending from Gansu Province to Guangxi, on the southern coast. Over the past few years, the Centers for Disease Control has funded ten flu laboratories in China. The number of strains sent to the C.D.C. from China every year has now reached two hundred, up from about a dozen several years ago.
Perhaps more important, flu-watchers have a sense of when to be on the lookout for new and vicious flu strains, because any kind of major social upheaval can serve as a pandemic breeding ground. This is probably what happened with the Spanish flu: the 1918 virus was the result of a shift to h1n1. But that alone doesn't explain its lethality. In the 1957 Asian-flu epidemic, h1n1 shifted to h2n2, and not nearly as many people died. The difference was probably the First World War.
As the Amherst College biologist Paul Ewald argues in his brilliant 1994 book, "Evolution of Infectious Diseases," under normal circumstances the mildest offspring of any flu family will always triumph, because people who are infected with the worst strains go home and go to bed, whereas people infected with the mild strains go to work, ride the bus, and go to the movies. You're much more likely, in other words, to catch a mild virus than a nasty virus because you're more likely to run into someone with a mild case of flu than with a nasty case of flu. In 1918, Ewald says, these rules got inverted by the war. The Spanish flu turned nasty in the late summer in France. A mild strain of flu spreading from soldier to soldier in the trenches stayed in the trenches because none of the soldiers got so sick that they had to leave their posts. A debilitating strain, though, resulted in a soldier's being shipped out in a crowded troop transport, then moved to an even more crowded hospital, where he had every opportunity to infect others. Wars and refugee camps and urban overcrowding give the worst flu strains a huge evolutionary advantage. If there were ever again a civil war in China, flu-watchers would be on full alert.
It doesn't take much, however, to see that our pandemic preparedness is not foolproof. What if, for example, the new strain emerges not in the spring but in midsummer? How, under those circumstances, could a vaccine be made in time for the fall, which is when--for reasons that are unclear--any flu in temperate zones tends to strike in earnest? And what if it didn't emerge from China, where there is good surveillance, but from Africa, say, where neither the C.D.C. nor any other W.H.O. center has the infrastructure to monitor flu strains? Most troubling, though, is that knowing a virus's type and source doesn't tell you nearly enough.
On May 10th of this year, for example, a three-year-old boy from Hong Kong's New Territories came down with the flu. He died on May 21st with what looked like an unusual and severe case of viral pneumonia, compounded by Reye's syndrome. A routine respiratory sample was taken from the boy's body and analyzed at Hong Kong's Queen Mary Hospital. It didn't seem to be either h1n1 or h3n2, though, and the doctor, puzzled, forwarded the sample to the Centers for Disease Control in early July, and also to other flu labs in the Netherlands and London. "The doctor said she had a virus that was reacting differently with the reagents," the C.D.C.'s Helen Regnery told me. "This happens sometimes, and all viruses that don't behave well are sent here right away. But at the time the kind of reaction was such that we thought it might be a human strain. Then we got another batch from Hong Kong reacting the same way. Now it was a red flag: she'd identified two others. She E-mailed me, asking if we could confirm that particular isolate." This was on the first Friday in August. The lab staff worked through the weekend. On Monday, Regnery got a call from researchers at the flu lab in the Netherlands. They had identified the boy's virus. It was a pure avian flu, one that had never been seen in humans before: h5n1. "He may have had direct contact with chickens that were sick," Regnery said. "They had chickens at his preschool. We found this out in a conference call the other night. The epidemiologists we have over there tracked down the day-care center, and they found some sick chickens."
When I visited the C.D.C. recently, Regnery and other senior C.D.C. officials I spoke with were careful not to be alarmist when they discussed the case. The two subsequent isolates sent from Hong Kong turned out to be human flu, they told me, and none of the other children at the preschool got sick. Some members of the boy's family had mild respiratory ailments immediately before his death, but none appeared to have caught the same flu. A C.D.C. team of three epidemiologists and a virologist recently returned from a three-week trip to Hong Kong and mainland China to help the local health authorities investigate and collect serum samples to see if anyone else was exposed, but so far nothing has come up. "For there to be a pandemic, there has to be a strain to which all or most of the population has no immunity, and that is capable of spreading from person to person," Nancy Arden, a senior epidemiologist at the C.D.C., told me. "So far, this doesn't meet the second criterion."
Nonetheless, the situation was a little disturbing. Ducks fly across virtually every continent in the world, dive-bombing the landmass with flu virus. They pass it to chickens and chickens come in close contact with humans every day--on farms, in poultry markets, in chicken-processing plants. If avian flu can't infect humans, of course, this is irrelevant. But what if the Hong Kong case means that there may now be strains of avian flu which can infect humans directly? Arden said, "There's still a question of whether avian strains may be evolving to the point where they can replicate in humans and where a strain could be transmitted from bird to person. And, once an avian virus gets into a mammal, it's possible that the evolution would be speeded up. No one's so alarmed that they're saying this is the start of the next pandemic. But it's not something anyone would want to be complacent about."
Then, there's the type of flu the boy got. Avian h5 is famous among virologists as the strain that passed to domestic chickens in Pennsylvania in 1983, apparently from wild ducks. Originally, it was harmless. But as it raced from chicken to chicken in the giant commercial chicken warehouses it underwent an unusual mutation. Instead of just infecting the cells of the chicken's intestinal tract, the virus became systemic--capable of infecting all the cells in a chicken's body. The eyes of the chickens became swollen. The chickens had difficulty breathing. They stopped laying eggs. They became weak, and in some cases blood spots appeared in their eyes and on their legs. Upon autopsy, it turned out that they had been hemorrhaging throughout their bodies. In a matter of months, seventeen million chickens died or had to be destroyed. "It was chicken Ebola," the flu expert Robert Webster told me. The little boy's h5 doesn't seem capable of the same destruction in humans. But is there any other type of h5 that would be? And, if there is, what genetic clues in the virus would tip us off? This is the big unanswered question behind our plans for the next pandemic, and it is also, of course, the big unanswered question that drives the search for the Spanish flu. What is it, specifically, that turns influenza into a killer?
To get to Longyearbyen, you fly from Oslo for about an hour and a half to Tromsø, a small town on Norway's northern coast, and then for another ninety minutes over the Norwegian Sea to the Longyearbyen airport. The second leg of the trip retraces by air the route that the Forsete took seventy-nine years ago. It is an extraordinary journey. First, the choppy, frigid waters of the Norwegian Sea and then, out of the Arctic mist, the forbidding mountains and glaciers of Svalbard. Longyearbyen is at Svalbard's southernmost point, huddled on the fringes of the island of Spitsbergen, a small gray stain in a blanket of white. On a clear day, from the air, it seems as if you could see the North Pole.
All this, of course, is what is so strange about the Spanish flu--that after killing so many it must now be sought out at the ends of the earth. Crosby, in the final chapter of his book on the pandemic, wonders about the disappearance of the pandemic from the American memory as well. In the Readers' Guide to Periodical Literature, 1919-21, he reports, there are thirteen inches of column space devoted to citations of baseball stories, forty-seven inches devoted to Prohibition, twenty inches devoted to Bolshevism, and eight inches devoted to the flu. John Dos Passos, who crossed the Atlantic on a troopship on which soldiers were dying of the Spanish flu every day, has just one reference to the flu in his novel "1919" and a brief mention of the pandemic in his fictionalized war memoir, "Three Soldiers." The pandemic is largely absent from the writing of Fitzgerald, Faulkner, and Hemingway as well, all of whom witnessed its savagery at first hand. "The average college graduate born since 1918 literally knows more about the Black Death of the fourteenth century than the World War I pandemic," Crosby writes. He offers a number of explanations for this. In the end, though, he concludes that the virus's figurative disappearance is of a piece with its literal disappearance, that we don't remember it because we can't find it. The Longyearbyen expedition is, if nothing else, an attempt to recover our memory of the Spanish flu.
Kirsty Duncan made the trip to Svalbard for the first time last spring. She had written to the governor of Svalbard seven months before, who had, in turn, approached the Norwegian medical-research community, the church in Longyearbyen, the church council, the bishop, the town council, and the victims' families and secured approval from each. She flew to Norway in May. "I had been in Longyearbyen about a day before I went to see the minister of the church, and I was really concerned about meeting him, because of what I was asking to do," she told me. "I introduced myself, and I said, 'I hope in no way have I offended you or the church,' and he said, 'No. This is exciting, this is important work. It has to be done,' and I was so relieved, and then he asked me if I had been to the cemetery, and I said no, that I had no right to go there until I had spoken to him, and he said, 'You go.'"
The cemetery is a ten-minute walk from the church, along a gravel road that runs parallel to the mountain. You walk away from the water and the docks, and toward the glacier, and for the entire walk the cemetery is straight ahead--a lonely stand of crosses climbing up the side of the mountain. When Duncan talked about that walk from the church to the graveyard, she looked away, her eyes misting up and her voice catching with emotion. "It was May, and everything was completely ice-covered. Completely white. Longyearbyen is in a valley, and the cemetery is up on the side of the valley floor. I knew that the seven graves I was interested in were the last seven graves at the top of the cemetery, and walking up there"--she stopped for a moment--"walking up there was really hard. I was just one year older than the oldest of them, and going to look at them made me realize that they had just come of age. You think about how they were just beginning their lives. And then you see those crosses."
The Pima Paradox
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
February 2, 1998
ANNALS OF MEDICINE
Can we learn how to lose weight from one of
the most obese people in the world?
1.
Sacton lies in the center of Arizona, just off interstate 10, on the Gila River reservation of the Pima Indian tribe. It is a small town, dusty and unremarkable, which looks as if it had been blown there by a gust of desert wind. Shacks and plywood bungalows are scattered along a dirt-and-asphalt grid. Dogs crisscross the streets. Back yards are filled with rusted trucks and junk. The desert in these parts is scruffy and barren, drained of water by the rapid growth of Phoenix, just half an hour's drive to the north. The nearby Gila River is dry, and the fields of wheat and cushaw squash and tepary beans which the Pima used to cultivate are long gone. The only prepossessing building in Sacaton is a gleaming low-slung modern structure on the outskirts of town--the Hu Hu Kam Memorial Hospital. There is nothing bigger or more impressive for miles, and that is appropriate, since medicine is what has brought Sacaton any wisp of renown it has.
Thirty-five years ago, a team of National Institutes of Health researchers arrived in Sacaton to study rheumatoid arthritis. They wanted to see whether the Pima had higher or lower rates of the disease than the Blackfoot of Montana. A third of the way through their survey, however, they realized that they had stumbled on something altogether strange--a population in the grip of a plague. Two years later, the N.I.H. returned to the Gila River Indian Reservation in force. An exhaustive epidemiological expedition was launched, in which thousands of Pima were examined every two years by government scientists, their weight and height and blood pressure checked, their blood sugar monitored, and their eyes and kidneys scrutinized. In Phoenix, a modern medical center devoted to Native Americans was built; on its top floor, the N.I.H. installed a state-of-the-art research lab, including the first metabolic chamber in North America--a sealed room in which to measure the precise energy intake and expenditure of Pima research subjects. Genetic samples were taken; family histories were mapped; patterns of illness and death were traced from relative to relative and generation to generation. Today, the original study group has grown from four thousand people to seven thousand five hundred, and so many new studies have been added to the old that the total number of research papers arising from the Gila River reservation takes up almost forty feet of shelf space in the N.I.H. library in Phoenix.
The Pima are famous now--famous for being fatter than any other group in the world, with the exception only of the Nauru islanders of the West Pacific. Among those over thirty- five on the reservation, the rate of diabetes, the disease most closely associated with obesity, is fifty per cent, eight times the national average and a figure unmatched in medical history. It is not unheard of in Sacaton for adults to weigh five hundred pounds, for teen-agers to be suffering from diabetes, or for relatively young men and women to be already disabled by the disease--to be blind, to have lost a limb, to be confined to a wheelchair, or to be dependent on kidney dialysis.
When I visited the town, on a monotonously bright desert day not long ago, I watched a group of children on a playing field behind the middle school moving at what seemed to be half speed, their generous shirts and baggy jeans barely concealing their bulk. At the hospital, one of the tribe's public-health workers told me that when she began an education program on nutrition several years ago she wanted to start with second graders, to catch the children before it was too late. "We were under the delusion that kids didn't gain weight until the second grade," she said, shaking her head. "But then we realized we'd have to go younger. Those kids couldn't run around the block."
From the beginning, the N.I.H. researchers have hoped that if they can understand why the Pima are so obese they can better understand obesity in the rest of us; the assumption is that obesity in the Pima is different only in degree, not in kind. One hypothesis for the Pima's plight, favored by Eric Ravussin, of the N.I.H.'s Phoenix team, is that after generations of living in the desert the only Pima who survived famine and drought were those highly adept at storing fat in times of plenty. Under normal circumstances, this disposition was kept in check by the Pima's traditional diet: cholla-cactus buds, honey mesquite, povertyweed, and prickly pears from the desert floor; mule deer, white-winged dove, and black-tailed jackrabbit; squawfish from the Gila River; and wheat, squash, and beans grown in irrigated desert fields. By the end of the Second World War, however, the Pima had almost entirely left the land, and they began to eat like other Americans. Their traditional diet had been fifteen to twenty per cent fat. Their new diet was closer to forty per cent fat. Famine, which had long been a recurrent condition, gave way to permanent plenty, and so the Pima's "thrifty" genes, once an advantage, were now a liability. N.I.H. researchers are trying to find these genes, on the theory that they may be the same genes that contribute to obesity in the rest of us. Their studies at Sacaton have also uncovered valuable clues to how diabetes works, how obesity in pregnant women affects their children, and how human metabolism is altered by weight gain. All told, the collaboration between the N.I.H. and the Pima is one of the most fruitful relationships in modern medical science--with one fateful exception. After thirty-five years, no one has had any success helping the Pima lose weight. For all the prodding and poking, the hundreds of research papers describing their bodily processes, and the determined efforts of health workers, year after year the tribe grows fatter.
"I used to be a nurse, I used to work in the clinic, I used to be all gung ho about going out and teaching people about diabetics and obesity," Teresa Wall, who heads the tribe's public-health department, told me. "I thought that was all people needed--information. But they weren't interested. They had other issues." Wall is a Pima, short and stocky, who has long, straight black hair, worn halfway down her back. She spoke softly. "There's something missing. It's one thing to say to people, 'This is what you should do.' It's another to actually get them to take it in."
The Pima have built a new wellness center in downtown Sacaton, with a weight room and a gymnasium. They now have an education program on nutrition aimed at preschoolers and first graders, and at all tribal functions signs identify healthful food choices--a tray of vegetables or of fruit, say. They are doing, in other words, what public-health professionals are supposed to be doing. But results are hard to see.
"We've had kids who were diabetic, whose mothers had diabetes and were on dialysis and had died of kidney failure," one of the tribe's nutritionists told me. "You'd think that that would make a difference--that it would motivate them to keep their diet under control. It doesn't." She got up from her desk, walked to a bookshelf, and pulled out two bottles of Coca-Cola. One was an old glass bottle. The other was a modern plastic bottle, which towered over it. "The original Coke bottle, in the nineteen-thirties, was six and a half ounces." She held up the plastic bottle. "Now they are marketing one litre as a single serving. That's five times the original serving size. The McDonald's regular hamburger is two hundred and sixty calories, but now you've got the double cheeseburger, which is four hundred and forty-five calories. Portion sizes are getting way out of whack. Eating is not about hunger anymore. The fact that people are hungry is way down on the list of why they eat." I told her that I had come to Sacaton, the front lines of the weight battle, in order to find out what really works in fighting obesity. She looked at me and shrugged. "We're the last people who could tell you that," she said.
In the early nineteen-sixties, at about the time the N.I.H. team stumbled on the Pima, seventeen per cent of middle-aged Americans met the clinical definition of obesity. Today, that figure is 32.3 per cent. Between the early nineteen-seventies and the early nineteen-nineties, the percentage of preschool girls who were overweight went from 5.8 per cent to ten per cent. The number of Americans who fall into what epidemiologists call Class Three Obesity--that is, people too grossly overweight, say, to fit into an airline seat--has risen three hundred and fifty per cent in the past thirty years. "We've looked at trends by educational level, race, and ethnic group, we've compared smokers and non-smokers, and it's very hard to say that there is any group that is not experiencing this kind of weight gain," Katherine Flegal, a senior research epidemiologist at the National Center for Health Statistics, says. "It's all over the world. In China, the prevalence of obesity is vanishingly low, yet they are showing an increase. In Western Samoa, it is very high, and they are showing an increase." In the same period, science has unlocked many of obesity's secrets, the American public has been given a thorough education in the principles of good nutrition, health clubs have sprung up from one end of the country to another, dieting has become a religion, and health food a marketing phenomenon. None of it has mattered. It is the Pima paradox: in the fight against obesity all the things that worked in curbing behaviors like drunk driving and smoking and in encouraging things like safe sex and the use of seat belts--education, awareness, motivation--don't seem to work. For one reason or another, we cannot stop eating. "Since many people cannot lose much weight no matter how hard they try, and promptly regain whatever they do lose," the editors of The New England Journal of Medicine wearily concluded last month, "the vast amount of money spent on diet clubs, special foods and over-the-counter remedies, estimated to be on the order of $30 billion to $50 billion yearly, is wasted." Who could argue? If the Pima--who are surrounded by the immediate and tangible consequences of obesity, who have every conceivable motivation--can't stop themselves from eating their way to illness, what hope is there for the rest of us?
In the scientific literature, there is something called Gourmand Syndrome--a neurological condition caused by anterior brain lesions and characterized by an unusual passion for eating. The syndrome was described in a recent issue of the journal Neurology, and the irrational, seemingly uncontrollable obsession with food evinced by its victims seems a perfect metaphor for the irrational, apparently uncontrollable obsession with food which seems to have overtaken American society as a whole. Here is a diary entry from a Gourmand Syndrome patient, a fifty-five-year-old stroke victim who had previously displayed no more than a perfunctory interest in food.
After I could stand on my feet again, I dreamt to go downtown and sit down in this well-known restaurant. There I would get a beer, sausage, and potatoes. Slowly my diet improved again and thus did quality of life. The day after discharge, my first trip brought me to this restaurant, and here I order potato salad, sausage, and a beer. I feel wonderful. My spouse anxiously registers everything I eat and nibble. It irritates me. A few steps down the street, we enter a coffee-house. My hand is reaching for a pastry, my wife's hand reaches between. Through the window I see my bank. If I choose, I could buy all the pastry I wanted, including the whole store. The creamy pastry slips from the foil like a mermaid. I take a bite.
2.
Is there an easy way out of this problem? Every year, millions of Americans buy books outlining new approaches to nutrition and diet, nearly all of which are based on the idea that overcoming our obsession with food is really just a matter of technique: that the right foods eaten in the right combination can succeed where more traditional approaches to nutrition have failed. A cynic would say, of course, that the seemingly endless supply of these books proves their lack of efficacy, since if one of these diets actually worked there would be no need for another. But that's not quite fair. After all, the medical establishment, too, has been giving Americans nutritional advice without visible effect. We have been told that we must not take in more calories than we burn, that we cannot lose weight if we don't exercise consistently, that an excess of eggs, red meat, cheese, and fried food clogs arteries, that fresh vegetables and fruits help to ward off cancer, that fibre is good and sugar is bad and whole-wheat bread is better than white bread. That few of us are able to actually follow this advice is either our fault or the fault of the advice. Medical orthodoxy, naturally, tends toward the former position. Diet books tend toward the latter. Given how often the medical orthodoxy has been wrong in the past, that position is not, on its face, irrational. It's worth finding out whether it is true.
Arguably the most popular diet of the moment, for example, is one invented by the biotechnology entrepreneur Barry Sears. Sears's first book, "The Zone," written with Bill Lawren, sold a million and a half copies and has been translated into fourteen languages. His second book, "Mastering the Zone," was on the best-seller lists for eleven weeks. Madonna is rumored to be on the Zone diet, and so are Howard Stern and President Clinton, and if you walk into almost any major bookstore in the country right now Sears's two best-sellers--plus a new book, "Zone Perfect Meals in Minutes"--will quite likely be featured on a display table near the front. They are ambitious books, filled with technical discussions of food chemistry, metabolism, evolutionary theory, and obscure scientific studies, all apparently serving as proof of the idea that through careful management of"the most powerful and ubiquitous drug we have: food" we can enter a kind of high-efficiency, optimal metabolic state--the Zone.
The key to entering the Zone, according to Sears, is limiting your carbohydrates. When you eat carbohydrates, he writes, you stimulate the production of insulin, and insulin is a hormone that evolved to put aside excess carbohydrate calories in the form of fat in case of future famine. So the insulin that's stimulated by excess carbohydrates aggressively promotes the accumulation of body fat. In other words, when we eat too much carbohydrate, we're essentially sending a hormonal message, via insulin, to the body (actually to the adipose cells). The message: "Store fat."
His solution is a diet in which carbohydrates make up no more than forty per cent of all calories consumed (as opposed to the fifty per cent or more consumed by most Americans), with fat and protein coming to thirty per cent each. Maintaining that precise four-to-three ratio between carbohydrates and protein is, in Sears's opinion, critical for keeping insulin in check. "The Zone" includes all kinds of complicated instructions to help readers figure out how to do things like calculate their precise protein requirements in restaurants. ("Start with the protein, using the palm of your hand as a guide. The amount of protein that can fit into your palm is usually four protein blocks. That's about one chicken breast or 4 ounces sliced turkey.")
It should be said that the kind of diet Sears suggests is perfectly nutritious. Following the Zone diet, you'll eat lots of fibre, fresh fruit, fresh vegetables, and fish, and very little red meat. Good nutrition, though, isn't really the point. Sears's argument is that being in the Zone can induce permanent weight loss--that by controlling carbohydrates and the production of insulin you can break your obsession with food and fundamentally alter the way your body works. "Weight loss . . . can be an ongoing and usually frustrating struggle for most people," he writes. "In the Zone it is painless, almost automatic."
Does the Zone exist? Yes and no. Certainly, if people start eating a more healthful diet they'll feel better about themselves. But the idea that there is something magical about keeping insulin within a specific range is a little strange. Insulin is simply a hormone that regulates the storage of energy. Precisely how much insulin you need to store carbohydrates is dependent on all kinds of things, including how fit you are and whether, like many diabetics, you have a genetic predisposition toward insulin resistance. Generally speaking, the heavier and more out of shape you are, the more insulin your body needs to do its job. The Pima have a problem with obesity and that makes their problem with diabetes worse--not the other way around. High levels of insulin are the result of obesity. They aren't the cause of obesity. When I read the insulin section of "The Zone" to Gerald Reaven, an emeritus professor of medicine at Stanford University, who is acknowledged to be the country's leading insulin expert, I could hear him grinding his teeth. "I had the experience ofbeing on a panel discussion with Sears, and I couldn't believe the stuff that comes out of this guy's mouth," he said. "I think he's full of it."
What Sears would have us believe is that when it comes to weight loss your body treats some kinds of calories differently from others--that the combination of the food we eat is more critical than the amount. To this end, he cites what he calls an "amazing" and "landmark" study published in 1956 in the British medical journal Lancet. (It should be a tipoff that the best corroborating research he can come up with here is more than forty years old.) In the study, a couple of researchers compared the effects of two different thousand-calorie diets--the first high in fat and protein and low in carbohydrates, and the second low in fat and protein and high in carbohydrates--on two groups of obese men. After eight to ten days, the men on the low-carbohydrate diet had lost more weight than the men on the high-carbohydrate diet. Sears concludes from the study that if you want to lose weight you should eat protein and shun carbohydrates. Actually, it shows nothing of the sort. Carbohydrates promote water retention; protein acts like a diuretic. Over a week or so, someone on a high-protein diet will always look better than someone on a high-carbohydrate diet, simply because of dehydration. When a similar study was conducted several years later, researchers found that after about three weeks--when the effects of dehydration had evened out--the weight loss on the two diets was virtually identical. The key isn't how you eat, in other words; it's how much you eat. Calories, not carbohydrates, are still what matters. The dirty little secret of the Zone system is that, despite Sears's expostulations about insulin, all he has done is come up with another low-calorie diet. He doesn't do the math for his readers, but some nutritionists have calculated that if you follow Sears's prescriptions religiously you'll take in at most seventeen hundred calories a day, and at seventeen hundred calories a day virtually anyone can lose weight. The problem with low-calorie diets, of course, is that no one can stay on them for very long. Just ask Sears. "Diets based on choice restriction and calorie limits usually fail," he writes in the second chapter of"The Zone," just as he is about to present his own choice-restricted and calorie-limited diet. "People on restrictive diets get tired of feeling hungry and deprived. They go off their diets, put the weight back on (primarily, as increased body fat) and then feel bad about themselves for not having enough will power, discipline, or motivation."
These are not, however, the kinds of contradiction that seem to bother Sears. His first book's dust jacket claims that in the Zone you can "reset your genetic code" and "burn more fat watching TV than by exercising." By the time he's finished, Sears has held up his diet as the answer to virtually every medical ill facing Western society, from heart disease to cancer and on to alcoholism and PMS. He writes, "Dr. Paul Kahl, the same physician with whom I did the aids pilot study"--yes, Sears's diet is just the thing for aids, too--"told me the story of one of his patients, a fifty-year-old woman with MS."
Paul put her on a Zone-favorable diet, and after a few months on the program she came in for a checkup. Paul asked the basic question: "How are you feeling?" Her answer was "Great!" Noticing that she was still using a cane for stability, Paul asked her, "If you're feeling so great, why are you still using the cane?" Her only response was that since developing MS she always had. Paul took the cane away and told her to walk to the end of the hallway and back. After a few tentative steps, she made the round trip quickly. When Paul asked her if she wanted her cane back, she just smiled and told him to keep it for someone who really needed it.
Put down your carbohydrates and walk!
It is hard, while reading this kind of thing, to escape the conclusion that what is said in a diet book somehow matters less than how it's said. Sears, after all, isn't the only diet specialist who seems to be making things up. They all seem to be making things up. But if you read a large number of popular diet books in succession, what is striking is that they all seem to be making things up in precisely the same way. It is as if the diet-book genre had an unspoken set of narrative rules and conventions, and all that matters is how skillfully those rules and conventions are adhered to. Sears, for example, begins fearful and despondent, his father dead of a heart attack at fifty-three, a "sword of Damocles" over his head. Judy Moscovitz, author of "The Rice Diet Report" (three months on the Times best-seller list), tells us, "I was always the fattest kid in the class, and I knew all the pain that only a fat kid can know.... I was always the last one reluctantly chosen for the teams." Martin Katahn, in his best-seller "The Rotation Diet," writes, "I was one of those fat kids who had no memory of ever being thin. Instead, I have memories such as not being able to run fast enough to keep up with my playmates, being chosen last for all games that required physical movement."
Out of that darkness comes light: the Eureka Moment, when the author explains how he stumbled on the radical truth that inpired his diet. Sears found himself in the library of the Boston University School of Medicine, reading everything he could on the subject: "I had no preconceptions, no base of knowledge to work from, so I read everything. I eventually came across an obscure report..." Rachael Heller, who was a co-author of the best-selling "The Carbohydrate Addict's Diet" (and, incidentally, so fat growing up that she was "always the last one picked for the team"), was at home in bed when her doctor called, postponing her appointment and thereby setting in motion an extraordinary chain of events that involved veal parmigiana, a Greek salad, and two French crullers: "I will always be grateful for that particular arrangement of circumstances.... Sometimes we are fortunate enough to recognize and take advantage of them, sometimes not. This time I did. I believe it saved my life." Harvey Diamond, the co-author of the three-million-copy-selling "Fit for Life," was at a music festival two thousand miles from home, when he happened to overhear two people in front of him discussing the theories of a friend in Santa Barbara: "'Excuse me,' I interrupted, 'who is this fellow you are discussing?' In less than twenty-four hours I was on my way to Santa Barbara. Little did I know that I was on the brink of one of the most remarkable discoveries of my life."
The Eureka Moment is followed, typically within a few pages, by the Patent Claim--the point at which the author shows why his Eureka Moment, which explains how weight can be lost without sacrifice, is different from the Eureka Moment of all those other diet books explaining how weight can be lost without sacrifice. This is harder than it appears. Dieters are actually attracted to the idea of discipline, because they attribute their condition to a failure of discipline. It's just that they know themselves well enough to realize that if a diet requires discipline they won't be able to follow it. At the same time, of course, even as the dieter realizes that what he is looking for--discipline without the discipline--has never been possible, he still clings to the hope that someday it might be. The Patent Claim must negotiate both paradoxes. Here is Sears, in his deft six-paragraph Patent Claim: "These are not unique claims. The proponents of every new diet that comes along say essentially the same thing. But if you're reading this book, you probably know that these diets don't really work."Why don't they work? Because they "violate the basic biochemical laws required to enter the Zone."Other diets don't have discipline. The Zone does. Yet, he adds, "The beauty of the dietary system presented in this book is that . . . it doesn't call for a great deal of the kind of unrealistic self- sacrifice that causes many people to fall off the diet wagon. . . . In fact, I can even show you how to stay within these dietary guidelines while eating at fast-food restaurants." It is the very discipline of the Zone system that allows its adherent to lose weight without discipline.
Or consider this from Adele Puhn's recent runaway best- seller, "The 5-Day Miracle Diet." America's No. 1 diet myth, she writes, is that "you have to deprive yourself to lose weight":
Even though countless diet programs have said you can have your cake and eat it, too, in your heart of hearts, you have that "nibbling" doubt: For a diet to really work, you have to sacrifice. I know. I bought into this myth for a long time myself. And the fact is that on every other diet, deprivation is involved. Motivation can only take you so far. Eventually you're going to grab for that extra piece of cake, that box of cookies, that cheeseburger and fries. But not the 5-Day Miracle Diet.
Let us pause and savor the five-hundred-and-forty-degree rhetorical triple gainer taken in those few sentences: (1) the idea that diet involves sacrifice is a myth; (2) all diets, to be sure, say that on their diets dieting without sacrifice is not a myth; (3) but you believe that dieting without sacrifice is a myth; (4) and I, too, believed that dieting without sacrifice is a myth; (5) because in fact on all diets dieting without sacrifice is a myth; (6) except on my diet, where dieting without sacrifice is not a myth.
The expository sequence that these books follow--last one picked, moment of enlightenment, assertion of the one true way--finally amounts to nothing less than a conversion narrative. In conception and execution, diet books are self- consciously theological. (Whom did Harvey Diamond meet after his impulsive, desperate mission to Santa Barbara? A man he will only identify, pseudonymously and mysteriously, as Mr. Jensen, an ethereal figure with "clear eyes, radiant skin, serene demeanor and well-proportioned body.") It is the appropriation of this religious narrative that permits the suspension of disbelief.
There is a more general explanation for all this in the psychological literature--a phenomenon that might be called the Photocopier Effect, after the experiments of the Harvard social scientist Ellen Langer. Langer examined the apparently common-sense idea that if you are trying to persuade someone to do something for you, you are always better off if you provide a reason. She went up to a group of people waiting in line to use a library copying machine and said, "Excuse me, I have five pages. May I use the Xerox machine?" Sixty per cent said yes. Then she repeated the experiment on another group, except that she changed her request to "Excuse me, I have five pages. May I use the Xerox machine, because I'm in a rush?" Ninety-four per cent said yes. This much sounds like common sense: if you say, "because I'm in a rush"--if you explain your need--people are willing to step aside. But here's where the study gets interesting. Langer then did the experiment a third time, in this case replacing the specific reason with a statement of the obvious: "Excuse me, I have five pages. May I use the Xerox machine, because I have to make some copies?" The percentage who let her do so this time was almost exactly the same as the one in the previous round--ninety-three per cent. The key to getting people to say yes, in other words, wasn't the explanation "because I'm in a rush" but merely the use of the word "because." What mattered wasn't the substance of the explanation but merely the rhetorical form--the conjunctional footprint--of an explanation.
Isn't this how diet books work? Consider the following paragraph, taken at random from "The Zone":
In paracrine hormonal responses, the hormone travels only a very short distance from a secreting cell to a target cell. Because of the short distance between the secreting cell and the target cell, paracrine responses don't need the long-distance capabilities of the bloodstream. Instead, they use the body's version of a regional system: the paracrine system. Finally, there are the autocrine hormone systems, analogous to the cord that links the handset of the phone to the phone itself. Here the secreting cells release a hormone that comes immediately back to affect the secreting cell itself.
Don't worry if you can't follow what Sears is talking about here--following isn't really the point. It is enough that he is using the word "because."
3.
If there is any book that defines the diet genre, however, it is "Dr. Atkins' New Diet Revolution." Here is the conversion narrative at its finest. Dr. Atkins, a humble corporate physician, is fat. ("I had three chins.") He begins searching for answers. ("One evening I read about the work that Dr. Garfield Duncan had done in nutrition at the University of Pennsylvania. Fasting patients, he reported, lose all sense of hunger after forty-eight hours without food. That stunned me. . . . That defied logic.") He tests his unorthodox views on himself. As if by magic, he loses weight. He tests his unorthodox views on a group of executives at A.T. & T. As if by magic, they lose weight. Incredibly, he has come up with a diet that "produces steady weight loss" while setting "no limit on the amount of food you can eat." In 1972, inspired by his vision, he puts pen to paper. The result is "Dr. Atkins' Diet Revolution," one of the fifty best-selling books of all time. In the early nineties, he publishes "Dr. Atkins' New Diet Revolution," which sells more than three million copies and is on the Times best-seller list for almost all of 1997. More than two decades of scientific research into health and nutrition have elapsed in the interim, but Atkins' message has remained the same. Carbohydrates are bad. Everything else is good. Eat the hamburger, hold the bun. Eat the steak, hold the French fries. Here is the list of ingredients for one of his breakfast "weight loss" recommendations: scrambled eggs for six. Keep in mind that Atkins is probably the most influential diet doctor in the world.
12 link sausages (be sure they contain no sugar)
1 3-ounce package cream cheese
1 tablespoon butter
3/4 cup cream
1/4 cup water
1 teaspoon seasoned salt
2 teaspoons parsley
8 eggs, beaten
Atkins' Patent Claim centers on the magical weight-loss properties of something called "ketosis." When you eat carbohydrates, your body converts them into glycogen and stores them for ready use. If you are deprived of carbohydrates, however, your body has to turn to its own stores of fat and muscle for energy. Among the intermediate metabolic products of this fat breakdown are ketones, and when you produce lots of ketones, you're in ketosis. Since an accumulation of these chemicals swiftly becomes toxic, your body works very hard to get rid of them, either through the kidneys, as urine, or through the lungs, by exhaling, so people in ketosis commonly spend a lot of time in the bathroom and have breath that smells like rotten apples. Ketosis can also raise the risk of bone fracture and cardiac arrhythmia and can result in light-headedness, nausea, and the loss of nutrients like potassium and sodium. There is no doubt that you can lose weight while you're in ketosis. Between all that protein and those trips to the bathroom, you'll quickly become dehydrated and drop several pounds just through water loss. The nausea will probably curb your appetite. And if you do what Atkins says, and suddenly cut out virtually all carbohydrates, it will take a little while for your body to compensate for all those lost calories by demanding extra protein and fat. The weight loss isn't permanent, though. After a few weeks your body adjusts, and the weight--and your appetite--comes back.
For Atkins, however, ketosis is as "delightful as sex and sunshine," which is why he wants dieters to cut out carbohydrates almost entirely. (To avoid bad breath he recommends carrying chlorophyll tablets and purse-size aerosol breath fresheners at all times; to avoid other complications, he recommends regular blood tests.) Somehow, he has convinced himself that his kind of ketosis is different from the bad kind of ketosis, and that his ketosis can actually lead to permanent weight loss. Why he thinks this, however, is a little unclear. In "Dr. Atkins' Diet Revolution" he thought that the key was in the many trips to the bathroom:"Hundreds of calories are sneaked out of your body every day in the form of ketones and a host of other incompletely broken down molecules of fat. You are disposing of these calories not by work or violent exercise--but just by breathing and allowing your kidneys to function. All this is achieved merely by cutting out your carbohydrates." Unfortunately, the year after that original edition of Atkins' book came out, the American Medical Association published a devastating critique of this theory, pointing out, among other things, that ketone losses in the urine and the breath rarely exceed a hundred calories a day--a quantity, the A.M.A. pointed out, "that could not possibly account for the dramatic results claimed for such diets." In "Dr. Atkins' New Diet Revolution," not surprisingly, he's become rather vague on the subject, mysteriously invoking something he calls Fat Mobilizing Substance. Last year, when I interviewed him, he offered a new hypothesis: that ketosis takes more energy than conventional food metabolism does, and that it is "a much less efficient pathway to burn up your calories via stored fat than it is via glucose." But he didn't want to be pinned down. "Nobody has really been able to work out that mechanism as well as I would have liked,"he conceded.
Atkins is a big, white-haired man in his late sixties, well over six feet, with a barrel chest and a gruff, hard-edged voice. On the day we met, he was wearing a high-lapelled, four-button black suit. Given a holster and a six-shooter, he could have passed for the sheriff in a spaghetti western. He is an intimidating figure, his manner brusque and impatient. He gives the impression that he doesn't like having to explain his theories, that he finds the details tedious and unnecessary. Given the Photocopier Effect, of course, he is quite right. The appearance of an explanation is more important than the explanation itself. But Atkins seems to take this principle farther than anyone else.
For example, in an attempt to convince his readers that eating pork chops, steaks, duck, and rack of lamb in abundance is good for them, Atkins points out that primitive Eskimo cultures had virtually no heart disease, despite a high-fat diet of fish and seal meat. But one obvious explanation for the Eskimo paradox is that cold-water fish and seal meat are rich in n-3 fatty acids--the "good" kind of fat. Red meat, on the other hand, is rich in saturated fat--the "bad" kind of fat. That dietary fats come in different forms, some of which are particularly bad for you and some of which are not, is the kind of basic fact that seventh graders are taught in Introduction to Nutrition. Atkins has a whole chapter on dietary fat in "New Diet Revolution" and doesn't make the distinction once. All diet-book authors profit from the Photocopier Effect. Atkins lives it.
I watched Atkins recently as he conducted his daily one- hour radio show on New York's WEVD. We were in a Manhattan town house in the East Fifties, where he has his headquarters, in a sleek, modernist office filled with leather furniture and soapstone sculpture. He sat behind his desk--John Wayne in headphones--as his producer perched in front of him. It was a bravura performance. He spoke quickly and easily, glancing at his notes only briefly, and then deftly gave counsel to listeners around the region.
The first call came from George, on his car phone. George told Atkins his ratio of triglycerides to cholesterol. It wasn't good. George was a very unhealthy man. "You're in big trouble," Atkins said. "You have to change your diet. What do you generally eat? What's your breakfast?"
"I've stopped taking junk foods," George says. "I don't eat eggs. I don't eat bacon."
"Then that's-- See there." Atkins' voice rose in exasperation. "What do you have for breakfast?"
"I have skim milk, cereal, with banana."
"That's three carbs!" Atkins couldn't believe that in this day and age people were still consuming fruit and skim milk. "That's how you are getting into trouble!... What you need to do, George, seriously, is get ahold of'New Diet Revolution' and just read what it says."
Atkins took another call. This time, it was from Robert, forty-one years old, three hundred pounds, and possessed of a formidable Brooklyn accent. He was desperate to lose weight--up on a ledge and wanting Atkins to talk him down. "I really don't know anything about dieting," he said. "I'm getting a little discouraged."
"It's really very easy," Atkins told him, switching artfully to the Socratic method. "Do you like meat?"
"Yes."
"Could you eat a steak?"
"Yes."
"All by itself, without any French fries?"
"Yes."
"And let's say we threw in a salad, but you couldn't have any bread or anything else."
"Yeah, I could do that."
"Well, if you could go through life like that.... Do you like eggs in the morning? Or a cheese omelette?"
"Yes,"Robert said, his voice almost giddy with relief. He called expecting a life sentence of rice cakes. Now he was being sent forth to eat cheeseburgers. "Yes, I do!"
"If you just eat that way," Atkins told him, "you'll have eighty pounds off in six months."
When I first arrived at Atkins' headquarters, two members of his staff took me on a quick tour of the facility, a vast medical center, where Atkins administers concoctions of his own creation to people suffering from a variety of disorders. Starting from the fifth floor, we went down to the third, and then from the third to the second, taking the elevator each time. It's a small point, but it did strike me as odd that I should be in the headquarters of the world's most popular weight-loss expert and be taking the elevator one floor at a time. After watching Atkins' show, I was escorted out by his public-relations assistant. We were on the second floor. He pressed the elevator button, down. "Why don't we take the stairs?" I asked. It was just a suggestion. He looked at me and then at the series of closed doors along the corridor. Tentatively, he opened the second. "I think this is it," he said, and we headed down, first one flight and then another. At the base of the steps was a door. The P.R. man, a slender fellow in a beautiful Italian suit, peered through it: for the moment, he was utterly lost. We were in the basement. It seemed as if nobody had gone down those stairs in a long time.
4.
Why are the Pima so fat? The answer that diet books would give is that the Pima don't eat as well as they used to. But that's what is ultimately wrong with diet books. They talk as if food were the only cause of obesity and its only solution, and we know, from just looking at the Pima, that things are not that simple. The diet of the Pima is bad, but no worse than anyone else's diet.
Exercise is also clearly part of the explanation for why obesity has become epidemic in recent years. Half as many Americans walk to work today as did twenty years ago. Over the same period, the number of calories burned by the average American every day has dropped by about two hundred and fifty. But this doesn't explain why obesity has hit the Pima so hard, either, since they don't seem to be any less active than the rest of us.
The answer, of course, is that there is something beyond diet and exercise that influences obesity--that can make the consequences of a bad diet or of a lack of exercise much worse than they otherwise would be--and this is genetic inheritance. Claude Bouchard, a professor of social and preventive medicine at Laval University, in Quebec City, and one of the world's leading obesity specialists, estimates that we human beings probably carry several dozen genes that are directly related to our weight. "Some affect appetite, some affect satiety. Some affect metabolic rate, some affect the partitioning of excess energy in fat or lean tissue," he told me. "There are also reasons to believe that there are genes affecting physical-activity level." Bouchard did a study not long ago in which he took a group of men of similar height, weight, and life style and overfed them by a thousand calories a day, six days a week, for a hundred days. The average weight gain in the group was eighteen pounds. But the range was from nine to twenty-six pounds. Clearly, the men who gained just nine pounds were the ones whose genes had given them the fastest possible metabolism--the ones who burn the most calories in daily living and are the least efficient at storing fat. These are people who have the easiest time staying thin. The men at the other end of the scale are closer to the Pima in physiology. Their obesity genes thriftily stored away as much of the thousand extra calories a day as possible.
One of the key roles for genes appears to be in determining what obesity researchers refer to as setpoints. In the classic experiment in the field, researchers took a group of rats and made a series of lesions in the base of each rat's brain. As a result, the rats began overeating and ended up much more obese than normal rats. The first conclusion is plain: there is a kind of thermostat in the brain that governs appetite and weight, and if you change the setting on that thermostat appetite and weight will change accordingly. With that finding in mind, the researchers took a second step. They took those same brain-damaged rats and put them on a diet, severely limiting the amount of food they could eat. What happened? The rats didn't lose weight. In fact, after some initial fluctuations, they ended up at exactly the same weight as before. Only, this time, being unable to attain their new thermostat setting by eating, they reached it by becoming less active--by burning less energy.
Two years ago, a group at Rockefeller University in New York published a landmark study essentially duplicating in human beings what had been done years ago in rats. They found that if you lose weight your body responds by starting to conserve energy: your metabolism slows down; your muscles seem to work more efficiently, burning fewer calories to do the same work. "Let's say you have two people, side by side, and these people have exactly the same body composition," Jules Hirsch, a member of the Rockefeller team, says. "They both weigh a hundred and thirty pounds. But there is one difference--the first person maintains his weight effortlessly, while the second person, who used to weigh two hundred pounds, is trying to maintain a lower weight. The second will need fifteen per cent fewer calories per day to do his work. He needs less oxygen and will burn less energy." The body of the second person is backpedalling furiously in response to all that lost weight. It is doing everything it can to gain it back. In response to weight gain, by contrast, the Rockefeller team found that the body speeds up metabolism and burns more calories during exercise. It tries to lose that extra weight. Human beings, like rats, seem to have a predetermined setpoint, a weight that their body will go to great lengths to maintain.
One key player in this regulatory system may be a chemical called leptin--or, as it is sometimes known, Ob protein--whose discovery four years ago, by Jeff Friedman, of the Howard Hughes Medical Institute at Rockefeller University, prompted a flurry of headlines. In lab animals, leptin tells the brain to cut back on appetite, to speed up metabolism, and to burn stored fat. The theory is that the same mechanism may work in human beings. If you start to overeat, your fat cells will produce more leptin, so your body will do everything it can to get back to the setpoint. That's why after gaining a few pounds over the holiday season most of us soon return to our normal weight. But if you eat too little or exercise too much, the theory goes, the opposite happens: leptin levels fall. "This is probably the reason that virtually every weight-loss program known to man fails," José F. Caro, vice-president of endocrine research and clinical investigation at Eli Lilly & Company, told me. "You go to Weight Watchers. You start losing weight. You feel good. But then your fat cells stop producing leptin. Remember, leptin is the hormone that decreases appetite and increases energy expenditure, so just as you are trying to lose weight you lose the hormone that helps you lose weight."
Obviously, our body's fat thermostat doesn't keep us at one weight all our adult lives. "There isn't a single setpoint for a human being or an animal," Thomas Wadden, the director of the Weight and Eating Disorders Clinic at the University of Pennsylvania, told me. "The body will regulate a stable weight but at very different levels, depending on food intake--quality of the diet, high fat versus low fat, high sweet versus low sweet--and depending on the amount of physical activity." It also seems to be a great deal easier to move the setpoint up than to move it down--which, if you think about the Pima, makes perfect sense. In their long history in the desert, those Pima who survived were the ones who were very good at gaining weight during times of plenty--very good, in other words, at overriding the leptin system at the high end. But there would have been no advantage for the ones who were good at losing weight in hard times. The same is probably true for the rest of us, albeit in a less dramatic form. In our evolutionary history, there was advantage in being able to store away whatever calorific windfalls came our way. To understand this interplay between genes and environment, imagine two women, both five feet five. The first might have a setpoint range of a hundred and ten to a hundred and fifty pounds; the second a range of a hundred and twenty-five to a hundred and eighty. The difference in the ranges of the two women is determined by their genes. Where they are in that range is determined by their life styles.
Not long after leptin was discovered, researchers began testing obese people for the hormone, to see whether a fat person was fat because his body didn't produce enough leptin. They found the opposite: fat people had lots of leptin. Some of the researchers thought this meant that the leptin theory was wrong--that leptin didn't do what it was supposed to do. But some other scientists now think that as people get fatter and fatter, their bodies simply get less and less sensitive to leptin. The body still pumps out messages to the brain calling for the metabolism to speed up and the appetite to shrink, but the brain just doesn't respond to those messages with as much sensitivity as it did. This is probably why it is so much easier to gain weight than it is to lose it. The fatter you get, the less effective your own natural weight-control system becomes.
This doesn't mean that diets can't work. In those instances in which dieters have the discipline and the will power to restrict their calories permanently, to get regular and vigorous exercise, and to fight the attempt by their own bodies to maintain their current weight, pounds can be lost. (There is also some evidence that if you can keep weight off for an extensive period--three years, say--a lower setpoint can be established.) Most people, though, don't have that kind of discipline, and even if they do have it the amount of weight that most dieters can expect to lose on a permanent basis may be limited by their setpoint range. The N.I.H. has a national six-year diabetes-prevention study going on right now, in which it is using a program of intensive, one-on-one counselling, dietary modification, and two and a half hours of exercise weekly to see if it can get overweight volunteers to lose seven per cent of their body weight. If that sounds like a modest goal, it should. "A lot of studies look at ten-per-cent weight loss," said Mary Hoskin, who is coördinating the section of the N.I.H. study involving the Pima. "But if you look at long-term weight loss nobody can maintain ten per cent. That's why we did seven."
On the other hand, now that we're coming to understand the biology of weight gain, it is possible to conceive of diet drugs that would actually work. If your body sabotages your diet by lowering leptin levels as you lose weight, why not give extra leptin to people on diets? That's what a number of drug companies, including Amgen and Eli Lilly, are working on now. They are trying to develop a leptin or leptin-analogue pill that dieters could take to fool their bodies into thinking they're getting fatter when they're actually getting thinner. "It is very easy to lose weight," José Caro told me. "The difficult thing is to maintain your weight loss. The thinking is that people fail because their leptin goes down. Here is where replacement therapy with leptin or an Ob-protein analogue might prevent the relapse. It is a subtle and important concept. What it tells you is that leptin is not going to be a magic bullet that allows you to eat whatever you want. You have to initiate the weight loss. Then leptin comes in."
Another idea, which the Hoffmann-La Roche company is exploring, is to focus on the problems obese people have with leptin. Just as Type II diabetics can become resistant to insulin, many overweight people may become resistant to leptin. So why not try to resensitize them? The idea is to find the leptin receptor in the brain and tinker with it to make it work as well in a fat person as it does in a thin person. (Drug companies have actually been pursuing the same strategy with the insulin receptors of diabetics.) Arthur Campfield, who heads the leptin project for Roche, likens the process by which leptin passes the signal about fat to the brain to a firemen's bucket brigade, where water is passed from hand to hand. "If you have all tall people, you can pass the bucket and it's very efficient,"he said. "But if two of the people in the chain are small children, then you're going to spill a lot of water and slow everything down. We want to take a tablet or a capsule that goes into your brain and puts a muscular person in the chain and overcomes that weakness. The elegant solution is to find the place in the chain where we are losing water."
The steps that take place in the brain when it receives the leptin message are known as the Ob pathway, and any number of these steps may lend themselves to pharmaceutical intervention. Using the Ob pathway to fight obesity represents a quantum leap beyond the kinds of diet drugs that have been available so far. Fen-phen, the popular medication removed from the market last year because of serious side effects, was, by comparison, a relatively crude product, which worked indirectly to suppress appetite. Hoffmann-La Roche is working now on a drug called Xenical, a compound that blocks the absorption of dietary fat by the intestine. You can eat fat; you just don't keep as much of it in your system. The drug is safe and has shown real, if modest, success in helping chronically obese patients lose weight. It will probably be the next big diet drug. But no one is pretending that it has anywhere near the potential of, say, a drug that would resensitize your leptin receptors.
Campfield talks about the next wave of drug therapy as the third leg of a three-legged stool--as the additional element that could finally make diet and exercise an easy and reliable way to lose weight. Wadden speaks of the new drugs as restoring sanity:"What I think will happen is that people on these medications will report that they are less responsive to their environment. They'll say that they are not as turned on by Wendy's or McDonald's. Food in America has become a recreational activity. It is divorced from nutritional need and hunger. We eat to kill time, to stimulate ourselves, to alter our mood. What these drugs may mean is that we're going to become less susceptible to these messages." In the past thirty years, the natural relationship between our bodies and our environment--a relation that was developed over thousands of years--has fallen out ofbalance. For people who cannot restore that natural balance themselves--who lack the discipline, the wherewithal, or, like the Pima, the genes--drugs could be a way of restoring it for them.
5.
Seven years ago, Peter Bennett, the epidemiologist who first stumbled on the Gila River Pima twenty-eight years earlier, led an N.I.H. expedition to Mexico's Sierra Madre Mountains. Their destination was a a tiny Indian community on the border of Sonora and Chihuahua, seven thousand feet above the desert. "I had known about their existence for at least fifteen years before that," Bennett says. "The problem was that I could never find anyone who knew much about them. In 1991, it just happened that we linked up with an investigator down in Mexico." The journey was a difficult one, but the Mexican government had just built a road linking Sonora and Chihuahua, so the team didn't have to make the final fifty- or sixty-mile trek on horseback. "They were clearly a group who have got along together for a very long time," Bennett recalls. "My reaction as a stranger going in was: Gee, I think these people are really very friendly, very coöperative. They seem to be interested in what we want to do, and they are willing to stick their arms out and let us take blood samples." He laughed. "Which is always a good sign."
The little town in the Sierra Madre is home to the Mexican Pima, the southern remnants of a tribe that once stretched from present-day Arizona down to central Mexico. Like the Pima of the Gila River reservation, they are farmers, living in small clusters of wood-and-adobe rancherĂas among the pine trees, cultivating beans, corn, and potatoes in the valleys. On that first trip, the N.I.H. team examined no more than a few dozen Pima. Since then, the team has been back five or six times, staying for as many as ten days at a time. Two hundred and fifty of the mountain Pima have now been studied. They have been measured and weighed, their blood sugar has been checked, and their kidneys and eyes have been examined for signs of damage. Genetic samples have been taken and their metabolism has been monitored. The Mexican Pima, it turns out, eat a diet consisting almost entirely ofbeans, potatoes, and corn tortillas, with chicken perhaps once a month. They take in twenty-two hundred calories a day, which is slightly more than the Pima of Arizona do. But on the average each of them puts in twenty-three hours a week of moderate to hard physical labor, whereas the average Arizona Pima puts in two hours. The Mexican Pima's rates of diabetes are normal. They are slightly shorter than their American counterparts. In weight, there is no comparison: "I would say they are thin," Bennett says. "Thin. Certainly by American standards."
There are, of course, a hundred reasons not to draw any great lessons from this. Subsistence farming is no way to make a living in America today, nor are twenty-three hours ofhard physical labor feasible in a society where most people sit at a desk from nine to five. And even if the Arizona Pima wanted to return to the land, they couldn't. It has been more than a hundred years since the Gila River, which used to provide the tribe with fresh fish and with water for growing beans and squash, was diverted upstream for commercial farming. Yet there is value in the example of the Mexican Pima. People who work with the Pima of Arizona say that the biggest problem they have in trying to fight diabetes and obesity is fatalism--a sense among the tribe that nothing can be done, that the way things are is the way things have to be. It is possible to see in the attitudes of Americans toward weight loss the same creeping resignation. As the world grows fatter, and as one best-selling diet scheme after another inevitably fails, the idea that being slender is an attainable--or even an advisable--condition is slowly receding. Last month, when The New England Journal of Medicine published a study suggesting that the mortality costs of obesity had been overstated, the news was greeted with resounding relief, as if we were all somehow off the hook, as if the issue with obesity were only mortality and not the thousand ways in which being fat undermines our quality of life: the heightened risk of heart disease, hypertension, diabetes, cancer, arthritis, gallbladder disease, trauma, gout, blindness, birth defects, and other aches, pains, and physical indignities too numerous to mention. What we are in danger of losing in the epidemic of obesity is not merely our health but our memory of health. Those Indian towns high in the Sierra Madre should remind the people of Sacaton--and all the rest of us as well--that it is still possible, even for a Pima, to be fit.
The Spin Myth
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 6, 1998
A CRITIC AT LARGE
Are our spin meisters just spinning one another?
On Easter Sunday, 1929, the legendary public-relations man Edward L. Bernays rounded up ten carefully chosen women, put cigarettes in their hands, and sent them down Fifth Avenue in what was billed as the Torches of Freedom march. The marchers were given detailed instructions, including when and how their cigarettes should be lit. Spokeswomen were enlisted to describe the protest as an advance for feminism. Photographers were hired to take pictures. It was an entirely contrived event that nonetheless looked so "real" that the next day it made front-page headlines across the country, prompting a debate over whether women should be allowed to smoke as freely as men, and--some historians believe--forever changing the social context of cigarettes. What Bernays never told anyone was that he was working for the American Tobacco Company.
It is difficult to appreciate how brazen Bernays's ruse was at the time. In the twenties, the expectation was that if you were trying to sell people something--even if you were planning to deceive them in the process--you had at least to admit that you were trying to sell them something. Bernays was guided by the principle that this wasn't true: that sometimes the best way to sell something (cigarettes, say) was to pretend to be selling something else (freedom, say).
Bernays helped the brewing industry establish beer as "the beverage of moderation." For Dixie cups, he founded the Committee for the Study and Promotion of the Sanitary Dispensing of Food and Drink. For the Mack truck company, he drummed up national support for highway construction through front groups called the Trucking Information Service, the Trucking Service Bureau, and Better Living Through Increased Highway Transportation. In a torrent of books and articles (including one book, "Crystallizing Public Opinion," that was found in Joseph Goebbels's library) he argued that the P.R. professional could "continuously and systematically" perform the task of "regimenting the public mind." He wasn't talking about lying. He was talking about artful, staged half- truth. It's the kind of sly deception that we've come to associate with the Reagan Administration's intricately scripted photo ops (the cowboy hats, the flannel shirts, the horse), with the choreographed folksiness of Clinton's Town Hall meetings, with the "Wag the Dog" world of political operatives, and with the Dilbertian byways of boardroom euphemism, in which firing is "rightsizing" and dismembering companies becomes "unlocking shareholder value." Edward L. Bernays invented spin.
Today, we're told, Bernays's touch is everywhere. The advertising critic Randall Rothenberg has suggested that there is something called a Media-Spindustrial Complex, which encompasses advertising, P.R., lobbying, polling, direct mail, investor relations, focus groups, jury consulting, speechwriting, radio and television stations, and newspapers--all in the business of twisting and turning and gyrating. Argument now masquerades as conversation. Spin, the political columnist E.J. Dionne wrote recently, "obliterates the distinction between persuasion and deception." Should P.R. people tell "the whole truth about our clients? No sirree!" Thomas Madden, the chairman of one of the largest P.R. firms in the country, declares in his recent memoir, entitled "Spin Man." In the best-seller "Spin Cycle: Inside the Clinton Propaganda Machine," Howard Kurtz,the media critic for the Washington Post, even describes as spin the White House's decision in the spring of 1997 to release thousands of pages of documents relating to the Democratic fund-raising scandal.
This was the documentation that the press had been clamoring for. You might have thought that it was full disclosure. Not so, says Kurtz, who dubs the diabolical plan Operation Candor. In playing the honesty card, he argues, the White House preëmpted embarrassing leaks by congressional investigators and buried incriminating documents under an avalanche of paper. Of course, not releasing any documents at all would also have been spin (Stonewall Spin), and so would releasing only a handful of unrepresentative documents (Selection Spin). But, if you think that calling everything "spin" renders the term meaningless (if this is all spin, then what is not spin?), you've missed the point. The notion that this is the age of spin rests on the premise that everything, including the truth, is potentially an instrument of manipulation.
In "P.R.!:ASocial History of Spin," the media critic Stuart Ewen describes how, in 1990, he went to visit Bernays at his home near Harvard Square, in Cambridge. He was ushered in by a maid and waited in the library, looking, awestruck, at the shelves. "It was a remarkable collection of books, thousands of them: about public opinion, individual and social psychology, survey research, propaganda, psychological warfare, and so forth--a comprehensive library spanning matters of human motivation and strategies of influence, scanning a period of more than one hundred years," he writes. "These were not the bookshelves of some shallow huckster, but the arsenal of an intellectual. The cross- hairs of nearly every volume were trained on the target of forging public attitudes. Here--in a large white room in Cambridge, Massachusetts--was the constellation of ideas that had inspired and informed a twentieth century preoccupation: the systematic molding of public opinion."
Suddenly, Ewen's reverie was broken. In walked Bernays, a "puckish little man" of ninety-eight, with "swift eyes," who looked like "an aged Albert Einstein." Bernays led Ewen past his picture gallery--Bernays and Henry Ford, Bernays and Thomas Edison, Bernays and Eisenhower, Bernays en route to the 1919 Paris Peace Conference, an autographed photo of Sigmund Freud, who was Bernays's uncle. And for four hours Bernays and Ewen talked. Ewen was "entranced": he had located the fountainhead of all spin. At one point, Bernays hypothesized about how he might have promoted Ewen's previous book, which was an account of consumer imagery in the modern economy. He would, he said, have called the big consumer organizations and suggested to them that they devote one of their annual meetings to a discussion of consumers and images. Ewen thought nothing of it. Then, three months later, he got a call from the president of the Consumer Federation of America asking him if he wanted to be the keynote speaker at its annual meeting. Was Bernays behind it? Was he still spinning, even as he approached his hundredth birthday? Ewen never found out. "Yet the question remained, and remains, open," he writes, in the breathless opening chapter of his book. "Things had uncannily come to pass much as Bernays had described in his hypothetical disquisition on the work of a P.R. practitioner, and I was left to ponder whether there is any reality anymore, save the reality of public relations."
The curious thing about our contemporary obsession with spin, however, is that we seldom consider whether spin works. We simply assume that, because people are everywhere trying to manipulate us, we're being manipulated. Yet it makes just as much sense to assume the opposite: that the reason spin is everywhere today is that it doesn't work--that, because the public is getting increasingly inured to spin, spinners feel they must spin even harder, on and on, in an ever-escalating arms race. The Torches of Freedom march worked because nobody had ever pulled a stunt like that before. Today, those same marchers would be stopped cold at ten feet. (First question at the press conference: Who put you up to this?) Once spun, twice shy. When, last week, the Clinton spokesman Rahm Emanuel called Steven Brill's revelations about Kenneth Starr's leaking to the press a "bombshell," that was spin, but we are so accustomed to Rahm Emanuel's spinning that the principal effect of his comment was to prompt a meta-discussion about, of all things, his comment. ("If the wonderful word oleaginous didn't exist," Frank Rich wrote in the Times, "someone would have to invent it to describe Rahm Emanuel.")Emanuel might have been better off saying nothing at all, except that--under the Howard Kurtz rule--this, too, would have been decoded as an attempt to spin us, by ostentatiously letting the Brill revelations speak for themselves: Silent Spin, perhaps. Spin sets into motion a never-ending cycle of skepticism.
There is a marvellous illustration of this arms-race problem in the work of two psychology professors, Deborah Gruenfeld and Robert Wyer, Jr. They gave people statements that were said to be newspaper headlines, and asked them to rate their plausibility, on a scale of zero to ten. Since the headlines basically stated the obvious--for example, "black democrats supported jesse jackson for president in 1988"--the scores were all quite high. The readers were then given a series of statements that contradicted the headlines. Not surprisingly, the belief scores went down significantly. Then another group of people was asked to read a series of statements that supported the headlines--statements like "Black Democrats presently support Jesse Jackson for President." This time, the belief scores still dropped. Telling people that what they think is true actually is true, in other words, has almost the same effect as telling them that what they think is true isn't true. Gruenfeld and Wyer call this a "boomerang effect," and it suggests that people are natural skeptics. How we respond to a media proposition has at least as much to do with its pragmatic meaning (why we think the statement is being made) as with its semantic meaning (what is literally being said). And when the pragmatic meaning is unclear--why, for example, would someone tell us over and over that Jesse Jackson has the support of black Democrats--we start to get suspicious. This is the dilemma of spin. When Rahm Emanuel says "bombshell," we focus not on the actual bombshell but on why he used the word "bombshell."
The point is that spin is too clever by half. In a forthcoming biography, "The Father of Spin," Larry Tye writes that in 1930 Bernays went to work for a number of major book publishers, including Simon &Schuster and Harcourt Brace: "'Where there are bookshelves,' he reasoned, 'there will be books.' So he got respected public figures to endorse the importance of books to civilization, and then he persuaded architects, contractors, and decorators to put up shelves on which to store the precious volumes--which is why so many homes from that era have built-in bookshelves."
This is the kind of slick move that makes Bernays such an inspiration for contemporary spin meisters. (Tye, admiringly, calls it "infinitely more effective" than simply promoting books one by one, in the conventional way.) But wait a minute. Did Bernays really reach all these architects and contractors? If so, how? Wouldn't there have been thousands of them? And, if he did, why would they ever have listened to him? (My limited experience with contractors and architects is that advice from someone outside their field has the opposite of its intended effect.) And, even if we assume that he did cause a surge in bookshelf building, is there a magical relationship between built-in shelves and the purchase of books? Most of us, I think, acquire books because we like books and we want to read them--not because we have customized space to fill in our apartments. The best way to promote cigarettes probably isn't to subsidize ashtrays.
People who worry about spin have bought into a particular mythology about persuasion--a mythology that runs from Tom Sawyer to Vance Packard--according to which the best way to persuade someone to do something is to hide the act of persuasion. The problem is, though, that if the seller is too far removed from the transaction, if his motives are too oblique, there's a good chance that his message will escape the buyer entirely. (People don't always think books when they think shelves.) In fact, successful persuasion today is characterized by the opposite principle--that it is better to be obvious and get your message across than it is to pull invisible strings and risk having your message miss the mark. Bernays sacrificed clarity for subtlety. Most effective advertising today sacrifices subtlety for clarity. Recently, at a Robert Wood Johnson Foundation conference on how to fight teen-age smoking, one prominent California ad executive talked about the reason for the success of the Marlboro and the Camel brands. It was not, he said, because of any of the fancy behind-the-scenes psychological tricks that Big Tobacco is so often accused of by its critics. On the contrary. The tobacco companies, he said, understand what Nike and Coca- Cola understand: that if they can make their brands ubiquitous--if they can plaster them on billboards, on product displays inside grocery stores, on convenience-store windows, on the sides of buildings, on T-shirts and baseball caps, on the hoods and the roofs of racing cars, in colorful spreads in teen magazines--they can make their message impossible to ignore. The secret is not deception but repetition, not artful spinning but plain speaking.
There's a second, related difficulty with spin--one that people in the marketing business call the internal-audience problem. Let's say you are the head of the ad agency that has the Burger King account. Your ultimate goal is to make ads that appeal to the kind of people who buy Burger King burgers. But, in order to keep Burger King's business and get your commercials on the air, you must first appeal to Burger King's marketing executives, who are probably quite different in temperament and taste from the target Burger King customer. Ideally, your ads will appeal to the folks at Burger King because they appeal to the Burger King customers; that is, the internal audience will be pleased because the external audience is pleased. But it has always been extremely difficult to measure the actual impact of a television commercial (especially, as is the case with many ads, where the aim is simply to maintain the current market share). Unless you're careful, then, you may start creating ads that appeal only to your internal audience, with the unfortunate result that the relationship between ad agency and ad buyer becomes a kind of closed loop. The internal audience supplants the real audience.
The internal-audience effect can be seen in all sorts of businesses. The reason so many magazines look alike is that their Manhattan-based editors and writers end up trying to impress not readers but other Manhattan-based editors and writers. It was in an effort to avoid this syndrome that Lincoln Mercury recently decided to move its headquarters from Detroit to California. The company said that the purpose was to get closer to its customers; more precisely, the purpose was to get away from people who weren't its customers. Why do you think it took so long to get Detroit to install seat belts? Because to the internal audience a seat belt is a cost center. It is only to the external audience that it's a life saver.
Edward Bernays was a master of the internal audience. He was intellectually indefatigable, a diminutive, mustachioed, impatient dervish. Larry Tye writes that as Bernays sat in his office "four or five young staff members, their chairs pulled close, would have been listening to him spew forth a stream of thoughts about peddling Ivory or keeping Luckies number one. With each new idea he'd scratch out a note, wad it up, and toss it on the floor." Afterward, the floor looked blanketed by snow. But it was all an inside joke. The wadded-up pieces of paper were, Tye quotes one former employee as saying, "a trick to demonstrate all the ideas he was generating." To promote bacon, Bernays persuaded prominent doctors to testify to the benefits of a hearty breakfast. His client, a bacon producer, no doubt regarded this as a dazzling feat. But does a hearty breakfast mean bacon? And does bacon mean his client's bacon? Bernays's extraordinary success is proof that in the P.R. world, where no hard-and-fast measures exist to gauge the true effectiveness of a message, he could prosper by playing only to his internal audience. But often the very things that make you successful with that audience prevent you from being successful with your real audience. To Simon & Schuster--to people in the book business--bookshelves really do mean books. To the rest of us, a bookshelf may be no more than a place to put unopened mail.
This is the mistake Howard Kurtz makes in "Spin Cycle." His book is a detailed account of how in the year following the 1996 elections Clinton's spokesman Mike McCurry successfully spun the White House press corps during the fund-raising and Whitewater scandals. Kurtz tries to argue that this, in turn, reflects Clinton's ability to manage his image with the wider public--with the external audience. In fact, "Spin Cycle" reads more like an extended treatise on the internal-audience problem, a three-hundred-page account of how McCurry's heroic attempts to spin the White House press corps had the effect of, well, spinning the White House press corps.
For example, Kurtz recounts the story of Rita Braver, a former White House correspondent for CBS television. Braver believed that the Clinton Administration would go to "unbelievable lengths" to keep her from breaking a story--on the ground, Kurtz says, that "bad stories came across as more sensational on television." In one instance, early in Clinton's second term, the White House announced that it was turning over a large number of Whitewater documents to the Justice Department. Braver, according to Kurtz, smelled a rat. She knew that you don't just turn over documents to the Justice Department. She made some calls and found out that, sure enough, the White House had actually been subpoenaed. Braver wrote a script: "CBS News has learned..." Then disaster struck. "Half an hour before the evening news began," Kurtz writes, "White House officials publicly announced the subpoena. No way they were going to let her break the news and look like they were hiding something, which they had been. They were determined to beat her to the punch."
Let's deconstruct this episode. Braver wanted to write a story that said, in effect, The documents the White House said it is handing over to the Justice Department today are, I have learned, being handed over because of a subpoena. Instead, she was forced to say, The documents that the White House is handing over to the Justice Department today are, the White House said, being handed over because of a subpoena. To the internal audience--to Braver and her colleagues--there is a real distinction between Statements A and B. In the first case, the White House is seen as reluctant to disclose the existence of a subpoena. In the second, it is not. More important, in the first case it is clear that the subpoena story is the result of the efforts of Rita Braver--of the efforts, in other words, of the White House press corps--and in the second that role has been erased. This distinction also matters to Clinton, McCurry's other internal audience. But why does this matter to the rest of us? The news of interest to the external audience is not the nuance of the White House's reaction to a subpoena, or the particular reporting talents of Rita Braver; it is the fact of the subpoena itself. Kurtz is entirely correct that the Braver episode is an example of the ascendancy of spin. But the only thing that's being spun here is ten square blocks in the center of Washington, D.C. This is dog-whistle politics.
The irony of Edward L. Bernays's enshrinement in the spin literature is that, in fact, he is not the father of contemporary persuasion. That honor belongs--if it belongs to anyone--to the wizard of direct marketing, Lester Wunderman. Wunderman was Bernays's antithesis. He was born in a tenement in the East Bronx, far from the privilege and wealth of Bernays's Manhattan. While Bernays was sending women marching down Fifth Avenue, Wunderman was delivering chickens for Izzy, a local kosher butcher. He started off in advertising making twenty-five dollars a week at Casper Pinsker's mail-order ad agency, in lower Manhattan, and in one of his first successes he turned the memoirs of Hitler's personal physician into a wartime best-seller by promoting them on the radio with some of the first-ever infomercials. If Bernays was the master of what Tye calls Big Think--splashy media moments, behind-the-scenes manipulations, concocted panels of "experts"--Wunderman, in the course of his career, established himself as the genius of Little Think, of the small but significant details that turn a shopper into a buyer. He was the person who first put bound- in subscription cards in magazines, who sold magazines on late-night television with an 800 number, who invented the forerunner of the scratch-'n'-sniff ad, who revolutionized the mail-order business, and who, in a thousand other ways, perfected the fine detail of true salesmanship.
In "Being Direct," his recent autobiography, Wunderman relates the story of how he turned the Columbia Record Club into the largest marketing club of its kind in the world. It's a story worth retelling, if only because it provides such an instructive counterpoint to the ideas of Edward Bernays. The year was 1955. Wunderman was already the acknowledged king of mail order, long since gone from Casper Pinsker, and by then a senior vice-president at the ad firm Maxwell Sackheim & Company, and Columbia came to him with a problem. Independent mail-order companies, using the model of the Book-of-the-Month Club, were starting to chip away at retail sales of records. (In those days, record companies sold records through dealers, the same way that car companies sell cars.) To stem the tide, Columbia wanted to start a club of its own.
Wunderman's response was to create a kind of mail- order department store, with four sections--classical, Broadway, jazz, and listening and dancing--and a purchase plan that offered a free record for joining. The offer was then advertised in magazines, with a coupon to clip and mail back. This initial campaign did respectably, but not well enough to break even. Wunderman went back to the drawing board. In 1956, he began testing hundreds of different kinds of ads in different publications and in different markets, comparing the response rate to each. The best response was to a plan that allowed the customer, for every four records purchased, to choose three free records from a list of twelve options. He went with that nationwide. By 1957, the club had a million members. But that summer the club-advertising response rate suddenly fell off by twenty per cent. Wunderman, who was travelling in Europe at the time, had another brainstorm:
What I had discovered in Italy was antipasto.... The idea of so many choices intrigued me, and the larger the selection, the longer the line at the antipasto table. Restaurant owners seemed to know this, because antipasto carts and tables were usually displayed prominently at the entrance. I made a point of counting the number of individual antipasto choices people took in relation to the number that were offered, and I discovered that they helped themselves to about the same number of dishes no matter how many were set in front of them.
Wunderman rushed back to New York with the solution. The free records Columbia offered to new members were "antipasto." But three free records from a list of twelve weren't enough, Wunderman argued. That wasn't a true antipasto bar. He persuaded Columbia that it should test an ad that increased the choice from twelve records to thirty- two. The response rate doubled. (Today, Columbia members get to choose from more than four hundred albums.) Columbia scrapped its old ad run, and replaced it with the antipasto campaign. The year 1958 was the best one in the club's history.
The next challenge Wunderman faced was that the club seemed stalled at a million members, so he began searching for new ways to get his message across. Taking an idea he had pioneered several years earlier, while he was selling mail-order roses for Jackson & Perkins, he persuaded a number of publishers to put post-paid insert cards in magazines--the now familiar little cards that are an ad on one side and a coupon on the other. Columbia jumped to two million members. In Life he inserted a sheet of "value stamps," each with the title of an album on it, which readers could stick to a response card. He was also the first to use an "answer card" in newspapers, and among the first to get newspapers to insert freestanding four- and eight-page special advertising sections in their Sunday editions. On another occasion, he put a little gold box in the corner of all Columbia's print ads, and, in a series of television commercials, instructed viewers to look in the ads for the "buried treasure"; if they found the box, they could get another free record. The gold-box campaign raised responses by eighty per cent. "The Gold Box," he writes, "had made the reader/viewer part of an interactive advertising system. Viewers were not just an audience but had become participants. It was like playing a game."
All these strategies amount to a marketing system of extraordinary sensitivity. Answer cards and gold boxes and antipasto and the other techniques of Little Think are sophisticated ways of listening, of overcoming the problems of distance and distortion which so handicap other forms of persuasion. There are times when we all get annoyed at the business reply cards that Wunderman invented. But at a conceptual level, surely, those cards are a thing of beauty. To the consumer--to us--they offer almost perfect convenience. Is there an easier way to subscribe to a magazine? To the client, they offer ubiquity: it knows that every time a magazine is opened a response card falls on someone's lap. And to the ad agency they offer a finely calibrated instrument to measure effectiveness: the adman can gauge precisely how successful his campaign is merely by counting the number of cards that come back. Much of the apparatus of modern-day marketing--the computer databases, the psychographic profiles, the mailing lists, the market differentiations, the focus groups--can be seen, in some sense, as an attempt to replicate the elegance and transparency of this model. Marketers don't want to spin us. They want to hold us perfectly still, so they can figure out who we are, what we want, and how to reach us.
There is a moment in Kurtz's book in which he stumbles on this truth: that it isn't spin, after all, that accounts for Clinton's popularity but, rather, the opposite of spin--the President's ability to listen, to offer his agenda like antipasto, to sidestep the press and speak directly to the public. Kurtz writes that the former White House communications director Don Baer believed that Clinton "cut through what was said about him":
He was having his own conversation with America, one that, if all went well, sailed over the heads of the journalists, who were nothing but theater critics and did little to shape public opinion. Baer saw the phenomenon time and again. When Clinton unveiled his plan for hope scholarships, which would give parents of college students up to $1500 a year in tax credits, the media verdict was swift: cynical political ploy to pander to middle-class voters. But what the public heard was that Clinton was concerned about the difficulty of sending kids to college and was willing to help them with tax credits. Voters got it. They liked constructive proposals and hated partisan sniping.
But Kurtz doesn't believe Baer, and why would he? The spin fantasy offers a far more satisfying explanation for the world around us. Spin suggests a drama, a script to decode, a game played at the highest of levels. Spinning is the art of telling a story, even when there is no story to tell, and this is irresistible (particularly to journalists, who make a living by telling stories even when there is no story to tell). In truth, the world of persuasion is a good deal more prosaic. Ideas and candidacies--not to mention albums--are sold by talking plainly and clearly, and the louder and faster the whirring of the spinners becomes, the more effective this clarity and plainspokenness will be. We think we belong to the world of Edward L. Bernays. We don't. We are all Wundermanians now.
Perils of a Parable
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
January 11, 1999
COMMENT
Science and the Perils of a Parable
In the movie "A Civil Action," the families of eight leukemia victims accuse two major corporations of contaminating the drinking water of Woburn, Massachusetts. John Travolta's portrayal of the lawyer who argues their case has been justifiably praised by critics for its subtlety: he is neither a villain nor a hero but an uncomfortable and ambiguous combination of the two--a man of equal parts greed and idealism who is in the grip of a powerful obsession. Curiously, though, when it comes to the scientific premise of the story, "A Civil Action" (like Jonathan Harr's best-seller, on which it is based) permits no ambiguity at all. It is taken as a given that the chemical allegedly dumped, trichloroethylene (TCE), is a human carcinogen-- even though, in point of fact, TCE is only a probable human carcinogen: tests have been made on animals, but no human-based data have tied it to cancer. It is also taken as a given that the particular carcinogenic properties of TCE were what resulted in the town's leukemia outbreak, even though the particular causes and origins of that form of cancer remain mysterious. The best that can be said is that there might be a link between TCE and disease. But the difference between what "might be" and what "is"--which in scientific circles is all the difference in the world--does not appear to amount to much among the rest of us. We know that human character can be complex and ambiguous. But we want science to conform to a special kind of narrative simplicity: to begin from obvious premises and proceed, tidily and expeditiously, to morally satisfying conclusions.
Consider the strange saga of silicone breast implants. Almost seven years ago, the Food and Drug Administration placed a moratorium on most uses of silicone implants, because the devices had been inadequately tested and the agency wanted to give researchers time to gather new data on their safety. Certain that the data would indict implants in the end, personal-injury lawyers rounded up hundreds of thousands of women in a massive class- action suit. By 1994, four manufacturers of implants had been instructed to pay out the largest class-action settlement in history: $4.25 billion. And when that amount proved insufficient for all the plaintiffs, the largest of the defendants--Dow Corning--filed for Chapter 11, offering $3.2 billion last November to settle its part of the suit.
Now, however, we actually have the evidence on implant safety. More than twenty studies have been completed, by institutions ranging from Harvard Medical School to the Mayo Clinic. The governments of Germany, Australia, and Britain have convened scientific panels. The American College of Rheumatology, the American Academy of Neurology, and the Council on Scientific Affairs of the American Medical Association have published reviews of the evidence, and last month, in a long-awaited decision, an independent scientific panel, appointed by a federal court, released its findings. All of the groups have reached the same conclusion: there is little or no reason to believe that breast implants cause disease of any kind. The author of the toxicological section of the federal court's panel concluded, "There is no evidence silicone breast implants precipitate novel immune responses or induce systemic inflammation," and the author of the immunology section of the same report stated, "Women with silicone breast implants do not display a silicone-induced systemic abnormality in the types or functions of cells of the immune system."
There is some sense now that with the unequivocal of the December report, the tide against implants may finally be turning. But that is small consolation. For almost seven years, at a cost of billions and in the face of some twenty-odd studies to the contrary, the courts and the public clung to a conclusion with no particular merit other than that it sounded as if it might be true. Here, after all, was a group of profit-driven multinationals putting gooey, leaky, largely untested patties of silicone into the chests of a million American women. In the narrative we have imposed on science, that act ought to have consequences, just as the contamination of groundwater by a profit-seeking multinational ought to cause leukemia. Our moral sense said so, and, apparently, that was enough. Of course, if science always made moral sense we would not need scientists. We could staff our laboratories with clergy.
It may be hard to shed a tear for implant manufacturers Dow Corning, even though their shareholders have been royally ransomed for no good reason. Those who sell drugs and medical devices must expect to be held hostage, from time to time, by the irrationalities of the legal system. The women in this country with breast implants do, however, deserve our compassion. They chose cosmetic in order to feel better about themselves. For this, they were first accused of an unnatural vanity and then warned that they had placed themselves in physical peril, and the first charge informed the second until the imagined threat of silicone implants took on the force of moral judgment: they asked for it, these women. They should have been satisfied with what God gave them and stayed home to reread "The Beauty Myth." Well, they didn't ask for anything, and what they did with their bodies turns out to have no larger meaning at all. Science, tempting though it is to believe otherwise, is not in the business of punishing the politically retrograde, nor is it a means of serving retribution to the wicked and the irresponsible. In the end, one may find that the true health toll of breast implants was the seven years of needless anxiety suffered by implant wearers at the hands of all those lawyers and health "advocates" who were ostensibly acting on their behalf.
Six Degrees of Lois Weisberg
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
January 11, 1999
ANNALS OF SOCIETY
She's a grandmother, she lives in a
big house in Chicago, and you've never
heard of her. Does she run the world?
1.
Everyone who knows Lois Weisberg has a story about meeting Lois Weisberg, and although she has done thousands of things in her life and met thousands of people, all the stories are pretty much the same. Lois (everyone calls her Lois) is invariably smoking a cigarette and drinking one of her dozen or so daily cups of coffee. She will have been up until two or three the previous morning, and up again at seven or seven-thirty, because she hardly seems to sleep. In some accounts -- particularly if the meeting took place in the winter -- she'll be wearing her white, fur-topped Dr. Zhivago boots with gold tights; but she may have on her platform tennis shoes, or the leather jacket with the little studs on it, or maybe an outrageous piece of costume jewelry, and, always, those huge, rhinestone-studded glasses that make her big eyes look positively enormous. "I have no idea why I asked you to come here, I have no job for you," Lois told Wendy Willrich when Willrich went to Lois's office in downtown Chicago a few years ago for an interview. But by the end of the interview Lois did have a job for her, because for Lois meeting someone is never just about meeting someone. If she likes you, she wants to recruit you into one of her grand schemes -- to sweep you up into her world. A while back, Lois called up Helen Doria, who was then working for someone on Chicago's city council, and said, "I don't have a job for you. Well, I might have a little job. I need someone to come over and help me clean up my office." By this, she meant that she had a big job for Helen but just didn't know what it was yet. Helen came, and, sure enough, Lois got her a big job.
Cindy Mitchell first met Lois twenty-three years ago, when she bundled up her baby and ran outside into one of those frigid Chicago winter mornings because some people from the Chicago Park District were about to cart away a beautiful sculpture of Carl von Linné from the park across the street. Lois happened to be driving by at the time, and, seeing all the commotion, she slammed on her brakes, charged out of her car -- all five feet of her -- and began asking Cindy questions, rat-a-tat-tat: "Who are you? What's going on here? Why do you care?" By the next morning, Lois had persuaded two Chicago Tribune reporters to interview Cindy and turn the whole incident into a cause célèbre, and she had recruited Cindy to join an organization she'd just started called Friends of the Parks, and then, when she found out that Cindy was a young mother at home who was too new in town to have many friends, she told her, "I've found a friend for you. Her name is Helen, and she has a little boy your kid's age, and you will meet her next week and the two of you will be best friends." That's exactly what happened, and, what's more, Cindy went on to spend ten years as president of Friends of the Park. "Almost everything that I do today and eighty to ninety per cent of my friends came about because of her, because of that one little chance meeting," Cindy says. "That's a scary thing. Try to imagine what would have happened if she had come by five minutes earlier."
It could be argued, of course, that even if Cindy hadn't met Lois on the street twenty-three years ago she would have met her somewhere else, maybe a year later or two years later or ten years later, or, at least, she would have met someone who knew Lois or would have met someone who knew someone who knew Lois, since Lois Weisberg is connected, by a very short chain, to nearly everyone. Weisberg is now the Commissioner of Cultural Affairs for the City of Chicago. But in the course of her seventy-three years she has hung out with actors and musicians and doctors and lawyers and politicians and activists and environmentalists, and once, on a whim, she opened a secondhand-jewelry store named for her granddaughter Becky Fyffe, and every step of the way Lois has made friends and recruited people, and a great many of those people have stayed with her to this day. "When we were doing the jazz festival, it turned out -- surprise, surprise -- that she was buddies with Dizzy Gillespie," one of her friends recalls. "This is a woman who cannot carry a tune. She has no sense of rhythm. One night Tony Bennett was in town, and so we hang out with Tony Bennett, hearing about the old days with him and Lois."
Once, in the mid-fifties, on a whim, Lois took the train to New York to attend the World Science Fiction Convention and there she met a young writer by the name of Arthur C. Clarke. Clarke took a shine to Lois, and next time he was in Chicago he called her up. "He was at a pay phone," Lois recalls. "He said, 'Is there anyone in Chicago I should meet?' I told him to come over to my house." Lois has a throaty voice, baked hard by half a century of nicotine, and she pauses between sentences to give herself the opportunity for a quick puff. Even when she's not smoking, she pauses anyway, as if to keep in practice. "I called Bob Hughes, one of the people who wrote for my paper." Pause. "I said, 'Do you know anyone in Chicago interested in talking to Arthur Clarke?' He said, 'Yeah, Isaac Asimov is in town. And this guy Robert, Robert...Robert Heinlein.' So they all came over and sat in my study." Pause. "Then they called over to me and they said, 'Lois' -- I can't remember the word they used. They had some word for me. It was something about how I was the kind of person who brings people together."
This is in some ways the archetypal Lois Weisberg story. First, she reaches out to somebody -- somebody outside her world. (At the time, she was running a drama troupe, whereas Arthur C. Clarke wrote science fiction.) Equally important, that person responds to her. Then there's the fact that when Arthur Clarke came to Chicago and wanted to meet someone Lois came up with Isaac Asimov. She says it was a fluke that Asimov was in town. But if it hadn't been Asimov it would have been someone else. Lois ran a salon out of her house on the North Side in the late nineteen-fifties, and one of the things that people remember about it is that it was always, effortlessly, integrated. Without that salon, blacks would still have socialized with whites on the North Side -- though it was rare back then, it happened. But it didn't happen by accident: it happened because a certain kind of person made it happen. That's what Asimov and Clarke meant when they said that Lois has this thing -- whatever it is -- that brings people together.
2.
Lois is a type -- a particularly rare and extraordinary type, but a type nonetheless. She's the type of person who seems to know everybody, and this type can be found in every walk of life. Someone I met at a wedding (actually, the wedding of the daughter of Lois's neighbors, the Newbergers) told me that if I ever went to Massapequa I should look up a woman named Marsha, because Marsha was the type of person who knew everybody. In Cambridge, Massachusetts, the word is that a tailor named Charlie Davidson knows everybody. In Houston, I'm told, there is an attorney named Harry Reasoner who knows everybody. There are probably Lois Weisbergs in Akron and Tucson and Paris and in some little town in the Yukon Territory, up by the Arctic Circle. We've all met someone like Lois Weisberg. Yet, although we all know a Lois Weisberg type, we don't know much about the Lois Weisberg type. Why is it, for example, that these few, select people seem to know everyone and the rest of us don't? And how important are the people who know everyone? This second question is critical, because once you begin even a cursory examination of the life of someone like Lois Weisberg you start to suspect that he or she may be far more important than we would ever have imagined -- that the people who know everyone, in some oblique way, may actually run the world. I don't mean that they are the sort who head up the Fed or General Motors or Microsoft, but that, in a very down-to-earth, day-to-day way, they make the world work. They spread ideas and information. They connect varied and isolated parts of society. Helen Doria says someone high up in the Chicago government told her that Lois is "the epicenter of the city administration," which is the right way to put it. Lois is far from being the most important or the most powerful person in Chicago. But if you connect all the dots that constitute the vast apparatus of government and influence and interest groups in the city of Chicago you'll end up coming back to Lois again and again. Lois is a connector.
Lois, it must be said, did not set out to know everyone. "She doesn't network for the sake of networking," says Gary Johnson, who was Lois's boss years ago, when she was executive director of the Chicago Council of Lawyers. "I just think she has the confidence that all the people in the world, whether she's met them or not, are in her Rolodex already, and that all she has to do is figure out how to reach them and she'll be able to connect with them."
Nor is Lois charismatic -- at least, not in the way that we think of extroverts and public figures as being charismatic. She doesn't fill a room; eyes don't swivel toward her as she makes her entrance. Lois has frizzy blond hair, and when she's thinking -- between her coffee and her cigarette -- she kneads the hair on the top of her head, so that by the end of a particularly difficult meeting it will be standing almost straight up. "She's not like the image of the Washington society doyenne," Gary Johnson says. "You know, one of those people who identify you, take you to lunch, give you the treatment. Her social life is very different. When I bump into her and she says, 'Oh, we should catch up,' what she means is that someday I should go with her to her office, and we'd go down to the snack bar and buy a muffin and then sit in her office while she answered the phone. For a real treat, when I worked with her at the Council of Lawyers she would take me to the dining room in the Wieboldt's department store." Johnson is an old-school Chicago intellectual who works at a fancy law firm and has a corner office with one of those Midwestern views in which, if you look hard enough, you can almost see Nebraska, and the memory of those lunches at Wieboldt's seems to fill him with delight. "Now, you've got to understand that the Wieboldt's department store -- which doesn't exist anymore -- was a notch below Field's, where the suburban society ladies have their lunch, and it's also a notch below Carson's," he says. "There was a kind of room there where people who bring their own string bags to go shopping would have a quick lunch. This was her idea of a lunch out. We're not talking Pamela Harriman here."
In the mid-eighties, Lois quit a job she'd had for four years, as director of special events in the administration of Harold Washington, and somehow hooked up with a group of itinerant peddlers who ran the city's flea markets. "There was this lady who sold jewelry," Lois said. "She was a person out of Dickens. She was bedraggled. She had a houseful of cats. But she knew how to buy jewelry, and I wanted her to teach me. I met her whole circle of friends, all these old gay men who had antique stores. Once a week, we would go to the Salvation Army." Lois was arguably the most important civic activist in the city. Her husband was a judge. She lived in a huge house in one of Chicago's nicest neighborhoods. Yet somehow she managed to be plausible as a flea-market peddler to a bunch of flea-market peddlers, the same way she managed to be plausible as a music lover to a musician like Tony Bennett. It doesn't matter who she's with or what she's doing; she always manages to be in the thick of things. "There was a woman I knew -- Sandra -- who had a kid in school with my son Joseph," Lois told me. Lois has a habit of telling stories that appear to be tangential and digressive but, on reflection, turn out to be parables of a sort. "She helped all these Asians living uptown. One day, she came over here and said there was this young Chinese man who wanted to meet an American family and learn to speak English better and was willing to cook for his room and board. Well, I'm always eager to have a cook, and especially a Chinese cook, because my family loves Chinese food. They could eat it seven days a week. So Sandra brought this man over here. His name was Shi Young. He was a graduate student at the Art Institute of Chicago." Shi Young lived with Lois and her family for two years, and during that time Chicago was in the midst of political turmoil. Harold Washington, who would later become the first black mayor of the city, was attempting to unseat the remains of the Daley political machine, and Lois's house, naturally, was the site of late-night, top-secret strategy sessions for the pro- Washington reformers of Chicago's North Side. "We'd have all these important people here, and Shi Young would come down and listen," Lois recalls. "I didn't think anything of it." But Shi Young, as it turns out, was going back up to his room and writing up what he heard for the China Youth Daily, a newspaper with a circulation in the tens of millions. Somehow, in the improbable way that the world works, a portal was opened up, connecting Chicago's North Side reform politics and the readers of the China Youth Daily, and that link was Lois's living room. You could argue that this was just a fluke -- just as it was a fluke that Isaac Asimov was in town and that Lois happened to be driving by when Cindy Mitchell came running out of her apartment. But sooner or later all those flukes begin to form a pattern.
3.
In the late nineteen-sixties, a Harvard social psychologist named Stanley Milgram conducted an experiment in an effort to find an answer to what is known as the small-world problem, though it could also be called the Lois Weisberg problem. It is this: How are human beings connected? Do we belong to separate worlds, operating simultaneously but autonomously, so that the links between any two people, anywhere in the world, are few and distant? Or are we all bound up together in a grand, interlocking web? Milgram's idea was to test this question with a chain letter. For one experiment, he got the names of a hundred and sixty people, at random, who lived in Omaha, Nebraska, and he mailed each of them a packet. In the packet was the name and address of a stockbroker who worked in Boston and lived in Sharon, Massachusetts. Each person was instructed to write his name on a roster in the packet and send it on to a friend or acquaintance who he thought would get it closer to the stockbroker. The idea was that when the letters finally arrived at the stockbroker's house Milgram could look at the roster of names and establish how closely connected someone chosen at random from one part of the country was to another person chosen at random in another part. Milgram found that most of the letters reached the stockbroker in five or six steps. It is from this experiment that we got the concept of six degrees of separation.
That phrase is now so familiar that it is easy to lose sight of how surprising Milgram's finding was. Most of us don't have particularly diverse groups of friends. In one well-known study, two psychologists asked people living in the Dyckman public-housing project, in uptown Manhattan, about their closest friend in the project; almost ninety per cent of the friends lived in the same building, and half lived on the same floor. In general, people chose friends of similar age and race. But if the friend lived down the hall, both age and race became a lot less important. Proximity overpowered similarity. Another study, involving students at the University of Utah, found that if you ask someone why he is friendly with someone else he'll say that it is because they share similar attitudes. But if you actually quiz the pairs of students on their attitudes you'll find out that this is an illusion, and that what friends really tend to have in common are activities. We're friends with the people we do things with, not necessarily with the people we resemble. We don't seek out friends; we simply associate with the people who occupy the same physical places that we do: People in Omaha are not, as a rule, friends with people who live in Sharon, Massachusetts. So how did the packets get halfway across the country in just five steps? "When I asked an intelligent friend of mine how many steps he thought it would take, he estimated that it would require 100 intermediate persons or more to move from Nebraska to Sharon," Milgram wrote. "Many people make somewhat similar estimates, and are surprised to learn that only five intermediaries will -- on the average -- suffice. Somehow it does not accord with intuition."
The explanation is that in the six degrees of separation not all degrees are equal. When Milgram analyzed his experiments, for example, he found that many of the chains reaching to Sharon followed the same asymmetrical pattern. Twenty-four packets reached the stockbroker at his home, in Sharon, and sixteen of those were given to him by the same person, a clothing merchant whom Milgram calls Mr. Jacobs. The rest of the packets were sent to the stockbroker at his office, and of those the majority came through just two men, whom Milgram calls Mr. Brown and Mr. Jones. In all, half of the responses that got to the stockbroker were delivered to him by these three people. Think of it. Dozens of people, chosen at random from a large Midwestern city, sent out packets independently. Some went through college acquaintances. Some sent their packets to relatives. Some sent them to old workmates. Yet in the end, when all those idiosyncratic chains were completed, half of the packets passed through the hands of Jacobs, Jones, and Brown. Six degrees of separation doesn't simply mean that everyone is linked to everyone else in just six steps. It means that a very small number of people are linked to everyone else in a few steps, and the rest of us are linked to the world through those few.
There's an easy way to explore this idea. Suppose that you made a list of forty people whom you would call your circle of friends (not including family members or co-workers), and you worked backward from each person until you could identify who was ultimately responsible for setting in motion the series of connections which led to that friendship. Imet my oldest friend, Bruce, for example, in first grade, so I'm the responsible party. That's easy. I met my college friend Nigel because he lived down the hall in the dormitory from Tom, whom I had met because in my freshman year he invited me to play touch football. Tom, then, is responsible for Nigel. Once you've made all the connections, you will find the same names coming up again and again. I met my friend Amy when she and her friend Katie came to a restaurant where I was having dinner. I know Katie because she is best friends with my friend Larissa, whom I know because I was told to look her up by a mutual friend, Mike A., whom I know because he went to school with another friend of mine, Mike H., who used to work at a political weekly with my friend Jacob. No Jacob, no Amy. Similarly, I met my friend Sarah S. at a birthday party a year ago because she was there with a writer named David, who was there at the invitation of his agent, Tina, whom I met through my friend Leslie, whom I know because her sister Nina is best friends with my friend Ann, whom I met through my old roommate Maura, who was my roommate because she had worked with a writer named Sarah L., who was a college friend of my friend Jacob. No Jacob, no Sarah S. In fact, when I go down my list of forty friends, thirty of them, in one way or another, lead back to Jacob. My social circle is really not a circle but an inverted pyramid. And the capstone of the pyramid is a single person, Jacob, who is responsible for an overwhelming majority of my relationships. Jacob's full name, incidentally, is Jacob Weisberg. He is Lois Weisberg's son.
This isn't to say, though, that Jacob is just like Lois. Jacob may be the capstone of my pyramid, but Lois is the capstone of lots and lots of people's pyramids, and that makes her social role different. In Milgram's experiment, Mr. Jacobs the clothing merchant was the person to go through to get to the stockbroker. Lois is the kind of person you would use to get to the stockbrokers of Sharon and also the cabaret singers of Sharon and the barkeeps of Sharon and the guy who gave up a thriving career in orthodontics to open a small vegetarian falafel hut.
4.
There is another way to look at this question, and that's through the popular parlor game Six Degrees of Kevin Bacon. The idea behind the game is to try to link in fewer than six steps any actor or actress, through the movies they've been in, to the actor Kevin Bacon. For example, O. J. Simpson was in "Naked Gun" with Priscilla Presley, who was in "The Adventures of Ford Fairlane" with Gilbert Gottfried, who was in "Beverly Hills Cop II" with Paul Reiser, who was in "Diner" with Kevin Bacon. That's four steps. Mary Pickford was in "Screen Snapshots" with Clark Gable, who was in "Combat America" with Tony Romano, who, thirty-five years later, was in "Starting Over" with Bacon. That's three steps. What's funny about the game is that Bacon, although he is a fairly young actor, has already been in so many movies with so many people that there is almost no one to whom he can't be easily connected. Recently, a computer scientist at the University of Virginia by the name of Brett Tjaden actually sat down and figured out what the average degree of connectedness is for the quarter million or so actors and actresses listed in the Internet Movie Database: he came up with 2.8312 steps. That sounds impressive, except that Tjaden then went back and performed an even more heroic calculation, figuring out what the average degree of connectedness was for everyone in the database. Bacon, it turns out, ranks only six hundred and sixty- eighth. Martin Sheen, by contrast, can be connected, on average, to every other actor, in 2.63681 steps, which puts him almost six hundred and fifty places higher than Bacon. Elliott Gould can be connected even more quickly, in 2.63601. Among the top fifteen are people like Robert Mitchum, Gene Hackman, Donald Sutherland, Rod Steiger, Shelley Winters, and Burgess Meredith.
Why is Kevin Bacon so far behind these actors? Recently, in the journal Nature, the mathematicians Duncan Watts and Steven Strogatz published a dazzling theoretical explanation of connectedness, but a simpler way to understand this question is to look at who Bacon is. Obviously, he is a lot younger than the people at the top of the list are and has made fewer movies. But that accounts for only some of the difference. A top-twenty person, like Burgess Meredith, made a hundred and fourteen movies in the course of his career. Gary Cooper, though, starred in about the same number of films and ranks only eight hundred and seventy-eighth, with a 2.85075 score. John Wayne made a hundred and eighty-three movies in his fifty-year career and still ranks only a hundred and sixteenth, at 2.7173. What sets someone like Meredith apart is his range. More than half of John Wayne's movies were Westerns, and that means he made the same kind of movie with the same kind of actors over and over again. Burgess Meredith, by contrast, was in great movies, like the Oscar-winning "Of Mice and Men" (1939), and in dreadful movies, like "Beware! The Blob" (1972). He was nominated for an Oscar for his role in "The Day of the Locust" and also made TV commercials for Skippy peanut butter. He was in four "Rocky" movies, and also played Don Learo in Godard's "King Lear." He was in schlocky made- for-TV movies, in B movies that pretty much went straight to video, and in pictures considered modern classics. He was in forty-two dramas, twenty-two comedies, eight adventure films, seven action films, five sci-fi films, five horror flicks, five Westerns, five documentaries, four crime movies, four thrillers, three war movies, three films noir, two children's films, two romances, two mysteries, one musical, and one animated film. Burgess Meredith was the kind of actor who was connected to everyone because he managed to move up and down and back and forth among all the different worlds and subcultures that the acting profession has to offer. When we say, then, that Lois Weisberg is the kind of person who "knows everyone," we mean it in precisely this way. It is not merely that she knows lots of people. It is that she belongs to lots of different worlds.
In the nineteen-fifties, Lois started her drama troupe in Chicago. The daughter of a prominent attorney, she was then in her twenties, living in one of the suburbs north of the city with two small children. In 1956, she decided to stage a festival to mark the centenary of George Bernard Shaw's birth. She hit up the reclusive billionaire John D. MacArthur for money. ("I go to the Pump Room for lunch. Booth One. There is a man, lurking around a pillar, with a cowboy hat and dirty, dusty boots. It's him.") She invited William Saroyan and Norman Thomas to speak on Shaw's legacy; she put on Shaw plays in theatres around the city; and she got written up in Life. She then began putting out a newspaper devoted to Shaw, which mutated into an underground alternative weekly called the Paper. By then, Lois was living in a big house on Chicago's near North Side, and on Friday nights people from the Paper gathered there for editorial meetings. William Friedkin, who went on to direct "The French Connection" and "The Exorcist," was a regular, and so were the attorney Elmer Gertz (who won parole for Nathan Leopold) and some of the editors from Playboy, which was just up the street. People like Art Farmer and Thelonious Monk and Dizzy Gillespie and Lenny Bruce would stop by when they were in town. Bruce actually lived in Lois's house for a while. "My mother was hysterical about it, especially one day when she rang the doorbell and he answered in a bath towel," Lois told me. "We had a window on the porch, and he didn't have a key, so the window was always left open for him. There were a lot of rooms in that house, and a lot of people stayed there and I didn't know they were there." Pause. Puff. "I never could stand his jokes. I didn't really like his act. I couldn't stand all the words he was using."
Lois's first marriage -- to a drugstore owner named Leonard Solomon -- was breaking up around this time, so she took a job doing public relations for an injury-rehabilitation institute. From there, she went to work for a public-interest law firm called B.P.I., and while she was at B.P.I. she became concerned about the fact that Chicago's parks were neglected and crumbling, so she gathered together a motley collection of nature lovers, historians, civic activists, and housewives, and founded the lobbying group Friends of the Parks. Then she became alarmed on discovering that a commuter railroad that ran along the south shore of Lake Michigan -- from South Bend to Chicago -- was about to shut down, so she gathered together a motley collection of railroad enthusiasts and environmentalists and commuters, and founded South Shore Recreation, thereby saving the railroad. Lois loved the railroad buffs. "They were all good friends of mine," she says. "They all wrote to me. They came from California. They came from everywhere. We had meetings. They were really interesting. I came this close" -- and here she held her index finger half an inch above her thumb -- "to becoming one of them." Instead, though, she became the executive director of the Chicago Council of Lawyers, a progressive bar association. Then she ran Congressman Sidney Yates's reëlection campaign. Then her sister June introduced her to someone who got her the job with Mayor Washington. Then she had her flea-market period. Finally, she went to work for Mayor Daley as Chicago's Commissioner of Cultural Affairs.
If you go through that history and keep count, the number of worlds that Lois has belonged to comes to eight: the actors, the writers, the doctors, the lawyers, the park lovers, the politicians, the railroad buffs, and the flea-market aficionados. When I asked Lois to make her own list, she added musicians and the visual artists and architects and hospitality-industry people whom she works with in her current job. But if you looked harder at Lois's life you could probably subdivide her experiences into fifteen or twenty worlds. She has the same ability to move among different subcultures and niches that the busiest actors do. Lois is to Chicago what Burgess Meredith is to the movies.
Lois was, in fact, a friend of Burgess Meredith. I learned this by accident, which is the way I learned about most of the strange celebrity details of Lois's life, since she doesn't tend to drop names. It was when I was with her at her house one night, a big, rambling affair just off the lakeshore, with room after room filled with odds and ends and old photographs and dusty furniture and weird bric-a- brac, such as a collection of four hundred antique egg cups. She was wearing bluejeans and a flowery-print top and she was smoking Carlton Menthol 100s and cooking pasta and holding forth to her son Joe on the subject of George Bernard Shaw, when she started talking about Burgess Meredith. "He was in Chicago in a play called 'Teahouse of the August Moon,' in 1956," she said, "and he came to see my production of 'Back to Methuselah,' and after the play he came up to me and said he was teaching acting classes, and asked would I come and talk to his class about Shaw. Well, I couldn't say no." Meredith liked Lois, and when she was running her alternative newspaper he would write letters and send in little doodles, and later she helped him raise money for a play he was doing called "Kicks and Company." It starred a woman named Nichelle Nichols, who lived at Lois's house for a while. "Nichelle was a marvellous singer and dancer," Lois said. "She was the lead. She was also the lady on the first..." Lois was doing so many things at once -- chopping and stirring and smoking and eating and talking -- that she couldn't remember the name of the show that made Nichols a star. "What's that space thing?" She looked toward Joe for help. He started laughing. "Star something," she said. "'Star...Star Trek'! Nichelle was Lieutenant Uhura!"
5.
On a sunny morning not long ago, Lois went to a little café just off the Magnificent Mile, in downtown Chicago, to have breakfast with Mayor Daley. Lois drove there in a big black Mercury, a city car. Lois always drives big cars, and, because she is so short and the cars are so big, all that you can see when she drives by is the top of her frizzy blond head and the lighted ember of her cigarette. She was wearing a short skirt and a white vest and was carrying a white cloth shopping bag. Just what was in the bag was unclear, since Lois doesn't have a traditional relationship to the trappings of bureaucracy. Her office, for example, does not have a desk in it, only a sofa and chairs and a coffee table. At meetings, she sits at the head of a conference table in the adjoining room, and, as often as not, has nothing in front of her except a lighter, a pack of Carltons, a cup of coffee, and an octagonal orange ceramic ashtray, which she moves a few inches forward or a few inches back when she's making an important point, or moves a few inches to the side when she is laughing at something really funny and feels the need to put her head down on the table.
Breakfast was at one of the city's tourist centers. The Mayor was there in a blue suit, and he had two city officials by his side and a very serious and thoughtful expression on his face. Next to him was a Chicago developer named Al Friedman, a tall and slender and very handsome man who is the chairman of the Commission on Chicago Landmarks. Lois sat across from them, and they all drank coffee and ate muffins and batted ideas back and forth in the way that people do when they know each other very well. It was a "power breakfast," although if you went around the table you'd find that the word "power" meant something very different to everyone there. Al Friedman is a rich developer. The Mayor, of course, is the administrative leader of one of the largest cities in the country. When we talk about power, this is usually what we're talking about: money and authority. But there is a third kind of power as well -- the kind Lois has -- which is a little less straightforward. It's social power.
At the end of the nineteen-eighties, for example, the City of Chicago razed an entire block in the heart of downtown and then sold it to a developer. But before he could build on it the real-estate market crashed. The lot was an eyesore. The Mayor asked for ideas about what to do with it. Lois suggested that they cover the block with tents. Then she heard that Keith Haring had come to Chicago in 1989 and worked with Chicago high-school students to create a giant five-hundred-foot-long mural. Lois loved the mural. She began to think. She'd long had a problem with the federal money that Chicago got every year to pay for summer jobs for disadvantaged kids. She didn't think it helped any kid to be put to work picking up garbage. So why not pay the kids to do arts projects like the Haring mural, and put the whole program in the tents? She called the program Gallery 37, after the number of the block. She enlisted the help of the Mayor's wife, Maggie Daley, whose energy and clout were essential in order to make the program a success. Lois hired artists to teach the kids. She realized, though, that the federal money was available only for poor kids, and, Lois says, "I don't believe poor kids can advance in any way by being lumped together with other poor kids." So Lois raised money privately to bring in middle-income kids, to mix with the poor kids and be put in the tents with the artists. She started small, with two hundred and sixty "apprentices" the first year, 1990. This year, there were more than three thousand. The kids study sculpture, painting, drawing, poetry, theatre, graphic design, dance, textile design, jewelry-making, and music. Lois opened a store downtown, where students' works of art are sold. She has since bought two buildings to house the project full time. She got the Parks Department to run Gallery 37 in neighborhoods around the city, and the Board of Education to let them run it as an after- school program in public high schools. It has been copied all around the world. Last year, it was given the Innovations in American Government Award by the Ford Foundation and the Harvard school of government.
Gallery 37 is at once a jobs program, an arts program, a real- estate fix, a schools program, and a parks program. It involves federal money and city money and private money, stores and buildings and tents, Maggie Daley and Keith Haring, poor kids and middle-class kids. It is everything, all at once -- a jumble of ideas and people and places which Lois somehow managed to make sense of. The ability to assemble all these disparate parts is, as should be obvious, a completely different kind of power from the sort held by the Mayor and Al Friedman. The Mayor has key allies on the city council or in the statehouse. Al Friedman can do what he does because, no doubt, he has a banker who believes in him, or maybe a lawyer whom he trusts to negotiate the twists and turns of the zoning process. Their influence is based on close relationships. But when Lois calls someone to help her put together one of her projects, chances are she's not calling someone she knows particularly well. Her influence suggests something a little surprising -- that there is also power in relationships that are not close at all.
6.
The sociologist Mark Granovetter examined this question in his classic 1974 book "Getting a Job." Granovetter interviewed several hundred professional and technical workers from the Boston suburb of Newton, asking them in detail about their employment history. He found that almost fifty-six per cent of those he talked to had found their jobs through a personal connection, about twenty per cent had used formal means (advertisements, headhunters), and another twenty per cent had applied directly. This much is not surprising: the best way to get in the door is through a personal contact. But the majority of those personal connections, Granovetter found, did not involve close friends. They were what he called "weak ties." Of those who used a contact to find a job, for example, only 16.7 per cent saw that contact "often," as they would have if the contact had been a good friend; 55.6 per cent saw their contact only "occasionally"; and 27.8 per cent saw the contact "rarely." People were getting their jobs not through their friends but through acquaintances.
Granovetter argues that when it comes to finding out about new jobs -- or, for that matter, gaining new information, or looking for new ideas -- weak ties tend to be more important than strong ties. Your friends, after all, occupy the same world that you do. They work with you, or live near you, and go to the same churches, schools, or parties. How much, then, do they know that you don't know? Mere acquaintances, on the other hand, are much more likely to know something that you don't. To capture this apparent paradox, Granovetter coined a marvellous phrase: "the strength of weak ties." The most important people in your life are, in certain critical realms, the people who aren't closest to you, and the more people you know who aren't close to you the stronger your position becomes.
Granovetter then looked at what he called "chain lengths" -- that is, the number of people who had to pass along the news about your job before it got to you. A chain length of zero means that you learned about your job from the person offering it. A chain length of one means that you heard about the job from someone who had heard about the job from the employer. The people who got their jobs from a zero chain were the most satisfied, made the most money, and were unemployed for the shortest amount of time between jobs. People with a chain of one stood second in the amount of money they made, in their satisfaction with their jobs, and in the speed with which they got their jobs. People with a chain of two stood third in all three categories, and so on. If you know someone who knows someone who knows someone who has lots of acquaintances, in other words, you have a leg up. If you know someone who knows someone who has lots of acquaintances, your chances are that much better. But if you know someone who has lots of acquaintances -- if you know someone like Lois -- you are still more fortunate, because suddenly you are just one step away from musicians and actors and doctors and lawyers and park lovers and politicians and railroad buffs and flea-market aficionados and all the other weak ties that make Lois so strong.
This sounds like a reformulation of the old saw that it's not what you know, it's who you know. It's much more radical than that, though. The old idea was that people got ahead by being friends with rich and powerful people -- which is true, in a limited way, but as a practical lesson in how the world works is all but useless. You can expect that Bill Gates's godson is going to get into Harvard and have a fabulous job waiting for him when he gets out. And, of course, if you play poker with the Mayor and Al Friedman it is going to be a little easier to get ahead in Chicago. But how many godsons can Bill Gates have? And how many people can fit around a poker table? This is why affirmative action seems pointless to so many people: It appears to promise something -- entry to the old-boy network -- that it can't possibly deliver. The old-boy network is always going to be just for the old boys.
Granovetter, by contrast, argues that what matters in getting ahead is not the quality of your relationships but the quantity -- not how close you are to those you know but, paradoxically, how many people you know whom you aren't particularly close to. What he's saying is that the key person at that breakfast in downtown Chicago is not the Mayor or Al Friedman but Lois Weisberg, because Lois is the kind of person who it really is possible for most of us to know. If you think about the world in this way, the whole project of affirmative action suddenly starts to make a lot more sense. Minority-admissions programs work not because they give black students access to the same superior educational resources as white students, or access to the same rich cultural environment as white students, or any other formal or grandiose vision of engineered equality. They work by giving black students access to the same white students as white students -- by allowing them to make acquaintances outside their own social world and so shortening the chain lengths between them and the best jobs.
This idea should also change the way we think about helping the poor. When we're faced with an eighteen-year-old high-school dropout whose only career option is making five dollars and fifty cents an hour in front of the deep fryer at Burger King, we usually talk about the importance of rebuilding inner-city communities, attracting new jobs to depressed areas, and re-investing in neglected neighborhoods. We want to give that kid the option of another, better-paying job, right down the street. But does that really solve his problem? Surely what that eighteen-year-old really needs is not another marginal inducement to stay in his neighborbood but a way to get out of his neighborhood altogether. He needs a school system that provides him with the skills to compete for jobs with middle-class kids. He needs a mass-transit system to take him to the suburbs, where the real employment opportunities are. And, most of all, he needs to know someone who knows someone who knows where all those good jobs are. If the world really is held together by people like Lois Weisberg, in other words, how poor you are can be defined quite simply as how far you have to go to get to someone like her. Wendy Willrich and Helen Doria and all the countless other people in Lois's circle needed to make only one phone call. They are well-off. The dropout wouldn't even know where to start. That's why he's poor. Poverty is not deprivation. It is isolation.
7.
I once met a man named Roger Horchow. If you ever go to Dallas and ask around about who is the kind of person who might know everyone, chances are you will be given his name. Roger is slender and composed. He talks slowly, with a slight Texas drawl. He has a kind of wry, ironic charm that is utterly winning. If you sat next to him on a plane ride across the Atlantic, he would start talking as the plane taxied to the runway, you would be laughing by the time the seat-belt sign was turned off, and when you landed at the other end you'd wonder where the time had gone.
I met Roger through his daughter Sally, whose sister Lizzie went to high school in Dallas with my friend Sara M., whom I know because she used to work with Jacob Weisberg. (No Jacob, no Roger.) Roger spent at least part of his childhood in Ohio, which is where Lois's second husband, Bernie Weisberg, grew up, so I asked Roger if he knew Bernie. It would have been a little too apt if he did -- that would have made it all something out of "The X-Files" -- but instead of just answering, "Sorry, I don't," which is what most of us would have done, he paused for a long time, as if to flip through the "W"s in his head, and then said, "No, but I'm sure if I made two phone calls..."
Roger has a very good memory for names. One time, he says, someone was trying to talk him into investing his money in a business venture in Spain, and when he asked the names of the other investors he recognized one of them as the same man with whom one of his ex-girlfriends had had a fling during her junior year abroad, fifty years before. Roger sends people cards on their birthdays: he has a computerized Rolodex with sixteen hundred names on it. When I met him, I became convinced that these techniques were central to the fact that he knew everyone -- that knowing everyone was a kind of skill. Horchow is the founder of the Horchow Collection, the first high-end mail-order catalogue, and I kept asking him how all the connections in his life had helped him in the business world, because I thought that this particular skill had to have been cultivated for a reason. But the question seemed to puzzle him. He didn't think of his people collection as a business strategy, or even as something deliberate. He just thought of it as something he did -- as who he was. One time, Horchow said, a close friend from childhood suddenly resurfaced. "He saw my catalogue and knew it had to be me, and when he was out here he showed up on my doorstep. I hadn't seen him since I was seven. We had zero in common. It was wonderful." The juxtaposition of those last two sentences was not ironic; he meant it.
In the book "The Language Instinct," the psychologist Steven Pinker argues against the idea that language is a cultural artifact -- something that we learn "the way we learn to tell time." Rather, he says, it is innate. Language develops "spontaneously," he writes, "without conscious effort or formal instruction," and "is deployed without awareness of its underlying logic.... People know how to talk in more or less the sense that spiders know how to spin webs." The secret to Roger Horchow and Lois Weisberg is, I think, that they have a kind of social equivalent of that instinct -- an innate and spontaneous and entirely involuntary affinity for people. They know everyone because -- in some deep and less than conscious way -- they can't help it.
8.
Once, in the very early nineteen-sixties, after Lois had broken up with her first husband, she went to a party for Ralph Ellison, who was then teaching at the University of Chicago. There she spotted a young lawyer from the South Side named Bernie Weisberg. Lois liked him. He didn't notice her, though, so she decided to write a profile of him for the Hyde Park Herald. It ran with a huge headline. Bernie still didn't call. "I had to figure out how I was going to get to meet him again, so I remembered that he was standing in line at the reception with Ralph Ellison," Lois says. "So I called up Ralph Ellison" -- whom she had never met -- "and said, 'It's so wonderful that you are in Chicago. You really should meet some people on the North Side. Would it be O.K. if I have a party for you?'" He said yes, and Lois sent out a hundred invitations, including one to Bernie. He came. He saw Dizzy Gillespie in the kitchen and Ralph Ellison in the living room. He was impressed. He asked Lois to go with him to see Lenny Bruce. Lois was mortified; she didn't want this nice Jewish lawyer from the South Side to know that she knew Lenny Bruce, who was, after all, a drug addict. "I couldn't get out of it," she said. "They sat us down at a table right at the front, and Lenny keeps coming over to the edge of the stage and saying" -- here Lois dropped her voice down very low -- "'Hello, Lois.'I was sitting there like this." Lois put her hands on either side of her face. "Finally I said to Bernie, 'There are some things I should tell you about. Lenny Bruce is a friend of mine. He's staying at my house. The second thing is I'm defending a murderer.'"(But that's another story.) Lois and Bernie were married a year later.
The lesson of this story isn't obvious until you diagram it culturally: Lois got to Bernie through her connections with Ralph Ellison and Lenny Bruce, one of whom she didn't know (although later, naturally, they became great friends) and one of whom she was afraid to say that she knew, and neither of whom, it is safe to speculate, had ever really been connected with each other before. It seems like an absurdly roundabout way to meet someone. Here was a thirtyish liberal Jewish intellectual from the North Side of Chicago trying to meet a thirtyish liberal Jewish intellectual from the South Side of Chicago, and to get there she charted a cross-cultural social course through a black literary lion and an avant-garde standup comic. Yet that's a roundabout journey only if you perceive the worlds of Lenny Bruce and Ralph Ellison and Bernie Weisberg to be impossibly isolated. If you don't -- if, like Lois, you see them all as three points of an equilateral triangle -- then it makes perfect sense. The social instinct makes everyone seem like part of a whole, and there is something very appealing about this, because it means that people like Lois aren't bound by the same categories and partitions that defeat the rest of us. This is what the power of the people who know everyone comes down to in the end. It is not -- as much as we would like to believe otherwise -- something rich and complex, some potent mixture of ambition and energy and smarts and vision and insecurity. It's much simpler than that. It's the same lesson they teach in Sunday school. Lois knows lots of people because she likes lots of people. And all those people Lois knows and likes invariably like her, too, because there is nothing more irresistible to a human being than to be unqualifiedly liked by another.
Not long ago, Lois took me to a reception at the Museum of Contemporary Art, in Chicago -- a brand-new, Bauhaus-inspired building just north of the Loop. The gallery space was impossibly beautiful -- cool, airy, high-ceilinged. The artist on display was Chuck Close. The crowd was sleek and well groomed. Black-clad young waiters carried pesto canapés and glasses of white wine. Lois seemed a bit lost. She can be a little shy sometimes, and at first she stayed on the fringes of the room, standing back, observing. Someone important came over to talk to her. She glanced up uncomfortably. I walked away for a moment to look at the show, and when I came back her little corner had become a crowd. There was her friend from the state legislature. A friend in the Chicago Park District. A friend from her neighborhood. A friend in the consulting business. A friend from Gallery 37. A friend from the local business- development group. And on and on. They were of all ages and all colors, talking and laughing, swirling and turning in a loose circle, and in the middle, nearly hidden by the commotion, was Lois, clutching her white bag, tiny and large-eyed, at that moment the happiest person in the room
Running from Ritalin
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
February 2, 1999
BOOKS
Is the hectic pace of contemporary
life really to blame for A.D.D.? Not so fast.
1.
There has always been a temptation in American culture to think of drugs as social metaphors. In the early sixties, the pharmaceutical metaphor for the times was Valium. During the sexual revolution, it was the Pill, and that was followed, in quick succession, by marijuana in the nineteen-seventies, cocaine in the nineteen-eighties, and Prozac in the early nineteen-nineties. Today, of course, the drug that has come to symbolize our particular predicaments is Ritalin, the widely prescribed treatment for attention-deficit hyperactivity disorder, or attention-deficit disorder, as it is more popularly known. In his new book, "The Hyperactivity Hoax," the neuropsychiatrist Sydney Walker calls attention disorders and the rise of Ritalin "symptoms of modern life, rather than symptoms of modern disease." In "Ritalin Nation" the psychologist Richard DeGrandpre argues that Ritalin and A.D.H.D. are the inevitable by-products of a culture-wide addiction to speed--to cellular phones and beepers and faxes and overnight mail and computers with powerful chips and hard-driving rock music and television shows that splice together images at hundredth-of-a-second intervals, and a thousand other social stimulants that have had the effect of transforming human expectations. The soaring use of Ritalin, the physician Lawrence Diller concludes in his new book, "Running on Ritalin," reveals something about the kind of society we are at the turn of the millennium.... It throws a spotlight on some of our most sensitive issues: what kind of parents we are, what kind of schools we have, what kind of health care is available to us. It brings into question our cultural standards for behavior, performance, and punishment; it reaches into the workplace, the courts and the halls of Congress. It highlights the most basic psychological conundrum of nature versus nurture, and it raises fundamental philosophical questions about the nature of free will and responsibility.
In a recent Time cover story on Ritalin, the mother of a child with A.D.H.D. is described as tearing up her daughter's Ritalin prescription. "I thought, maybe there is something else we can do," she says. "I knew that medicine can mask things." That is the kind of question that Ritalin provokes--not the simple, traditional "Will this cure my child?" but the harder, postmodern question "In curing my child, what deeper pathology might this drug be hiding?"
It's important that we ask questions like this, particularly of drugs that are widely used. The problem with Ritalin is that many of the claims made to support the drug's status as a symbol turn out, on closer examination, to be vague or confusing. Diller, DeGrandpre, and Walker are all, for example, deeply suspicious of our reliance on Ritalin. They think that it is overprescribed--that it is being used to avoid facing broader questions about our values and our society. This sounds plausible: the amount of Ritalin consumed in the United States has more than tripled since 1990. Then again, it has been only in the last ten years that clinical trials have definitively proved that Ritalin is effective in treating A.D.H.D. And, even with that dramatic increase, the number of American children taking Ritalin is estimated to be one or two per cent. Given that most estimates put the incidence of A.D.H.D. at between three and five per cent, are too many children taking the drug--or too few? "You really run into problems with teen-agers," William Pelham, a professor of psychology at SUNY Buffalo and a prominent A.D.H.D. expert, told me. "They don't want to take this medication. They don't feel they need to. It's part of the oppositional stuff you run into. The kids whom you most want to take it are the ones who are aggressive, and they are the most likely to blow it off."
Or consider how A.D.H.D. is defined. According to the Diagnostic and Statistical Manual-IV, a child has A.D.H.D. if, for a period of six months, he or she exhibits at least six symptoms from a list of behavioral signs. Among them: "often has difficulty organizing tasks and activities," "often does not seem to listen when spoken to directly," "is often easily distracted by extraneous stimuli," "is often 'on the go' or acts as if 'driven by a motor,'" and "often blurts out answers before questions have been completed," and so on. "Ritalin Nation" argues that all these are essentially symptoms of boredom--the impatience of those used to the rapid-fire pace of MTV, Nintendo, and the rest of contemporary culture. The A.D.H.D. child blurts out answers before questions have been completed because, DeGrandpre says, "listening is usually a waiting situation that provides a low level of stimulation." The A.D.H.D. child is easily distracted because, "by definition, extraneous stimuli are novel." Give A.D.H.D. kids something novel to do, something that can satisfy their addiction, DeGrandpre argues, and they'll be fine. Diller works with a different definition of A.D.H.D. but comes to some of the same conclusions. High-stimulus activities like TV and video games "constitute a strange sort of good-fit situation for distractible children," he writes. "These activities are among the few things they can concentrate on well."
2.
When A.D.H.D. kids are actually tested on activities like video games, however, this alleged "good fit" disappears. Rosemary Tannock, a behavioral scientist at the Hospital for Sick Children, in Toronto, recently looked at how well a group of boys between the ages of eight and twelve actually did at Pac Man and Super Mario World, and she found that the ones with A.D.H.D. completed fewer levels and had to restart more games than their unaffected peers. "They often failed to inhibit their forward trajectory and crashed headlong into obstacles," she explained. A.D.H.D. kids may like the stimulation of a video game, but that doesn't mean they can handle it. Tannock has also given a group of A.D.H.D. children what's called a letter-naming test. The child is asked to read as quickly as he can five rows of letters, each of which consists of five letters repeated in different orders--"A, B, C, D, E," for example, followed by "D, E, B, A, C," and so on. A normal eight-year-old might take twenty-five seconds to complete the list. His counterpart with attention deficit might take thirty-five seconds, which is the kind of performance usually associated with dyslexia. "Some of our most articulate [A.D.H.D.] youngsters describe how doing this test is like speaking a foreign language in a foreign land," Tannock told me. "You get exhausted. That's how they feel. They have a thousand different ideas crowding into their heads at the same time." This doesn't sound like a child attuned to the quicksilver rhythms of the modern age. This sounds like a garden-variety learning disorder.
What further confounds the culture-of-Ritalin school is that A.D.H.D. turns out to have a considerable genetic component. As a result of numerous studies of twins conducted around the world over the past decade, scientists now estimate that A.D.H.D. is about seventy per cent heritable. This puts it up there with the most genetically influenced of traits--traits such as blood pressure, height, and weight. Meanwhile, the remaining thirty per cent--the environmental contribution to the disorder--seems to fall under what behavioral geneticists call "non-shared environment," meaning that it is likely to be attributable to such factors as fetal environment or illness and injury rather than factors that siblings share, such as parenting styles or socioeconomic class. That's why the way researchers describe A.D.H.D. has changed over the past decade. There is now less discussion of the role of bad parents, television, and diet and a lot more discussion of neurology and the role of specific genes.
This doesn't mean that there is no social role at all in the expression of A.D.H.D. Clearly, something has happened to make us all suddenly more aware of the disorder. But when, for instance, Diller writes that "the conditions that have fueled the A.D.D. epidemic and the Ritalin boom" will not change until "America somehow regains its balance between material gain and emotional and spiritual satisfaction," it's clear that he is working with a definition of A.D.H.D. very different from that of the scientific mainstream. In fact, books like "Running on Ritalin" and "Ritalin Nation" don't seem to have a coherent definition of A.D.H.D. at all. This is what is so confusing about the popular debate over this disorder: it's backward. We've become obsessed with what A.D.H.D. means. Don't we first have to figure out what it is?
3.
One of the tests researchers give to children with A.D.H.D. is called a stop-signal task. A child sits down at a computer and is told to hit one key if he sees an "X" on the screen and another key if he sees an "O." If he hears a tone, however, he is to refrain from hitting the key. By changing the timing of the tone--playing it just before or just as or just a millisecond after the "X" or "O" appears on the screen--you can get a very good idea of how well someone reacts. "Kids with A.D.H.D. have a characteristically longer reaction time," Gordon Logan, a cognitive psychologist at the University of Illinois, told me. "They're fifty per cent slower than other kids." Unless the tone is played very early, giving them plenty of warning, they can't stop themselves from hitting the keys.
The results may seem a relatively trivial matter--these are differences measured in fractions of a second, after all. But for many researchers the idea that children with A.D.H.D. lack some fundamental ability to inhibit themselves, to stop a pre-programmed action, is at the heart of the disorder. Suppose, for example, that you have been given a particularly difficult math problem. Your immediate, impulsive response might be to throw down your pencil in frustration. But most of us wouldn't do that. We would check those impulses, and try to slog our way through the problem, and, with luck, maybe get it right. Part of what it takes to succeed in a complex world, in other words, is the ability to inhibit our impulses. But the child with A.D.H.D., according to the official diagnosis, "often does not follow through on instructions and fails to finish schoolwork, chores, or duties in the workplace" and "often runs about or climbs excessively in situations in which it is inappropriate." He cannot apply himself because he cannot regulate his behavior in a consistent manner. He is at the mercy of the temptations and distractions in his immediate environment. "It's not that a child or an individual is always hyperactive or always inattentive or distracted," Tannock says. "The same individual can one minute be restless and fidgeting or the next minute lethargic or yawning. The individual can be overfocussed one minute and incredibly distractible the next. It is this variability, from day to day and moment to moment, that is the most robust finding we have."
Russell Barkley, a professor of psychiatry at the University of Massachusetts at Worcester, has done experiments that look at the way A.D.H.D. kids experience time, and the results demonstrate how this basic problem with self-regulation can have far-reaching consequences. In one experiment, he turns on a light for a predetermined length of time and then asks a child to turn the light back on and off for what the child guesses to be the same interval. Children without A.D.H.D. perform fairly consistently. At twelve seconds, for example, their guesses are just a little low. At thirty-six seconds, they are slightly less accurate--still on the low side--and at sixty seconds their guesses are coming in at about fifty seconds. A.D.H.D. kids, on the other hand, are terrible at this game. At twelve seconds, they are well over; apparently, twelve seconds seems much, much longer to them. But at sixty seconds their guesses are much lower than everyone else's; apparently, the longer interval is impossible to comprehend. The consequences of having so profoundly subjective a sense of time are obvious. It's no surprise that people with A.D.H.D. often have problems with punctuality and with patience. An accurate sense of time is a function of a certain kind of memory--an ability to compare the duration of ongoing events with that of past events, so that a red light doesn't seem like an outrageous imposition, or five minutes doesn't seem so impossibly long that you can imagine getting from one side of town to the other in that amount of time. Time is about imposing order, about exercising control over one's perceptions, and that's something that people with attention deficit have trouble with.
This way of thinking about A.D.H.D. clarifies some of the more confusing aspects of the disorder. In DeGrandpre's formulation, the A.D.H.D. child can't follow through on instructions or behaves inappropriately because there isn't enough going on in his environment. What the inhibition theory implies is the opposite: that the A.D.H.D. child can't follow through or behaves inappropriately because there is too much going on; he falters in situations that require him to exercise self-control and his higher cognitive skills. DeGrandpre cannot explain why A.D.H.D. kids like video games but are also so bad at them. Shouldn't they thrive in that most stimulating of environments? If their problem is self-control, that apparent contradiction makes perfect sense. The A.D.H.D. child likes video games because they permit--even encourage--him to play impulsively. But he's not very good at them because to succeed at Pac Man or Super Mario World a child must learn to overcome the temptation posed by those games to respond impulsively to every whiz and bang: the child has to learn to stop and think (ever so quickly) before reacting.
At the same time, this theory makes it a lot clearer what kind of problem A.D.H.D. represents. The fact that children with the disorder can't finish the hard math problem doesn't mean that they're not smart enough to know the answer. It means they can't focus long enough to get to the answer. As Barkley puts it, A.D.H.D. is a problem not of knowing what you should do but, rather, of doing what you know. Motivation and memory and higher cognitive skills are intact in people with attention deficit. "But they are secondarily delayed," Barkley says. "They have no chance. They are rarely engaged and highly ineffective, because impulsive actions take precedence." The inability to stop pressing that "X" or "O" key ends up causing much more serious problems down the road.
This way of thinking about A.D.H.D. also demystifies Ritalin. Implicit in the popular skepticism about the drug has always been the idea that you cannot truly remedy something as complicated as A.D.H.D. with a pill. That's why the mother quoted in the Time story ripped up her child's Ritalin prescription, and why Diller places so much emphasis on the need for "real" social and spiritual solutions. But if A.D.H.D. is merely a discrete problem in inhibition why couldn't Ritalin be a complete solution? People with A.D.H.D. don't need a brain overhaul. They just need a little help with stopping..
4.
There is another way to look at the A.D.H.D.-Ritalin question, which is known as the dopamine theory. This is by no means a conclusive account of A.D.H.D., but it may help clarify some of the issues surrounding the disorder. Dopamine is the chemical in the brain--the neurotransmitter--that appears to play a major role in things like attention and inhibition. When you tackle a difficult task or pay attention to a complex social situation, you are essentially generating dopamine in the parts of the brain that deal with higher cognitive tasks. If you looked at a thousand people at random, you would find a huge variation in their dopamine systems, just as you would if you looked at, say, blood pressure in a random population. A.D.H.D., according to this theory, is the name we give to people whose dopamine falls at the lower end of the scale, the same way we say that people suffer from hypertension if their blood pressure is above a certain point. In order to get normal levels of attention and inhibition, you have to produce normal levels of dopamine.
This is what Ritalin does. Dopamine is manufactured in the brain by special receptors, and each of those receptors has a "transport," a kind of built-in vacuum cleaner that sucks up any excess dopamine floating around and stores it inside the neuron. Ritalin shuts down that transport, so the amount of dopamine available for cognition remains higher than it would be otherwise. In about sixty-five per cent of those who take the drug, Ritalin appears to make them "normal," and in an additional ten per cent it appears to bring about substantial improvement. It does have a few minor side effects--appetite loss and insomnia, in some users--but by and large it's a remarkably safe drug, with remarkably specific effects.
So what does the fact that we seem to be relying more and more on Ritalin mean? The beginning of the answer, I think, lies in the fact that Ritalin is not the only drug in existence that enhances dopamine. Cocaine affects the brain in almost exactly the same way. Nicotine, too, is a dopamine booster, although its mechanism is somewhat different. Obviously, taking Ritalin doesn't have the same consequences as snorting cocaine or smoking a cigarette. It's not addictive, and its effect is a lot more specific. Still, nicotine, cocaine, and Ritalin are all performing the same basic neurological function.
What, for instance, was the appeal of cocaine at the beginning of the coke epidemic of the eighties? It was a feel-good drug. But it was a feel-good drug of a certain kind--a drug that people thought would help them master the complexity and the competitive pressures of the world around them. In the now infamous Time story on cocaine that ran in the summer of 1981, there is a picture of a "freelance artist" in Manhattan doing lines on his lunch break, with the caption "Feeling stronger, smarter, faster, more able to cope." Cocaine, the article begins, "is becoming the all-American drug," and its popularity, in the words of one expert, is a symptom of the fact that "right from childhood in this country there is pressure for accomplishment." At the moment of its greatest popularity, cocaine was considered a thinking drug, an achievement drug, a drug for the modern world. Does that sound familiar?
Nicotine has a similar profile. Cigarettes aid concentration. Understandably, this isn't a fact that has received much publicity in recent years. But there are plenty of data showing that nicotine does exactly what you would expect a dopamine enhancer to do. In one experiment, for example, smokers were given three minutes to read randomly ordered letters, in rows of thirty, and cross out the letter "e" every time they encountered it. The smokers took the test twice, before and after smoking a cigarette, and, on average, they were able to read 161.5 more letters--or more than five extra lines--after smoking than before. It's no surprise that this test sounds a lot like the test that A.D.H.D. kids do so poorly on, because we are really talking about the same set of cognitive skills--the ability to concentrate and screen out distractions. Numerous studies have shown that children with A.D.H.D. are much more likely to smoke and take illegal drugs in later life; what the dopamine theory suggests is that many people resort to such substances as a way of medicating themselves. Nora Volkow, the chairman of medicine at Brookhaven National Laboratory, says that between ten and twenty per cent of drug addicts have A.D.H.D. "In studies, when they were given Ritalin they would stop taking cocaine," she told me. Timothy Wilens, a psychiatrist at Harvard Medical School, presented data at a recent National Institutes of Health conference on A.D.H.D. which showed that treating A.D.H.D. kids with Ritalin and the like lowered the risk of their developing drug problems in adolescence by an extraordinary sixty-eight per cent. Among people with dopamine deficits, Ritalin is becoming a safe pharmaceutical alternative to the more dangerous dopamine boosters of the past.
Here, surely, is one of the deeper implications of the rise of Ritalin--particularly among adults, whose use of the drug has increased rapidly in recent years. For decades, in this country and around the world, millions of people used smoking as a way of boosting their dopamine and sharpening focus and concentration. Over the past twenty years, we have gradually taken away that privilege, by making it impossible for people to smoke at work and by marshalling an array of medical evidence to convince people that they should not start at all. From a public-health standpoint, this has been of critical importance: countless lives have been saved. But the fact remains that millions of people have lost a powerful pharmacological agent--nicotine--that they had been using to cope with the world around them. In fact, they have lost it precisely at a moment when the rising complexity of modern life would seem to make dopamine enhancement more important than ever. Among adults, Ritalin is a drug that may fill the void left by nicotine.
Among children, Ritalin is clearly performing a similar function. We are extending to the young cognitive aids of a kind that used to be reserved exclusively for the old. It is this reliance on a drug--the idea that children should have to be medicated--that, of course, people like Diller, Walker, and DeGrandpre find so upsetting. If some children need to take a drug in order to be "normal," they think that the problem is with our definition of "normal." Diller asks, "Is there still a place for childhood in the anxious, downsizing America of the late nineteen-nineties? What if Tom Sawyer or Huckleberry Finn were to walk into my office tomorrow? Tom's indifference to schooling and Huck's 'oppositional' behavior would surely have been cause for concern. Would I prescribe Ritalin for them, too?" But this is just the point. Huck Finn and Tom Sawyer lived in an age where difficult children simply dropped out of school, or worked on farms, or drifted into poverty and violence. The "childhood" Diller romanticizes was a ruthlessly Darwinian place, which provided only for the most economically--and genetically--privileged. Children are now being put into situations that demand attention and intellectual consideration, and it is no longer considered appropriate simply to cast aside those who because of some neurological quirk have difficulty coping. Only by a strange inversion of moral responsibility do books like "Ritalin Nation" and "Running on Ritalin" seek to make those parents and physicians trying to help children with A.D.H.D. feel guilty for doing so. The rise of A.D.H.D. is a consequence of what might otherwise be considered a good thing: that the world we live in increasingly values intellectual consideration and rationality--increasingly demands that we stop and focus. Modernity didn't create A.D.H.D. It revealed it.
Drunk Drivers
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
March 8, 1999
talk of the town
Drunk Drivers and Other Dangers
Last week, New York City began confiscating the automobiles of people caught drinking and driving. On the first day of the crackdown, the police seized three cars, including one from a man who had been arrested for drunk driving on eight previous occasions. The tabloids cheered. Mothers Against Drunk Driving nodded in approval. After a recent series of brutal incidents involving the police tarnished the Giuliani administration, the Mayor's anti-crime crusade appeared to right itself. The city now has the toughest anti-drunk-driving policy in the country, and the public was given a welcome reminder that the vast majority of the city's thirty-eight thousand cops are neither racist nor reckless and that the justice they mete out is largely deserved. "There's a very simple way to stay out of this problem, for you, your family, and anyone else," a triumphant Giuliani said. "Do not drink and get behind the wheel of a car."
Let's leave aside, for a moment, the question of whether the new policy is constitutional. That is a matter for the courts. A more interesting issue is what the willing acceptance of such a hard-line stance on drunk driving says about the sometimes contradictory way we discuss traffic safety. Suppose, for example, that I was stopped by the police for running a red light on Madison Avenue. I would get points on my license and receive a fine. If I did the same thing while my blood-alcohol level was above the prescribed limit, however, I would be charged with drunk driving and lose my car. The behavior is the same in both cases, but the consequences are very different. We believe, as a society, that the combination of alcohol and driving deserves particular punishment. And that punishment isn't necessarily based on what you have actually done. It's often based on what you could do--or, to be more precise, on the extra potential for harm that your drinking poses.
There is nothing wrong with this approach. We have laws against threatening people with guns for the same reason. It hardly makes sense to wait for drunks or people waving guns to kill someone before we arrest them. But if merely posing a threat to others on the road is the threshold for something as drastic as civil forfeiture, then why are we stopping with drunks? Fifty per cent of all car accidents in the United States are attributed to driver inattention, for example. Some of that inattention is caused by inebriation, but there are other common and obvious distractions. Two studies made in the past three years--the first conducted at the Rochester Institute of Technology and the second published in the New England Journal of Medicine-- suggest that the use of car phones is associated with a four-to-fivefold increase in the risk of accidents, and that hands-free phones may not be any safer than conventional ones. The driver on the phone is a potential risk to others, just as the driver who has been drinking is. It is also now abundantly clear that sport-utility vehicles and pickup trucks can--by virtue of their weight, high clearance, and structural rigidity--do far more damage in an accident than conventional automobiles can. S.U.V.s and light trucks account for about a third of the vehicles on the road. But a disproportionate number of the fatalities in two-vehicle crashes are caused by collisions between those bigger vehicles and conventional automobiles, and the people riding in the cars make up a stunning eighty-one per cent of those killed.
The reason we don't like drunk drivers is that by making the decision to drink and drive an individual deliberately increases his or her chance of killing someone else with a vehicle. But how is the moral culpability of the countless Americans who have walked into a dealership and made a decision to buy a fifty-six- hundred-pound sport utility any different? Of course, there are careful S.U.V. drivers and careful car-phone users. Careful people can get drunk, too, and overcompensate for their impairment by creeping along at twenty-five miles an hour, and in New York City we won't hesitate to take away their vehicles. Obviously, Giuliani, even in his most crusading moments, isn't about to confiscate all the car phones and S.U.V.s on the streets of New York. States should, however, stop drivers from using car phones while the car is in motion, as some countries, including England, do. And a prohibitive weight tax on sport utilities would probably be a good idea. The moneys collected could be used to pay the medical bills and compensate the family of anyone hit by some cell-phone-wielding yuppie in a four-wheeled behemoth.
True Colors
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
March 22, 1999
ANNALS OF ADVERTISING
Hair dye and the hidden history of postwar America.
1.
During the Depression-long before she became one of the most famous copywriters of her day-Shirley Polykoff met a man named George Halperin. He was the son of an Orthodox rabbi from Reading, Pennsylvania, and soon after they began courting he took her home for Passover to meet his family. They ate roast chicken, tzimmes, and sponge cake, and Polykoff hit it off with Rabbi Halperin, who was warm and funny. George's mother was another story. She was Old World Orthodox, with severe, tightly pulled back hair; no one was good enough for her son.
"How'd I do, George?" Shirley asked as soon as they got in the car for the drive home. "Did your mother like me?"
He was evasive. "My sister Mildred thought you were great."
"That's nice, George," she said. "But what did your mother say?"
There was a pause. "She says you paint your hair." Another pause. "Well, do you?"
Shirley Polykoff was humiliated. In her mind she could hear her future mother-in-law: Fahrbt zi der huer? Oder fahrbt zi nisht? Does she color her hair? Or doesn't she?
The answer, of course, was that she did. Shirley Polykoff always dyed her hair, even in the days when the only women who went blond were chorus girls and hookers. At home in Brooklyn, starting when she was fifteen, she would go to Mr. Nicholas's beauty salon, one flight up, and he would "lighten the back" until all traces of her natural brown were gone. She thought she ought to be a blonde-or, to be more precise, she thought that the decision about whether she could be a blonde was rightfully hers, and not God's. Shirley dressed in deep oranges and deep reds and creamy beiges and royal hues. She wore purple suede and aqua silk, and was the kind of person who might take a couture jacket home and embroider some new detail on it. Once, in the days when she had her own advertising agency, she was on her way to Memphis to make a presentation to Maybelline and her taxi broke down in the middle of the expressway. She jumped out and flagged down a Pepsi-Cola truck, and the truck driver told her he had picked her up because he'd never seen anyone quite like her before. "Shirley would wear three outfits, all at once, and each one of them would look great," Dick Huebner, who was her creative director, says. She was flamboyant and brilliant and vain in an irresistible way, and it was her conviction that none of those qualities went with brown hair. The kind of person she spent her life turning herself into did not go with brown hair. Shirley's parents were Hyman Polykoff, small-time necktie merchant, and Rose Polykoff, housewife and mother, of East New York and Flatbush, by way of the Ukraine. Shirley ended up on Park Avenue at Eighty-second. "If you asked my mother 'Are you proud to be Jewish?' she would have said yes," her daughter, Alix Nelson Frick, says. "She wasn't trying to pass. But she believed in the dream, and the dream was that you could acquire all the accouterments of the established affluent class, which included a certain breeding and a certain kind of look. Her idea was that you should be whatever you want to be, including being a blonde."
In 1956, when Shirley Polykoff was a junior copywriter at Foote, Cone & Belding, she was given the Clairol account. The product the company was launching was Miss Clairol, the first hair-color bath that made it possible to lighten, tint, condition, and shampoo at home, in a single step-to take, say, Topaz (for a champagne blond) or Moon Gold (for a medium ash), apply it in a peroxide solution directly to the hair, and get results in twenty minutes. When the Clairol sales team demonstrated their new product at the International Beauty Show, in the old Statler Hotel, across from Madison Square Garden, thousands of assembled beauticians jammed the hall and watched, openmouthed, demonstration after demonstration. "They were astonished," recalls Bruce Gelb, who ran Clairol for years, along with his father, Lawrence, and his brother Richard. "This was to the world of hair color what computers were to the world of adding machines. The sales guys had to bring buckets of water and do the rinsing off in front of everyone, because the hairdressers in the crowd were convinced we were doing something to the models behind the scenes."
Miss Clairol gave American women the ability, for the first time, to color their hair quickly and easily at home. But there was still the stigma-the prospect of the disapproving mother-in-law. Shirley Polykoff knew immediately what she wanted to say, because if she believed that a woman had a right to be a blonde she also believed that a woman ought to be able to exercise that right with discretion. "Does she or doesn't she?" she wrote, translating from the Yiddish to the English. "Only her hairdresser knows for sure." Clairol bought thirteen ad pages in Life in the fall of 1956, and Miss Clairol took off like a bird. That was the beginning. For Nice 'n Easy, Clairol's breakthrough shampoo-in hair color, she wrote, "The closer he gets, the better you look." For Lady Clairol, the cream-and-bleach combination that brought silver and platinum shades to Middle America, she wrote, "Is it true blondes have more fun?" and then, even more memorably, "If I've only one life, let me live it as a blonde!" (In the summer of 1962, just before "The Feminine Mystique" was published, Betty Friedan was, in the words of her biographer, so "bewitched" by that phrase that she bleached her hair.) Shirley Polykoff wrote the lines; Clairol perfected the product. And from the fifties to the seventies, when Polykoff gave up the account, the number of American women coloring their hair rose from seven per cent to more than forty per cent.
Today, when women go from brown to blond to red to black and back again without blinking, we think of hair-color products the way we think of lipstick. On drugstore shelves there are bottles and bottles of hair-color products with names like Hydrience and Excellence and Preference and Natural Instincts and Loving Care and Nice 'n Easy, and so on, each in dozens of different shades. Feria, the new, youth-oriented brand from L'Oreal, comes in Chocolate Cherry and Champagne Cocktail-colors that don't ask "Does she or doesn't she?" but blithely assume "Yes, she does." Hair dye is now a billion-dollar-a-year commodity.
Yet there was a time, not so long ago-between, roughly speaking, the start of Eisenhower's Administration and the end of Carter's-when hair color meant something. Lines like "Does she or doesn't she?" or the famous 1973 slogan for L'Oreal's Preference-"Because I'm worth it" were as instantly memorable as "Winston tastes good like a cigarette should" or "Things go better with Coke." They lingered long after advertising usually does and entered the language; they somehow managed to take on meanings well outside their stated intention. Between the fifties and the seventies, women entered the workplace, fought for social emancipation, got the Pill, and changed what they did with their hair. To examine the hair-color campaigns of the period is to see, quite unexpectedly, all these things as bound up together, the profound with the seemingly trivial. In writing the history of women in the postwar era, did we forget something important? Did we leave out hair?
2.
When the "Does she or doesn't she?" campaign first ran, in 1956, most advertisements that were aimed at women tended to be high glamour-"cherries in the snow, fire and ice," as Bruce Gelb puts it. But Shirley Polykoff insisted that the models for the Miss Clairol campaign be more like the girl next door-"Shirtwaist types instead of glamour gowns," she wrote in her original memo to Clairol. "Cashmere-sweater-over-the-shoulder types. Like larger-than-life portraits of the proverbial girl on the block who's a little prettier than your wife and lives in a house slightly nicer than yours." The model had to be a Doris Day type-not a Jayne Mansfield-because the idea was to make hair color as respectable and mainstream as possible. One of the earliest "Does she or doesn't she?" television commercials featured a housewife, in the kitchen preparing hors d'ouvres for a party. She is slender and pretty and wearing a black cocktail dress and an apron. Her husband comes in, kisses her on the lips, approvingly pats her very blond hair, then holds the kitchen door for her as she takes the tray of hors d'ouvres out for her guests. It is an exquisitely choreographed domestic tableau, down to the little dip the housewife performs as she hits the kitchen light switch with her elbow on her way out the door. In one of the early print ads-which were shot by Richard Avedon and then by Irving Penn-a woman with strawberry-blond hair is lying on the grass, holding a dandelion between her fingers, and lying next to her is a girl of about eight or nine. What's striking is that the little girl's hair is the same shade of blond as her mother's. The "Does she or doesn't she?" print ads always included a child with the mother to undercut the sexual undertones of the slogan-to make it clear that mothers were using Miss Clairol, and not just "fast" women-and, most of all, to provide a precise color match. Who could ever guess, given the comparison, that Mom's shade came out of a bottle?
The Polykoff campaigns were a sensation. Letters poured in to Clairol. "Thank you for changing my life,"read one, which was circulated around the company and used as the theme for a national sales meeting. "My boyfriend, Harold, and I were keeping company for five years but he never wanted to set a date. This made me very nervous. I am twenty-eight and my mother kept saying soon it would be too late for me." Then, the letter writer said, she saw a Clairol ad in the subway. She dyed her hair blond, and "that is how I am in Bermuda now on my honeymoon with Harold." Polykoff was sent a copy with a memo: "It's almost too good to be true!" With her sentimental idyll of blond mother and child, Shirley Polykoff had created something iconic.
"My mother wanted to be that woman in the picture," Polykoff's daughter, Frick, says. "She was wedded to the notion of that suburban, tastefully dressed, well-coddled matron who was an adornment to her husband, a loving mother, a long-suffering wife, a person who never overshadowed him.She wanted the blond child. In fact, I was blond as a kid, but when I was about thirteen my hair got darker and my mother started bleaching it." Of course-and this is the contradiction central to those early Clairol campaigns-Shirley Polykoff wasn't really that kind of woman at all. She always had a career. She never moved to the suburbs. "She maintained that women were supposed to be feminine, and not too dogmatic and not overshadow their husband, but she greatly overshadowed my father, who was a very pure, unaggressive, intellectual type," Frick says. "She was very flamboyant, very emotional, very dominating."
One of the stories Polykoff told about herself repeatedly- and that even appeared after her death last year, in her Times obituary-was that she felt that a woman never ought to make more than her husband, and that only after George's death, in the early sixties, would she let Foote, Cone & Belding raise her salary to its deserved level. "That's part of the legend, but it isn't the truth," Frick says. "The ideal was always as vividly real to her as whatever actual parallel reality she might be living. She never wavered in her belief in that dream, even if you would point out to her some of the fallacies of that dream, or the weaknesses, or the internal contradictions, or the fact that she herself didn't really live her life that way." For Shirley Polykoff, the color of her hair was a kind of useful fiction, a way of bridging the contradiction between the kind of woman she was and the kind of woman she felt she ought to be. It was a way of having it all. She wanted to look and feel like Doris Day without having to be Doris Day. In twenty-seven years of marriage, during which she bore two children, she spent exactly two weeks as a housewife, every day of which was a domestic and culinary disaster. "Listen, sweetie," an exasperated George finally told her. "You make a lousy little woman in the kitchen." She went back to work the following Monday.
This notion of the useful fiction-of looking the part without being the part-had a particular resonance for the America of Shirley Polykoff's generation. As a teen-ager, Shirley Polykoff tried to get a position as a clerk at an insurance agency and failed. Then she tried again, at another firm, applying as Shirley Miller. This time, she got the job. Her husband, George, also knew the value of appearances. The week Polykoff first met him, she was dazzled by his worldly sophistication, his knowledge of out-of-the-way places in Europe, his exquisite taste in fine food and wine. The second week, she learned that his expertise was all show, derived from reading the Times. The truth was that George had started his career loading boxes in the basement of Macy's by day and studying law at night. He was a faker, just as, in a certain sense, she was, because to be Jewish-or Irish or Italian or African-American or, for that matter, a woman of the fifties caught up in the first faint stirrings of feminism--was to be compelled to fake it in a thousand small ways, to pass as one thing when, deep inside, you were something else. "That's the kind of pressure that comes from the immigrants' arriving and thinking that they don't look right, that they are kind of funny-looking and maybe shorter than everyone else, and their clothes aren't expensive," Frick says. "That's why many of them began to sew, so they could imitate the patterns of the day. You were making yourself over. You were turning yourself into an American." Frick, who is also in advertising (she's the chairman of Spier NY), is a forcefully intelligent woman, who speaks of her mother with honesty and affection. "There were all those phrases that came to fruition at that time-you know, 'clothes make the man' and 'first impressions count.'" So the question "Does she or doesn't she?" wasn't just about how no one could ever really know what you were doing. It was about how no one could ever really know who you were. It really meant not "Does she?" but "Is she?" It really meant "Is she a contented homemaker or a feminist, a Jew or a Gentile--or isn't she?"
3. I am Ilon Specht, hear me roar
In 1973, Ilon Specht was working as a copywriter at the McCann-Erickson advertising agency, in New York. She was a twenty-three-year-old college dropout from California. She was rebellious, unconventional, and independent, and she had come East to work on Madison Avenue, because that's where people like that went to work back then. "It was a different business in those days," Susan Schermer, a long-time friend of Specht's, says. "It was the seventies. People were wearing feathers to work." At her previous agency, while she was still in her teens, Specht had written a famous television commercial for the Peace Corps. (Single shot. No cuts. A young couple lying on the beach. "It's a big, wide wonderful world" is playing on a radio. Voice-over recites a series of horrible facts about less fortunate parts of the world: in the Middle East half the children die before their sixth birthday, and so forth. A news broadcast is announced as the song ends, and the woman on the beach changes the station.)
"Ilon? Omigod! She was one of the craziest people I ever worked with," Ira Madris, another colleague from those years, recalls, using the word "crazy" as the highest of compliments. "And brilliant. And dogmatic. And highly creative. We all believed back then that having a certain degree of neurosis made you interesting. Ilon had a degree of neurosis that made her very interesting."
At McCann, Ilon Specht was working with L'Oreal, a French company that was trying to challenge Clairol's dominance in the American hair-color market. L'Oreal had originally wanted to do a series of comparison spots, presenting research proving that their new product-Preference-was technologically superior to Nice 'n Easy, because it delivered a more natural, translucent color. But at the last minute the campaign was killed because the research hadn't been done in the United States. At McCann, there was panic. "We were four weeks before air date and we had nothing-nada," Michael Sennott, a staffer who was also working on the account, says. The creative team locked itself away: Specht, Madris-who was the art director on the account-and a handful of others. "We were sitting in this big office," Specht recalls. "And everyone was discussing what the ad should be. They wanted to do something with a woman sitting by a window, and the wind blowing through the curtains. You know, one of those fake places with big, glamorous curtains. The woman was a complete object. I don't think she even spoke. They just didn't get it. We were in there for hours."
Ilon Specht is now the executive creative director of Jordan, McGrath, Case & Partners, in the Flatiron district, with a big office overlooking Fifth Avenue. She has long, thick black hair, held in a loose knot at the top of her head, and lipstick the color of maraschino cherries. She talks fast and loud, and swivels in her chair as she speaks, and when people walk by her office they sometimes bang on her door, as if the best way to get her attention is to be as loud and emphatic as she is. Reminiscing not long ago about the seventies, she spoke about the strangeness of corporate clients in shiny suits who would say that all the women in the office looked like models. She spoke about what it meant to be young in a business dominated by older men, and about what it felt like to write a line of copy that used the word "woman" and have someone cross it out and write "girl."
"I was a twenty-three-year-old girl-a woman," she said. "What would my state of mind have been? I could just see that they had this traditional view of women, and my feeling was that I'm not writing an ad about looking good for men, which is what it seems to me that they were doing. I just thought, Fuck you. I sat down and did it, in five minutes. It was very personal. I can recite to you the whole commercial, because I was so angry when I wrote it."
Specht sat stock still and lowered her voice: "I use the most expensive hair color in the world. Preference, by L'Oreal. It's not that I care about money. It's that I care about my hair. It's not just the color. I expect great color. What's worth more to me is the way my hair feels. Smooth and silky but with body. It feels good against my neck. Actually, I don't mind spending more for L'Oreal. Because I'm" -and here Specht took her fist and struck her chest-"worth it."
The power of the commercial was originally thought to lie in its subtle justification of the fact that Preference cost ten cents more than Nice 'n Easy. But it quickly became obvious that the last line was the one that counted. On the strength of "Because I'm worth it," Preference began stealing market share from Clairol. In the nineteen-eighties, Preference surpassed Nice 'n Easy as the leading hair-color brand in the country, and two years ago L'Oreal took the phrase and made it the slogan for the whole company. An astonishing seventy-one per cent of American women can now identify that phrase as the L'Oreal signature, which, for a slogan-as opposed to a brand name-is almost without precedent.
4.
From the very beginning, the Preference campaign was unusual. Polykoff's Clairol spots had male voice-overs. In the L'Oreal ads, the model herself spoke, directly and personally. Polykoff's commercials were "other-directed" -they were about what the group was saying ("Does she or doesn't she?") or what a husband might think ("The closer he gets, the better you look"). Specht's line was what a woman says to herself. Even in the choice of models, the two campaigns diverged. Polykoff wanted fresh, girl-next-door types. McCann and L'Oreal wanted models who somehow embodied the complicated mixture of strength and vulnerability implied by "Because I'm worth it." In the late seventies, Meredith Baxter Birney was the brand spokeswoman. At that time, she was playing a recently divorced mom going to law school on the TV drama "Family." McCann scheduled her spots during "Dallas" and other shows featuring so-called "silk blouse" women--women of strength and independence. Then came Cybill Shepherd, at the height of her run as the brash, independent Maddie on "Moonlighting," in the eighties. Now the brand is represented by Heather Locklear, the tough and sexy star of "Melrose Place." All the L'Oreal spokeswomen are blondes, but blondes of a particular type. In his brilliant 1995 book, "Big Hair: A Journey into the Transformation of Self," the Canadian anthropologist Grant McCracken argued for something he calls the "blondness periodic table," in which blondes are divided into six categories: the "bombshell blonde" (Mae West, Marilyn Monroe), the "sunny blonde" (Doris Day, Goldie Hawn), the "brassy blonde" (Candice Bergen), the "dangerous blonde" (Sharon Stone), the "society blonde" (C.Z. Guest), and the "cool blonde" (Marlene Dietrich, Grace Kelly). L'Oreal's innovation was to carve out a niche for itself in between the sunny blondes-the "simple, mild, and innocent" blondes-and the smart, bold, brassy blondes, who, in McCracken's words, "do not mediate their feelings or modulate their voices."
This is not an easy sensibility to capture. Countless actresses have auditioned for L'Oreal over the years and been turned down. "There was one casting we did with Brigitte Bardot," Ira Madris recalls (this was for another L'Oreal product), "and Brigitte, being who she is, had the damnedest time saying that line. There was something inside of her that didn't believe it. It didn't have any conviction." Of course it didn't: Bardot is bombshell, not sassy. Clairol made a run at the Preference sensibility for itself, hiring Linda Evans in the eighties as the pitchwoman for Ultress, the brand aimed at Preference's upscale positioning. This didn't work, either. Evans, who played the adoring wife of Blake Carrington on "Dynasty," was too sunny. ("The hardest thing she did on that show," Michael Sennott says, perhaps a bit unfairly, "was rearrange the flowers.")
Even if you got the blonde right, though, there was still the matter of the slogan. For a Miss Clairol campaign in the seventies, Polykoff wrote a series of spots with the tag line "This I do for me." But "This I do for me" was at best a halfhearted approximation of "Because I'm worth it"--particularly for a brand that had spent its first twenty years saying something entirely different. "My mother thought there was something too brazen about 'I'm worth it,'" Frick told me. "She was always concerned with what people around her might think. She could never have come out with that bald-faced an equation between hair color and self-esteem."
The truth is that Polykoff's sensibility-which found freedom in assimilation-had been overtaken by events. In one of Polykoff's "Is it true blondes have more fun?" commercials for Lady Clairol in the sixties, for example, there is a moment that by 1973 must have been painful to watch. A young woman, radiantly blond, is by a lake, being swung around in the air by a darkly handsome young man. His arms are around her waist. Her arms are around his neck, her shoes off, her face aglow. The voice-over is male, deep and sonorous. "Chances are," the voice says, "she'd have gotten the young man anyhow, but you'll never convince her of that." Here was the downside to Shirley Polykoff's world. You could get what you wanted by faking it, but then you would never know whether it was you or the bit of fakery that made the difference. You ran the risk of losing sight of who you really were. Shirley Polykoff knew that the all-American life was worth it, and that "he" -the handsome man by the lake, or the reluctant boyfriend who finally whisks you off to Bermuda-was worth it. But, by the end of the sixties, women wanted to know that they were worth it, too.
5. What Herta Herzog knew
Why are Shirley Polykoff and Ilon Specht important? That seems like a question that can easily be answered in the details of their campaigns. They were brilliant copywriters, who managed in the space of a phrase to capture the particular feminist sensibilities of the day. They are an example of a strange moment in American social history when hair dye somehow got tangled up in the politics of assimilation and feminism and self-esteem. But in a certain way their stories are about much more: they are about the relationship we have to the products we buy, and about the slow realization among advertisers that unless they understood the psychological particulars of that relationship-unless they could dignify the transactions of everyday life by granting them meaning-they could not hope to reach the modern consumer. Shirley Polykoff and Ilon Specht perfected a certain genre of advertising which did just this, and one way to understand the Madison Avenue revolution of the postwar era is as a collective attempt to define and extend that genre. The revolution was led by a handful of social scientists, chief among whom was an elegant, Viennese-trained psychologist by the name of Herta Herzog. What did Herta Herzog know? She knew-or, at least, she thought she knew-the theory behind the success of slogans like "Does she or doesn't she?" and "Because I'm worth it," and that makes Herta Herzog, in the end, every bit as important as Shirley Polykoff and Ilon Specht.
Herzog worked at a small advertising agency called Jack Tinker & Partners, and people who were in the business in those days speak of Tinker the way baseball fans talk about the 1927 Yankees. Tinker was the brainchild of the legendary adman Marion Harper, who came to believe that the agency he was running, McCann-Erickson, was too big and unwieldy to be able to consider things properly. His solution was to pluck a handful of the very best and brightest from McCann and set them up, first in the Waldorf Towers (in the suite directly below the Duke and Duchess of Windsor's and directly above General Douglas MacArthur's) and then, more permanently, in the Dorset Hotel, on West Fifty-fourth Street, overlooking the Museum of Modern Art. The Tinker Group rented the penthouse, complete with a huge terrace, Venetian-tiled floors, a double-height living room, an antique French polished-pewter bar, a marble fireplace, spectacular skyline views, and a rotating exhibit of modern art (hung by the partners for motivational purposes), with everything-walls, carpets, ceilings, furnishings-a bright, dazzling white. It was supposed to be a think tank, but Tinker was so successful so fast that clients were soon lined up outside the door. When Buick wanted a name for its new luxury coup?, the Tinker Group came up with Riviera. When Bulova wanted a name for its new quartz watch, Tinker suggested Accutron. Tinker also worked with Coca-Cola and Exxon and Westinghouse and countless others, whose names-according to the strict standards of secrecy observed by the group-they would not divulge. Tinker started with four partners and a single phone. But by the end of the sixties it had taken over eight floors of the Dorset.
What distinguished Tinker was its particular reliance on the methodology known as motivational research, which was brought to Madison Avenue in the nineteen-forties by a cadre of European intellectuals trained at the University of Vienna. Advertising research up until that point had been concerned with counting heads-with recording who was buying what. But the motivational researchers were concerned with why: Why do people buy what they do?What motivates them when they shop? The researchers devised surveys, with hundreds of questions, based on Freudian dynamic psychology. They used hypnosis, the Rosenzweig Picture-Frustration Study, role-playing, and Rorschach blots, and they invented what we now call the focus group. There was Paul Lazarsfeld, one of the giants of twentieth-century sociology, who devised something called the Lazarsfeld-Stanton Program Analyzer, a little device with buttons to record precisely the emotional responses of research subjects. There was Hans Zeisel, who had been a patient of Alfred Adler's in Vienna, and went to work at McCann-Erickson. There was Ernest Dichter, who had studied under Lazarsfeld at the Psychological Institute in Vienna, and who did consulting for hundreds of the major corporations of the day. And there was Tinker's Herta Herzog, perhaps the most accomplished motivational researcher of all, who trained dozens of interviewers in the Viennese method and sent them out to analyze the psyche of the American consumer.
"For Puerto Rican rum once, Herta wanted to do a study of why people drink, to tap into that below-the-surface kind of thing," Rena Bartos, a former advertising executive who worked with Herta in the early days, recalls. "We would would invite someone out to drink and they would order whatever they normally order, and we would administer a psychological test. Then we'd do it again at the very end of the discussion, after the drinks. The point was to see how people's personality was altered under the influence of alcohol." Herzog helped choose the name of Oasis cigarettes, because her psychological research suggested that the name-with its connotations of cool, bubbling springs-would have the greatest appeal to the orally-fixated smoker.
"Herta was graceful and gentle and articulate," Herbert Krugman, who worked closely with Herzog in those years, says. "She had enormous insights. Alka-Seltzer was a client of ours, and they were discussing new approaches for the next commercial. She said, 'You show a hand dropping an Alka-Seltzer tablet into a glass of water. Why not show the hand dropping two? You'll double sales.' And that's just what happened. Herta was the gray eminence. Everybody worshipped her."
Herta Herzog is now eighty-nine. After retiring from Tinker, she moved back to Europe, first to Germany and then to Austria, her homeland. She wrote an analysis of the TV show "Dallas" for the academic journal Society. She taught college courses on communications theory. She conducted a study on the Holocaust for the Vidal Sassoon Center for the Study of Anti-Semitism, in Jerusalem. Today, she lives in the mountain village of Leutasch, half an hour's hard drive up into the Alps from Innsbruck, in a white picture-book cottage with a sharply pitched roof. She is a small woman, slender and composed, her once dark hair now streaked with gray. She speaks in short, clipped, precise sentences, in flawless, though heavily accented, English. If you put her in a room with Shirley Polykoff and Ilon Specht, the two of them would talk and talk and wave their long, bejeweled fingers in the air, and she would sit unobtrusively in the corner and listen. "Marion Harper hired me to do qualitative research-the qualitative interview, which was the specialty that had been developed in Vienna at the .sterreichische Wirtschaftspsychologische Forschungsstelle," Herzog told me. "It was interviewing not with direct questions and answers but where you open some subject of the discussion relevant to the topic and then let it go. You have the interviewer not talk but simply help the person with little questions like 'And anything else?' As an interviewer, you are not supposed to influence me. You are merely trying to help me. It was a lot like the psychoanalytic method." Herzog was sitting, ramrod straight, in a chair in her living room. She was wearing a pair of black slacks and a heavy brown sweater to protect her against the Alpine chill. Behind her was row upon row of bookshelves, filled with the books of a postwar literary and intellectual life: Mailer in German, Reisman in English. Open and face down on a long couch perpendicular to her chair was the latest issue of the psychoanalytic journal Psyche. "Later on, I added all kinds of psychological things to the process, such as word-association tests, or figure drawings with a story. Suppose you are my respondent and the subject is soap. I've already talked to you about soap. What you see in it. Why you buy it. What you like about it. Dislike about it. Then at the end of the interview I say, 'Please draw me a figure-anything you want-and after the figure is drawn tell me a story about the figure.'"
When Herzog asked her subjects to draw a figure at the end of an interview, she was trying to extract some kind of narrative from them, something that would shed light on their unstated desires. She was conducting, as she says, a psychoanalytic session. But she wouldn't ask about hair-color products in order to find out about you, the way a psychoanalyst might; she would ask about you in order to learn about hair-color products. She saw that the psychoanalytic interview could go both ways. You could use the techniques of healing to figure out the secrets of selling. "Does she or doesn't she?" and "Because I'm worth it" did the same thing: they not only carried a powerful and redemptive message, but-and this was their real triumph-they succeeded in attaching that message to a five-dollar bottle of hair dye. The lasting contribution of motivational research to Madison Avenue was to prove that you could do this for just about anything-that the products and the commercial messages with which we surround ourselves are as much a part of the psychological furniture of our lives as the relationships and emotions and experiences that are normally the subject of psychoanalytic inquiry.
"There is one thing we did at Tinker that I remember well,"Herzog told me, returning to the theme of one of her, and Tinker's, coups. "I found out that people were using Alka-Seltzer for stomach upset, but also for headaches," Herzog said. "We learned that the stomach ache was the kind of ache where many people tended to say 'It was my fault.' Alka-Seltzer had been mostly advertised in those days as a cure for overeating, and overeating is something you have done. But the headache is quite different. It is something imposed on you." This was, to Herzog, the classic psychological insight. It revealed Alka-Seltzer users to be divided into two apparently incompatible camps-the culprit and the victim-and it suggested that the company had been wooing one at the expense of the other. More important, it suggested that advertisers, with the right choice of words, could resolve that psychological dilemma with one or, better yet, two little white tablets. Herzog allowed herself a small smile. "So I said the nice thing would be if you could find something that combines these two elements. The copywriter came up with 'the blahs.'" Herzog repeated the phrase, "the blahs," because it was so beautiful. "The blahs was not one thing or the other-it was not the stomach or the head. It was both."
6.
This notion of household products as psychological furniture is, when you think about it, a radical idea. When we give an account of how we got to where we are, we're inclined to credit the philosophical over the physical, and the products of art over the products of commerce. In the list of sixties social heroes, there are musicians and poets and civil-rights activists and sports figures. Herzog's implication is that such a high-minded list is incomplete. What, say, of Vidal Sassoon? In the same period, he gave the world the Shape, the Acute Angle, and the One-Eyed Ungaro. In the old "cosmology of cosmetology," McCracken writes, "the client counted only as a plinth...the conveyor of the cut." But Sassoon made individualization the hallmark of the haircut, liberating women's hair from the hair styles of the times-from, as McCracken puts it, those "preposterous bits of rococo shrubbery that took their substance from permanents, their form from rollers, and their rigidity from hair spray." In the Herzogian world view, the reasons we might give to dismiss Sassoon's revolution-that all he was dispensing was a haircut, that it took just half an hour, that it affects only the way you look, that you will need another like it in a month-are the very reasons that Sassoon is important. If a revolution is not accessible, tangible, and replicable, how on earth can it be a revolution?
"Because I'm worth it" and "Does she or doesn't she?" were powerful, then, precisely because they were commercials, for commercials come with products attached, and products offer something that songs and poems and political movements and radical ideologies do not, which is an immediate and affordable means of transformation. "We discovered in the first few years of the 'Because I'm worth it' campaign that we were getting more than our fair share of new users to the category-women who were just beginning to color their hair," Sennott told me. "And within that group we were getting those undergoing life changes, which usually meant divorce. We had far more women who were getting divorced than Clairol had. Their children had grown, and something had happened, and they were reinventing themselves." They felt different, and Ilon Specht gave them the means to look different-and do we really know which came first, or even how to separate the two? They changed their lives and their hair. But it wasn't one thing or the other. It was both.
7.
Since the mid-nineties, the spokesperson for Clairol's Nice 'n Easy has been Julia Louis-Dreyfus, better known as Elaine, from "Seinfeld." In the Clairol tradition, she is the girl next door-a postmodern Doris Day. But the spots themselves could not be less like the original Polykoff campaigns for Miss Clairol. In the best of them, Louis-Dreyfus says to the dark-haired woman in front of her on a city bus, "You know, you'd look great as a blonde." Louis-Dreyfus then shampoos in Nice 'n Easy Shade 104 right then and there, to the gasps and cheers of the other passengers. It is Shirley Polykoff turned upside down: funny, not serious; public, not covert.
L'Oreal, too, has changed. Meredith Baxter Birney said "Because I'm worth it" with an earnestness appropriate to the line. By the time Cybill Shepherd became the brand spokeswoman, in the eighties, it was almost flip-a nod to the materialism of the times-and today, with Heather Locklear, the spots have a lush, indulgent feel. "New Preference by L'Oreal,"she says in one of the current commercials. "Pass it on. You're worth it." The "because" -which gave Ilon Specht's original punch line such emphasis-is gone. The forceful "I'm" has been replaced by "you're." The Clairol and L'Oreal campaigns have converged. According to the Spectra marketing firm, there are almost exactly as many Preference users as Nice 'n Easy users who earn between fifty thousand and seventy-five thousand dollars a year, listen to religious radio, rent their apartments, watch the Weather Channel, bought more than six books last year, are fans of professional football, and belong to a union.
But it is a tribute to Ilon Specht and Shirley Polykoff's legacy that there is still a real difference between the two brands. It's not that there are Clairol women or L'Oreal women. It's something a little subtler. As Herzog knew, all of us, when it comes to constructing our sense of self, borrow bits and pieces, ideas and phrases, rituals and products from the world around us-over-the-counter ethnicities that shape, in some small but meaningful way, our identities. Our religion matters, the music we listen to matters, the clothes we wear matter, the food we eat matters-and our brand of hair dye matters, too. Carol Hamilton, L'Oreal's vice-president of marketing, says she can walk into a hair-color focus group and instantly distinguish the Clairol users from the L'Oreal users. "The L'Oreal user always exhibits a greater air of confidence, and she usually looks better-not just her hair color, but she always has spent a little more time putting on her makeup, styling her hair," Hamilton told me. "Her clothing is a little bit more fashion-forward. Absolutely, I can tell the difference." Jeanne Matson, Hamilton's counterpart at Clairol, says she can do the same thing. "Oh, yes," Matson told me. "There's no doubt. The Clairol woman would represent more the American-beauty icon, more naturalness. But it's more of a beauty for me, as opposed to a beauty for the external world. L'Oreal users tend to be a bit more aloof. There is a certain warmth you see in the Clairol people. They interact with each other more. They'll say, 'I use Shade 101.' And someone else will say, 'Ah, I do, too!' There is this big exchange."
These are not exactly the brand personalities laid down by Polykoff and Specht, because this is 1999, and not 1956 or 1973. The complexities of Polykoff's artifice have been muted. Specht's anger has turned to glamour. We have been left with just a few bars of the original melody. But even that is enough to insure that "Because I'm worth it" will never be confused with "Does she or doesn't she?" Specht says, "It meant I know you don't think I'm worth it, because that's what it was with the guys in the room. They were going to take a woman and make her the object. I was defensive and defiant. I thought, I'll fight you. Don't you tell me what I am. You've been telling me what I am for generations." As she said "fight," she extended the middle finger of her right hand. Shirley Polykoff would never have given anyone the finger. She was too busy exulting in the possibilities for self-invention in her America-a land where a single woman could dye her hair and end up lying on a beach with a ring on her finger. At her retirement party, in 1973, Polykoff reminded the assembled executives of Clairol and of Foote, Cone & Belding about the avalanche of mail that arrived after their early campaigns: "Remember that letter from the girl who got to a Bermuda honeymoon by becoming a blonde?"
Everybody did.
"Well," she said, with what we can only imagine was a certain sweet vindication, "I wrote it."
Dept. of Finales
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 24, 1999
talk of the town
"Melrose Place," 1992-1999, R.I.P.
During the 1995-96 season of "Melrose Place"--unquestionably the finest in the seven-year run of the prime-time soap, which comes to an end this week--the winsome redhead known as Dr. Kimberly Shaw experienced a sudden breakthrough in her therapy with Dr. Peter Burns, whom, according to the convolutions of the Melrose narrative, she happened to be living with at the moment. Burns was also acting as her lawyer and guardian, in addition to being her lover and therapist, although those last two descriptions are not quite accurate, since Kimberly and Dr. Burns weren't sleeping together at the time of her therapy breakthrough and, what's more, Burns wasn't really a therapist. From all appearances, he was actually a surgeon, or--since he also treated the show's central figure (and his future lover), Amanda, when she had her cancer scare--an oncologist, or, at the very least, a hunky guy with a stethoscope and a pager, which, in the Melrose universe, is all you really need to be to pass your medical boards.
In any case, in the first or second session between Kimberly and Dr. Peter Burns--her lawyer, suitor, guardian, non-therapist therapist, landlord, and room-mate--Kimberly realized that the reason she had been exhibiting strong homicidal tendencies was that she had been suppressing the childhood memory of having killed a very evil man who bore a distinct resemblance to a ferret. In a daring plot twist, Michael and Sydney--Kimberly's ex-lover and her romantic rival, respectively--got hold of a sketch she had made of her ferret-faced tormentor and hired an actor to impersonate him in an effort to make Kimberly think that she was still as crazy as ever. And that's exactly what happened, until Kimberly, toward the end of one episode, confronted the actor playing the ferret man and peeled off his prosthetic makeup, vanquishing her personal demon once and for all.
If you talk to aficionados of "Melrose Place," they will tell you that the ferret-man moment, more than any other, captured what was truly important about the series: here was an actor playing a doctor, in therapy with another actor playing a doctor who was himself impersonating a therapist, confronting an actor playing an actor playing her own personal demon, and when she unmasked him she found . . . that he was just another actor! Or something like that. The wonderful thing about "Melrose Place" was that just when you thought that the show was about to make some self-consciously postmodern commentary on, say, the relationship between art and life, it had the courage to take the easy way out and go for the laugh.
"Melrose Place" was often, mistakenly, lumped with its sister show on Fox, "Beverly Hills, 90210," which, like "Melrose," was an Aaron Spelling Production. At one point, Fox even ran the two shows back to back on Wednesday nights. But they were worlds apart. "90210" was the most conventional kind of television. It played to the universal desire of adolescents to be grownups, and it presented the world inside West Beverly High as one driven by the same social and ethical and political issues as the real world. "90210" was all about teens behaving like adults. "Melrose" was the opposite. It started with a group of adults--doctors, advertising executives, fashion designers--and dared to have them behave as foolishly and as naively as adolescents. Most of them lived in the same apartment building, where they fought and drank and wore really tight outfits and slept together in every conceivable permutation. They were all dumb, and the higher they rose in the outside world the dumber they got when they came home to Melrose Place.
In the mid-nineteen-nineties, when a generation of Americans reached adulthood and suddenly realized that they didn't want to be there, the inverted world of Melrose was a wonderfully soothing place. Here, after all, was a show that ostensibly depicted sophisticated grownup society, and every viewer was smarter than the people on the screen. Could anyone believe, for example, that when Kimberly came back from her breakthrough session with Peter Burns and went home to make dinner for Peter Burns, and Peter Burns sidled up to kiss her as she was slicing carrots, bra-less, he never stopped to think that here was his client and patient and tenant and analysand--a woman who had just tried to kill all kinds of people--and she was in his kitchen holding a knife? Peter! You moron! Watch the knife!
Dept. of Straight Thinking
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 12, 1999
talk of the town
Is the Belgian Coca-Cola hysteria the real thing?
The wave of illness among Belgian children last month had the look and feel--in the beginning, at least--of an utterly typical food poisoning outbreak. First, forty-two children in the Belgian town of Bornem became mysteriously ill after drinking Coca-Cola and had to be hospitalized. Two days later, eight more school children fell sick in Bruges, followed by thirteen in Harelbeke the next day and forty two in Lochristi three days after that--and on and on in a widening spiral that, in the end, sent more than one hundred children to the hospital complaining of nausea, dizziness, and headaches, and forced Coca-Cola into the biggest product recall in its hundred-and-thirteen-year history. Upon investigation, an apparent culprit was found. In the Coca-Cola plant in Antwerp, contaminated carbon dioxide had been used to carbonate a batch of the soda's famous syrup. With analysts predicting that the scare would make a dent in Coca-Cola's quarterly earnings, the soft-drink giant apologized to the Belgian people, and the world received a sobering reminder of the fragility of food safety.
The case isn't as simple as it seems, though. A scientific study ordered by Coca-Cola found that the contaminants in the carbon dioxide were sulfur compounds left over from the production process. In the tainted bottles of Coke, these residues were present at between five and seventeen parts per billion. These sulfides can cause illness, however, only at levels about a thousand times greater than that. At seventeen parts per billion, they simply leave a bad smell--like rotten eggs--which means that Belgium should have experienced nothing more than a minor epidemic of nose-wrinkling. More puzzling is the fact that, in four of the five schools were the bad Coke allegedly struck, half of the kids who got sick hadn't drunk any Coke that day. Whatever went on Belgium, in other words, probably wasn't Coca-Cola poisoning. So what was it? Maybe nothing at all.
"You know, when this business started I bet two of my friends a bottle of champagne each that I knew the cause," Simon Wessely, a psychiatrist who teaches at the King's College School of Medicine in London, said.
"It's quite simple. It's just mass hysteria. These things usually are."
Wessely has been collecting reports of this kind of hysteria for about ten years and now has hundreds of examples, dating back as far as 1787, when millworkers in Lancashire suddenly took ill after they became persuaded that they were being poisoned by tainted cotton. According to Wessely, almost all cases fit a pattern. Someone sees a neighbor fall ill and becomes convinced that he is being contaminated by some unseen evil--in the past it was demons and spirits; nowadays it tends to be toxins and gases--and his fear makes him anxious. His anxiety makes him dizzy and nauseous. He begins to hyperventilate. He collapses. Other people hear the same allegation, see the "victim" faint, and they begin to get anxious themselves. They feel nauseous. They hyperventilate. They collapse, and before you know it everyone in the room is hyperventilating and collapsing. These symptoms, Wessely stresses, are perfectly genuine. It's just that they are manifestations of a threat that is wholly imagined. "This kind of thing is extremely common," he says, "and it's almost normal. It doesn't mean that you are mentally ill or crazy."
Mass hysteria comes in several forms. Mass motor hysteria, for example, involves specific physical movements: shaking, tremors, and convulsions. According to the sociologist Robert Bartholomew, motor hysteria often occurs in environments of strict emotional repression; it was common in medieval nunneries and in nineteenth-century European schools, and it is seen today in some Islamic cultures. What happened in Belgium, he says, is a fairly typical example of a more standard form of contagious anxiety, possibly heightened by the recent Belgian scare over dioxin-contaminated animal feed. The students' alarm over the rotten-egg odor of their Cokes, for example, is straight out of the hysteria textbooks. "The vast majority of these events are triggered by some abnormal but benign smell," Wessely said. "Something strange, like a weird odor coming from the air conditioning."
The fact that the outbreaks occurred in schools is also typical of hysteria cases. "The classic ones always involve schoolchildren," Wessely continued. "There is a famous British case involving hundreds of schoolgirls who collapsed during a 1980 Nottinghamshire jazz festival. They blamed it on a local farmer spraying pesticides." Bartholomew has just published a paper on a hundred and fifteen documented hysteria cases in schools over the past three hundred years. As anyone who has ever been to a rock concert knows, large numbers of adolescents in confined spaces seem to be particularly susceptible to mass hysteria. Those intent on pointing the finger at Coca-Cola in this sorry business ought to remember that. "We let the people of Belgium down," Douglas Ivester, the company's chairman, said in the midst of the crisis. Or perhaps it was the other way around.
The Science of the Sleeper
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
October 4, 1999
ANNALS OF MARKETING
How the Information Age
could blow away the blockbuster.
1.
In 1992, a sometime actress named Rebecca Wells published a novel called "Little Altars Everywhere" with a small, now defunct press in Seattle. Wells was an unknown author, and the press had no money for publicity. She had a friend, however, who spent that Thanksgiving with a friend who was a producer of National Public Radio's "All Things Considered." The producer read the book and passed it on to Linda Wertheimer, a host of the show, and she liked it so much that she put Wells on her program. That interview, in turn, was heard by a man who was listening to the radio in Blytheville, Arkansas, and whose wife, Mary Gay Shipley, ran the town bookstore. He bought the book and gave it to her; she loved it, and, with that, the strange and improbable rise of Rebecca Wells, best-selling author, began. Blytheville is a sleepy little town about an hour or so up the Mississippi from Memphis, and Mary Gay Shipley's bookstore--That Bookstore in Blytheville--sits between the Red Ball Barber Shop and Westbrook's shoe store on a meandering stretch of Main Street. The store is just one long room in a slightly shabby storefront, with creaky floors and big overhead fans and subject headings on the shelves marked with Post-it notes. Shipley's fiction section takes up about as much shelf space as a typical Barnes & Noble devotes to, say, homeopathic medicine. That's because Shipley thinks that a book buyer ought to be able to browse and read the jacket flap of everything that might catch her eye, without being overwhelmed by thousands of choices. Mostly, though, people come to Mary Gay Shipley's store in order to find out what Mary Gay thinks they ought to be reading, and in 1993 Mary Gay Shipley thought people ought to be reading "Little Altars Everywhere." She began ordering it by the dozen, which, Shipley says, "for us, is huge." She put it in the little rack out front where she lists her current favorites. She wrote about it in the newsletter she sends to her regular customers. "We could tell it was going to have a lot of word of mouth," she says. "It was the kind of book where you could say, 'You'll love it. Take it home.' " The No. 1 author at That Bookstore in Blytheville in 1993 was John Grisham, as was the case in nearly every bookstore in the country. But No. 2 was Rebecca Wells.
"Little Altars Everywhere" was not a best-seller. But there were pockets of devotees around the country--in Blytheville; at the Garden District Book Shop, in New Orleans; at Parkplace books, in Kirkland, Washington--and those pockets created a buzz that eventually reached Diane Reverand, an editor in New York. Reverand published Wells's next book, "Divine Secrets of the Ya-Ya Sisterhood," and when it hit the bookshelves the readers and booksellers of Blytheville, the Garden District, and Kirkland were ready. "When 'The Ya-Ya Sisterhood' came out, I met with an in-store sales rep from HarperCollins," Shipley said. She is a tall woman with graying hair and a quiet, dignified bearing. "I'm not real sure he knew what a hot book this was. When he came in the store, I just turned the page of the catalogue and said, 'I want one hundred copies,' and his jaw fell to the table, because I usually order four or two or one. And I said, 'I want her to come here! And if you go anywhere, tell people this woman sells in Blytheville!'"
Wells made the trip to Arkansas and read in the back of Shipley's store; the house was packed, and the women in the front row wore placards saying "Ya-Ya." She toured the country, and the crowds grew steadily bigger. "Before the numbers really showed it, I'd be signing books and there would be groups of women who would come together, six or seven, and they would have me sign anywhere between three and ten books," Wells recalls. "And then, after that, I started noticing mothers and daughters coming. Then I noticed that the crowds started to be three-generational--there would be teen-agers and sixth graders." "Ya-Ya" sold fifteen thousand copies in hardcover. The paperback sold thirty thousand copies in its first two months. Diane Reverand took out a single- column ad next to the contents page of The New Yorker--the first dollar she'd spent on advertising for the paperback--and sales doubled to sixty thousand in a month. It sold and sold, and by February of 1998, almost two years after the book was published, it reached the best-seller lists. There are now nearly three million copies in print. Rebecca Wells, needless to say, has a warm spot in her heart for people like Mary Gay Shipley. "Mary Gay is a legend," she says. "She just kept putting my books in people's hands."
2.
In the book business, as in the movie business, there are two kinds of hits: sleepers and blockbusters. John Grisham and Tom Clancy and Danielle Steel write blockbusters. Their books are announced with huge publicity campaigns. Within days of publication, they leap onto the best-seller lists. Sales start high--hundreds of thousands of copies in the first few weeks--and then taper off. People who buy or watch blockbusters have a clear sense of what they are going to get: a Danielle Steel novel is always--well, a Danielle Steel novel. Sleepers, on the other hand, are often unknown quantities. Sales start slowly and gradually build; publicity, at least early on, is often nonexistent. Sleepers come to your attention by a slow, serendipitous path: a friend who runs into a friend who sets up the interview that just happens to be heard by a guy married to a bookseller. Sleepers tend to emerge from the world of independent bookstores, because independent bookstores are the kinds of places where readers go to ask the question that launches all sleeper hits: Can you recommend a book to me? Shipley was plugging Terry Kay's "To Dance with the White Dog" long before it became a best-seller. She had Melinda Haynes lined up to do a reading at her store before Oprah tapped "Mother of Pearl" as one of her recommended books and it shot onto the best-seller lists. She read David Guterson's "Snow Falling on Cedars" in manuscript and went crazy for it. "I called the publisher, and they said, 'We think it's a regional book.' And I said, 'Write it down. "M.G.S. says this is an important book."'" All this makes it sound as if she has a sixth sense for books that will be successful, but that's not quite right. People like Mary Gay Shipley don't merely predict sleeper hits; they create sleeper hits.
Most of us, of course, don't have someone like Mary Gay Shipley in our lives, and with the decline of the independent bookstore in recent years the number of Shipleys out there creating sleeper hits has declined as well. The big chain bookstores that have taken over the bookselling business are blockbuster factories, since the sheer number of titles they offer can make browsing an intimidating proposition. As David Gernert, who is John Grisham's agent and editor, explains, "If you walk into a superstore, that's where being a brand makes so much more of a difference. There is so much more choice it's overwhelming. You see walls and walls of books. In that kind of environment, the reader is drawn to the known commodity. The brand-name author is now a safe haven." Between 1986 and 1996, the share of book sales represented by the thirty top-selling hardcover books in America nearly doubled.
The new dominance of the blockbuster is part of a familiar pattern. The same thing has happened in the movie business, where a handful of heavily promoted films featuring "bankable" stars now command the lion's share of the annual box-office. We live, as the economists Robert Frank and Philip Cook have argued, in a "winner-take-all society," which is another way of saying that we live in the age of the blockbuster. But what if there were a way around the blockbuster? What if there were a simple way to build your very own Mary Gay Shipley? This is the promise of a new technology called collaborative filtering, one of the most intriguing developments to come out of the Internet age.
3.
If you want a recommendation about what product to buy, you might want to consult an expert in the field. That's a function that magazines like Car and Driver and Sound & Vision perform. Another approach is to poll users or consumers of a particular product or service and tabulate their opinions. That's what the Zagat restaurant guides and consumer-ratings services like J. D. Power and Associates do. It's very helpful to hear what an "expert" audiophile has to say about the newest DVD player, or what the thousands of owners of the new Volkswagen Passat have to say about reliability and manufacturing defects. But when it comes to books or movies--what might be called "taste products"--these kinds of recommendations aren't nearly as useful. Few moviegoers, for example, rely on the advice of a single movie reviewer. Most of us gather opinions from a variety of sources--from reviewers whom we have agreed with in the past, from friends who have already seen the movie, or from the presence of certain actors or directors whom we already like--and do a kind of calculation in our heads. It's an imperfect procedure. You can find out a great deal about what various critics have to say. But they're strangers, and, to predict correctly whether you'll like something, the person making the recommendation really has to know something about you.
That's why Shipley is such a powerful force in touting new books. She has lived in Blytheville all her life and has run the bookstore there for twenty-three years, and so her customers know who she is. They trust her recommendations. At the same time, she knows who they are, so she knows how to match up the right book with the right person. For example, she really likes David Guterson's new novel, "East of the Mountains," but she's not about to recommend it to anyone. It's about a doctor who has cancer and plans his own death and, she says, "there are some people dealing with a death in their family for whom this is not the book to read right now." She had similar reservations about Charles Frazier's "Cold Mountain." "There were people I know who I didn't think would like it," Shipley said. "And I'd tell them that. It's a journey story. It's not what happens at the end that matters, and there are some people for whom that's just not satisfying. I don't want them to take it home, try to read it, not like it, then not go back to that writer." Shipley knows what her customers will like because she knows who they are.
Collaborative filtering is an attempt to approximate this kind of insider knowledge. It works as a kind of doppelgänger search engine. All of us have had the experience of meeting people and discovering that they appear to have the very same tastes we do--that they really love the same obscure foreign films that we love, or that they are fans of the same little-known novelist whom we are obsessed with. If such a person recommended a book to you, you'd take that recommendation seriously, because cultural tastes seem to run in patterns. If you and your doppelgänger love the same ten books, chances are you'll also like the eleventh book he likes. Collaborative filtering is simply a system that sifts through the opinions and preferences of thousands of people and systematically finds your doppelgänger--and then tells you what your doppelgänger's eleventh favorite book is.
John Riedl, a University of Minnesota computer scientist who is one of the pioneers of this technology, has set up a Web site called MovieLens, which is a very elegant example of collaborative filtering at work. Everyone who logs on--and tens of thousands of people have already done so--is asked to rate a series of movies on a scale of 1 to 5, where 5 means "must see" and 1 means "awful." For example, Irated "Rushmore" as a 5, which meant that I was put into the group of people who loved "Rushmore." I then rated "Summer of Sam" as a 1, which put me into the somewhat smaller and more select group that both loved "Rushmore" and hated "Summer of Sam." Collaborative-filtering systems don't work all that well at first, because, obviously, in order to find someone's cultural counterparts you need to know a lot more about them than how they felt about two movies. Even after I had given the system seven opinions (including "Election," 4; "Notting Hill," 2; "The Sting," 4; and "Star Wars," 1), it was making mistakes. It thought I would love "Titanic" and "Zero Effect," and I disliked them both. But after I had plugged in about fifteen opinions--which Riedl says is probably the minimum--I began to notice that the rating that MovieLens predicted I would give a movie and the rating I actually gave it were nearly always, almost eerily, the same. The system had found a small group of people who feel exactly the same way I do about a wide range of popular movies.
What makes this collaborative-filtering system different from those you may have encountered on Amazon.com or Barnesandnoble.com? In order to work well, collaborative filtering requires a fairly representative sample of your interests or purchases. But most of us use retailers like Amazon only for a small percentage of our purchases. For example, I buy the fiction I read at the Barnes & Noble around the corner from where I live. I buy most of my nonfiction in secondhand bookstores, and I use Amazon for gifts and for occasional work-related books that I need immediately, often for a specific and temporary purpose. That's why, bizarrely, Amazon currently recommends that I buy a number of books by the radical theorist Richard Bandler, none of which I have any desire to read. But if I were to buy a much bigger share of my books on-line, or if I "educated" the filter--as Amazon allows every customer to do--and told it what I think of its recommendations, it's easy to see how, over time, it could turn out to be a powerful tool.
In a new book, "Net Worth," John Hagel, an E-commerce consultant with McKinsey & Company, and his co-author, Marc Singer, suggest that we may soon see the rise of what they call "infomediaries," which are essentially brokers who will handle our preference information. Imagine, for example, that I had set up a company that collected and analyzed all your credit-card transactions. That information could be run through a collaborative filter, and the recommendations could be sold to retailers in exchange for discounts. Steve Larsen, the senior vice-president of marketing for Net Perceptions--a firm specializing in collaborative filtering which was started by Riedl and the former Microsoft executive Steven Snyder, among others--says that someday there might be a kiosk at your local video store where you could rate a dozen or so movies and have the computer generate recommendations for you from the movies the store has in stock. "Better yet, when I go there with my wife we put in my card and her card and say, 'Find us a movie we both like,'" he elaborates. "Or, even better yet, when we go with my fifteen-year-old daughter, 'Find us a movie all three of us like.'" Among marketers, the hope is that such computerized recommendations will increase demand. Right now, for example, thirty-five per cent of all people who enter a video store leave empty-handed, because they can't figure out what they want; the point of putting kiosks in those stores would be to lower that percentage. "It means that people might read more, or listen to music more, or watch videos more, because of the availability of an accurate and dependable and reliable method for them to learn about things that they might like," Snyder says.
One of Net Perceptions' clients is SkyMall, which is a company that gathers selections from dozens of mail-order catalogues--from Hammacher Schlemmer and L. L. Bean to the Wine Enthusiast--and advertises them in the magazines that you see in the seat pockets of airplanes. SkyMall licensed the system both for their Web site and for their 800-number call center, where the software looks for your doppelgänger while you are calling in with your order, and a few additional recommendations pop up on the operator's screen. SkyMall's system is still in its infancy, but, in a test, the company found that it has increased the total sales per customer somewhere between fifteen and twenty-five per cent. What's remarkable about the SkyMall system is that it links products from many different categories. It's one thing, after all, to surmise that if someone likes "The Remains of the Day" he is also going to like "A Room with a View." But it's quite another to infer that if you liked a particular item from the Orvis catalogue there's a certain item from Reliable Home Office that you'll also be interested in. "Their experience has been absolutely hilarious," Larsen says. "One of the very first recommendations that came out of the engine was for a gentleman who was ordering a blue cloth shirt, a twenty-eight-dollar shirt. Our engine recommended a hundred-and-thirty-five-dollar cigar humidor--and he bought it! I don't think anybody put those two together before."
The really transformative potential of collaborative filtering, however, has to do with the way taste products--books, plays, movies, and the rest--can be marketed. Marketers now play an elaborate game of stereotyping. They create fixed sets of groups--middle-class-suburban, young-urban-professional, inner-city- working-class, rural-religious, and so on--and then find out enough about us to fit us into one of those groups. The collaborative-filtering process, on the other hand, starts with who we are, then derives our cultural "neighborhood" from those facts. And these groups aren't permanent. They change as we change. I have never seen a film by Luis Buñuel, and I have no plans to. I don't put myself in the group of people who like Buñuel. But if I were to see "That Obscure Object of Desire" tomorrow and love it, and enter my preference on MovieLens, the group of people they defined as "just like me" would immediately and subtly change.
A group at Berkeley headed by the computer scientist Ken Goldberg has, for instance, developed a collaborative-filtering system for jokes. If you log on to the site, known as Jester, you are given ten jokes to rate. (Q.: Did you hear about the dyslexic devil worshipper? A.: He sold his soul to Santa.) These jokes aren't meant to be especially funny; they're jokes that reliably differentiate one "sense of humor" from another. On the basis of the humor neighborhood you fall into, Jester gives you additional jokes that it thinks you'll like. Goldberg has found that when he analyzes the data from the site--and thirty-six thousand people so far have visited Jester--the resulting neighborhoods are strikingly amorphous. In other words, you don't find those thirty-six thousand people congregating into seven or eight basic humor groups--off-color, say, or juvenile, or literary. "What we'd like to see is nice little clusters," Goldberg says. "But, when you look at the results, what you see is something like a cloud with sort of bunches, and nothing that is nicely defined. It's kind of like looking into the night sky. It's very hard to identify the constellations." The better you understand someone's particular taste pattern--the deeper you probe into what he finds interesting or funny--the less predictable and orderly his preferences become.
Collaborative filtering underscores a lesson that, for the better part of history, humans have been stubbornly resistant to learning: if you want to understand what one person thinks or feels or likes or does it isn't enough to draw inferences from the general social or demographic category to which he belongs. You cannot tell, with any reasonable degree of certainty, whether someone will like "The Girl's Guide to Hunting and Fishing" by knowing that the person is a single twenty-eight-year-old woman who lives in Manhattan, any more than you can tell whether somebody will commit a crime knowing only that he's a twenty-eight- year-old African-American male who lives in the Bronx. Riedl has taken demographic data from the people who log on to MovieLens--such as their age and occupation and sex--but he has found that it hardly makes his predictions any more accurate. "What you tell us about what you like is far more predictive of what you will like in the future than anything else we've tried," he says. "It seems almost dumb to say it, but you tell that to marketers sometimes and they look at you puzzled."
None of this means that standard demographic data is useless. If you were trying to figure out how to market a coming- of-age movie, you'd be most interested in collaborative-filtering data from people below, say, the age of twenty-eight. Facts such as age and sex and place of residence are useful in sorting the kinds of information you get from a recommendation engine. But the central claim of the collaborative-filtering movement is that, head to head, the old demographic and "psychographic" data cannot compete with preference data. This is a potentially revolutionary argument. Traditionally, there has been almost no limit to the amount of information marketers have wanted about their customers: academic records, work experience, marital status, age, sex, race, Zip Code, credit records, focus-group sessions--everything has been relevant, because in trying to answer the question of what we want marketers have taken the long way around and tried to find out first who we are. Collaborative filtering shows that, in predicting consumer preferences, none of this information is all that important. In order to know what someone wants, what you really need to know is what they've wanted.
4.
How will this affect the so-called blockbuster complex? When a bookstore's sales are heavily driven by the recommendations of a particular person--a Mary Gay Shipley--sleepers, relatively speaking, do better and blockbusters do worse. If you were going to read only Clancy and Grisham and Steel, after all, why would you need to ask Shipley what to read? This is what David Gernert, Grisham's agent, meant when he said that in a Barnes & Noble superstore a brand like Grisham enjoys a "safe haven." It's a book you read when there is no one, like Shipley, with the credibility to tell you what else you ought to read. Gernert says that at this point in Grisham's career each of his novels follows the same general sales pattern. It rides high on the best-seller lists for the first few months, of course, but, after that, "his sales pick up at very specific times--notably, Father's Day and Mother's Day, and then it will sell well again for Christmas." That description makes it clear that Grisham's books are frequently bought as gifts. And that's because gifts are the trickiest of all purchases. They require a guess about what somebody else likes, and in conditions of uncertainty the logical decision is to buy the blockbuster, the known quantity.
Collaborative filtering is, in effect, anti-blockbuster. The more information the system has about you, the more narrow and exclusive its recommendations become. It's just like Shipley: it uses its knowledge about you to steer you toward choices you wouldn't normally know about. I gave MovieLens my opinions on fifteen very mainstream American movies. I'm a timid and unsophisticated moviegoer. I rarely see anything but very commercial Hollywood releases. It told me, in return, that I would love "C'est Arrivé Près de Chez Vous," an obscure 1992 Belgian comedy, and "Shall We Dance," the 1937 Fred and Ginger vehicle. In other words, among my moviegoing soul mates are a number of people who share my views on mainstream fare but who also have much greater familiarity with foreign and classic films. The system essentially put me in touch with people who share my tastes but who happen to know a good deal more about movies. Collaborative filtering gives voice to the expert in every preference neighborhood. A world where such customized recommendations were available would allow Shipley's well-read opinions to be known not just in Blytheville but wherever there are people who share her taste in books.
Collaborative filtering, in short, has the ability to reshape the book market. When customized recommendations are available, choices become more heterogeneous. Big bookstores lose their blockbuster bias, because customers now have a way of narrowing down their choices to the point where browsing becomes easy again. Of the top hundred best-selling books of the nineteen-nineties, there are only a handful that can accurately be termed sleepers--Robert James Waller's "The Bridges of Madison County," James Redfield's "The Celestine Prophecy," John Berendt's "Midnight in the Garden of Good and Evil," Charles Frazier's "Cold Mountain." Just six authors--John Grisham, Tom Clancy, Stephen King, Michael Crichton, Dean Koontz, and Danielle Steel--account for sixty-three of the books on the list. In a world more dependent on collaborative filtering, Grisham, Clancy, King, and Steel would still sell a lot of books. But you'd expect to see many more books like "Divine Secrets of the Ya-Ya Sisterhood"--many more new writers--make their way onto the best- seller list. And the gap between the very best selling books and those in the middle would narrow. Collaborative filtering, Hagel says, "favors the smaller, the more talented, more quality products that may have a hard time getting visibility because they are not particularly good at marketing."
5.
In recent years, That Bookstore in Blytheville has become a mecca for fiction in the South. Prominent writers drop by all the time to give readings in the back, by the potbellied stove. John Grisham himself has been there nine times, beginning with his tour for "The Firm," which was the hit that turned him into a blockbuster author. Melinda Haynes, Bobbie Ann Mason, Roy Blount, Jr., Mary Higgins Clark, Billie Letts, Sandra Brown, Jill Conner Browne, and countless others have recently made the drive up from Memphis. Sometimes Shipley will host a supper for them after the reading, and send the proceeds from the event to a local literacy program.
There seems, in this era of mega-stores, something almost impossibly quaint about That Bookstore in Blytheville. The truth is, though, that the kind of personalized recommendation offered by Mary Gay Shipley represents the future of marketing, not its past. The phenomenal success in recent years of Oprah Winfrey's book club--which created one best-seller after another on the strength of its nominations--suggests that, in this age of virtually infinite choice, readers are starved for real advice, desperate for a recommendation from someone they know and who they feel knows them. "Certain people don't want to waste their time experimenting with new books, and the function we provide here is a filter," Shipley says, and as she speaks you can almost hear the makings of another sleeper on the horizon. "If we like something, we get behind it. I'm reading a book right now called 'Nissa's Place,' by Alexandria LaFaye. She's a woman I think we're going to be hearing more from."
Clicks and Mortar
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
December 6, 1999
ANNALS OF RETAIL
Don't believe the Internet hype:
the real E-commerce revolution happened off-line.
1.
At the turn of this century, a Missouri farmer named D.Ward King invented a device that came to be known, in his honor, as the King Road Drag. It consisted of two wooden rails that lay side by side about three feet apart, attached by a series of wooden braces. If you pulled the King Drag along a muddy road, it had the almost magical effect of smoothing out the ruts and molding the dirt into a slight crown, so that the next time it rained the water would drain off to the sides. In 1906, when King demonstrated his device to a group of farmers in Wellsville, Kansas, the locals went out and built a hundred King Drags of their own within the week, which makes sense, because if you had asked a farmer at the turn of the century what single invention could make his life easier he would probably have wanted something that improved the roads. They were, in the late nineteenth century, a disaster: of the country's two million miles of roads, fewer than a hundred and fifty thousand had been upgraded with gravel or oil. The rest were dirt. They turned into rivers of mud when it was raining, and hardened into an impassable sea of ruts when it was not. A trip to church or to go shopping was an exhausting ordeal for many farmers. At one point in the early part of this century, economists estimated that it cost more to haul a bushel of wheat along ten miles of American dirt road than it did to ship it across the ocean from New York to Liverpool.
The King Road Drag was a simple invention that had the effect of reducing the isolation of the American farmer, and soon that simple invention led to all kinds of dramatic changes. Ever since the Post Office was established, for example, farmers had to make the difficult trek into town to pick up their mail. In the eighteen-nineties, Congress pledged that mail would be delivered free to every farmer's home, but only so long as rural communities could demonstrate that their roads were good enough for a mailman to pass by every day--which was a Catch-22 neatly resolved by the King Road Drag. And once you had rural free delivery and good roads, something like parcel post became inevitable. Through the beginning of the century, all packages that weighed more than four pounds were carried by private-express services, which were unreliable and expensive and would, outside big cities, deliver only to a set of depots. But if the mail was being delivered every day to rural dwellers, why not have the mailman deliver packages, too? In 1912, Congress agreed, and with that the age of the mail-order house began: now a farmer could look through a catalogue that contained many thousands of products and have them delivered right to his door. Smaller companies, with limited resources, had a way to bypass the middleman and reach customers all over the country. You no longer needed to sell to the consumer through actual stores made of bricks and mortar. You could build a virtual store!
In the first fifteen years of this century, in other words, America underwent something of a revolution. Before rural free delivery, if you didn't live in a town--and most Americans didn't--it wasn't really practical to get a daily newspaper. It was only after daily delivery that the country became "wired," in the sense that if something happened in Washington or France or the Congo one evening, everyone would know about it by the next morning. In 1898, mailmen were delivering about eighteen thousand pieces of mail per rural route. Within five years, that number had more than doubled, and by 1929 it had topped a hundred thousand.
Here was the dawn of the modern consumer economy--an economy in which information moved freely around the country, in which retailers and consumers, buyers and sellers became truly connected for the first time. "You may go to an average store, spend valuable time and select from a limited stock at retail prices," the fall 1915 Sears, Roebuck catalogue boasted, "or have our Big Store of World Wide Stocks at Economy Prices come to you in this catalog--the Modern Way." By the turn of the century, the Sears catalogue had run to over a thousand pages, listing tens of thousands of items in twenty-four departments: music, buggies, stoves, carriage hardware, drugs, vehicles, shoes, notions, sewing machines, cloaks, sporting goods, dry goods, hardware, groceries, furniture and baby carriages, jewelry, optical goods, books, stereopticons, men's clothing, men's furnishings, bicycles, gramophones, and harnesses. Each page was a distinct site, offering a reader in- depth explanations and descriptions well beyond what he would expect if he went to a store, talked to a sales clerk, and personally examined a product. To find all those products, the company employed scores of human search engines--"missionaries" who, the historians Boris Emmet and John Jeuck write, were "said to travel constantly, inspecting the stocks of virtually all retail establishments in the country, conversing with the public at large to discover their needs and desires, and buying goods 'of all kinds and descriptions'" in order to post them on the World Wide Stock.
The catalogue, as economists have argued, represented a radical transformation in the marketing and distribution of consumer goods. But, of course, that transformation would not have been possible unless you had parcel post, and you couldn't have had parcel post unless you had rural free delivery, and you could not have had rural free delivery without good roads, and you would not have had good roads without D. Ward King. So what was the genuine revolution? Was it the World Wide Stock or was it the King Road Drag?
2.
We are now, it is said, in the midst of another business revolution. "This new economy represents a tectonic upheaval in our commonwealth, a far more turbulent reordering than mere digital hardware has produced," Kevin Kelly, a former executive editor of Wired, writes in his book "New Rules for the New Economy." In "Cyber Rules," the software entrepreneurs Thomas M. Siebel and Pat House compare the advent of the Internet to the invention of writing, the appearance of a metal currency in the eastern Mediterranean several thousand years ago, and the adoption of the Arabic zero. "Business," Bill Gates states flatly in the opening sentence of "Business @ the Speed of Thought," "is going to change more in the next ten years than it has in the last fifty."
The revolution of today, however, turns out to be as difficult to define as the revolution of a hundred years ago. Kelly, for example, writes that because of the Internet "the new economy is about communication, deep and wide." Communication, he maintains, "is not just a sector of the economy. Communication is the economy." But which is really key--how we communicate, or what we communicate? Gates, meanwhile, is preoccupied with the speed of interaction in the new economy. Going digital, he writes, will "shatter the old way of doing business" because it will permit almost instant communication. Yet why is the critical factor how quickly I communicate some decision or message to you--as opposed to how long it takes me to make that decision, or how long it takes you to act on it? Gates called his book "Business @ the Speed of Thought," but thought is a slow and messy thing. Computers do nothing to speed up our thought process; they only make it a lot faster to communicate our thoughts once we've had them. Gates should have called his book "Business @ the Speed of Typing." In "Growing Up Digital," Don Tapscott even goes so far as to claim that the rise of the Internet has created an entirely new personality among the young. N-Geners, as Tapscott dubs the generation, have a different set of assumptions about work than their parents have. They thrive on collaboration, and many find the notion of a boss somewhat bizarre....They are driven to innovate and have a mindset of immediacy requiring fast results. They love hard work because working, learning, and playing are the same thing to them. They are creative in ways their parents could only imagine....Corporations who hire them should be prepared to have their windows and walls shaken.
Let's leave aside the fact that the qualities Tapscott ascribes to the Net Generation--energy, a "mindset of immediacy," creativity, a resistance to authority, and (of all things) sharp differences in outlook from their parents--could safely have been ascribed to every upcoming generation in history. What's interesting here is the blithe assumption, which runs through so much of the thinking and talking about the Internet, that this new way of exchanging information must be at the root of all changes now sweeping through our economy and culture. In these last few weeks before Christmas, as the country's magazines and airways become crowded with advertisements for the fledgling class of dot coms, we may be tempted to concur. But is it possible that, once again, we've been dazzled by the catalogues and forgotten the roads?
3.
The world's largest on-line apparel retailer is Lands' End, in Wisconsin. Lands' End began in 1963 as a traditional mail-order company. It mailed you its catalogue, and you mailed back your order along with a check. Then, in the mid-nineteen-eighties, Lands' End, like the rest of the industry, reinvented itself. It mailed you its catalogue, and you telephoned an 800 number with your order and paid with a credit card. Now Lands' End has moved on line. In the first half of this year, E-commerce sales accounted for ten per cent of Lands' End's total business, up two hundred and fifty per cent from last year. What has this move to the Web meant?
Lands' End has its headquarters in the tiny farming town of Dodgeville, about an hour's drive west of Madison, through the rolling Midwestern countryside. The main Lands' End campus is composed of half a dozen modern, low-slung buildings, clustered around a giant parking lot. In one of those buildings, there is a huge open room filled with hundreds of people sitting in front of computer terminals and wearing headsets. These are the people who take your orders. Since the bulk of Lands' End's business is still driven by the catalogue and the 800 number, most of those people are simply talking on the phone to telephone customers. But a growing percentage of the reps are now part of the company's Internet team, serving people who use the Lands' End Live feature on the company's Web site. Lands' End Live allows customers, with the click of a mouse, to start a live chat with a Lands' End representative or get a rep to call them at home, immediately.
On a recent fall day, a Lands' End Live user--let's call her Betty--was talking to one of the company's customer-service reps, a tall, red-haired woman named Darcia. Betty was on the Lands' End Web site to buy a pair of sweatpants for her young daughter, and had phoned to ask a few questions.
"What size did I order last year?" Betty asked. "I think I need one size bigger." Darcia looked up the record of Betty's purchase. Last year, she told Betty, she bought the same pants in big- kid's small.
"I'm thinking medium or large," Betty said. She couldn't decide.
"The medium is a ten or a twelve, really closer to a twelve," Darcia told her. "I'm thinking if you go to a large, it will throw you up to a sixteen, which is really big."
Betty agreed. She wanted the medium. But now she had a question about delivery. It was Thursday morning, and she needed the pants by Tuesday. Darcia told her that the order would go out on Friday morning, and with U.P.S. second-day air she would almost certainly get it by Tuesday. They briefly discussed spending an extra six dollars for the premium, next- day service, but Darcia talked Betty out of it. It was only an eighteen-dollar order, after all.
Betty hung up, her decision made, and completed her order on the Internet. Darcia started an on-line chat with a woman from the East Coast. Let's call her Carol. Carol wanted to buy the forty-nine-dollar attaché case but couldn't decide on a color. Darcia was partial to the dark olive, which she said was "a professional alternative to black." Carol seemed convinced, but she wanted the case monogrammed and there were eleven monogramming styles on the Web-site page.
"Can I have a personal suggestion?" she wrote.
"Sure," Darcia typed back. "Who is the case for?"
"A conservative psychiatrist," Carol replied.
Darcia suggested block initials, in black. Carol agreed, and sent the order in herself on the Internet. "All right," Darcia said, as she ended the chat. "She feels better." The exchange had taken twenty-three minutes.
Notice that in each case the customer filled out the actual order herself and sent it in to the Lands' End computer electronically--which is, of course, the great promise of E-commerce. But that didn't make some human element irrelevant. The customers still needed Darcia for advice on colors, and styles, or for reassurance that their daughter was a medium and not a large. In each case, the sale was closed because that human interaction allayed the last-minute anxieties and doubts that so many of us have at the point of purchase. It's a mistake, in other words, to think that E-commerce will entirely automate the retail process. It just turns reps from order-takers into sales advisers.
"One of the big fallacies when the Internet came along was that you could get these huge savings by eliminating customer- service costs," Bill Bass, the head of E-commerce for Lands' End, says. "People thought the Internet was self-service, like a gas station. But there are some things that you cannot program a computer to provide. People will still have questions, and what you get are much higher-level questions. Like, 'Can you help me come up with a gift?' And they take longer."
Meanwhile, it turns out, Internet customers at Lands' End aren't much different from 800-number customers. Both groups average around a hundred dollars an order, and they have the same rate of returns. Call volume on the 800 numbers is highest on Mondays and Tuesdays, from ten in the morning until one in the afternoon. So is E-commerce volume. In the long term, of course, the hope is that the Web site will reduce dependence on the catalogue, and that would be a huge efficiency. Given that last year the company mailed two hundred and fifty million catalogues, costing about a dollar each, the potential savings could be enormous. And yet customers' orders on the Internet spike just after a new catalogue arrives at people's homes in exactly the same way that the 800-number business spikes just after the catalogue arrives. E-commerce users, it seems, need the same kind of visual, tangible prompting to use Lands' End as traditional customers. If Lands' End did all its business over the Internet, it would still have to send out something in the mail--a postcard or a bunch of fabric swatches or a slimmed-down catalogue. "We thought going into E-commerce it would be a different business," Tracy Schmit, an Internet analyst at the company, says. "But it's the same business, the same patterns, the same contacts. It's an extension of what we already do."
4.
Now consider what happens on what retailers call the "back end"--the customer-fulfillment side--of Lands' End's operations. Say you go to the company's Web site one afternoon and order a blue 32-16 oxford-cloth button-down shirt and a pair of size-9 Top-Siders. At midnight, the computer at Lands' End combines your order with all the other orders for the day: it lumps your shirt order with the hundred other orders, say, that came in for 32-16 blue oxford-cloth button-downs, and lumps your shoe order with the fifty other size-9 Top-Sider orders of the day. It then prints bar codes for every item, so each of those hundred shirts is assigned a sticker listing the location of blue oxford 32-16 shirts in the warehouse, the order that it belongs to, shipping information, and instructions for things like monogramming.
The next morning, someone known as a "picker" finds the hundred oxford- cloth shirts in that size, yours among them, and puts a sticker on each one, as does another picker in the shoe area with the fifty size-9 Top-Siders. Each piece of merchandise is placed on a yellow plastic tray along an extensive conveyor belt, and as the belt passes underneath a bar-code scanner the computer reads the label and assembles your order. The tray with your shirt on it circles the room until it is directly above a bin that has been temporarily assigned to you, and then tilts, sending the package sliding downward. Later, when your shoes come gliding along on the belt, the computer reads the bar code on the box and sends the shoe box tumbling into the same bin. Then the merchandise is packed and placed on another conveyor belt, and a bar-code scanner sorts the packages once again, sending the New York-bound packages to the New York-bound U.P.S. truck, the Detroit packages to the Detroit truck, and so on.
It's an extraordinary operation. When you stand in the middle of the Lands' End warehouse--while shirts and pants and sweaters and ties roll by at a rate that, at Christmas, can reach twenty-five thousand items an hour--you feel as if you're in Willy Wonka's chocolate factory. The warehouses are enormous buildings--as big, in all, as sixteen football fields--and the conveyor belts hang from the ceiling like giant pieces of industrial sculpture. Every so often, a belt lurches to a halt, and a little black scanner box reads the bar code and sends the package off again, directing it left or right or up or down, onto any number of separate sidings and overpasses. In the middle of one of the buildings, there is another huge room where thousands of pants, dangling from a jumbo-sized railing like a dry cleaner's rack, are sorted by color (so sewers don't have to change thread as often) and by style, then hemmed, pressed, bagged, and returned to the order-fulfillment chain--all within a day.
This system isn't unique to Lands' End. If you went to L. L. Bean or J.Crew or, for that matter, a housewares-catalogue company like Pottery Barn, you'd find the same kind of system. It's what all modern, automated warehouses look like, and it is as much a part of E-commerce as a Web site. In fact, it is the more difficult part of E-commerce. Consider the problem of the Christmas rush. Lands' End records something like thirty per cent of its sales during November and December. A well- supported Web site can easily handle those extra hits, but for the rest of the operation that surge in business represents a considerable strain. Lands' End, for example, aims to respond to every phone call or Lands' End Live query within twenty seconds, and to ship out every order within twenty-four hours of its receipt. In August, those goals are easily met. But, to maintain that level of service in November and December, Lands' End must hire an extra twenty-six hundred people, increasing its normal payroll by more than fifty per cent. Since unemployment in the Madison area is hovering around one per cent, this requires elaborate planning: the company charters buses to bring in students from a nearby college, and has made a deal in the past with a local cheese factory to borrow its workforce for the rush. Employees from other parts of the company are conscripted to help out as pickers, while others act as "runners" in the customer-service department, walking up and down the aisles and jumping into any seat made vacant by someone taking a break. Even the structure of the warehouse is driven, in large part, by the demands of the holiday season. Before the popularization of the bar code, in the early nineteen- eighties, Lands' End used what is called an "order picking" method. That meant that the picker got your ticket, then went to the shirt room and got your shirt, and the shoe room and got your shoes, then put your order together. If another shoe-and- shirt order came over next, she would have to go back to the shirts and back to the shoes all over again. A good picker under the old system could pick between a hundred and fifty and a hundred and seventy-five pieces an hour. The new technique, known as "batch picking," is so much more efficient that a good picker can now retrieve between six hundred and seven hundred pieces an hour. Without bar codes, if you placed an order in mid-December, you'd be hard pressed to get it by Christmas.
None of this is to minimize the significance of the Internet. Lands' End has a feature on its Web site which allows you to try clothes on a virtual image of yourself--a feature that is obviously not possible with a catalogue. The Web site can list all the company's merchandise, whereas a catalogue has space to list only a portion of the inventory. But how big a role does the Internet ultimately play in E-commerce? It doesn't much affect the cost of running a customer-service department. It reduces catalogue costs, but it doesn't eliminate traditional marketing, because you still have to remind people of your Web site. You still need to master batch picking. You still need the Willy Wonka warehouse. You still need dozens of sewers in the inseaming department, and deals with the local cheese factory, and buses to ship in students every November and December. The head of operations for Lands' End is a genial man in his fifties named Phil Schaecher, who works out of a panelled office decorated with paintings of ducks which overlooks the warehouse floor. When asked what he would do if he had to choose between the two great innovations of the past twenty years--the bar code, which has transformed the back end of his business, and the Internet, which is transforming the front end--Schaecher paused, for what seemed a long time. "I'd take the Internet," he said finally, toeing the line that all retailers follow these days. Then he smiled. "But of course if we lost bar codes I'd retire the next day."
5.
On a recent fall morning, a young woman named Charlene got a call from a shipping agent at a firm in Oak Creek, Wisconsin. Charlene is a dispatcher with a trucking company in Akron, Ohio, called Roberts Express. She sits in front of a computer with a telephone headset on, in a large crowded room filled with people in front of computers wearing headsets, not unlike the large crowded room at Lands' End. The shipping agent told Charlene that she had to get seven drums of paint to Muskegon, Michigan, as soon as possible. It was 11:25 a.m. Charlene told the agent she would call her back, and immediately typed those details into her computer, which relayed the message to the two-way-communications satellite that serves as the backbone for the Roberts transportation network. The Roberts satellite, in turn, "pinged" the fifteen hundred independent truckers that Roberts works with, and calculated how far each available vehicle was from the customer in Oak Creek. Those data were then analyzed by proprietary software, which sorted out the cost of the job and the distance between Muskegon and Oak Creek, and sifted through more than fifteen variables governing the optimal distribution of the fleet.
This much--the satellite relay and the probability calculation--took a matter of seconds. The trip, Charlene's screen told her, was two hundred and seventy-four miles and would cost seven hundred and twenty-six dollars. The computer also gave her twenty-three candidates for the run, ranked in order of preference. The first, Charlene realized, was ineligible, because federal regulations limit the number of hours drivers can spend on the road. The second, she found out, was being held for another job. The third, according to the satellite, was fifty miles away, which was too far. But the fourth, a husband- and-wife team named Jerry and Ann Love, seemed ideal. They were just nineteen miles from OakCreek. "I've worked with them before," Charlene said. "They're really nice people." At eleven-twenty-seven, Charlene sent the Loves an E-mail message, via satellite, that would show up instantly on the computer screens Roberts installs in the cabs of all its contractors. According to Roberts' rules, they had ten minutes to respond. "I'm going to give them a minute or two," Charlene said. There was no answer, so she called the Loves on their cell phone. Ann Love answered. "We'll do that," she said. Charlene chatted with her for a moment and then, as an afterthought, E-mailed the Loves again: "Thank you!" It was eleven-thirty.
Trucking companies didn't work this way twenty years ago. But Roberts uses its state-of-the-art communications and computer deployment to give the shipping business a new level of precision. If your pickup location is within twenty-five miles of one of the company's express centers--and Roberts has express centers in most major North American cities--Roberts will pick up a package of almost any size within ninety minutes, and it will do so twenty-four hours a day, seven days a week. If the cargo is located between twenty-six and fifty miles of an express center, it will be picked up within two hours. More than half of those deliveries will be made by midnight of the same day. Another twenty-five per cent will be made by eight o'clock the next morning. Ninety-six per cent of all Roberts deliveries are made within fifteen minutes of the delivery time promised when the order is placed. Because of its satellite system, the company knows precisely, within yards, where your order is at all times. The minute the computer tells her your truck is running fifteen minutes behind, Charlene or one of her colleagues will call you to work out some kind of solution. Roberts has been known to charter planes or send in Huey helicopters to rescue time-sensitive cargo stranded in traffic or in a truck that has broken down. The result is a truck-based system so efficient that Roberts estimates it can outperform air freight at distances of up to seven hundred or eight hundred miles.
Roberts, of course, isn't the only company to reinvent the delivery business over the past twenty years. In the same period, Federal Express has put together, from scratch, a network of six hundred and forty-three planes, forty-three thousand five hundred vehicles, fourteen hundred service centers, thirty-four thousand drop boxes, and a hundred and forty-eight thousand employees--all coordinated by satellite links and organized around a series of huge, automated, bar- code-driven Willy Wonka warehouses. Federal Express was even a pioneer in the development of aircraft antifog navigational equipment: if it absolutely, positively has to get there overnight, the weather can't be allowed to get in the way.
E-commerce would be impossible without this extraordinary infrastructure. Would you care that you could order a new wardrobe with a few clicks of a mouse if the package took a couple of weeks to get to you? Lands' End has undergone three major changes over the past couple of decades. The first was the introduction of an 800 number, in 1978; the second was express delivery, in 1994; and the third was the introduction of a Web site, in 1995. The first two innovations cut the average transaction time--the time between the moment of ordering and the moment the goods are received--from three weeks to four days. The third innovation has cut the transaction time from four days to, well, four days.
It isn't just that E-commerce depends on express mail; there's a sense in which E-commerce is express mail. Right now, billions of dollars are being spent around the country on so-called "last-mile delivery systems." Companies such as Webvan, in San Francisco, or Kozmo.com, in New York, are putting together networks of trucks and delivery personnel which can reach almost any home in their area within an hour. What if Webvan or Kozmo were somehow integrated into a huge, national, Roberts-style network of connected trucks? And what if that network were in turn integrated into the operations of a direct merchant like Lands' End? There may soon come a time when a customer from Northampton could order some shirts on LandsEnd.com at the height of the Christmas rush, knowing that the retailer's computer could survey its stock, assess its warehouse capabilities, "ping" a network of thousands of trucks it has at its disposal, look up how many other orders are going to his neck of the woods, check in with his local Kozmo or Webvan, and tell him, right then and there, precisely what time it could deliver those shirts to him that evening or the next morning. It's not hard to imagine, under such a system, that Lands' End's sales would soar; the gap between the instant gratification of a real store and the delayed gratification of a virtual store would narrow even further. It would be a revolution of sorts, a revolution of satellites, probability models, people in headsets, cell phones, truckers, logistics experts, bar codes, deals with the local cheese factory, and--oh yes, the Internet.
The interesting question, of course, is why we persist in identifying the E-commerce boom as an Internet revolution. Part of the reason, perhaps, is simply the convenience of the word "Internet" as a shorthand for all the technological wizardry of the last few decades. But surely whom and what we choose to celebrate in any period of radical change says something about the things we value. This fall, for example, the Goodyear Tire & Rubber Company--a firm with sales of more than thirteen billion dollars--was dropped from the Dow Jones industrial average. After all, Goodyear runs factories, not Web sites. It is based in Akron, not in Silicon Valley. It is part of the highway highway, not the information highway. The manufacturing economy of the early twentieth century, from which Goodyear emerged, belonged to trade unions and blue-collar men. But ours is the first economic revolution in history that the educated classes have sought to claim as wholly their own, a revolution of Kevin Kelly's "communication" and Bill Gates's "thought"--the two activities for which the Net-Geners believe themselves to be uniquely qualified. Today's talkers and thinkers value the conception of ideas, not their fulfillment. They give credit to the catalogue, but not to the postman who delivered it, or to the road he travelled on. The new economy was supposed to erase all hierarchies. Instead, it has devised another one. On the front end, there are visionaries. On the back end, there are drones.
6.
One of the very first packages ever delivered by parcel post, in 1913, was an eight-pound crate of apples sent from New Jersey to President Wilson at the White House. The symbolism of that early delivery was deliberate. When the parcel post was established, the assumption was that it would be used by farmers as a way of sending their goods cheaply and directly to customers in the city. "Let us imagine that the Gotham family," one journalist wrote at the time, immured in the city by the demands of Father Gotham's business, knew that twice a week during the summer they could get from Farmer Ruralis, forty miles out in the country, a hamper of fresh-killed poultry, green peas, string beans, asparagus, strawberries, lettuce, cherries, summer squash, and what not; that the "sass" would be only a day from garden to table; that prices would be lower than market prices; that the cost of transportation would be only thirty-five cents in and, say, eleven cents for the empty hamper back again. Would the Gotham family be interested?
The Post Office told rural mailmen to gather the names and addresses of all those farmers along their routes who wanted to sell their produce by mail. Those lists were given to city mailmen, who delivered them along their routes, so interested customers could get in contact with interested farmers directly. Because customers wanted to know what kind of produce each farmer had to sell, local postmasters began including merchandise information on their lists, essentially creating a farm-produce mail-order catalogue. A California merchant named David Lubin proposed a scheme whereby a farmer would pick up colored cards from the post office--white for eggs, pink for chickens, yellow for butter--mark each card with his prices, and mail the cards back. If he had three chickens that week for a dollar each, he would mail three pink cards to the post office. There they would be put in a pigeonhole with all the other pink cards. Customers could come by and comparison shop, pick out the cards they liked, write their address on these cards, and have the postal clerk mail them back to the farmer. It was a pre-digital eBay. The scheme was adopted in and around Sacramento, and Congress appropriated ten thousand dollars to try a similar version of it on a large scale.
At about the same time, an assistant Postmaster General, James Blakslee, had the bright idea of putting together a fleet of parcel-post trucks, which would pick up farm produce from designated spots along the main roads and ship it directly to town. Blakslee laid out four thousand miles of produce routes around the country, to be covered by fifteen hundred parcel- post trucks. In 1918, in the system's inaugural run, four thousand day-old chicks, two hundred pounds of honey, five hundred pounds of smoked sausage, five hundred pounds of butter, and eighteen thousand eggs were carried from Lancaster, Pennsylvania, to New York City, all for $31.60 in postage. New York's Secretary of State called it "an epoch in the history of the United States and the world."
Only, it wasn't. The Post Office had devised a wonderful way of communicating between farmer and customer. But there is more to a revolution than communication, and within a few years the farm-to-table movement, which started out with such high hopes, was dead. The problem was that Blakslee's trucks began to break down, which meant that the food on board spoiled. Eggs proved hard to package, and so they often arrived damaged. Butter went rancid. In the winter of 1919-20, Blakslee collected a huge number of orders for potatoes, but, as Wayne Fuller writes in his wonderful history of the era, "RFD:The Changing Face of Rural America," the potatoes that year were scarce, and good ones even scarcer, and when Blakslee's men were able to buy them and attempted delivery, nothing but trouble followed. Some of the potatoes were spoiled to begin with; some froze in transit; prices varied, deliveries went astray, and customers complained loudly enough for Congress to hear. One harried official wrote Blakslee that he could "fill the mails with complaints from people who have ordered potatoes from October to December."... Some people had been waiting over four months, either to have the potatoes delivered or their money refunded.
Parcel post, in the end, turned out to be something entirely different from what was originally envisioned--a means not to move farm goods from country to town but to move consumer goods from town to country. That is the first lesson from the revolution of a hundred years ago, and it's one that should give pause to all those eager to pronounce on the significance of the Internet age: the nature of revolutions is such that you never really know what they mean until they are over. The other lesson, of course, is that coming up with a new way of connecting buyers and sellers is a very fine thing, but what we care about most of all is getting our potatoes.
John Rock's Error
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
March 10, 2000
ANNALS OF MEDICINE
What the co-inventor of the Pill
didn't know about menstruation
can endanger women's health.
1.
John Rock was christened in 1890 at the Church of the Immaculate Conception in Marlborough, Massachusetts, and married by Cardinal William O'Connell, of Boston. He had five children and nineteen grandchildren. A crucifix hung above his desk, and nearly every day of his adult life he attended the 7 a.m. Mass at St. Mary's in Brookline. Rock, his friends would say, was in love with his church. He was also one of the inventors of the birth-control pill, and it was his conviction that his faith and his vocation were perfectly compatible. To anyone who disagreed he would simply repeat the words spoken to him as a child by his home-town priest: "John, always stick to your conscience. Never let anyone else keep it for you. And I mean anyone else." Even when Monsignor Francis W. Carney, of Cleveland, called him a "moral rapist," and when Frederick Good, the longtime head of obstetrics at Boston City Hospital, went to Boston's Cardinal Richard Cushing to have Rock excommunicated, Rock was unmoved. "You should be afraid to meet your Maker," one angry woman wrote to him, soon after the Pill was approved. "My dear madam," Rock wrote back, "in my faith, we are taught that the Lord is with us always. When my time comes, there will be no need for introductions."
In the years immediately after the Pill was approved by the F.D.A., in 1960, Rock was everywhere. He appeared in interviews and documentaries on CBS and NBC, in Time, Newsweek, Life, The Saturday Evening Post. He toured the country tirelessly. He wrote a widely discussed book, "The Time Has Come: A Catholic Doctor's Proposals to End the Battle Over Birth Control," which was translated into French, German, and Dutch. Rock was six feet three and rail-thin, with impeccable manners; he held doors open for his patients and addressed them as "Mrs." or "Miss." His mere association with the Pill helped make it seem respectable. "He was a man of great dignity," Dr. Sheldon J. Segal, of the Population Council, recalls. "Even if the occasion called for an open collar, you'd never find him without an ascot. He had the shock of white hair to go along with that. And posture, straight as an arrow, even to his last year." At Harvard Medical School, he was a giant, teaching obstetrics for more than three decades. He was a pioneer in in-vitro fertilization and the freezing of sperm cells, and was the first to extract an intact fertilized egg. The Pill was his crowning achievement. His two collaborators, Gregory Pincus and Min- Cheuh Chang, worked out the mechanism. He shepherded the drug through its clinical trials. "It was his name and his reputation that gave ultimate validity to the claims that the pill would protect women against unwanted pregnancy," Loretta McLaughlin writes in her marvellous 1982 biography of Rock. Not long before the Pill's approval, Rock travelled to Washington to testify before the F.D.A. about the drug's safety. The agency examiner, Pasquale DeFelice, was a Catholic obstetrician from Georgetown University, and at one point, the story goes, DeFelice suggested the unthinkable--that the Catholic Church would never approve of the birth-control pill. "I can still see Rock standing there, his face composed, his eyes riveted on DeFelice," a colleague recalled years later, "and then, in a voice that would congeal your soul, he said, 'Young man, don't you sell my church short.' "
In the end, of course, John Rock's church disappointed him. In 1968, in the encyclical "Humanae Vitae," Pope Paul VI outlawed oral contraceptives and all other "artificial" methods of birth control. The passion and urgency that animated the birth-control debates of the sixties are now a memory. John Rock still matters, though, for the simple reason that in the course of reconciling his church and his work he made an error. It was not a deliberate error. It became manifest only after his death, and through scientific advances he could not have anticipated. But because that mistake shaped the way he thought about the Pill--about what it was, and how it worked, and most of all what it meant--and because John Rock was one of those responsible for the way the Pill came into the world, his error has colored the way people have thought about contraception ever since.
John Rock believed that the Pill was a "natural" method of birth control. By that he didn't mean that it felt natural, because it obviously didn't for many women, particularly not in its earliest days, when the doses of hormone were many times as high as they are today. He meant that it worked by natural means. Women can get pregnant only during a certain interval each month, because after ovulation their bodies produce a surge of the hormone progesterone. Progesterone--one of a class of hormones known as progestin--prepares the uterus for implantation and stops the ovaries from releasing new eggs; it favors gestation. "It is progesterone, in the healthy woman, that prevents ovulation and establishes the pre- and post-menstrual 'safe' period," Rock wrote. When a woman is pregnant, her body produces a stream of progestin in part for the same reason, so that another egg can't be released and threaten the pregnancy already under way. Progestin, in other words, is nature's contraceptive. And what was the Pill? Progestin in tablet form. When a woman was on the Pill, of course, these hormones weren't coming in a sudden surge after ovulation and weren't limited to certain times in her cycle. They were being given in a steady dose, so that ovulation was permanently shut down. They were also being given with an additional dose of estrogen, which holds the endometrium together and--as we've come to learn--helps maintain other tissues as well. But to Rock, the timing and combination of hormones wasn't the issue. The key fact was that the Pill's ingredients duplicated what could be found in the body naturally. And in that naturalness he saw enormous theological significance.
In 1951, for example, Pope Pius XII had sanctioned the rhythm method for Catholics because he deemed it a "natural" method of regulating procreation: it didn't kill the sperm, like a spermicide, or frustrate the normal process of procreation, like a diaphragm, or mutilate the organs, like sterilization. Rock knew all about the rhythm method. In the nineteen-thirties, at the Free Hospital for Women, in Brookline, he had started the country's first rhythm clinic for educating Catholic couples in natural contraception. But how did the rhythm method work? It worked by limiting sex to the safe period that progestin created. And how did the Pill work? It worked by using progestin to extend the safe period to the entire month. It didn't mutilate the reproductive organs, or damage any natural process. "Indeed," Rock wrote, oral contraceptives "may be characterized as a 'pill-established safe period,' and would seem to carry the same moral implications" as the rhythm method. The Pill was, to Rock, no more than "an adjunct to nature."
In 1958, Pope Pius XII approved the Pill for Catholics, so long as its contraceptive effects were "indirect"--that is, so long as it was intended only as a remedy for conditions like painful menses or "a disease of the uterus." That ruling emboldened Rock still further. Short-term use of the Pill, he knew, could regulate the cycle of women whose periods had previously been unpredictable. Since a regular menstrual cycle was necessary for the successful use of the rhythm method--and since the rhythm method was sanctioned by the Church--shouldn't it be permissible for women with an irregular menstrual cycle to use the Pill in order to facilitate the use of rhythm? And if that was true why not take the logic one step further? As the federal judge John T. Noonan writes in "Contraception," his history of the Catholic position on birth control:
If it was lawful to suppress ovulation to achieve a regularity necessary for successfully sterile intercourse, why was it not lawful to suppress ovulation without appeal to rhythm? If pregnancy could be prevented by pill plus rhythm, why not by pill alone? In each case suppression of ovulation was used as a means. How was a moral difference made by the addition of rhythm?
These arguments, as arcane as they may seem, were central to the development of oral contraception. It was John Rock and Gregory Pincus who decided that the Pill ought to be taken over a four-week cycle--a woman would spend three weeks on the Pill and the fourth week off the drug (or on a placebo), to allow for menstruation. There was and is no medical reason for this. A typical woman of childbearing age has a menstrual cycle of around twenty- eight days, determined by the cascades of hormones released by her ovaries. As first estrogen and then a combination of estrogen and progestin flood the uterus, its lining becomes thick and swollen, preparing for the implantation of a fertilized egg. If the egg is not fertilized, hormone levels plunge and cause the lining--the endometrium--to be sloughed off in a menstrual bleed. When a woman is on the Pill, however, no egg is released, because the Pill suppresses ovulation. The fluxes of estrogen and progestin that cause the lining of the uterus to grow are dramatically reduced, because the Pill slows down the ovaries. Pincus and Rock knew that the effect of the Pill's hormones on the endometrium was so modest that women could conceivably go for months without having to menstruate. "In view of the ability of this compound to prevent menstrual bleeding as long as it is taken," Pincus acknowledged in 1958, "a cycle of any desired length could presumably be produced." But he and Rock decided to cut the hormones off after three weeks and trigger a menstrual period because they believed that women would find the continuation of their monthly bleeding reassuring. More to the point, if Rock wanted to demonstrate that the Pill was no more than a natural variant of the rhythm method, he couldn't very well do away with the monthly menses. Rhythm required "regularity," and so the Pill had to produce regularity as well.
It has often been said of the Pill that no other drug has ever been so instantly recognizable by its packaging: that small, round plastic dial pack. But what was the dial pack if not the physical embodiment of the twenty-eight-day cycle? It was, in the words of its inventor, meant to fit into a case "indistinguishable" from a woman's cosmetics compact, so that it might be carried "without giving a visual clue as to matters which are of no concern to others." Today, the Pill is still often sold in dial packs and taken in twenty-eight-day cycles. It remains, in other words, a drug shaped by the dictates of the Catholic Church--by John Rock's desire to make this new method of birth control seem as natural as possible. This was John Rock's error. He was consumed by the idea of the natural. But what he thought was natural wasn't so natural after all, and the Pill he ushered into the world turned out to be something other than what he thought it was. In John Rock's mind the dictates of religion and the principles of science got mixed up, and only now are we beginning to untangle them.
2.
In 1986, a young scientist named Beverly Strassmann travelled to Africa to live with the Dogon tribe of Mali. Her research site was the village of Sangui in the Sahel, about a hundred and twenty miles south of Timbuktu. The Sahel is thorn savannah, green in the rainy season and semi-arid the rest of the year. The Dogon grow millet, sorghum, and onions, raise livestock, and live in adobe houses on the Bandiagara escarpment. They use no contraception. Many of them have held on to their ancestral customs and religious beliefs. Dogon farmers, in many respects, live much as people of that region have lived since antiquity. Strassmann wanted to construct a precise reproductive profile of the women in the tribe, in order to understand what female biology might have been like in the millennia that preceded the modern age. In a way, Strassmann was trying to answer the same question about female biology that John Rock and the Catholic Church had struggled with in the early sixties: what is natural? Only, her sense of "natural" was not theological but evolutionary. In the era during which natural selection established the basic patterns of human biology--the natural history of our species--how often did women have children? How often did they menstruate? When did they reach puberty and menopause? What impact did breast-feeding have on ovulation? These questions had been studied before, but never so thoroughly that anthropologists felt they knew the answers with any certainty.
Strassmann, who teaches at the University of Michigan at Ann Arbor, is a slender, soft-spoken woman with red hair, and she recalls her time in Mali with a certain wry humor. The house she stayed in while in Sangui had been used as a shelter for sheep before she came and was turned into a pigsty after she left. A small brown snake lived in her latrine, and would curl up in a camouflaged coil on the seat she sat on while bathing. The villagers, she says, were of two minds: was it a deadly snake--Kere me jongolo, literally, "My bite cannot be healed"--or a harmless mouse snake? (It turned out to be the latter.) Once, one of her neighbors and best friends in the tribe roasted her a rat as a special treat. "I told him that white people aren't allowed to eat rat because rat is our totem," Strassmann says. "I can still see it. Bloated and charred. Stretched by its paws. Whiskers singed. To say nothing of the tail." Strassmann meant to live in Sangui for eighteen months, but her experiences there were so profound and exhilarating that she stayed for two and a half years. "I felt incredibly privileged," she says. "I just couldn't tear myself away."
Part of Strassmann's work focussed on the Dogon's practice of segregating menstruating women in special huts on the fringes of the village. In Sangui, there were two menstrual huts--dark, cramped, one-room adobe structures, with boards for beds. Each accommodated three women, and when the rooms were full, latecomers were forced to stay outside on the rocks. "It's not a place where people kick back and enjoy themselves," Strassmann says. "It's simply a nighttime hangout. They get there at dusk, and get up early in the morning and draw their water." Strassmann took urine samples from the women using the hut, to confirm that they were menstruating. Then she made a list of all the women in the village, and for her entire time in Mali--seven hundred and thirty- six consecutive nights--she kept track of everyone who visited the hut. Among the Dogon, she found, a woman, on average, has her first period at the age of sixteen and gives birth eight or nine times. From menarche, the onset of menstruation, to the age of twenty, she averages seven periods a year. Over the next decade and a half, from the age of twenty to the age of thirty-four, she spends so much time either pregnant or breast-feeding (which, among the Dogon, suppresses ovulation for an average of twenty months) that she averages only slightly more than one period per year. Then, from the age of thirty-five until menopause, at around fifty, as her fertility rapidly declines, she averages four menses a year. All told, Dogon women menstruate about a hundred times in their lives. (Those who survive early childhood typically live into their seventh or eighth decade.) By contrast, the average for contemporary Western women is somewhere between three hundred and fifty and four hundred times.
Strassmann's office is in the basement of a converted stable next to the Natural History Museum on the University of Michigan campus. Behind her desk is a row of battered filing cabinets, and as she was talking she turned and pulled out a series of yellowed charts. Each page listed, on the left, the first names and identification numbers of the Sangui women. Across the top was a time line, broken into thirty-day blocks. Every menses of every woman was marked with an X. In the village, Strassmann explained, there were two women who were sterile, and, because they couldn't get pregnant, they were regulars at the menstrual hut. She flipped through the pages until she found them. "Look, she had twenty-nine menses over two years, and the other had twenty- three." Next to each of their names was a solid line of x's. "Here's a woman approaching menopause," Strassmann went on, running her finger down the page. "She's cycling but is a little bit erratic. Here's another woman of prime childbearing age. Two periods. Then pregnant. I never saw her again at the menstrual hut. This woman here didn't go to the menstrual hut for twenty months after giving birth, because she was breast-feeding. Two periods. Got pregnant. Then she miscarried, had a few periods, then got pregnant again. This woman had three menses in the study period." There weren't a lot of x's on Strassmann's sheets. Most of the boxes were blank. She flipped back through her sheets to the two anomalous women who were menstruating every month. "If this were a menstrual chart of undergraduates here at the University of Michigan, all the rows would be like this."
Strassmann does not claim that her statistics apply to every preindustrial society. But she believes--and other anthropological work backs her up--that the number of lifetime menses isn't greatly affected by differences in diet or climate or method of subsistence (foraging versus agriculture, say). The more significant factors, Strassmann says, are things like the prevalence of wet-nursing or sterility. But over all she believes that the basic pattern of late menarche, many pregnancies, and long menstrual-free stretches caused by intensive breast-feeding was virtually universal up until the "demographic transition" of a hundred years ago from high to low fertility. In other words, what we think of as normal--frequent menses--is in evolutionary terms abnormal. "It's a pity that gynecologists think that women have to menstruate every month,"Strassmann went on. "They just don't understand the real biology of menstruation."
To Strassmann and others in the field of evolutionary medicine, this shift from a hundred to four hundred lifetime menses is enormously significant. It means that women's bodies are being subjected to changes and stresses that they were not necessarily designed by evolution to handle. In a brilliant and provocative book, "Is Menstruation Obsolete?," Drs. Elsimar Coutinho and Sheldon S. Segal, two of the world's most prominent contraceptive researchers, argue that this recent move to what they call "incessant ovulation" has become a serious problem for women's health. It doesn't mean that women are always better off the less they menstruate. There are times--particularly in the context of certain medical conditions--when women ought to be concerned if they aren't menstruating: In obese women, a failure to menstruate can signal an increased risk of uterine cancer. In female athletes, a failure to menstruate can signal an increased risk of osteoporosis. But for most women, Coutinho and Segal say, incessant ovulation serves no purpose except to increase the occurence of abdominal pain, mood shifts, migraines, endometriosis, fibroids, and anemia--the last of which, they point out, is "one of the most serious health problems in the world."
Most serious of all is the greatly increased risk of some cancers. Cancer, after all, occurs because as cells divide and reproduce they sometimes make mistakes that cripple the cells' defenses against runaway growth. That's one of the reasons that our risk of cancer generally increases as we age: our cells have more time to make mistakes. But this also means that any change promoting cell division has the potential to increase cancer risk, and ovulation appears to be one of those changes. Whenever a woman ovulates, an egg literally bursts through the walls of her ovaries. To heal that puncture, the cells of the ovary wall have to divide and reproduce. Every time a woman gets pregnant and bears a child, her lifetime risk of ovarian cancer drops ten per cent. Why? Possibly because, between nine months of pregnancy and the suppression of ovulation associated with breast-feeding, she stops ovulating for twelve months--and saves her ovarian walls from twelve bouts of cell division. The argument is similar for endometrial cancer. When a woman is menstruating, the estrogen that flows through her uterus stimulates the growth of the uterine lining, causing a flurry of potentially dangerous cell division. Women who do not menstruate frequently spare the endometrium that risk. Ovarian and endometrial cancer are characteristically modern diseases, consequences, in part, of a century in which women have come to menstruate four hundred times in a lifetime.
In this sense, the Pill really does have a "natural"effect. By blocking the release of new eggs, the progestin in oral contraceptives reduces the rounds of ovarian cell division. Progestin also counters the surges of estrogen in the endometrium, restraining cell division there. A woman who takes the Pill for ten years cuts her ovarian-cancer risk by around seventy per cent and her endometrial-cancer risk by around sixty per cent. But here "natural" means something different from what Rock meant. He assumed that the Pill was natural because it was an unobtrusive variant of the body's own processes. In fact, as more recent research suggests, the Pill is really only natural in so far as it's radical--rescuing the ovaries and endometrium from modernity. That Rock insisted on a twenty-eight-day cycle for his pill is evidence of just how deep his misunderstanding was: the real promise of the Pill was not that it could preserve the menstrual rhythms of the twentieth century but that it could disrupt them.
Today, a growing movement of reproductive specialists has begun to campaign loudly against the standard twenty-eight-day pill regimen. The drug company Organon has come out with a new oral contraceptive, called Mircette, that cuts the seven-day placebo interval to two days. Patricia Sulak, a medical researcher at Texas A.& M. University, has shown that most women can probably stay on the Pill, straight through, for six to twelve weeks before they experience breakthrough bleeding or spotting. More recently, Sulak has documented precisely what the cost of the Pill's monthly "off" week is. In a paper in the February issue of the journal Obstetrics and Gynecology, she and her colleagues documented something that will come as no surprise to most women on the Pill: during the placebo week, the number of users experiencing pelvic pain, bloating, and swelling more than triples, breast tenderness more than doubles, and headaches increase by almost fifty per cent. In other words, some women on the Pill continue to experience the kinds of side effects associated with normal menstruation. Sulak's paper is a short, dry, academic work, of the sort intended for a narrow professional audience. But it is impossible to read it without being struck by the consequences of John Rock's desire to please his church. In the past forty years, millions of women around the world have been given the Pill in such a way as to maximize their pain and suffering. And to what end? To pretend that the Pill was no more than a pharmaceutical version of the rhythm method?
3.
In 1980 and 1981, Malcolm Pike, a medical statistician at the University of Southern California, travelled to Japan for six months to study at the Atomic Bomb Casualties Commission. Pike wasn't interested in the effects of the bomb. He wanted to examine the medical records that the commission had been painstakingly assembling on the survivors of Hiroshima and Nagasaki. He was investigating a question that would ultimately do as much to complicate our understanding of the Pill as Strassmann's research would a decade later: why did Japanese women have breast-cancer rates six times lower than American women?
In the late forties, the World Health Organization began to collect and publish comparative health statistics from around the world, and the breast-cancer disparity between Japan and America had come to obsess cancer specialists. The obvious answer--that Japanese women were somehow genetically protected against breast cancer--didn't make sense, because once Japanese women moved to the United States they began to get breast cancer almost as often as American women did. As a result, many experts at the time assumed that the culprit had to be some unknown toxic chemical or virus unique to the West. Brian Henderson, a colleague of Pike's at U.S.C. and his regular collaborator, says that when he entered the field, in 1970, "the whole viral- and chemical- carcinogenesis idea was huge--it dominated the literature." As he recalls, "Breast cancer fell into this large, unknown box that said it was something to do with the environment--and that word 'environment' meant a lot of different things to a lot of different people. They might be talking about diet or smoking or pesticides."
Henderson and Pike, however, became fascinated by a number of statistical pecularities. For one thing, the rate of increase in breast-cancer risk rises sharply throughout women's thirties and forties and then, at menopause, it starts to slow down. If a cancer is caused by some toxic outside agent, you'd expect that rate to rise steadily with each advancing year, as the number of mutations and genetic mistakes steadily accumulates. Breast cancer, by contrast, looked as if it were being driven by something specific to a woman's reproductive years. What was more, younger women who had had their ovaries removed had a markedly lower risk of breast cancer; when their bodies weren't producing estrogen and progestin every month, they got far fewer tumors. Pike and Henderson became convinced that breast cancer was linked to a process of cell division similar to that of ovarian and endometrial cancer. The female breast, after all, is just as sensitive to the level of hormones in a woman's body as the reproductive system. When the breast is exposed to estrogen, the cells of the terminal-duct lobular unit--where most breast cancer arises--undergo a flurry of division. And during the mid-to-late stage of the menstrual cycle, when the ovaries start producing large amounts of progestin, the pace of cell division in that region doubles.
It made intuitive sense, then, that a woman's risk of breast cancer would be linked to the amount of estrogen and progestin her breasts have been exposed to during her lifetime. How old a woman is at menarche should make a big difference, because the beginning of puberty results in a hormonal surge through a woman's body, and the breast cells of an adolescent appear to be highly susceptible to the errors that result in cancer. (For more complicated reasons, bearing children turns out to be protective against breast cancer, perhaps because in the last two trimesters of pregnancy the cells of the breast mature and become much more resistant to mutations.) How old a woman is at menopause should matter, and so should how much estrogen and progestin her ovaries actually produce, and even how much she weighs after menopause, because fat cells turn other hormones into estrogen.
Pike went to Hiroshima to test the cell-division theory. With other researchers at the medical archive, he looked first at the age when Japanese women got their period. A Japanese woman born at the turn of the century had her first period at sixteen and a half. American women born at the same time had their first period at fourteen. That difference alone, by their calculation, was sufficient to explain forty per cent of the gap between American and Japanese breast-cancer rates. "They had collected amazing records from the women of that area," Pike said. "You could follow precisely the change in age of menarche over the century. You could even see the effects of the Second World War. The age of menarche of Japanese girls went up right at that point because of poor nutrition and other hardships. And then it started to go back down after the war. That's what convinced me that the data were wonderful."
Pike, Henderson, and their colleagues then folded in the other risk factors. Age at menopause, age at first pregnancy, and number of children weren't sufficiently different between the two countries to matter. But weight was. The average post- menopausal Japanese woman weighed a hundred pounds; the average American woman weighed a hundred and forty-five pounds. That fact explained another twenty-five per cent of the difference. Finally, the researchers analyzed blood samples from women in rural Japan and China, and found that their ovaries-- possibly because of their extremely low-fat diet--were producing about seventy-five per cent the amount of estrogen that American women were producing. Those three factors, added together, seemed to explain the breast-cancer gap. They also appeared to explain why the rates of breast cancer among Asian women began to increase when they came to America: on an American diet, they started to menstruate earlier, gained more weight, and produced more estrogen. The talk of chemicals and toxins and power lines and smog was set aside. "When people say that what we understand about breast cancer explains only a small amount of the problem, that it is somehow a mystery, it's absolute nonsense," Pike says flatly. He is a South African in his sixties, with graying hair and a salt-and-pepper beard. Along with Henderson, he is an eminent figure in cancer research, but no one would ever accuse him of being tentative in his pronouncements. "We understand breast cancer extraordinarily well. We understand it as well as we understand cigarettes and lung cancer."
What Pike discovered in Japan led him to think about the Pill, because a tablet that suppressed ovulation--and the monthly tides of estrogen and progestin that come with it--obviously had the potential to be a powerful anti-breast-cancer drug. But the breast was a little different from the reproductive organs. Progestin prevented ovarian cancer because it suppressed ovulation. It was good for preventing endometrial cancer because it countered the stimulating effects of estrogen. But in breast cells, Pike believed, progestin wasn't the solution; it was one of the hormones that caused cell division. This is one explanation for why, after years of studying the Pill, researchers have concluded that it has no effect one way or the other on breast cancer: whatever beneficial effect results from what the Pill does is cancelled out by how it does it. John Rock touted the fact that the Pill used progestin, because progestin was the body's own contraceptive. But Pike saw nothing "natural"about subjecting the breast to that heavy a dose of proges- tin. In his view, the amount of progestin and estrogen needed to make an effective contraceptive was much greater than the amount needed to keep the reproductive system healthy--and that excess was unnecessarily raising the risk of breast cancer. A truly natural Pill might be one that found a way to suppress ovulation without using progestin. Throughout the nineteen-eighties, Pike recalls, this was his obsession. "We were all trying to work out how the hell we could fix the Pill. We thought about it day and night."
4.
Pike's proposed solution is a class of drugs known as GnRHAs, which has been around for many years. GnRHAs disrupt the signals that the pituitary gland sends when it is attempting to order the manufacture of sex hormones. It's a circuit breaker. "We've got substantial experience with this drug," Pike says. Men suffering from prostate cancer are sometimes given a GnRHA to temporarily halt the production of testosterone, which can exacerbate their tumors. Girls suffering from what's called precocious puberty--puberty at seven or eight, or even younger--are sometimes given the drug to forestall sexual maturity. If you give GnRHA to women of childbearing age, it stops their ovaries from producing estrogen and progestin. If the conventional Pill works by convincing the body that it is, well, a little bit pregnant, Pike's pill would work by convincing the body that it was menopausal.
In the form Pike wants to use it, GnRHA will come in a clear glass bottle the size of a saltshaker, with a white plastic mister on top. It will be inhaled nasally. It breaks down in the body very quickly. A morning dose simply makes a woman menopausal for a while. Menopause, of course, has its risks. Women need estrogen to keep their hearts and bones strong. They also need progestin to keep the uterus healthy. So Pike intends to add back just enough of each hormone to solve these problems, but much less than women now receive on the Pill. Ideally, Pike says, the estrogen dose would be adjustable: women would try various levels until they found one that suited them. The progestin would come in four twelve-day stretches a year. When someone on Pike's regimen stopped the progestin, she would have one of four annual menses.
Pike and an oncologist named Darcy Spicer have joined forces with another oncologist, John Daniels, in a startup called Balance Pharmaceuticals. The firm operates out of a small white industrial strip mall next to the freeway in Santa Monica. One of the tenants is a paint store, another looks like some sort of export company. Balance's offices are housed in an oversized garage with a big overhead door and concrete floors. There is a tiny reception area, a little coffee table and a couch, and a warren of desks, bookshelves, filing cabinets, and computers. Balance is testing its formulation on a small group of women at high risk for breast cancer, and if the results continue to be encouraging, it will one day file for F.D.A. approval.
"When I met Darcy Spicer a couple of years ago," Pike said recently, as he sat at a conference table deep in the Balance garage, "he said, 'Why don't we just try it out? By taking mammograms, we should be able to see changes in the breasts of women on this drug, even if we add back a little estrogen to avoid side effects.' So we did a study, and we found that there were huge changes." Pike pulled out a paper he and Spicer had published in the Journal of the National Cancer Institute, showing breast X-rays of three young women. "These are the mammograms of the women before they start," he said. Amid the grainy black outlines of the breast were large white fibrous clumps--clumps that Pike and Spicer believe are indicators of the kind of relentless cell division that increases breast-cancer risk. Next to those x-rays were three mammograms of the same women taken after a year on the GnRHA regimen. The clumps were almost entirely gone. "This to us represents that we have actually stopped the activity inside the breasts," Pike went on. "White is a proxy for cell proliferation. We're slowing down the breast."
Pike stood up from the table and turned to a sketch pad on an easel behind him. He quickly wrote a series of numbers on the paper. "Suppose a woman reaches menarche at fifteen and menopause at fifty. That's thirty-five years of stimulating the breast. If you cut that time in half, you will change her risk not by half but by half raised to the power of 4.5." He was working with a statistical model he had developed to calculate breast-cancer risk. "That's one-twenty-third. Your risk of breast cancer will be one- twenty-third of what it would be otherwise. It won't be zero. You can't get to zero. If you use this for ten years, your risk will be cut by at least half. If you use it for five years, your risk will be cut by at least a third. It's as if your breast were to be five years younger, or ten years younger--forever." The regimen, he says, should also provide protection against ovarian cancer.
Pike gave the sense that he had made this little speech many times before, to colleagues, to his family and friends--and to investors. He knew by now how strange and unbelievable what he was saying sounded. Here he was, in a cold, cramped garage in the industrial section of Santa Monica, arguing that he knew how to save the lives of hundreds of thousands of women around the world. And he wanted to do that by making young women menopausal through a chemical regimen sniffed every morning out of a bottle. This was, to say the least, a bold idea. Could he strike the right balance between the hormone levels women need to stay healthy and those that ultimately make them sick? Was progestin really so important in breast cancer? There are cancer specialists who remain skeptical. And, most of all, what would women think? John Rock, at least, had lent the cause of birth control his Old World manners and distinguished white hair and appeals from theology; he took pains to make the Pill seem like the least radical of interventions--nature's contraceptive, something that could be slipped inside a woman's purse and pass without notice. Pike was going to take the whole forty-year mythology of "natural" and sweep it aside. "Women are going to think, I'm being manipulated here. And it's a perfectly reasonable thing to think." Pike's South African accent gets a little stronger as he becomes more animated. "But the modern way of living represents an extraordinary change in female biology. Women are going out and becoming lawyers, doctors, presidents of countries. They need to understand that what we are trying to do isn't abnormal. It's just as normal as when someone hundreds of years ago had menarche at seventeen and had five babies and had three hundred fewer menstrual cycles than most women have today. The world is not the world it was. And some of the risks that go with the benefits of a woman getting educated and not getting pregnant all the time are breast cancer and ovarian cancer, and we need to deal with it. I have three daughters. The earliest grandchild I had was when one of them was thirty-one. That's the way many women are now. They ovulate from twelve or thirteen until their early thirties. Twenty years of uninterrupted ovulation before their first child! That's a brand-new phenomenon!"
5.
John Rock's long battle on behalf of his birth-control pill forced the Church to take notice. In the spring of 1963, just after Rock's book was published, a meeting was held at the Vatican between high officials of the Catholic Church and Donald B. Straus, the chairman of Planned Parenthood. That summit was followed by another, on the campus of the University of Notre Dame. In the summer of 1964, on the eve of the feast of St. John the Baptist, Pope Paul VI announced that he would ask a committee of church officials to reëxamine the Vatican's position on contraception. The group met first at the Collegio San Jose, in Rome, and it was clear that a majority of the committee were in favor of approving the Pill. Committee reports leaked to the National Catholic Register confirmed that Rock's case appeared to be winning. Rock was elated. Newsweek put him on its cover, and ran a picture of the Pope inside. "Not since the Copernicans suggested in the sixteenth century that the sun was the center of the planetary system has the Roman Catholic Church found itself on such a perilous collision course with a new body of knowledge," the article concluded. Paul VI, however, was unmoved. He stalled, delaying a verdict for months, and then years. Some said he fell under the sway of conservative elements within the Vatican. In the interim, theologians began exposing the holes in Rock's arguments. The rhythm method " 'prevents' conception by abstinence, that is, by the non-performance of the conjugal act during the fertile period," the Catholic journal America concluded in a 1964 editorial. "The pill prevents conception by suppressing ovulation and by thus abolishing the fertile period. No amount of word juggling can make abstinence from sexual relations and the suppression of ovulation one and the same thing." On July 29, 1968, in the "Humanae Vitae" encyclical, the Pope broke his silence, declaring all "artificial" methods of contraception to be against the teachings of the Church.
In hindsight, it is possible to see the opportunity that Rock missed. If he had known what we know now and had talked about the Pill not as a contraceptive but as a cancer drug--not as a drug to prevent life but as one that would save life--the church might well have said yes. Hadn't Pius XII already approved the Pill for therapeutic purposes? Rock would only have had to think of the Pill as Pike thinks of it: as a drug whose contraceptive aspects are merely a means of attracting users, of getting, as Pike put it, "people who are young to take a lot of stuff they wouldn't otherwise take."
But Rock did not live long enough to understand how things might have been. What he witnessed, instead, was the terrible time at the end of the sixties when the Pill suddenly stood accused--wrongly--of causing blood clots, strokes, and heart attacks. Between the mid-seventies and the early eighties, the number of women in the United States using the Pill fell by half. Harvard Medical School, meanwhile, took over Rock's Reproductive Clinic and pushed him out. His Harvard pension paid him only seventy-five dollars a year. He had almost no money in the bank and had to sell his house in Brookline. In 1971, Rock left Boston and retreated to a farmhouse in the hills of New Hampshire. He swam in the stream behind the house. He listened to John Philip Sousa marches. In the evening, he would sit in the living room with a pitcher of martinis. In 1983, he gave his last public interview, and it was as if the memory of his achievements was now so painful that he had blotted it out.
He was asked what the most gratifying time of his life was. "Right now," the inventor of the Pill answered, incredibly. He was sitting by the fire in a crisp white shirt and tie, reading "The Origin," Irving Stone's fictional account of the life of Darwin. "It frequently occurs to me, gosh, what a lucky guy I am. I have no responsibilities, and I have everything I want. I take a dose of equanimity every twenty minutes. I will not be disturbed about things."
Once, John Rock had gone to seven-o'clock Mass every morning and kept a crucifix above his desk. His interviewer, the writer Sara Davidson, moved her chair closer to his and asked him whether he still believed in an afterlife.
"Of course I don't," Rock answered abruptly. Though he didn't explain why, his reasons aren't hard to imagine. The church could not square the requirements of its faith with the results of his science, and if the church couldn't reconcile them how could Rock be expected to? John Rock always stuck to his conscience, and in the end his conscience forced him away from the thing he loved most. This was not John Rock's error. Nor was it his church's. It was the fault of the haphazard nature of science, which all too often produces progress in advance of understanding. If the order of events in the discovery of what was natural had been reversed, his world, and our world, too, would have been a different place.
"Heaven and Hell, Rome, all the Church stuff--that's for the solace of the multitude," Rock said. He had only a year to live. "I was an ardent practicing Catholic for a long time, and I really believed it all then, you see."
The Young Garmentos
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
April 24, 2000
LETTER FROM LOS ANGELES
The T-shirt trade becomes a calling.
1.
Dov Charney started his T-shirt business, American Apparel, on the corner of Santa Fe Avenue and the 10 Freeway, a mile or so from downtown Los Angeles. Actually, his factory was built directly underneath the eastbound and westbound lanes, and the roof over the room where the cutters and sewers work was basically the freeway itself, so that the clicking and clacking of sewing machines mixed with the rumble of tractor trailers. It was not, as Dov was the first to admit, an ideal location, with the possible exception that it was just two blocks from the Playpen, the neighborhood strip bar, which made it awfully convenient whenever he decided to conduct a fitting. "Big companies tend to hire fitting models at a hundred bucks an hour," Dov explained recently as he headed over to the Playpen to test some of his new T-shirts. "But they only give you one look. At a strip bar, you get a cross- section of chicks. You've got big chicks, little chicks, big-assed chicks, little-assed chicks, chicks with big tits, and chicks with little tits. You couldn't ask for a better place to fit a shirt."
He had three of his staff with him, and half a dozen samples of his breakthrough Classic Girl line of "baby T"s, in this case shirts with ribbed raglan three-quarter sleeves in lilac and pink. He walked quickly, leaning forward slightly, as if to improve his aerodynamics. Dov is thirty-one years old and has thick black hair and blue-tinted aviator glasses, and tends to dress in khakis and knit vintage shirts, with one of his own T-shirts as an undergarment. In front of the Playpen, Dov waved to the owner, a middle-aged Lebanese man in a red guayabera, and ushered his group into the gloom of the bar. At this hour--two o'clock in the afternoon--the Playpen was almost empty; just one girl gyrated for a customer, to what sounded like the music from "Ali Baba and the Forty Thieves." The situation was ideal, because it meant the rest of the girls had time to model.
The first to come over was Diana, dark-haired and buxom. She slipped out of a yellow mesh dress and pulled on one of Dov's baby T's. Dov examined her critically. He was concerned about the collar. The Classic Girl is supposed to have a snug fit, with none of the torquing and bowing that plague lesser shirts. But the prototype was bunching around the neck. Dov gestured to one of his colleagues. "Olin, look what's going on here. I think there's too much binding going into the machine." Diana turned around, and wiggled her behind playfully. Dov pulled the T-shirt tight. "I think it could be a little longer here," he said, pursing his lips. Baby T's, in their earlier incarnation, were short, in some cases above the belly button--something that Dov considers a mistake. The music was now deafening, and over a loudspeaker a "lap-dance promo" was being announced. Dov, oblivious, turned his attention to Mandy, a svelte, long-legged blonde in a black bikini. On her, Dov observed, the shirt did not fit so "emphatically" around the chest as it had on Diana. Dov looked Mandy up and down, tugging and pulling to get the shirt just right. "When you're doing a fitting, often the more oddly shaped girl will tell you a lot more," he said. By now, a crowd of strippers was gathering around him, presumably attracted by the novelty of being asked by a customer to put clothes on. But Dov had seen all he needed to. His life's great cause--which is to produce the world's finest T-shirt for between three and four dollars wholesale--had advanced another step. "What did I learn today?" he asked, as he strode out the door. "I learned that my sleeves are perfect. But I see a quality problem with the collar." He thought for a moment. "And I definitely have to add an inch to the garment."
2.
There is a town in upstate New York, just north and west of Albany, called Gloversville, so named because in the late nineteenth century and the early part of the twentieth century ninety-five per cent of the fine gloves sold in the United States were manufactured there. At one time, there were a hundred and sixteen glove factories in the town, employing twelve thousand people and turning out fifteen million dollars' worth of gloves a year. New glove start-ups appeared all the time, whenever some glove entrepreneur--some ambitious handschumacher--had a better idea about how to make a glove. A trade journal, Glovers Review, covered the industry's every step. Local firms--such as Jacob Adler & Co. and Louis Meyers & Sons and Elite Glove Co.--became nationally known brands. When the pogroms of Eastern Europe intensified, in the eighteen-eighties, the Jewish glove cutters of Warsaw--the finest leather artisans of nineteenth-century Europe--moved en masse to Gloversville, because Gloversville was where you went in those days if you cared about gloves.
It's hard to imagine anyone caring so deeply about gloves, and had we visited Gloversville in its prime most of us would have found it a narrow and provincial place. But if you truly know gloves and think about them and dream about them and, more important, if you are surrounded every day by a community of people who know and think and dream about gloves, a glove becomes more than a glove. In Gloversville, there was an elaborate social hierarchy. The handschumacher considered himself socially and intellectually superior to the schuster and the schneider--the shoemaker and the tailor. To cover the hands, after all, was the highest calling. (As the glover's joke goes, "Did you ever see anyone talk using his boots?") Within the glove world, in turn, the "makers"--the silkers, the closers, and the fourchetters, who sewed the gloves--were inferior to the "cutters," who first confronted the hide, and who advertised their status by going to work wearing white shirts and collars, bow ties or cravats, tigereye cufflinks, and carefully pressed suits. A skilled cutter could glance at a glove and see in it the answers to a hundred questions. Is the leather mocha, the most pliable of all skins, taken from the hide of long-black-haired Arabian sheep? Or is it South African capeskin, the easiest to handle? Is it kid from Spain, peccary from the wild pigs of Brazil and Mexico, chamois from Europe, or cabretta, from a Brazilian hairy sheep? Is the finish "grained"--showing the outside of the hide--or "velvet," meaning that the leather has been buffed? Is it sewn in a full- piqué stitch or a half-piqué, an osann or an overseam? Do the color and texture of the fourchette--the strip of leather that forms the sides of the fingers--match the adjoining leather? The lesson of Gloversville is that behind every ordinary object is a group of people to whom that object is anything but ordinary.
Dov Charney lives in his own, modern-day version of Gloversville. He is part of a world that cares about T-shirts every bit as much as the handschumachers cared about peccary and cabretta. It is impossible to talk about Dov, for example, without talking about his best friend, Rick Klotz, who runs a clothing company named Fresh Jive, about a mile and a half from Dov's factory. Rick, who is thirty-two, designs short-sleeve shirts and baggy pants and pullovers and vests and printed T-shirts with exquisite graphics (featuring everything from an obscure typographical scheme to the Black Panthers). In the eighties, Rick was a punker, at least until everyone else got short hair, at which point he grew his hair long. Later, in his Ted Nugent-and-TransAm phase, he had, he says, a "big, filthy mustache, like Cheech." Now he is perfectly bald, and drives a black custom-made late-model Cadillac Fleetwood Limited, with a VCR in the back, and, because he sits very low in the seat, and bobs up and down to very loud hip-hop as he drives, the effect, from the street, is slightly comic, like that of a Ping-Pong ball in choppy water. When Dov first came to Los Angeles, a few years ago, he crashed at Rick's apartment in Hollywood, and the two grew so close that Rick believes he and Dov were "separated at birth."
"If it wasn't for Rick, I wouldn't have been able to make it," Dov says. "I slept on his couch. I checked in for a few days, stayed for a year." This was after an initial foray that Dov had made into the T-shirt business, in South Carolina in the early nineties, failed. "When he lived with me, he was on the brink," Rick added. "Every day was the same. Go to sleep at two with the phone. Then wake up at six to call back East. One time, he was just crying and losing it. It was just so heavy. I was, like, 'Dude, what are you doing?'"
What do Rick and Dov have in common? It isn't a matter of personality. Dov says that sometimes when he's out with Rick he'll spot one of Rick's T-shirts, and he'll shout, "There's one of your T-shirts!" Rick will look down and away, embarrassed, because he's so acutely aware of how uncool that sounds. Dov couldn't care less. When he spots his own work, he can hardly contain himself. "I always say, 'Hey' "--Dov put on the accent of his native Montreal--"'where did you get that shirt?' Like, if I'm on the subway in New York City. I say, 'You want some more?' I take my bag and give them out for free. I'm excited about it. I could be watching TV at night, or I could be watching a porno, and, boom, there is my T-shirt. I've made millions of them. I always know it!"
What the two of them share is a certain sensibility. Rick grew up in the Valley and Dov grew up in Montreal, but it's as if they were born and raised in the same small town, where the T-shirt was something that you lived and died for. At dinner one recent night in L.A., Rick talked about how he met Dov, several years ago, at a big trade show in Las Vegas. "I'm at this party sitting out on the balcony. I see this guy dancing and he's--what's the word?" And here Rick did a kind of spastic gyration in his seat. "Imbecilic. He didn't care what anybody thought. And he catches me looking and goes like this." Rick made two pistols out of his fingers, and fired one hand after another. "I was, like, in love."
Dov seemed touched. "You know, I knew of Rick long before I ever met him. His T-shirt graphics are some of the most respected T-shirt graphics in the world. I swear to God."
But Rick was being modest again. "No, they're not."
"If you mention Fresh Jive in most industrialized countries to people that know what good graphics are on T-shirts, they're, like . . . " Dov made an appreciative noise. "I swear, it's like a connoisseur's wine."
"Maybe at one time," Rick murmured.
"He is an artist!" Dov went on, his voice rising. "His canvas is fabric!"
3.
On the day that he made his foray to the Playpen, Dov met with a fortyish man named Jhean. In the garment-manufacturing business in Los Angeles, the up-and-coming entrepreneurs are Persian and Korean. (Dov has a partner who is Korean.) The occasional throwback, like Dov, is Jewish. Jhean, however, is Haitian. He used to work in government, but now he is in the garment business, a career change of which Dov heartily approved. Jhean was wearing tight black pants, a red silk shirt open to mid-chest, and a gold chain. Dov put his arm around him affectionately. "Jhean is a crazy man," he announced, to no one in particular. "He was going to be one of my partners. We were going to get this whole Montreal Jewish-Korean-Haitian thing going." Jhean turned away, and Dov lowered his voice to a whisper. "Jhean has it in his blood, you know," he said, meaning a feel for T-shirts.
Dov led Jhean outside, and they sat on a bench, the sun peeking through at them between the off-ramp and the freeway lanes. Jhean handed Dov a men's Fruit of the Loom undershirt, size medium. It was the reason for Jhean's visit. "Who can do this for me?" he asked.
Dov took the shirt and unfolded it slowly. He held it up in front of his eyes, as a mother might hold a baby, and let out a soft whistle. "This is an unbelievable garment," he said. "Nobody has the machines to make it, except for two parties that I'm aware of. Fruit of the Loom. And Hanes. The shirt is a two-by-one rib. They've taken out one or two of the needles. It's a coarse yarn. And it's tubular, so there is no waste. This is one of the most efficient garments in the world. It comes off the tube like a sock."
Some T-shirts have two seams down each side: they are made with "open width" fabric, by sewing together the front and the back of the T-shirt. This T-shirt had no seams. It was cut from cotton fabric that had been knitted into a T-shirt-size tube, which is a trickier procedure but means less wasted fabric, lower sewing costs, and less of the twisting that can distort a garment.
Dov began to run his fingers along the bottom of the shirt, which had been not hemmed but overlocked--with a stitch--to save even more fabric. "This costs, with the right equipment, maybe a dollar. My cost is a dollar-thirty, a dollar-fifty. The finest stuff is two-fifty, two-sixty. If you can make this shirt, you can make millions. But you can't make this shirt. Hanes actually does this even better than Fruit of the Loom. They've got this dialled down." Jhean wondered if he could side-seam it, but Dov just shook his head. "If you side-seam it, you lose the whole energy."
You could tell that Dov was speaking as much to himself as to Jhean. He was saying that he couldn't reproduce a masterpiece like that undershirt, either. But there was no defeat in his voice, because he knew enough about T-shirts to realize that there is more than one way to make a perfect garment. Dov likes to point out that the average American owns twenty-ve T-shirts--twenty- five!--and, even if you reckon, as he does, that of those only between four and seven are in regular rotation, that's still an enormous market.
The garment in question was either eighteen- or twenty-singles yarn, which is standard for T-shirts. But what if a T-shirt maker were to use thirty-singles yarn, knitted on a fine-gauge machine, which produces a thinner, more "fashion-forward" fabric? The Fruit of the Loom piece was open-end cotton, and open-end is coarse. Dov likes "ring-spun combed" yarn, which is much softer, and costs an extra eighty cents a pound. Softness also comes from the way the fabric is processed before cutting, and Dov is stickler for that kind of detail. "I have a lot of secret ingredients," he says. "Just like K.F.C. There is the amount of yarn in one revolution, which determines the tightness. There's the spacing of the needle. Then there's the finishing. What kind of chemicals are you using in the finishing? We think this through. We've developed a neurosis about this." In his teens, Dov hooked up with a friend who was selling printed T's outside the Montreal Forum, and Dov's contribution was to provide American Hanes instead of the Canadian poly-cotton-blend Penmans. The Hanes, Dov says, was "creamier," and he contended that the Canadian T-shirt consumer deserved that extra creaminess. When he's inspecting rolls of fabric, Dov will sometimes break into the plastic package wrap and run his hand over the cotton, palm flat, and if you look behind his tinted aviators you'll see that his eyes have closed slightly. Once, he held two white swatches up to the light, in order to demonstrate how one had "erections"--little fibres that stood up straight on the fabric--and the other did not, and then he ran his hand ever so slightly across the surface of the swatch he liked, letting the fibres tickle his palm. "I'm particular," Dov explained. "Like in my underwear. I'm very committed to Hanes thirty-two. I've been wearing it for twelve years. I sleep in it. And if Hanes makes any adjustments I'm picking it up. I watch. They change their labels, they use different countries to make their shit, I know."
Dov was back inside his factory now, going from the room where all the sewers sit, stitching up T-shirts, to a passageway lined with big rolls of fabric. The fact that Jhean's Fruit of the Loom undershirt was of rib fabric launched him on one of his favorite topics, which was the fabric he personally helped rediscover--baby rib. Baby rib is rib in which the ridges are so close together and the cotton is so fine that it looks like standard T-shirt jersey, and Dov's breakthrough was to realize that because of the way it stretches and supports and feels it was perfect for girls. "See this, that's conventional rib." He pulled on a piece of white fabric, exposing wide ridges of cotton. "It's knitted on larger machines. And it's a larger, bulkier yarn. It's poor-quality cotton. But girls want softness. So, rather than take the cheap road, I've taken the higher road." Dov's baby rib uses finer cotton and tighter stitching, and the fit is tighter across the chest and shoulders, the way he believes a T-shirt ought to look. "There were a few influences," he said, reflecting on the creative process that brought him to baby rib. "I'm not sure which girlfriend, but we can name some." He ticked them off on his fingers. "There's Marcella, from Argentina. I met her in South Beach. She wore these little tops made in South America. And they were finer than the tops that girls were wearing in the States. I got such a boner looking at her in that T-shirt that I thought, This is doing something for me. We've got to explore this opportunity. This was four, five years ago. O.K., I broke up with her, and I started going out with this stripper, Julie, from South Carolina. She had a gorgeous body. She was all-American. And, you know, Julie looked so great in those little T-shirts. She put one on and it meant something."
Dov pulled out a single typewritten page, a draft of a "mission statement" he was preparing for the industry. This was for a new line of Standard American T-shirts he wanted to start making--thirty-singles, ring-spun, tubular shirts knit on custom- made Asian equpiment. "Dear Client," it began:
During the last ten years major T-shirt makers such as Hanes and Fruit of the Loom have focused on being "heavier" and generously cut. Innovation and style have been put aside, and there has been a perpetual price war during the last four years. The issues are who can be cheaper, bigger or heavier. . . .Concerns about fit or issues of softness or stretch have been the last priority and have been barely considered. In order to create leadership we have reconstructed the T-shirt and have made a deviation from the traditional "Beefy-T" styled garment. We have redone the typical pattern. It is slightly more fitted--especially in the sleeve and armhole opening. . . . Yes the fabric is lighter, and we think that is a positive aspect of the garment. The garment has a stretch that is reminiscent of T-shirts from decades ago.
Dov was peering over my shoulder as I read. "We're going to kick everybody's ass," he announced. "The finest T-shirts are six dollars a piece wholesale. The shittiest shirts are like two dollars. We're going to come in at three and have the right stuff. I'm making the perfect fit. I'm going to manufacture this like gasoline."
If you ask Dov why he's going to these lengths, he'll tell you that it matters to him that Americans can buy an affordable and high-quality T-shirt. That's an admirable notion, but, of course, most of us don't really know what constitutes a high-quality T-shirt: we don't run our hands over a swatch of cotton and let the little fibres tickle our palm, or ruminate on the difference between side-seaming and tubularity. For that matter, few people who bought dress gloves in 1900 knew the difference between a full-piqué or a half-piqué stitch, between high-grade or merely medium-grade peccary. Producers, the economics textbooks tell us, are disciplined by the scrutiny of the marketplace. Yet what of commonplace articles such as T-shirts and gloves, about which most customers don't know enough or care enough to make fine discriminations? Discipline really comes not from customers but from other producers. And here again the economics textbooks steer us wrong, because they place too much emphasis on the role of formal competitors, the Gap or Hanes or the other big glove-maker in your niche. To be sure, Dov can occasionally be inspired by a truly exceptional garment like, say, a two-by-one ribbed undershirt from Fruit of the Loom. But in Gloversville the critical person is not so much the distant rival as the neighbor who is also a contractor, or the guy at the bar downtown who used to be in the business, or the friend at synagogue who is also an expert glove-maker--all of whom can look at your work with a practiced eye and shame you if it isn't right. Dov is motivated to produce a high-quality T-shirt at three dollars because that would mean something to Jhean and to Olin and, most of all, to Rick, whose T-shirt graphics are respected around the world. In Gloversville, the market is not an economic mechanism but--and this is the real power of a place like that--a social one.
"Everybody got so technically obsessed with reduced shrinkage," Dov went on, and by "everyone" he meant a group of people you could count on the fingers of one hand. "That was a big mistake for the industry because they took away the natural stretch property of a lot of the jersey. If you look at vintage shirts, they had a lot of stretch. Today, they don't. They are like these print boards. They are practically woven in comparison. I say fuck the shrinkage. I have a theory on width shrinkage on rib: I don't care. In fact, you put it on, it will come back." He was pacing back and forth and talking even more rapidly than usual. "I'm concerned about linear shrinkage. But, if it doesn't have any width shrinkage at all, I become concerned, too. I have a fabric I'm working on with a T-shirt engineer. It keeps having zero width shrinkage. That's not desirable!"
Dov stopped. He had spotted something out of the corner of his eye. It was one of his workers, a young man with a mustache and a goatee and slicked-back hair. He was wearing a black custom T, with two white stripes down the arms. Dov started walking toward him. "Oh, my God. You want to see something?" He reached out and flipped up the tag at the back of the cutter's shirt. "It's a Fresh Jive piece. I made it for Rick five years ago. Somehow this shirt just trickled back here." The sweet serendipity of it all brought a smile to his face.
4.
While Dov was perfecting his baby T's, Rick was holding a fashion shoot for his elegant new women's-wear line, Fresh Jive Domestics, which had been conceived by a young designer named Jessica. The shoot was at Rick's friend Deidre's house, a right-angled, white- stuccoed, shag-rugged modernist masterpiece under the Hollywood sign. Deidre rents it from the drummer of the seventies supergroup Bread. Madonna's old house is several hundred yards to the west of Deidre's place, and Aldous Huxley used to live a few hundred yards in the other direction, with the result that her block functions as a kind of architectural enactment of postwar Los Angeles intellectual life. For Rick's purposes, though, the house's main points of attraction were its fabulous details, like the little white Star Trek seats around the kitchen counter and the white baby grand in the window with the autographed Hugh Hefner photo and the feisty brown-haired spitz-collie named Sage barricaded in the kitchen. Rick had a box of disposable cameras, and as he shot the models other people joined in with the disposables, so that in the end Rick would be able to combine both sets of pictures in a brag book. It made for a slightly chaotic atmosphere--particularly since there were at least seven highly active cell phones in the room, each with a different ring, competing with the hip-hop on the stereo--and in the midst of it all Rick walked over to the baby grand and, with a mischievous look on his face, played the opening chords of Beethoven's "Pathétique" sonata.
Rick was talking about his plans to open a Fresh Jive store in Los Angeles. But he kept saying that it couldn't be on Melrose Avenue, where all the street-wear stores are. "Maybe that would be good for sales," he said. Then he shook his head. "No way."
Deidre, who was lounging next to the baby grand, started laughing. "You know what, Rick?" she said. "I think it's all about a Fresh Jive store without any Fresh Jive stuff in it."
It was a joke, but in some way not a joke, because that's the sort of thing that Rick might actually do. He's terrified by the conventional. At dinner the previous evening, for example, he and Dov had talked about a particular piece--the sports-style V-necked raglan custom T with stripes that Dov had spotted on the cutter. Rick introduced it years ago and then stopped making it when everyone else started making it, too.
"One of our biggest retailers takes me into this room last year," Rick explained. "It's full of custom T-shirts. He said, 'You started this, and everybody else took advantage of it. But you didn't go with it.' He was pissed off at me."
The businessman in Rick knew that he shouldn't have given up on the shirt so quickly, that he could have made a lot more money had he stayed and exploited the custom-T market. But he couldn't do that, because if he had been in that room with all the other custom T's he risked being known in his world as the guy who started the custom-T trend and then ran out of new ideas. Retail chains like J.C. Penney and Millers Outpost sometimes come to Rick and ask if they can carry Fresh Jive, or ask if he will sell them a big run of a popular piece, and he usually says no. He will allow his clothes to appear only in certain street-wear boutiques. His ambition is to grow three times as big as he is now--to maybe a thirty-million-dollar company--but no larger.
This is the sensibility of the artisan, and it isn't supposed to play much of a role anymore. We live in the age of the entrepreneur, who responds rationally to global pressures and customer demands in order to maximize profit. To the extent that we still talk of Gloversville--and the glove-making business there has long since faded away--we talk of it as a place that people need to leave behind. There was Lucius N. Littauer, for example, who, having made his fortune with Littauer Brothers Glove Co., in downtown Gloversville, went on to Congress, became a confidant of Presidents McKinley and Roosevelt, and then put up the money for what is now the Kennedy School of Government, at Harvard University. There was Samuel Goldwyn, the motion-picture magnate, who began his career as a cutter with Gloversville's Elite Glove Co. In 1912, he jumped into the movie business. He went to Hollywood. He rode horses and learned to play tennis and croquet. Like so many immigrant Jews in the movie industry, he enacted through his films a very public process of assimilation. This is the oldest of American stories: the heroic young man who leaves the small town to play on the big stage--who wants to be an entrepreneur, not an artisan. But the truth is that we always get the story wrong. It isn't that Littauer and Goldwyn left Gloversville to find the real culture, because the real culture comes from Gloversville, too; places like Washington and Hollywood persist and renew themselves only because Littauers and Goldwyns arrive from time to time, bringing with them a little piece of the real thing.
"The one paranoia Rick has is that, God forbid, he makes something that another company has," Dov said, at dinner with Rick that night.
Rick nodded. "In my personal life. Ask Dov. Every piece of clothing I own. Nobody else can have it."
Rick was wearing a pair of jeans and a plain white T-shirt, but if you looked closely you noticed that it wasn't just any jeans-and- T-shirt ensemble. The pants were an unusual denim chino, from Rick's Beggars Banquet collection. And the shirt?
"That is a very well-thought-out item," Dov said, gesturing toward Rick. "It's a purple-label BVD. It's no longer available. Size medium. Of all the shirts I've studied, this one has a phenomenal fit."He reached across the table and ran his fingers around the lower edge of the sleeve. Dov is a believer in a T-shirt that is snug on the biceps. "It's not the greatest fabric. But it shrinks perfectly. I actually gave him that shirt. I came back from one of my customers in New York City, on Grand Street, that happens to resell that particular garment."
It's all of a piece, in the end: the purple-label BVD, the custom-T that he designed but now won't touch. If in Dov's world the true competitive pressures are not economic but social, Rick's version of Gloversville is driven not by the marketplace but by personality--the particular, restless truculence of the sort of person who will give up almost anything and go to any lengths not to be like anyone else.
"We're doing this line of casual shoes," Rick said, during a rare lull in one of Dov's T-shirt soliloquies. "One is the Crip Slip. It's that corduroy slipper that the gang kids would always wear. The other is the Wino, which is that really cheap canvas slipper that you can buy at K mart for seven dollars and that the winos wear when they're, like, really hung over." His big new idea, Rick explained, was to bring out a line of complementary twelve-inch dolls in those characters. "We could have a guy with baggy pants and a pushcart," he went on. "You know, you pull down his pants and there's skid marks. And we have a full gangster for the Crip Slip."
Rick was so excited about the idea that he was still talking about it the next day at work. He was with a Fresh Jive designer named Jupiter--a skateboarder from Las Vegas of German, Welsh, Irish, French, Chinese, and Spanish extraction--and a guy named Matt, who wore on his chest a gold-plated, diamond-encrusted Star of David the size of a Peppermint Pattie. "The idea is that the doll would pump the shoe, and the shoe would pump the doll,"Rick said. "The doll for the Crip Slip would be totally gangster. The handkerchief. The plaid shirt or the wife beater. A forty in his hand. Flashing signs. Wouldn't that be crazy?" And then Rick caught himself. "Omigod.The doll for the Crip Slip will have interchangeable hands, with different gang signs!"
Matt looked awestruck: "Ohhh, that'll be sick!"
"Wooooow." Jupiter dragged the word out, and shook his head slowly. "That's crazy!"
5.
A few days later, Dov drove down to San Diego for Action Sports Retail, a big trade show in the street-wear world. Dov makes the rounds of A.S.R. twice a year, walking up and down through the cavernous conference center, stopping at the booths of hundreds of T-shirt companies and persuading people to buy his shirts wholesale for their lines. This year, he was busy scouting locations for American Apparel's new factory, and so he arrived a day late, clutching a motorized mini-scooter. To his great irritation, he wasn't allowed to carry it in. "This is the most uncool show," he announced, after haggling fruitlessly with the guard at the gate.
But his mood lifted quickly. How could it not? This was A.S.R., and everyone was wearing T-shirts or selling T-shirts, and because this was a place where people knew their T-shirts a lot of those T-shirts were Dov's. He started down one of the aisles. He pointed to a booth on the left. "They use my T-shirts." Next to that booth was another small company. "They use my T-shirts, too." He was wearing khakis and New Balance sneakers and one of his men's T-shirts in baby rib (a controversial piece, because the binding on the collar was a mere half inch). On his back he had a huge orange pack full of catalogues and samples, and every time he spotted a potential customer he would pull the backpack off and rummage through it, and the contents would spill on the floor.
Dov spotted a young woman walking toward him in a baby T. "That's a competitor's shirt. I can tell right away. The spacing of the needle. The fabric is not baby rib." He high-fived someone in another booth. Another young woman, in another T-shirt booth, loomed up ahead. "That's my shirt right there. In the green. I even know the stock number." He turned to her: "You're the girl in the olive forty-three, sixty-six sleeveless V with one-inch binding."
She laughed, but Dov was already off again, plunging back into the fray. "I always have an insecurity that I can be crushed by a bigger business," he said. "Like, Fruit of the Loom decided to do baby T's, and I got a little scared. But then I saw their shirt, and I laughed, because they missed it." Do the suits over at Fruit of the Loom have the same feel for a shirt that Dov does? Were they inspired by Marcella of Argentina and Julie from South Carolina? Those guys were off somewhere in a suburban office park. They weren't in Gloversville. "It was horribly designed," Dov went on. "It was thick, open-end, eighteen-singles coarse rib. It's not the luxury that I offer. See the rib on that collar?" He pulled up the binding on the T-shirt of a friend standing next to him. "Look how thick and spacey it is. That's what they did. They missed the point." Somewhere a cell phone was ringing. A young woman walked past. "Hey!" Dov called out. "That's my T-shirt!" The New-Boy Network
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 29, 2000
DEPT. OF HUMAN RESOURCES
What do job interviews really tell us?
1.
Nolan Myers grew up in Houston, the elder of two boys in a middle- class family. He went to Houston's High School for the Performing and Visual Arts and then Harvard, where he intended to major in History and Science. After discovering the joys of writing code, though, he switched to computer science. "Programming is one of those things you get involved in, and you just can't stop until you finish," Myers says. "You get involved in it, and all of a sudden you look at your watch and it's four in the morning! I love the elegance of it." Myers is short and slightly stocky and has pale-blue eyes. He smiles easily, and when he speaks he moves his hands and torso for emphasis. He plays in a klezmer band called the Charvard Chai Notes. He talks to his parents a lot. He gets B's and B-pluses.
This spring, in the last stretch of his senior year, Myers spent a lot of time interviewing for jobs with technology companies. He talked to a company named Trilogy, down in Texas, but he didn't think he would fit in. "One of Trilogy's subsidiaries put ads out in the paper saying that they were looking for the top tech students, and that they'd give them two hundred thousand dollars and a BMW," Myers said, shaking his head in disbelief. In another of his interviews, a recruiter asked him to solve a programming problem, and he made a stupid mistake and the recruiter pushed the answer back across the table to him, saying that his "solution" accomplished nothing. As he remembers the moment, Myers blushes. "I was so nervous. I thought, Hmm, that sucks!" The way he says that, though, makes it hard to believe that he really was nervous, or maybe what Nolan Myers calls nervous the rest of us call a tiny flutter in the stomach. Myers doesn't seem like the sort to get flustered. He's the kind of person you would call the night before the big test in seventh grade, when nothing made sense and you had begun to panic.
I like Nolan Myers. He will, I am convinced, be very good at whatever career he chooses. I say those two things even though I have spent no more than ninety minutes in his presence. We met only once, on a sunny afternoon in April at the Au Bon Pain in Harvard Square. He was wearing sneakers and khakis and a polo shirt, in a dark-green pattern. He had a big backpack, which he plopped on the floor beneath the table. I bought him an orange juice. He fished around in his wallet and came up with a dollar to try and repay me, which I refused. We sat by the window. Previously, we had talked for perhaps three minutes on the phone, setting up the interview. Then I E-mailed him, asking him how I would recognize him at Au Bon Pain. He sent me the following message, with what I'm convinced—again, on the basis of almost no evidence—to be typical Myers panache: "22ish, five foot seven, straight brown hair, very good-looking. :)." I have never talked to his father, his mother, or his little brother, or any of his professors. I have never seen him ecstatic or angry or depressed. I know nothing of his personal habits, his tastes, or his quirks. I cannot even tell you why I feel the way I do about him. He's good-looking and smart and articulate and funny, but not so good-looking and smart and articulate and funny that there is some obvious explanation for the conclusions I've drawn about him. I just like him, and I'm impressed by him, and if I were an employer looking for bright young college graduates, I'd hire him in a heartbeat.
I heard about Nolan Myers from Hadi Partovi, an executive with Tellme, a highly touted Silicon Valley startup offering Internet access through the telephone. If you were a computer-science major at M.I.T., Harvard, Stanford, Caltech, or the University of Waterloo this spring, looking for a job in software, Tellme was probably at the top of your list. Partovi and I talked in the conference room at Tellme's offices, just off the soaring, open floor where all the firm's programmers and marketers and executives sit, some of them with bunk beds built over their desks. (Tellme recently moved into an old printing plant—a low- slung office building with a huge warehouse attached—and, in accordance with new-economy logic, promptly turned the old offices into a warehouse and the old warehouse into offices.) Partovi is a handsome man of twenty-seven, with olive skin and short curly black hair, and throughout our entire interview he sat with his chair tilted precariously at a forty-five-degree angle. At the end of a long riff about how hard it is to find high-quality people, he blurted out one name: Nolan Myers. Then, from memory, he rattled off Myers's telephone number. He very much wanted Myers to come to Tellme.
Partovi had met Myers in January, during a recruiting trip to Harvard. "It was a heinous day," Partovi remembers. "I started at seven and went until nine. I'd walk one person out and walk the other in." The first fifteen minutes of every interview he spent talking about Tellme—its strategy, its goals, and its business. Then he gave everyone a short programming puzzle. For the rest of the hour-long meeting, Partovi asked questions. He remembers that Myers did well on the programming test, and after talking to him for thirty to forty minutes he became convinced that Myers had, as he puts it, "the right stuff." Partovi spent even less time with Myers than I did. He didn't talk to Myers's family, or see him ecstatic or angry or depressed, either. He knew that Myers had spent last summer as an intern at Microsoft and was about to graduate from an Ivy League school. But virtually everyone recruited by a place like Tellme has graduated from an Ă©lite university, and the Microsoft summer-internship program has more than six hundred people in it. Partovi didn't even know why he liked Myers so much. He just did. "It was very much a gut call," he says.
This wasn't so very different from the experience Nolan Myers had with Steve Ballmer, the C.E.O. of Microsoft. Earlier this year, Myers attended a party for former Microsoft interns called Gradbash. Ballmer gave a speech there, and at the end of his remarks Myers raised his hand. "He was talking a lot about aligning the company in certain directions," Myers told me, "and I asked him about how that influences his ability to make bets on other directions. Are they still going to make small bets?" Afterward, a Microsoft recruiter came up to Myers and said, "Steve wants your E-mail address." Myers gave it to him, and soon he and Ballmer were E-mailing. Ballmer, it seems, badly wanted Myers to come to Microsoft. "He did research on me," Myers says. "He knew which group I was interviewing with, and knew a lot about me personally. He sent me an E-mail saying that he'd love to have me come to Microsoft, and if I had any questions I should contact him. So I sent him a response, saying thank you. After I visited Tellme, I sent him an E-mail saying I was interested in Tellme, here were the reasons, that I wasn't sure yet, and if he had anything to say I said I'd love to talk to him. I gave him my number. So he called, and after playing phone tag we talked—about career trajectory, how Microsoft would influence my career, what he thought of Tellme. I was extremely impressed with him, and he seemed very genuinely interested in me."
What convinced Ballmer he wanted Myers? A glimpse! He caught a little slice of Nolan Myers in action and—just like that—the C.E.O. of a four-hundred-billion-dollar company was calling a college senior in his dorm room. Ballmer somehow knew he liked Myers, the same way Hadi Partovi knew, and the same way I knew after our little chat at Au Bon Pain. But what did we know? What could we know? By any reasonable measure, surely none of us knew Nolan Myers at all.
It is a truism of the new economy that the ultimate success of any enterprise lies with the quality of the people it hires. At many technology companies, employees are asked to all but live at the office, in conditions of intimacy that would have been unthinkable a generation ago. The artifacts of the prototypical Silicon Valley office—the videogames, the espresso bar, the bunk beds, the basketball hoops—are the elements of the rec room, not the workplace. And in the rec room you want to play only with your friends. But how do you find out who your friends are?Today, recruiters canvas the country for rĂ©sumĂ©s. They analyze employment histories and their competitors' staff listings. They call references, and then do what I did with Nolan Myers: sit down with a perfect stranger for an hour and a half and attempt to draw conclusions about that stranger's intelligence and personality. The job interview has become one of the central conventions of the modern economy. But what, exactly, can you know about a stranger after sitting down and talking with him for an hour?
2.
Some years ago, an experimental psychologist at Harvard University, Nalini Ambady, together with Robert Rosenthal, set out to examine the nonverbal aspects of good teaching. As the basis of her research, she used videotapes of teaching fellows which had been made during a training program at Harvard. Her plan was to have outside observers look at the tapes with the sound off and rate the effectiveness of the teachers by their expressions and physical cues. Ambady wanted to have at least a minute of film to work with. When she looked at the tapes, though, there was really only about ten seconds when the teachers were shown apart from the students. "I didn't want students in the frame, because obviously it would bias the ratings," Ambady says. "So I went to my adviser, and I said, 'This isn't going to work.'"
But it did. The observers, presented with a ten-second silent video clip, had no difficulty rating the teachers on a fifteen- item checklist of personality traits. In fact, when Ambady cut the clips back to five seconds, the ratings were the same. They were even the same when she showed her raters just two seconds of videotape. That sounds unbelievable unless you actually watch Ambady's teacher clips, as I did, and realize that the eight seconds that distinguish the longest clips from the shortest are superfluous: anything beyond the first flash of insight is unnecessary. When we make a snap judgment, it is made in a snap. It's also, very clearly, a judgment:we get a feeling that we have no difficulty articulating.
Ambady's next step led to an even more remarkable conclusion. She compared those snap judgments of teacher effectiveness with evaluations made, after a full semester of classes, by students of the same teachers. The correlation between the two, she found, was astoundingly high. A person watching a two-second silent video clip of a teacher he has never met will reach conclusions about how good that teacher is that are very similar to those of a student who sits in the teacher's class for an entire semester.
Recently, a comparable experiment was conducted by Frank Bernieri, a psychologist at the University of Toledo. Bernieri, working with one of his graduate students, Neha Gada-Jain, selected two people to act as interviewers, and trained them for six weeks in the proper procedures and techniques of giving an effective job interview. The two then interviewed ninety-eight volunteers, of various ages and backgrounds. The interviews lasted between fifteen and twenty minutes, and afterward each interviewer filled out a six-page, five-part evaluation of the person he'd just talked to. Originally, the intention of the study was to find out whether applicants who had been coached in certain nonverbal behaviors designed to ingratiate themselves with their interviewers—like mimicking the interviewers' physical gestures or posture—would get better ratings than applicants who behaved normally. As it turns out, they didn't. But then another of Bernieri's students, an undergraduate named Tricia Prickett, decided that she wanted to use the interview videotapes and the evaluations that had been collected to test out the adage that "the handshake is everything."
"She took fifteen seconds of videotape showing the applicant as he or she knocks on the door, comes in, shakes the hand of the interviewer, sits down, and the interviewer welcomes the person," Bernieri explained. Then, like Ambady, Prickett got a series of strangers to rate the applicants based on the handshake clip, using the same criteria that the interviewers had used. Once more, against all expectations, the ratings were very similar to those of the interviewers. "On nine out of the eleven traits the applicants were being judged on, the observers significantly predicted the outcome of the interview," Bernieri says. "The strength of the correlations was extraordinary."
This research takes Ambady's conclusions one step further. In the Toledo experiment, the interviewers were trained in the art of interviewing. They weren't dashing off a teacher evaluation on their way out the door. They were filling out a formal, detailed questionnaire, of the sort designed to give the most thorough and unbiased account of an interview. And still their ratings weren't all that different from those of people off the street who saw just the greeting.
This is why Hadi Partovi, Steve Ballmer, and I all agreed on Nolan Myers. Apparently, human beings don't need to know someone in order to believe that they know someone. Nor does it make that much difference, apparently, that Partovi reached his conclusion after putting Myers through the wringer for an hour, I reached mine after ninety minutes of amiable conversation at Au Bon Pain, and Ballmer reached his after watching and listening as Myers asked a question.
Bernieri and Ambady believe that the power of first impressions suggests that human beings have a particular kind of prerational ability for making searching judgments about others. In Ambady's teacher experiments, when she asked her observers to perform a potentially distracting cognitive task—like memorizing a set of numbers—while watching the tapes, their judgments of teacher effectiveness were unchanged. But when she instructed her observers to think hard about their ratings before they made them, their accuracy suffered substantially. Thinking only gets in the way. "The brain structures that are involved here are very primitive," Ambady speculates. "All of these affective reactions are probably governed by the lower brain structures." What we are picking up in that first instant would seem to be something quite basic about a person's character, because what we conclude after two seconds is pretty much the same as what we conclude after twenty minutes or, indeed, an entire semester. "Maybe you can tell immediately whether someone is extroverted, or gauge the person's ability to communicate,"Bernieri says. "Maybe these clues or cues are immediately accessible and apparent." Bernieri and Ambady are talking about the existence of a powerful form of human intuition. In a way, that's comforting, because it suggests that we can meet a perfect stranger and immediately pick up on something important about him. It means that I shouldn't be concerned that I can't explain why I like Nolan Myers, because, if such judgments are made without thinking, then surely they defy explanation.
But there's a troubling suggestion here as well. I believe that Nolan Myers is an accomplished and likable person. But I have no idea from our brief encounter how honest he is, or whether he is self-centered, or whether he works best by himself or in a group, or any number of other fundamental traits. That people who simply see the handshake arrive at the same conclusions as people who conduct a full interview also implies, perhaps, that those initial impressions matter too much—that they color all the other impressions that we gather over time.
For example, I asked Myers if he felt nervous about the prospect of leaving school for the workplace, which seemed like a reasonable question, since I remember how anxious I was before my first job. Would the hours scare him? Oh no, he replied, he was already working between eighty and a hundred hours a week at school. "Are there things that you think you aren't good at, which make you worry?" I continued.
His reply was sharp: "Are there things that I'm not good at, or things that I can't learn? I think that's the real question. There are a lot of things I don't know anything about, but I feel comfortable that given the right environment and the right encouragement I can do well at." In my notes, next to that reply, I wrote "Great answer!" and I can remember at the time feeling the little thrill you experience as an interviewer when someone's behavior conforms with your expectations. Because I had decided, right off, that I liked him, what I heard in his answer was toughness and confidence. Had I decided early on that I didn't like Nolan Myers, I would have heard in that reply arrogance and bluster. The first impression becomes a self-fulfilling prophecy: we hear what we expect to hear. The interview is hopelessly biased in favor of the nice.
3.
When Ballmer and Partovi and I met Nolan Myers, we made a prediction. We looked at the way he behaved in our presence—at the way he talked and acted and seemed to think—and drew conclusions about how he would behave in other situations. I had decided, remember, that Myers was the kind of person you called the night before the big test in seventh grade. Was I right to make that kind of generalization?
This is a question that social psychologists have looked at closely. In the late nineteen-twenties, in a famous study, the psychologist Theodore Newcomb analyzed extroversion among adolescent boys at a summer camp. He found that how talkative a boy was in one setting—say, lunch—was highly predictive of how talkative that boy would be in the same setting in the future. A boy who was curious at lunch on Monday was likely to be curious at lunch on Tuesday. But his behavior in one setting told you almost nothing about how he would behave in a different setting: from how someone behaved at lunch, you couldn't predict how he would behave during, say, afternoon playtime. In a more recent study, of conscientiousness among students at Carleton College, the researchers Walter Mischel, Neil Lutsky, and Philip K. Peake showed that how neat a student's assignments were or how punctual he was told you almost nothing about how often he attended class or how neat his room or his personal appearance was. How we behave at any one time, evidently, has less to do with some immutable inner compass than with the particulars of our situation.
This conclusion, obviously, is at odds with our intuition. Most of the time, we assume that people display the same character traits in different situations. We habitually underestimate the large role that context plays in people's behavior. In the Newcomb summer-camp experiment, for example, the results showing how little consistency there was from one setting to another in talkativeness, curiosity, and gregariousness were tabulated from observations made and recorded by camp counsellors on the spot. But when, at the end of the summer, those same counsellors were asked to give their final impressions of the kids, they remembered the children's behavior as being highly consistent.
"The basis of the illusion is that we are somehow confident that we are getting what is there, that we are able to read off a person's disposition," Richard Nisbett, a psychologist at the University of Michigan, says. "When you have an interview with someone and have an hour with them, you don't conceptualize that as taking a sample of a person's behavior, let alone a possibly biased sample, which is what it is. What you think is that you are seeing a hologram, a small and fuzzy image but still the whole person."
Then Nisbett mentioned his frequent collaborator, Lee Ross, who teaches psychology at Stanford. "There was one term when he was teaching statistics and one term he was teaching a course with a lot of humanistic psychology. He gets his teacher evaluations. The first referred to him as cold, rigid, remote, finicky, and uptight. And the second described this wonderful warmhearted guy who was so deeply concerned with questions of community and getting students to grow. It was Jekyll and Hyde. In both cases, the students thought they were seeing the real Lee Ross."
Psychologists call this tendency—to fixate on supposedly stable character traits and overlook the influence of context—the Fundamental Attri-bution Error, and if you combine this error with what we know about snap judgments the interview becomes an even more problematic encounter. Not only had I let my first impressions color the informationI gathered about Myers, but I had also assumed that the way he behaved with me in an interview setting was indicative of the way he would always behave. It isn't that the interview is useless; what I learned about Myers—that he and I get along well—is something I could never have got from a rĂ©sumĂ© or by talking to his references. It's just that our conversation turns out to have been less useful, and potentially more misleading, than I had supposed. That most basic of human rituals—the conversation with a stranger—turns out to be a minefield.
4.
Not long after I met with Nolan Myers, I talked with a human- resources consultant from Pasadena named Justin Menkes. Menkes's job is to figure out how to extract meaning from face-to-face encounters, and with that in mind he agreed to spend an hour interviewing me the way he thinks interviewing ought to be done. It felt, going in, not unlike a visit to a shrink, except that instead of having months, if not years, to work things out, Menkes was set upon stripping away my secrets in one session. Consider, he told me, a commonly asked question like "Describe a few situations in which your work was criticized. How did you handle the criticism?" The problem, Menkes said, is that it's much too obvious what the interviewee is supposed to say. "There was a situation where I was working on a project, and I didn't do as well as I could have," he said, adopting a mock-sincere singsong. "My boss gave me some constructive criticism. And I redid the project. It hurt. Yet we worked it out." The same is true of the question "What would your friends say about you?"—to which the correct answer (preferably preceded by a pause, as if to suggest that it had never dawned on you that someone would ask such a question) is "My guess is that they would call me a people person—either that or a hard worker."
Myers and I had talked about obvious questions, too. "What is your greatest weakness?" I asked him. He answered, "I tried to work on a project my freshman year, a children's festival. I was trying to start a festival as a benefit here in Boston. And I had a number of guys working with me. I started getting concerned with the scope of the project we were working on—how much responsibility we had, getting things done. I really put the brakes on, but in retrospect I really think we could have done it and done a great job."
Then Myers grinned and said, as an aside, "Do I truly think that is a fault? Honestly, no." And, of course, he's right. All I'd really asked him was whether he could describe a personal strength as if it were a weakness, and, in answering as he did, he had merely demonstrated his knowledge of the unwritten rules of the interview.
But, Menkes said, what if those questions were rephrased so that the answers weren't obvious? For example: "At your weekly team meetings, your boss unexpectedly begins aggressively critiquing your performance on a current project. What do you do?"
I felt a twinge of anxiety. What would I do? I remembered a terrible boss I'd had years ago. "I'd probably be upset," I said. "But I doubt I'd say anything. I'd probably just walk away." Menkes gave no indication whether he was concerned or pleased by that answer. He simply pointed out that another person might well have said something like "I'd go and see my boss later in private, and confront him about why he embarrassed me in front of my team." I was saying that I would probably handle criticism—even inappropriate criticism—from a superior with stoicism; in the second case, the applicant was saying he or she would adopt a more confrontational style. Or, at least, we were telling the interviewer that the workplace demands either stoicism or confrontation—and to Menkes these are revealing and pertinent pieces of information.
Menkes moved on to another area—handling stress. A typical question in this area is something like "Tell me about a time when you had to do several things at once. How did you handle the situation? How did you decide what to do first?" Menkes says this is also too easy. "I just had to be very organized," he began again in his mock-sincere singsong. "I had to multitask. I had to prioritize and delegate appropriately. I checked in frequently with my boss." Here's how Menkes rephrased it: "You're in a situation where you have two very important responsibilities that both have a deadline that is impossible to meet. You cannot accomplish both. How do you handle that situation?"
"Well," I said, "I would look at the two and decide what I was best at, and then go to my boss and say, 'It's better that I do one well than both poorly,' and we'd figure out who else could do the other task."
Menkes immediately seized on a telling detail in my answer. I was in-terested in what job I would do best. But isn't the key issue what job the company most needed to have done? With that comment, I had revealed some-thing valuable: that in a time of work-related crisis I start from a self-centered consideration. "Perhaps you are a bit of a solo practitioner," Menkes said diplomatically. "That's an essential bit of information."
Menkes deliberately wasn't drawing any broad conclusions. If we are not people who are shy or talkative or outspoken but people who are shy in some contexts, talkative in other situations, and outspoken in still other areas, then what it means to know someone is to catalogue and appreciate all those variations. Menkes was trying to begin that process of cataloguing. This interviewing technique is known as "structured interviewing," and in studies by industrial psychologists it has been shown to be the only kind of interviewing that has any success at all in predicting performance in the workplace. In the structured interviews, the format is fairly rigid. Each applicant is treated in precisely the same manner. The questions are scripted. The interviewers are carefully trained, and each applicant is rated on a series of predetermined scales.
What is interesting about the structured interview is how narrow its objectives are. When I interviewed Nolan Myers I was groping for some kind of global sense of who he was; Menkes seemed entirely uninterested in arriving at that same general sense of me—he seemed to realize how foolish that expectation was for an hour-long interview. The structured interview works precisely because it isn't really an interview; it isn't about getting to know someone, in a traditional sense. It's as much concerned with rejecting information as it is with collecting it.
Not surprisingly, interview specialists have found it extraordinarily difficult to persuade most employers to adopt the structured interview. It just doesn't feel right. For most of us, hiring someone is essentially a romantic process, in which the job interview functions as a desexualized version of a date. We are looking for someone with whom we have a certain chemistry, even if the coupling that results ends in tears and the pursuer and the pursued turn out to have nothing in common. We want the unlimited promise of a love affair. The structured interview, by contrast, seems to offer only the dry logic and practicality of an arranged marriage.
5.
Nolan Myers agonized over which job to take. He spent half an hour on the phone with Steve Ballmer, and Ballmer was very persuasive. "He gave me very, very good advice," Myers says of his conversations with the Microsoft C.E.O. "He felt that I should go to the place that excited me the most and that I thought would be best for my career. He offered to be my mentor." Myers says he talked to his parents every day about what to do. In February, he flew out to California and spent a Saturday going from one Tellme executive to another, asking and answering questions. "Basically, I had three things I was looking for. One was long-term goals for the company. Where did they see themselves in five years? Second, what position would I be playing in the company?" He stopped and burst out laughing. "And I forget what the third one is." In March, Myers committed to Tellme.
Will Nolan Myers succeed at Tellme? I think so, although I honestly have no idea. It's a harder question to answer now than it would have been thirty or forty years ago. If this were 1965, Nolan Myers would have gone to work at I.B.M. and worn a blue suit and sat in a small office and kept his head down, and the particulars of his personality would not have mattered so much. It was not so important that I.B.M. understood who you were before it hired you, because you understood what I.B.M. was. If you walked through the door at Armonk or at a branch office in Illinois, you knew what you had to be and how you were supposed to act. But to walk through the soaring, open offices of Tellme, with the bunk beds over the desks, is to be struck by how much more demanding the culture of Silicon Valley is. Nolan Myers will not be provided with a social script, that blue suit and organization chart. Tellme, like any technology startup these days, wants its employees to be part of a fluid team, to be flexible and innovative, to work with shifting groups in the absence of hierarchy and bureaucracy, and in that environment, where the workplace doubles as the rec room, the particulars of your personality matter a great deal.
This is part of the new economy's appeal, because Tellme's soaring warehouse is a more productive and enjoyable place to work than the little office boxes of the old I.B.M. But the danger here is that we will be led astray in judging these newly important particulars of character. If we let personability—some indefinable, prerational intuition, magnified by the Fundamental Attribution Error—bias the hiring process today, then all we will have done is replace the old-boy network, where you hired your nephew, with the new-boy network, where you hire whoever impressed you most when you shook his hand. Social progress, unless we're careful, can merely be the means by which we replace the obviously arbitrary with the not so obviously arbitrary.
Myers has spent much of the past year helping to teach Introduction to Computer Science. He realized, he says, that one of the reasons that students were taking the course was that they wanted to get jobs in the software industry. "I decided that, having gone through all this interviewing, I had developed some expertise, and I would like to share that. There is a real skill and art in presenting yourself to potential employers. And so what we did in this class was talk about the kinds of things that employers are looking for—what are they looking for in terms of personality. One of the most important things is that you have to come across as being confident in what you are doing and in who you are. How do you do that? Speak clearly and smile." As he said that, Nolan Myers smiled. "For a lot of people, that's a very hard skill to learn. But for some reason I seem to understand it intuitively."
The New-Boy Network
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 29, 2000
DEPT. OF HUMAN RESOURCES
What do job interviews really tell us?
1.
Nolan Myers grew up in Houston, the elder of two boys in a middle- class family. He went to Houston's High School for the Performing and Visual Arts and then Harvard, where he intended to major in History and Science. After discovering the joys of writing code, though, he switched to computer science. "Programming is one of those things you get involved in, and you just can't stop until you finish," Myers says. "You get involved in it, and all of a sudden you look at your watch and it's four in the morning! I love the elegance of it." Myers is short and slightly stocky and has pale-blue eyes. He smiles easily, and when he speaks he moves his hands and torso for emphasis. He plays in a klezmer band called the Charvard Chai Notes. He talks to his parents a lot. He gets B's and B-pluses.
This spring, in the last stretch of his senior year, Myers spent a lot of time interviewing for jobs with technology companies. He talked to a company named Trilogy, down in Texas, but he didn't think he would fit in. "One of Trilogy's subsidiaries put ads out in the paper saying that they were looking for the top tech students, and that they'd give them two hundred thousand dollars and a BMW," Myers said, shaking his head in disbelief. In another of his interviews, a recruiter asked him to solve a programming problem, and he made a stupid mistake and the recruiter pushed the answer back across the table to him, saying that his "solution" accomplished nothing. As he remembers the moment, Myers blushes. "I was so nervous. I thought, Hmm, that sucks!" The way he says that, though, makes it hard to believe that he really was nervous, or maybe what Nolan Myers calls nervous the rest of us call a tiny flutter in the stomach. Myers doesn't seem like the sort to get flustered. He's the kind of person you would call the night before the big test in seventh grade, when nothing made sense and you had begun to panic.
I like Nolan Myers. He will, I am convinced, be very good at whatever career he chooses. I say those two things even though I have spent no more than ninety minutes in his presence. We met only once, on a sunny afternoon in April at the Au Bon Pain in Harvard Square. He was wearing sneakers and khakis and a polo shirt, in a dark-green pattern. He had a big backpack, which he plopped on the floor beneath the table. I bought him an orange juice. He fished around in his wallet and came up with a dollar to try and repay me, which I refused. We sat by the window. Previously, we had talked for perhaps three minutes on the phone, setting up the interview. Then I E-mailed him, asking him how I would recognize him at Au Bon Pain. He sent me the following message, with what I'm convinced—again, on the basis of almost no evidence—to be typical Myers panache: "22ish, five foot seven, straight brown hair, very good-looking. :)." I have never talked to his father, his mother, or his little brother, or any of his professors. I have never seen him ecstatic or angry or depressed. I know nothing of his personal habits, his tastes, or his quirks. I cannot even tell you why I feel the way I do about him. He's good-looking and smart and articulate and funny, but not so good-looking and smart and articulate and funny that there is some obvious explanation for the conclusions I've drawn about him. I just like him, and I'm impressed by him, and if I were an employer looking for bright young college graduates, I'd hire him in a heartbeat.
I heard about Nolan Myers from Hadi Partovi, an executive with Tellme, a highly touted Silicon Valley startup offering Internet access through the telephone. If you were a computer-science major at M.I.T., Harvard, Stanford, Caltech, or the University of Waterloo this spring, looking for a job in software, Tellme was probably at the top of your list. Partovi and I talked in the conference room at Tellme's offices, just off the soaring, open floor where all the firm's programmers and marketers and executives sit, some of them with bunk beds built over their desks. (Tellme recently moved into an old printing plant—a low- slung office building with a huge warehouse attached—and, in accordance with new-economy logic, promptly turned the old offices into a warehouse and the old warehouse into offices.) Partovi is a handsome man of twenty-seven, with olive skin and short curly black hair, and throughout our entire interview he sat with his chair tilted precariously at a forty-five-degree angle. At the end of a long riff about how hard it is to find high-quality people, he blurted out one name: Nolan Myers. Then, from memory, he rattled off Myers's telephone number. He very much wanted Myers to come to Tellme.
Partovi had met Myers in January, during a recruiting trip to Harvard. "It was a heinous day," Partovi remembers. "I started at seven and went until nine. I'd walk one person out and walk the other in." The first fifteen minutes of every interview he spent talking about Tellme—its strategy, its goals, and its business. Then he gave everyone a short programming puzzle. For the rest of the hour-long meeting, Partovi asked questions. He remembers that Myers did well on the programming test, and after talking to him for thirty to forty minutes he became convinced that Myers had, as he puts it, "the right stuff." Partovi spent even less time with Myers than I did. He didn't talk to Myers's family, or see him ecstatic or angry or depressed, either. He knew that Myers had spent last summer as an intern at Microsoft and was about to graduate from an Ivy League school. But virtually everyone recruited by a place like Tellme has graduated from an Ă©lite university, and the Microsoft summer-internship program has more than six hundred people in it. Partovi didn't even know why he liked Myers so much. He just did. "It was very much a gut call," he says.
This wasn't so very different from the experience Nolan Myers had with Steve Ballmer, the C.E.O. of Microsoft. Earlier this year, Myers attended a party for former Microsoft interns called Gradbash. Ballmer gave a speech there, and at the end of his remarks Myers raised his hand. "He was talking a lot about aligning the company in certain directions," Myers told me, "and I asked him about how that influences his ability to make bets on other directions. Are they still going to make small bets?" Afterward, a Microsoft recruiter came up to Myers and said, "Steve wants your E-mail address." Myers gave it to him, and soon he and Ballmer were E-mailing. Ballmer, it seems, badly wanted Myers to come to Microsoft. "He did research on me," Myers says. "He knew which group I was interviewing with, and knew a lot about me personally. He sent me an E-mail saying that he'd love to have me come to Microsoft, and if I had any questions I should contact him. So I sent him a response, saying thank you. After I visited Tellme, I sent him an E-mail saying I was interested in Tellme, here were the reasons, that I wasn't sure yet, and if he had anything to say I said I'd love to talk to him. I gave him my number. So he called, and after playing phone tag we talked—about career trajectory, how Microsoft would influence my career, what he thought of Tellme. I was extremely impressed with him, and he seemed very genuinely interested in me."
What convinced Ballmer he wanted Myers? A glimpse! He caught a little slice of Nolan Myers in action and—just like that—the C.E.O. of a four-hundred-billion-dollar company was calling a college senior in his dorm room. Ballmer somehow knew he liked Myers, the same way Hadi Partovi knew, and the same way I knew after our little chat at Au Bon Pain. But what did we know? What could we know? By any reasonable measure, surely none of us knew Nolan Myers at all.
It is a truism of the new economy that the ultimate success of any enterprise lies with the quality of the people it hires. At many technology companies, employees are asked to all but live at the office, in conditions of intimacy that would have been unthinkable a generation ago. The artifacts of the prototypical Silicon Valley office—the videogames, the espresso bar, the bunk beds, the basketball hoops—are the elements of the rec room, not the workplace. And in the rec room you want to play only with your friends. But how do you find out who your friends are?Today, recruiters canvas the country for rĂ©sumĂ©s. They analyze employment histories and their competitors' staff listings. They call references, and then do what I did with Nolan Myers: sit down with a perfect stranger for an hour and a half and attempt to draw conclusions about that stranger's intelligence and personality. The job interview has become one of the central conventions of the modern economy. But what, exactly, can you know about a stranger after sitting down and talking with him for an hour?
2.
Some years ago, an experimental psychologist at Harvard University, Nalini Ambady, together with Robert Rosenthal, set out to examine the nonverbal aspects of good teaching. As the basis of her research, she used videotapes of teaching fellows which had been made during a training program at Harvard. Her plan was to have outside observers look at the tapes with the sound off and rate the effectiveness of the teachers by their expressions and physical cues. Ambady wanted to have at least a minute of film to work with. When she looked at the tapes, though, there was really only about ten seconds when the teachers were shown apart from the students. "I didn't want students in the frame, because obviously it would bias the ratings," Ambady says. "So I went to my adviser, and I said, 'This isn't going to work.'"
But it did. The observers, presented with a ten-second silent video clip, had no difficulty rating the teachers on a fifteen- item checklist of personality traits. In fact, when Ambady cut the clips back to five seconds, the ratings were the same. They were even the same when she showed her raters just two seconds of videotape. That sounds unbelievable unless you actually watch Ambady's teacher clips, as I did, and realize that the eight seconds that distinguish the longest clips from the shortest are superfluous: anything beyond the first flash of insight is unnecessary. When we make a snap judgment, it is made in a snap. It's also, very clearly, a judgment:we get a feeling that we have no difficulty articulating.
Ambady's next step led to an even more remarkable conclusion. She compared those snap judgments of teacher effectiveness with evaluations made, after a full semester of classes, by students of the same teachers. The correlation between the two, she found, was astoundingly high. A person watching a two-second silent video clip of a teacher he has never met will reach conclusions about how good that teacher is that are very similar to those of a student who sits in the teacher's class for an entire semester.
Recently, a comparable experiment was conducted by Frank Bernieri, a psychologist at the University of Toledo. Bernieri, working with one of his graduate students, Neha Gada-Jain, selected two people to act as interviewers, and trained them for six weeks in the proper procedures and techniques of giving an effective job interview. The two then interviewed ninety-eight volunteers, of various ages and backgrounds. The interviews lasted between fifteen and twenty minutes, and afterward each interviewer filled out a six-page, five-part evaluation of the person he'd just talked to. Originally, the intention of the study was to find out whether applicants who had been coached in certain nonverbal behaviors designed to ingratiate themselves with their interviewers—like mimicking the interviewers' physical gestures or posture—would get better ratings than applicants who behaved normally. As it turns out, they didn't. But then another of Bernieri's students, an undergraduate named Tricia Prickett, decided that she wanted to use the interview videotapes and the evaluations that had been collected to test out the adage that "the handshake is everything."
"She took fifteen seconds of videotape showing the applicant as he or she knocks on the door, comes in, shakes the hand of the interviewer, sits down, and the interviewer welcomes the person," Bernieri explained. Then, like Ambady, Prickett got a series of strangers to rate the applicants based on the handshake clip, using the same criteria that the interviewers had used. Once more, against all expectations, the ratings were very similar to those of the interviewers. "On nine out of the eleven traits the applicants were being judged on, the observers significantly predicted the outcome of the interview," Bernieri says. "The strength of the correlations was extraordinary."
This research takes Ambady's conclusions one step further. In the Toledo experiment, the interviewers were trained in the art of interviewing. They weren't dashing off a teacher evaluation on their way out the door. They were filling out a formal, detailed questionnaire, of the sort designed to give the most thorough and unbiased account of an interview. And still their ratings weren't all that different from those of people off the street who saw just the greeting.
This is why Hadi Partovi, Steve Ballmer, and I all agreed on Nolan Myers. Apparently, human beings don't need to know someone in order to believe that they know someone. Nor does it make that much difference, apparently, that Partovi reached his conclusion after putting Myers through the wringer for an hour, I reached mine after ninety minutes of amiable conversation at Au Bon Pain, and Ballmer reached his after watching and listening as Myers asked a question.
Bernieri and Ambady believe that the power of first impressions suggests that human beings have a particular kind of prerational ability for making searching judgments about others. In Ambady's teacher experiments, when she asked her observers to perform a potentially distracting cognitive task—like memorizing a set of numbers—while watching the tapes, their judgments of teacher effectiveness were unchanged. But when she instructed her observers to think hard about their ratings before they made them, their accuracy suffered substantially. Thinking only gets in the way. "The brain structures that are involved here are very primitive," Ambady speculates. "All of these affective reactions are probably governed by the lower brain structures." What we are picking up in that first instant would seem to be something quite basic about a person's character, because what we conclude after two seconds is pretty much the same as what we conclude after twenty minutes or, indeed, an entire semester. "Maybe you can tell immediately whether someone is extroverted, or gauge the person's ability to communicate,"Bernieri says. "Maybe these clues or cues are immediately accessible and apparent." Bernieri and Ambady are talking about the existence of a powerful form of human intuition. In a way, that's comforting, because it suggests that we can meet a perfect stranger and immediately pick up on something important about him. It means that I shouldn't be concerned that I can't explain why I like Nolan Myers, because, if such judgments are made without thinking, then surely they defy explanation.
But there's a troubling suggestion here as well. I believe that Nolan Myers is an accomplished and likable person. But I have no idea from our brief encounter how honest he is, or whether he is self-centered, or whether he works best by himself or in a group, or any number of other fundamental traits. That people who simply see the handshake arrive at the same conclusions as people who conduct a full interview also implies, perhaps, that those initial impressions matter too much—that they color all the other impressions that we gather over time.
For example, I asked Myers if he felt nervous about the prospect of leaving school for the workplace, which seemed like a reasonable question, since I remember how anxious I was before my first job. Would the hours scare him? Oh no, he replied, he was already working between eighty and a hundred hours a week at school. "Are there things that you think you aren't good at, which make you worry?" I continued.
His reply was sharp: "Are there things that I'm not good at, or things that I can't learn? I think that's the real question. There are a lot of things I don't know anything about, but I feel comfortable that given the right environment and the right encouragement I can do well at." In my notes, next to that reply, I wrote "Great answer!" and I can remember at the time feeling the little thrill you experience as an interviewer when someone's behavior conforms with your expectations. Because I had decided, right off, that I liked him, what I heard in his answer was toughness and confidence. Had I decided early on that I didn't like Nolan Myers, I would have heard in that reply arrogance and bluster. The first impression becomes a self-fulfilling prophecy: we hear what we expect to hear. The interview is hopelessly biased in favor of the nice.
3.
When Ballmer and Partovi and I met Nolan Myers, we made a prediction. We looked at the way he behaved in our presence—at the way he talked and acted and seemed to think—and drew conclusions about how he would behave in other situations. I had decided, remember, that Myers was the kind of person you called the night before the big test in seventh grade. Was I right to make that kind of generalization?
This is a question that social psychologists have looked at closely. In the late nineteen-twenties, in a famous study, the psychologist Theodore Newcomb analyzed extroversion among adolescent boys at a summer camp. He found that how talkative a boy was in one setting—say, lunch—was highly predictive of how talkative that boy would be in the same setting in the future. A boy who was curious at lunch on Monday was likely to be curious at lunch on Tuesday. But his behavior in one setting told you almost nothing about how he would behave in a different setting: from how someone behaved at lunch, you couldn't predict how he would behave during, say, afternoon playtime. In a more recent study, of conscientiousness among students at Carleton College, the researchers Walter Mischel, Neil Lutsky, and Philip K. Peake showed that how neat a student's assignments were or how punctual he was told you almost nothing about how often he attended class or how neat his room or his personal appearance was. How we behave at any one time, evidently, has less to do with some immutable inner compass than with the particulars of our situation.
This conclusion, obviously, is at odds with our intuition. Most of the time, we assume that people display the same character traits in different situations. We habitually underestimate the large role that context plays in people's behavior. In the Newcomb summer-camp experiment, for example, the results showing how little consistency there was from one setting to another in talkativeness, curiosity, and gregariousness were tabulated from observations made and recorded by camp counsellors on the spot. But when, at the end of the summer, those same counsellors were asked to give their final impressions of the kids, they remembered the children's behavior as being highly consistent.
"The basis of the illusion is that we are somehow confident that we are getting what is there, that we are able to read off a person's disposition," Richard Nisbett, a psychologist at the University of Michigan, says. "When you have an interview with someone and have an hour with them, you don't conceptualize that as taking a sample of a person's behavior, let alone a possibly biased sample, which is what it is. What you think is that you are seeing a hologram, a small and fuzzy image but still the whole person."
Then Nisbett mentioned his frequent collaborator, Lee Ross, who teaches psychology at Stanford. "There was one term when he was teaching statistics and one term he was teaching a course with a lot of humanistic psychology. He gets his teacher evaluations. The first referred to him as cold, rigid, remote, finicky, and uptight. And the second described this wonderful warmhearted guy who was so deeply concerned with questions of community and getting students to grow. It was Jekyll and Hyde. In both cases, the students thought they were seeing the real Lee Ross."
Psychologists call this tendency—to fixate on supposedly stable character traits and overlook the influence of context—the Fundamental Attri-bution Error, and if you combine this error with what we know about snap judgments the interview becomes an even more problematic encounter. Not only had I let my first impressions color the informationI gathered about Myers, but I had also assumed that the way he behaved with me in an interview setting was indicative of the way he would always behave. It isn't that the interview is useless; what I learned about Myers—that he and I get along well—is something I could never have got from a rĂ©sumĂ© or by talking to his references. It's just that our conversation turns out to have been less useful, and potentially more misleading, than I had supposed. That most basic of human rituals—the conversation with a stranger—turns out to be a minefield.
4.
Not long after I met with Nolan Myers, I talked with a human- resources consultant from Pasadena named Justin Menkes. Menkes's job is to figure out how to extract meaning from face-to-face encounters, and with that in mind he agreed to spend an hour interviewing me the way he thinks interviewing ought to be done. It felt, going in, not unlike a visit to a shrink, except that instead of having months, if not years, to work things out, Menkes was set upon stripping away my secrets in one session. Consider, he told me, a commonly asked question like "Describe a few situations in which your work was criticized. How did you handle the criticism?" The problem, Menkes said, is that it's much too obvious what the interviewee is supposed to say. "There was a situation where I was working on a project, and I didn't do as well as I could have," he said, adopting a mock-sincere singsong. "My boss gave me some constructive criticism. And I redid the project. It hurt. Yet we worked it out." The same is true of the question "What would your friends say about you?"—to which the correct answer (preferably preceded by a pause, as if to suggest that it had never dawned on you that someone would ask such a question) is "My guess is that they would call me a people person—either that or a hard worker."
Myers and I had talked about obvious questions, too. "What is your greatest weakness?" I asked him. He answered, "I tried to work on a project my freshman year, a children's festival. I was trying to start a festival as a benefit here in Boston. And I had a number of guys working with me. I started getting concerned with the scope of the project we were working on—how much responsibility we had, getting things done. I really put the brakes on, but in retrospect I really think we could have done it and done a great job."
Then Myers grinned and said, as an aside, "Do I truly think that is a fault? Honestly, no." And, of course, he's right. All I'd really asked him was whether he could describe a personal strength as if it were a weakness, and, in answering as he did, he had merely demonstrated his knowledge of the unwritten rules of the interview.
But, Menkes said, what if those questions were rephrased so that the answers weren't obvious? For example: "At your weekly team meetings, your boss unexpectedly begins aggressively critiquing your performance on a current project. What do you do?"
I felt a twinge of anxiety. What would I do? I remembered a terrible boss I'd had years ago. "I'd probably be upset," I said. "But I doubt I'd say anything. I'd probably just walk away." Menkes gave no indication whether he was concerned or pleased by that answer. He simply pointed out that another person might well have said something like "I'd go and see my boss later in private, and confront him about why he embarrassed me in front of my team." I was saying that I would probably handle criticism—even inappropriate criticism—from a superior with stoicism; in the second case, the applicant was saying he or she would adopt a more confrontational style. Or, at least, we were telling the interviewer that the workplace demands either stoicism or confrontation—and to Menkes these are revealing and pertinent pieces of information.
Menkes moved on to another area—handling stress. A typical question in this area is something like "Tell me about a time when you had to do several things at once. How did you handle the situation? How did you decide what to do first?" Menkes says this is also too easy. "I just had to be very organized," he began again in his mock-sincere singsong. "I had to multitask. I had to prioritize and delegate appropriately. I checked in frequently with my boss." Here's how Menkes rephrased it: "You're in a situation where you have two very important responsibilities that both have a deadline that is impossible to meet. You cannot accomplish both. How do you handle that situation?"
"Well," I said, "I would look at the two and decide what I was best at, and then go to my boss and say, 'It's better that I do one well than both poorly,' and we'd figure out who else could do the other task."
Menkes immediately seized on a telling detail in my answer. I was in-terested in what job I would do best. But isn't the key issue what job the company most needed to have done? With that comment, I had revealed some-thing valuable: that in a time of work-related crisis I start from a self-centered consideration. "Perhaps you are a bit of a solo practitioner," Menkes said diplomatically. "That's an essential bit of information."
Menkes deliberately wasn't drawing any broad conclusions. If we are not people who are shy or talkative or outspoken but people who are shy in some contexts, talkative in other situations, and outspoken in still other areas, then what it means to know someone is to catalogue and appreciate all those variations. Menkes was trying to begin that process of cataloguing. This interviewing technique is known as "structured interviewing," and in studies by industrial psychologists it has been shown to be the only kind of interviewing that has any success at all in predicting performance in the workplace. In the structured interviews, the format is fairly rigid. Each applicant is treated in precisely the same manner. The questions are scripted. The interviewers are carefully trained, and each applicant is rated on a series of predetermined scales.
What is interesting about the structured interview is how narrow its objectives are. When I interviewed Nolan Myers I was groping for some kind of global sense of who he was; Menkes seemed entirely uninterested in arriving at that same general sense of me—he seemed to realize how foolish that expectation was for an hour-long interview. The structured interview works precisely because it isn't really an interview; it isn't about getting to know someone, in a traditional sense. It's as much concerned with rejecting information as it is with collecting it.
Not surprisingly, interview specialists have found it extraordinarily difficult to persuade most employers to adopt the structured interview. It just doesn't feel right. For most of us, hiring someone is essentially a romantic process, in which the job interview functions as a desexualized version of a date. We are looking for someone with whom we have a certain chemistry, even if the coupling that results ends in tears and the pursuer and the pursued turn out to have nothing in common. We want the unlimited promise of a love affair. The structured interview, by contrast, seems to offer only the dry logic and practicality of an arranged marriage.
5.
Nolan Myers agonized over which job to take. He spent half an hour on the phone with Steve Ballmer, and Ballmer was very persuasive. "He gave me very, very good advice," Myers says of his conversations with the Microsoft C.E.O. "He felt that I should go to the place that excited me the most and that I thought would be best for my career. He offered to be my mentor." Myers says he talked to his parents every day about what to do. In February, he flew out to California and spent a Saturday going from one Tellme executive to another, asking and answering questions. "Basically, I had three things I was looking for. One was long-term goals for the company. Where did they see themselves in five years? Second, what position would I be playing in the company?" He stopped and burst out laughing. "And I forget what the third one is." In March, Myers committed to Tellme.
Will Nolan Myers succeed at Tellme? I think so, although I honestly have no idea. It's a harder question to answer now than it would have been thirty or forty years ago. If this were 1965, Nolan Myers would have gone to work at I.B.M. and worn a blue suit and sat in a small office and kept his head down, and the particulars of his personality would not have mattered so much. It was not so important that I.B.M. understood who you were before it hired you, because you understood what I.B.M. was. If you walked through the door at Armonk or at a branch office in Illinois, you knew what you had to be and how you were supposed to act. But to walk through the soaring, open offices of Tellme, with the bunk beds over the desks, is to be struck by how much more demanding the culture of Silicon Valley is. Nolan Myers will not be provided with a social script, that blue suit and organization chart. Tellme, like any technology startup these days, wants its employees to be part of a fluid team, to be flexible and innovative, to work with shifting groups in the absence of hierarchy and bureaucracy, and in that environment, where the workplace doubles as the rec room, the particulars of your personality matter a great deal.
This is part of the new economy's appeal, because Tellme's soaring warehouse is a more productive and enjoyable place to work than the little office boxes of the old I.B.M. But the danger here is that we will be led astray in judging these newly important particulars of character. If we let personability—some indefinable, prerational intuition, magnified by the Fundamental Attribution Error—bias the hiring process today, then all we will have done is replace the old-boy network, where you hired your nephew, with the new-boy network, where you hire whoever impressed you most when you shook his hand. Social progress, unless we're careful, can merely be the means by which we replace the obviously arbitrary with the not so obviously arbitrary.
Myers has spent much of the past year helping to teach Introduction to Computer Science. He realized, he says, that one of the reasons that students were taking the course was that they wanted to get jobs in the software industry. "I decided that, having gone through all this interviewing, I had developed some expertise, and I would like to share that. There is a real skill and art in presenting yourself to potential employers. And so what we did in this class was talk about the kinds of things that employers are looking for—what are they looking for in terms of personality. One of the most important things is that you have to come across as being confident in what you are doing and in who you are. How do you do that? Speak clearly and smile." As he said that, Nolan Myers smiled. "For a lot of people, that's a very hard skill to learn. But for some reason I seem to understand it intuitively."
Cheap and Easy
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 10, 2000
COMMENT
Every now and again in politics, there is a moment that captures the temper of the times, and our moment may have come this budget season in Washington. The Centers for Disease Control asked Congress if, for an extra fifteen million dollars in C.D.C. funding, it would like to wipe out syphilis from the United States by 2005. And Congress said no.
The request was not a political ploy to get a bigger budget. Syphilis is an epidemic that, for reasons no one quite understands, runs in cycles, and, after peaking in 1990, the disease is now at its lowest level in United States history. It has retreated to a handful of areas across the country: just twenty- five counties account for half of all cases. In other words, syphilis is very close to that critical point faced by many epidemics, when even the slightest push could tip them into oblivion. That's why the C.D.C. has asked for the extra fifteen million dollars-- to supply that final push.
This was all patiently explained to Congress last year as the epidemic first neared its lowest ebb. The C.D.C. proposed the most prosaic and straightforward of public-health efforts--an aggressive regimen of free diagnosis and treatment. The drug of choice? Penicillin, the same antibiotic that has been so successful in fighting syphilis for the past half century. Congress wasn't interested. This year, the C.D.C. made its case again, and again the public-health budgets that emerged from the House and the Senate left the agency well short of the necessary funding. Next year, unfortunately, the moment when syphilis can be easily eliminated will have passed. The disease will have begun its cyclical return, moving out of the familiar, well-defined neighborhoods where it is now sequestered, and presenting a much more formidable target for public-health officials. "If you miss the timing, there is a point when it is no longer feasible to move to elimination," says Judy Wasserheit, who is the head of the C.D.C.'s syphilis-prevention effort. "We're already pushing the limits of that time frame."
Exactly why, in a period of fiscal plenty, Congress cannot find the money for an anti-syphilis campaign is a bit puzzling. The disease plays a major role in the transmission of H.I.V., increasing infection rates between two- and five-fold. It often irreparably harms children born to those who are infected. And it is extremely expensive. Even with the rates as low as they are now, syphilis costs the country two hundred and fourteen million dollars a year. Congress has the opportunity to make history by eliminating a disease that has plagued the West for centuries. Why isn't it taking it?
The truth is, this is the price we pay for the ways in which disease has become steadily politicized. The great insight of the AIDS movement--later picked up by groups concerned about breast cancer and prostate cancer--was that a community afflicted with a specific medical problem could take its case directly to Capitol Hill, bypassing the medical establishment entirely. This has dramatically increased the resources available for medical research. But it has also given Congress an excuse to treat public health as another form of interest-group politics, in which the most deserving constituencies are those which shout the loudest. In fact, when it comes to illness and disease the most deserving constituencies are often those who cannot shout at all. That syphilis is a sexually transmitted disease primarily affecting very poor African-Americans only makes things worse--sex, race, and poverty being words that the present Congress has difficulty pronouncing individually, let alone in combination.
The last time America came so tantalizingly close to the elimination of syphilis was during the mid-fifties, after the introduction of penicillin. "Are Venereal Diseases disappearing?" the American Journal of Syphilis asked in 1951; four years later, the journal itself had disappeared. Such was the certainty that the era of syphilis was ending that the big debate in the public- health field was ethical rather than medical--namely, how the removal of the threat of venereal disease would affect sexual behavior.
As Dr. John Stokes, one of the leading experts of his day on sexually transmitted diseases, wrote, "It is a reasonable question, whether by eliminating disease, without commensurate attention to the development of human idealism, self-control, and responsibility in the sexual life, we are not bringing mankind to its fall instead of fulfillment." Stokes assumed that syphilis would soon vanish, and that we ought to worry about the morality of those who could have got the disease but now wouldn't. As it turns out, he had it backward. Syphilis is still with us. And we ought to worry instead about the morality of those who could have eliminated the disease but chose not to.
The Art of Failure
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
August 21 & 28, 2000
PERFORMANCE STUDIES
Why some people choke and others panic
There was a moment, in the third and deciding set of the 1993 Wimbledon final, when Jana Novotna seemed invincible. She was leading 4-1 and serving at 40-30, meaning that she was one point from winning the game, and just five points from the most coveted championship in tennis. She had just hit a backhand to her opponent, Steffi Graf, that skimmed the net and landed so abruptly on the far side of the court that Graf could only watch, in flat- footed frustration. The stands at Center Court were packed. The Duke and Duchess of Kent were in their customary place in the royal box. Novotna was in white, poised and confident, her blond hair held back with a headband--and then something happened. She served the ball straight into the net. She stopped and steadied herself for the second serve--the toss, the arch of the back--but this time it was worse. Her swing seemed halfhearted, all arm and no legs and torso. Double fault. On the next point, she was slow to react to a high shot by Graf, and badly missed on a forehand volley. At game point, she hit an overhead straight into the net. Instead of 5-1, it was now 4-2. Graf to serve: an easy victory, 4-3. Novotna to serve. She wasn't tossing the ball high enough. Her head was down. Her movements had slowed markedly. She double-faulted once, twice, three times. Pulled wide by a Graf forehand, Novotna inexplicably hit a low, flat shot directly at Graf, instead of a high crosscourt forehand that would have given her time to get back into position: 4-4. Did she suddenly realize how terrifyingly close she was to victory? Did she remember that she had never won a major tournament before? Did she look across the net and see Steffi Graf--Steffi Graf!--the greatest player of her generation?
On the baseline, awaiting Graf's serve, Novotna was now visibly agitated, rocking back and forth, jumping up and down. She talked to herself under her breath. Her eyes darted around the court. Graf took the game at love; Novotna, moving as if in slow motion, did not win a single point: 5-4, Graf. On the sidelines, Novotna wiped her racquet and her face with a towel, and then each finger individually. It was her turn to serve. She missed a routine volley wide, shook her head, talked to herself. She missed her first serve, made the second, then, in the resulting rally, mis-hit a backhand so badly that it sailed off her racquet as if launched into flight. Novotna was unrecognizable, not an Ă©lite tennis player but a beginner again. She was crumbling under pressure, but exactly why was as baffling to her as it was to all those looking on. Isn't pressure supposed to bring out the best in us? We try harder. We concentrate harder. We get a boost of adrenaline. We care more about how well we perform. So what was happening to her?
At championship point, Novotna hit a low, cautious, and shallow lob to Graf. Graf answered with an unreturnable overhead smash, and, mercifully, it was over. Stunned, Novotna moved to the net. Graf kissed her twice. At the awards ceremony, the Duchess of Kent handed Novotna the runner-up's trophy, a small silver plate, and whispered something in her ear, and what Novotna had done finally caught up with her. There she was, sweaty and exhausted, looming over the delicate white-haired Duchess in her pearl necklace. The Duchess reached up and pulled her head down onto her shoulder, and Novotna started to sob.
Human beings sometimes falter under pressure. Pilots crash and divers drown. Under the glare of competition, basketball players cannot find the basket and golfers cannot find the pin. When that happens, we say variously that people have "panicked" or, to use the sports colloquialism, "choked." But what do those words mean? Both are pejoratives. To choke or panic is considered to be as bad as to quit. But are all forms of failure equal? And what do the forms in which we fail say about who we are and how we think?We live in an age obsessed with success, with documenting the myriad ways by which talented people overcome challenges and obstacles. There is as much to be learned, though, from documenting the myriad ways in which talented people sometimes fail.
"Choking" sounds like a vague and all-encompassing term, yet it describes a very specific kind of failure. For example, psychologists often use a primitive video game to test motor skills. They'll sit you in front of a computer with a screen that shows four boxes in a row, and a keyboard that has four corresponding buttons in a row. One at a time, x's start to appear in the boxes on the screen, and you are told that every time this happens you are to push the key corresponding to the box. According to Daniel Willingham, a psychologist at the University of Virginia, if you're told ahead of time about the pattern in which those x's will appear, your reaction time in hitting the right key will improve dramatically. You'll play the game very carefully for a few rounds, until you've learned the sequence, and then you'll get faster and faster. Willingham calls this "explicit learning." But suppose you're not told that the x's appear in a regular sequence, and even after playing the game for a while you're not aware that there is a pattern. You'll still get faster: you'll learn the sequence unconsciously. Willingham calls that "implicit learning"--learning that takes place outside of awareness. These two learning systems are quite separate, based in different parts of the brain. Willingham says that when you are first taught something--say, how to hit a backhand or an overhead forehand--you think it through in a very deliberate, mechanical manner. But as you get better the implicit system takes over: you start to hit a backhand fluidly, without thinking. The basal ganglia, where implicit learning partially resides, are concerned with force and timing, and when that system kicks in you begin to develop touch and accuracy, the ability to hit a drop shot or place a serve at a hundred miles per hour. "This is something that is going to happen gradually," Willingham says. "You hit several thousand forehands, after a while you may still be attending to it. But not very much. In the end, you don't really notice what your hand is doing at all."
Under conditions of stress, however, the explicit system sometimes takes over. That's what it means to choke. When Jana Novotna faltered at Wimbledon, it was because she began thinking about her shots again. She lost her fluidity, her touch. She double-faulted on her serves and mis-hit her overheads, the shots that demand the greatest sensitivity in force and timing. She seemed like a different person--playing with the slow, cautious deliberation of a beginner--because, in a sense, she was a beginner again: she was relying on a learning system that she hadn't used to hit serves and overhead forehands and volleys since she was first taught tennis, as a child. The same thing has happened to Chuck Knoblauch, the New York Yankees' second baseman, who inexplicably has had trouble throwing the ball to first base. Under the stress of playing in front of forty thousand fans at Yankee Stadium, Knoblauch finds himself reverting to explicit mode, throwing like a Little Leaguer again.
Panic is something else altogether. Consider the following account of a scuba-diving accident, recounted to me by Ephimia Morphew, a human-factors specialist at nasa: "It was an open-water certification dive, Monterey Bay, California, about ten years ago. I was nineteen. I'd been diving for two weeks. This was my first time in the open ocean without the instructor. Just my buddy and I. We had to go about forty feet down, to the bottom of the ocean, and do an exercise where we took our regulators out of our mouth, picked up a spare one that we had on our vest, and practiced breathing out of the spare. My buddy did hers. Then it was my turn. I removed my regulator. I lifted up my secondary regulator. I put it in my mouth, exhaled, to clear the lines, and then I inhaled, and, to my surprise, it was water. I inhaled water. Then the hose that connected that mouthpiece to my tank, my air source, came unlatched and air from the hose came exploding into my face.
"Right away, my hand reached out for my partner's air supply, as if I was going to rip it out. It was without thought. It was a physiological response. My eyes are seeing my hand do something irresponsible. I'm fighting with myself. Don't do it. Then I searched my mind for what I could do. And nothing came to mind. All I could remember was one thing: If you can't take care of yourself, let your buddy take care of you. I let my hand fall back to my side, and I just stood there."
This is a textbook example of panic. In that moment, Morphew stopped thinking. She forgot that she had another source of air, one that worked perfectly well and that, moments before, she had taken out of her mouth. She forgot that her partner had a working air supply as well, which could easily be shared, and she forgot that grabbing her partner's regulator would imperil both of them. All she had was her most basic instinct: get air. Stress wipes out short-term memory. People with lots of experience tend not to panic, because when the stress suppresses their short- term memory they still have some residue of experience to draw on. But what did a novice like Morphew have? I searched my mind for what I could do. And nothing came to mind.
Panic also causes what psychologists call perceptual narrowing. In one study, from the early seventies, a group of subjects were asked to perform a visual acuity task while undergoing what they thought was a sixty-foot dive in a pressure chamber. At the same time, they were asked to push a button whenever they saw a small light flash on and off in their peripheral vision. The subjects in the pressure chamber had much higher heart rates than the control group, indicating that they were under stress. That stress didn't affect their accuracy at the visual-acuity task, but they were only half as good as the control group at picking up the peripheral light. "You tend to focus or obsess on one thing," Morphew says. "There's a famous airplane example, where the landing light went off, and the pilots had no way of knowing if the landing gear was down. The pilots were so focussed on that light that no one noticed the autopilot had been disengaged, and they crashed the plane." Morphew reached for her buddy's air supply because it was the only air supply she could see.
Panic, in this sense, is the opposite of choking. Choking is about thinking too much. Panic is about thinking too little. Choking is about loss of instinct. Panic is reversion to instinct. They may look the same, but they are worlds apart.
Why does this distinction matter? In some instances, it doesn't much. If you lose a close tennis match, it's of little moment whether you choked or panicked; either way, you lost. But there are clearly cases when how failure happens is central to understanding why failure happens.
Take the plane crash in which John F. Kennedy, Jr., was killed last summer. The details of the flight are well known. On a Friday evening last July, Kennedy took off with his wife and sister-in-law for Martha's Vineyard. The night was hazy, and Kennedy flew along the Connecticut coastline, using the trail of lights below him as a guide. At Westerly, Rhode Island, he left the shoreline, heading straight out over Rhode Island Sound, and at that point, apparently disoriented by the darkness and haze, he began a series of curious maneuvers: He banked his plane to the right, farther out into the ocean, and then to the left. He climbed and descended. He sped up and slowed down. Just a few miles from his destination, Kennedy lost control of the plane, and it crashed into the ocean.
Kennedy's mistake, in technical terms, was that he failed to keep his wings level. That was critical, because when a plane banks to one side it begins to turn and its wings lose some of their vertical lift. Left unchecked, this process accelerates. The angle of the bank increases, the turn gets sharper and sharper, and the plane starts to dive toward the ground in an ever-narrowing corkscrew. Pilots call this the graveyard spiral. And why didn't Kennedy stop the dive? Because, in times of low visibility and high stress, keeping your wings level--indeed, even knowing whether you are in a graveyard spiral--turns out to be surprisingly difficult. Kennedy failed under pressure.
Had Kennedy been flying during the day or with a clear moon, he would have been fine. If you are the pilot, looking straight ahead from the cockpit, the angle of your wings will be obvious from the straight line of the horizon in front of you. But when it's dark outside the horizon disappears. There is no external measure of the plane's bank. On the ground, we know whether we are level even when it's dark, because of the motion-sensing mechanisms in the inner ear. In a spiral dive, though, the effect of the plane's G-force on the inner ear means that the pilot feels perfectly level even if his plane is not. Similarly, when you are in a jetliner that is banking at thirty degrees after takeoff, the book on your neighbor's lap does not slide into your lap, nor will a pen on the floor roll toward the "down" side of the plane. The physics of flying is such that an airplane in the midst of a turn always feels perfectly level to someone inside the cabin.
This is a difficult notion, and to understand it I went flying with William Langewiesche, the author of a superb book on flying, "Inside the Sky." We met at San Jose Airport, in the jet center where the Silicon Valley billionaires keep their private planes. Langewiesche is a rugged man in his forties, deeply tanned, and handsome in the way that pilots (at least since the movie "The Right Stuff") are supposed to be. We took off at dusk, heading out toward Monterey Bay, until we had left the lights of the coast behind and night had erased the horizon. Langewiesche let the plane bank gently to the left. He took his hands off the stick. The sky told me nothing now, so I concentrated on the instruments. The nose of the plane was dropping. The gyroscope told me that we were banking, first fifteen, then thirty, then forty-five degrees. "We're in a spiral dive," Langewiesche said calmly. Our airspeed was steadily accelerating, from a hundred and eighty to a hundred and ninety to two hundred knots. The needle on the altimeter was moving down. The plane was dropping like a stone, at three thousand feet per minute. I could hear, faintly, a slight increase in the hum of the engine, and the wind noise as we picked up speed. But if Langewiesche and I had been talking I would have caught none of that. Had the cabin been unpressurized, my ears might have popped, particularly as we went into the steep part of the dive. But beyond that? Nothing at all. In a spiral dive, the G-load--the force of inertia--is normal. As Langewiesche puts it, the plane likes to spiral-dive. The total time elapsed since we started diving was no more than six or seven seconds. Suddenly, Langewiesche straightened the wings and pulled back on the stick to get the nose of the plane up, breaking out of the dive. Only now did I feel the full force of the G-load, pushing me back in my seat. "You feel no G-load in a bank," Langewiesche said. "There's nothing more confusing for the uninitiated."
I asked Langewiesche how much longer we could have fallen. "Within five seconds, we would have exceeded the limits of the airplane," he replied, by which he meant that the force of trying to pull out of the dive would have broken the plane into pieces. I looked away from the instruments and asked Langewiesche to spiral-dive again, this time without telling me. I sat and waited. I was about to tell Langewiesche that he could start diving anytime, when, suddenly, I was thrown back in my chair. "We just lost a thousand feet," he said.
This inability to sense, experientially, what your plane is doing is what makes night flying so stressful. And this was the stress that Kennedy must have felt when he turned out across the water at Westerly, leaving the guiding lights of the Connecticut coastline behind him. A pilot who flew into Nantucket that night told the National Transportation Safety Board that when he descended over Martha's Vineyard he looked down and there was "nothing to see. There was no horizon and no light.... I thought the island might [have] suffered a power failure." Kennedy was now blind, in every sense, and he must have known the danger he was in. He had very little experience in flying strictly by instruments. Most of the time when he had flown up to the Vineyard the horizon or lights had still been visible. That strange, final sequence of maneuvers was Kennedy's frantic search for a clearing in the haze. He was trying to pick up the lights of Martha's Vineyard, to restore the lost horizon. Between the lines of the National Transportation Safety Board's report on the crash, you can almost feel his desperation:
About 2138 the target began a right turn in a southerly direction. About 30 seconds later, the target stopped its descent at 2200 feet and began a climb that lasted another 30 seconds. During this period of time, the target stopped the turn, and the airspeed decreased to about 153 KIAS. About 2139, the target leveled off at 2500 feet and flew in a southeasterly direction. About 50 seconds later, the target entered a left turn and climbed to 2600 feet. As the target continued in the left turn, it began a descent that reached a rate of about 900 fpm.
But was he choking or panicking? Here the distinction between those two states is critical. Had he choked, he would have reverted to the mode of explicit learning. His movements in the cockpit would have become markedly slower and less fluid. He would have gone back to the mechanical, self-conscious application of the lessons he had first received as a pilot--and that might have been a good thing. Kennedy needed to think, to concentrate on his instruments, to break away from the instinctive flying that served him when he had a visible horizon.
But instead, from all appearances, he panicked. At the moment when he needed to remember the lessons he had been taught about instrument flying, his mind--like Morphew's when she was underwater--must have gone blank. Instead of reviewing the instruments, he seems to have been focussed on one question: Where are the lights of Martha's Vineyard? His gyroscope and his other instruments may well have become as invisible as the peripheral lights in the underwater-panic experiments. He had fallen back on his instincts--on the way the plane felt--and in the dark, of course, instinct can tell you nothing. The N.T.S.B. report says that the last time the Piper's wings were level was seven seconds past 9:40, and the plane hit the water at about 9:41, so the critical period here was less than sixty seconds. At twenty-five seconds past the minute, the plane was tilted at an angle greater than forty-five degrees. Inside the cockpit it would have felt normal. At some point, Kennedy must have heard the rising wind outside, or the roar of the engine as it picked up speed. Again, relying on instinct, he might have pulled back on the stick, trying to raise the nose of the plane. But pulling back on the stick without first levelling the wings only makes the spiral tighter and the problem worse. It's also possible that Kennedy did nothing at all, and that he was frozen at the controls, still frantically searching for the lights of the Vineyard, when his plane hit the water. Sometimes pilots don't even try to make it out of a spiral dive. Langewiesche calls that "one G all the way down."
What happened to Kennedy that night illustrates a second major difference between panicking and choking. Panicking is conventional failure, of the sort we tacitly understand. Kennedy panicked because he didn't know enough about instrument flying. If he'd had another year in the air, he might not have panicked, and that fits with what we believe--that performance ought to improve with experience, and that pressure is an obstacle that the diligent can overcome. But choking makes little intuitive sense. Novotna's problem wasn't lack of diligence; she was as superbly conditioned and schooled as anyone on the tennis tour. And what did experience do for her? In 1995, in the third round of the French Open, Novotna choked even more spectacularly than she had against Graf, losing to Chanda Rubin after surrendering a 5-0 lead in the third set. There seems little doubt that part of the reason for her collapse against Rubin was her collapse against Graf--that the second failure built on the first, making it possible for her to be up 5-0 in the third set and yet entertain the thought I can still lose. If panicking is conventional failure, choking is paradoxical failure.
Claude Steele, a psychologist at Stanford University, and his colleagues have done a number of experiments in recent years looking at how certain groups perform under pressure, and their findings go to the heart of what is so strange about choking. Steele and Joshua Aronson found that when they gave a group of Stanford undergraduates a standardized test and told them that it was a measure of their intellectual ability, the white students did much better than their black counterparts. But when the same test was presented simply as an abstract laboratory tool, with no relevance to ability, the scores of blacks and whites were virtually identical. Steele and Aronson attribute this disparity to what they call "stereotype threat": when black students are put into a situation where they are directly confronted with a stereotype about their group--in this case, one having to do with intelligence--the resulting pressure causes their performance to suffer.
Steele and others have found stereotype threat at work in any situation where groups are depicted in negative ways. Give a group of qualified women a math test and tell them it will measure their quantitative ability and they'll do much worse than equally skilled men will; present the same test simply as a research tool and they'll do just as well as the men. Or consider a handful of experiments conducted by one of Steele's former graduate students, Julio Garcia, a professor at Tufts University. Garcia gathered together a group of white, athletic students and had a white instructor lead them through a series of physical tests: to jump as high as they could, to do a standing broad jump, and to see how many pushups they could do in twenty seconds. The instructor then asked them to do the tests a second time, and, as you'd expect, Garcia found that the students did a little better on each of the tasks the second time around. Then Garcia ran a second group of students through the tests, this time replacing the instructor between the first and second trials with an African-American. Now the white students ceased to improve on their vertical leaps. He did the experiment again, only this time he replaced the white instructor with a black instructor who was much taller and heavier than the previous black instructor. In this trial, the white students actually jumped less high than they had the first time around. Their performance on the pushups, though, was unchanged in each of the conditions. There is no stereotype, after all, that suggests that whites can't do as many pushups as blacks. The task that was affected was the vertical leap, because of what our culture says: white men can't jump.
It doesn't come as news, of course, that black students aren't as good at test-taking as white students, or that white students aren't as good at jumping as black students. The problem is that we've always assumed that this kind of failure under pressure is panic. What is it we tell underperforming athletes and students? The same thing we tell novice pilots or scuba divers: to work harder, to buckle down, to take the tests of their ability more seriously. But Steele says that when you look at the way black or female students perform under stereotype threat you don't see the wild guessing of a panicked test taker. "What you tend to see is carefulness and second-guessing," he explains. "When you go and interview them, you have the sense that when they are in the stereotype-threat condition they say to themselves, 'Look, I'm going to be careful here. I'm not going to mess things up.' Then, after having decided to take that strategy, they calm down and go through the test. But that's not the way to succeed on a standardized test. The more you do that, the more you will get away from the intuitions that help you, the quick processing. They think they did well, and they are trying to do well. But they are not." This is choking, not panicking. Garcia's athletes and Steele's students are like Novotna, not Kennedy. They failed because they were good at what they did: only those who care about how well they perform ever feel the pressure of stereotype threat. The usual prescription for failure--to work harder and take the test more seriously--would only make their problems worse.
That is a hard lesson to grasp, but harder still is the fact that choking requires us to concern ourselves less with the performer and more with the situation in which the performance occurs. Novotna herself could do nothing to prevent her collapse against Graf. The only thing that could have saved her is if--at that critical moment in the third set--the television cameras had been turned off, the Duke and Duchess had gone home, and the spectators had been told to wait outside. In sports, of course, you can't do that. Choking is a central part of the drama of athletic competition, because the spectators have to be there--and the ability to overcome the pressure of the spectators is part of what it means to be a champion. But the same ruthless inflexibility need not govern the rest of our lives. We have to learn that sometimes a poor performance reflects not the innate ability of the performer but the complexion of the audience; and that sometimes a poor test score is the sign not of a poor student but of a good one.
Through the first three rounds of the 1996 Masters golf tournament, Greg Norman held a seemingly insurmountable lead over his nearest rival, the Englishman Nick Faldo. He was the best player in the world. His nickname was the Shark. He didn't saunter down the fairways; he stalked the course, blond and broad-shouldered, his caddy behind him, struggling to keep up. But then came the ninth hole on the tournament's final day. Norman was paired with Faldo, and the two hit their first shots well. They were now facing the green. In front of the pin, there was a steep slope, so that any ball hit short would come rolling back down the hill into oblivion. Faldo shot first, and the ball landed safely long, well past the cup.
Norman was next. He stood over the ball. "The one thing you guard against here is short," the announcer said, stating the obvious. Norman swung and then froze, his club in midair, following the ball in flight. It was short. Norman watched, stone-faced, as the ball rolled thirty yards back down the hill, and with that error something inside of him broke.
At the tenth hole, he hooked the ball to the left, hit his third shot well past the cup, and missed a makable putt. At eleven, Norman had a three-and-a-half-foot putt for par--the kind he had been making all week. He shook out his hands and legs before grasping the club, trying to relax. He missed: his third straight bogey. At twelve, Norman hit the ball straight into the water. At thirteen, he hit it into a patch of pine needles. At sixteen, his movements were so mechanical and out of synch that, when he swung, his hips spun out ahead of his body and the ball sailed into another pond. At that, he took his club and made a frustrated scythelike motion through the grass, because what had been obvious for twenty minutes was now official: he had fumbled away the chance of a lifetime.
Faldo had begun the day six strokes behind Norman. By the time the two started their slow walk to the eighteenth hole, through the throng of spectators, Faldo had a four- stroke lead. But he took those final steps quietly, giving only the smallest of nods, keeping his head low. He understood what had happened on the greens and fairways that day. And he was bound by the particular etiquette of choking, the understanding that what he had earned was something less than a victory and what Norman had suffered was something less than a defeat.
When it was all over, Faldo wrapped his arms around Norman. "I don't know what to say--I just want to give you a hug," he whispered, and then he said the only thing you can say to a choker: "I feel horrible about what happened. I'm so sorry." With that, the two men began to cry.
The Pitchman
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
October 30, 2000
ANNALS OF ENTERPRISE
Ron Popeil and the conquest of the American kitchen.
The extraordinary story of the Ronco Showtime Rotisserie & BBQ begins with Nathan Morris, the son of the shoemaker and cantor Kidders Morris, who came over from the Old Country in the eighteen-eighties, and settled in Asbury Park, New Jersey. Nathan Morris was a pitchman. He worked the boardwalk and the five-and-dimes and county fairs up and down the Atlantic coast, selling kitchen gadgets made by Acme Metal, out of Newark. In the early forties, Nathan set up N. K. Morris Manufacturing--turning out the KwiKi-Pi and the Morris Metric Slicer--and perhaps because it was the Depression and job prospects were dim, or perhaps because Nathan Morris made such a compelling case for his new profession, one by one the members of his family followed him into the business. His sons Lester Morris and Arnold (the Knife) Morris became his pitchmen. He set up his brother-in-law Irving Rosenbloom, who was to make a fortune on Long Island in plastic goods, including a hand grater of such excellence that Nathan paid homage to it with his own Dutch Kitchen Shredder Grater. He partnered with his brother Al, whose own sons worked the boardwalk, alongside a gangly Irishman by the name of Ed McMahon. Then, one summer just before the war, Nathan took on as an apprentice his nephew Samuel Jacob Popeil. S.J., as he was known, was so inspired by his uncle Nathan that he went on to found Popeil Brothers, based in Chicago, and brought the world the Dial-O-Matic, the Chop-O-Matic, and the Veg-O-Matic. S. J. Popeil had two sons. The elder was Jerry, who died young. The younger is familiar to anyone who has ever watched an infomercial on late- night television. His name is Ron Popeil.
In the postwar years, many people made the kitchen their life's work. There were the Klinghoffers of New York, one of whom, Leon, died tragically in 1985, during the Achille Lauro incident, when he was pushed overboard in his wheelchair by Palestinian terrorists). They made the Roto-Broil 400, back in the fifties, an early rotisserie for the home, which was pitched by Lester Morris. There was Lewis Salton, who escaped the Nazis with an English stamp from his father's collection and parlayed it into an appliance factory in the Bronx. He brought the world the Salton Hotray--a sort of precursor to the microwave--and today Salton, Inc., sells the George Foreman Grill.
But no rival quite matched the Morris-Popeil clan. They were the first family of the American kitchen. They married beautiful women and made fortunes and stole ideas from one another and lay awake at night thinking of a way to chop an onion so that the only tears you shed were tears of joy. They believed that it was a mistake to separate product development from marketing, as most of their contemporaries did, because to them the two were indistinguishable: the object that sold best was the one that sold itself. They were spirited, brilliant men. And Ron Popeil was the most brilliant and spirited of them all. He was the family's Joseph, exiled to the wilderness by his father only to come back and make more money than the rest of the family combined. He was a pioneer in taking the secrets of the boardwalk pitchmen to the television screen. And, of all the kitchen gadgets in the Morris-Popeil pantheon, nothing has ever been quite so ingenious in its design, or so broad in its appeal, or so perfectly representative of the Morris-Popeil belief in the interrelation of the pitch and the object being pitched, as the Ronco Showtime Rotisserie & BBQ, the countertop oven that can be bought for four payments of $39.95 and may be, dollar for dollar, the finest kitchen appliance ever made.
A Rotisserie Is Born
Ron Popeil is a handsome man, thick through the chest and shoulders, with a leonine head and striking, over-size features. He is in his mid-sixties, and lives in Beverly Hills, halfway up Coldwater Canyon, in a sprawling bungalow with a stand of avocado trees and a vegetable garden out back. In his habits he is, by Beverly Hills standards, old school. He carries his own bags. He has been known to eat at Denny's. He wears T-shirts and sweatpants. As often as twice a day, he can be found buying poultry or fish or meat at one of the local grocery stores--in particular, Costco, which he favors because the chickens there are ninety-nine cents a pound, as opposed to a dollar forty-nine at standard supermarkets. Whatever he buys, he brings back to his kitchen, a vast room overlooking the canyon, with an array of industrial appliances, a collection of fifteen hundred bottles of olive oil, and, in the corner, an oil painting of him, his fourth wife, Robin (a former Frederick's of Hollywood model), and their baby daughter, Contessa. On paper, Popeil owns a company called Ronco Inventions, which has two hundred employees and a couple of warehouses in Chatsworth, California, but the heart of Ronco is really Ron working out of his house, and many of the key players are really just friends of Ron's who work out of their houses, too, and who gather in Ron's kitchen when, every now and again, Ron cooks a soup and wants to talk things over.
In the last thirty years, Ron has invented a succession of kitchen gadgets, among them the Ronco Electric Food Dehydrator and the Popeil Automatic Pasta and Sausage Maker, which featured a thrust bearing made of the same material used in bulletproof glass. He works steadily, guided by flashes of inspiration. This past August, for instance, he suddenly realized what product should follow the Showtime Rotisserie. He and his right-hand man, Alan Backus, had been working on a bread-and-batter machine, which would take up to ten pounds of chicken wings or scallops or shrimp or fish fillets and do all the work--combining the eggs, the flour, the breadcrumbs--in a few minutes, without dirtying either the cook's hands or the machine. "Alan goes to Korea, where we have some big orders coming through," Ron explained recently over lunch--a hamburger, medium-well, with fries--in the V.I.P. booth by the door in the Polo Lounge, at the Beverly Hills Hotel. "I call Alan on the phone. I wake him up. It was two in the morning there. And these are my exact words: `Stop. Do not pursue the bread-and-batter machine. I will pick it up later. This other project needs to come first.' " The other project, his inspiration, was a device capable of smoking meats indoors without creating odors that can suffuse the air and permeate furniture. Ron had a version of the indoor smoker on his porch--"a Rube Goldberg kind of thing" that he'd worked on a year earlier--and, on a whim, he cooked a chicken in it. "That chicken was so good that I said to myself"--and with his left hand Ron began to pound on the table--"This is the best chicken sandwich I have ever had in my life." He turned to me: "How many times have you had a smoked-turkey sandwich? Maybe you have a smoked- turkey or a smoked-chicken sandwich once every six months. Once! How many times have you had smoked salmon? Aah. More. I'm going to say you come across smoked salmon as an hors d'oeuvre or an entrée once every three months. Baby-back ribs? Depends on which restaurant you order ribs at. Smoked sausage, same thing. You touch on smoked food"--he leaned in and poked my arm for emphasis--"but I know one thing, Malcolm. You don't have a smoker."
The idea for the Showtime came about in the same way. Ron was at Costco about four years ago when he suddenly realized that there was a long line of customers waiting to buy chickens from the in-store rotisserie ovens. They touched on rotisserie chicken, but Ron knew one thing: they did not have a rotisserie oven. Ron went home and called Backus. Together, they bought a glass aquarium, a motor, a heating element, a spit rod, and a handful of other spare parts, and began tinkering. Ron wanted something big enough for a fifteen-pound turkey but small enough to fit into the space between the base of an average kitchen cupboard and the countertop. He didn't want a thermostat, because thermostats break, and the constant clicking on and off of the heat prevents the even, crispy browning that he felt was essential. And the spit rod had to rotate on the horizontal axis, not the vertical axis, because if you cooked a chicken or a side of beef on the vertical axis the top would dry out and the juices would drain to the bottom. Roderick Dorman, Ron's patent attorney, says that when he went over to Coldwater Canyon he often saw five or six prototypes on the kitchen counter, lined up in a row. Ron would have a chicken in each of them, so that he could compare the consistency of the flesh and the browning of the skin, and wonder if, say, there was a way to rotate a shish kebab as it approached the heating element so that the inner side of the kebab would get as brown as the outer part. By the time Ron finished, the Showtime prompted no fewer than two dozen patent applications. It was equipped with the most powerful motor in its class. It had a drip tray coated with a nonstick ceramic, which was easily cleaned, and the oven would still work even after it had been dropped on a concrete or stone surface ten times in succession, from a distance of three feet. To Ron, there was no question that it made the best chicken he had ever had in his life.
It was then that Ron filmed a television infomercial for the Showtime, twenty-eight minutes and thirty seconds in length. It was shot live before a studio audience, and aired for the first time on August 8, 1998. It has run ever since, often in the wee hours of the morning, or on obscure cable stations, alongside the get-rich schemes and the "Three's Company" reruns. The response to it has been such that within the next three years total sales of the Showtime should exceed a billion dollars. Ron Popeil didn't use a single focus group. He had no market researchers, R. & D. teams, public-relations advisers, Madison Avenue advertising companies, or business consultants. He did what the Morrises and the Popeils had been doing for most of the century, and what all the experts said couldn't be done in the modern economy. He dreamed up something new in his kitchen and went out and pitched it himself.
Pitchmen
Nathan Morris, Ron Popeil's great-uncle, looked a lot like Cary Grant. He wore a straw boater. He played the ukulele, drove a convertible, and composed melodies for the piano. He ran his business out of a low-slung, whitewashed building on Ridge Avenue, near Asbury Park, with a little annex in the back where he did pioneering work with Teflon. He had certain eccentricities, such as a phobia he developed about travelling beyond Asbury Park without the presence of a doctor. He feuded with his brother Al, who subsequently left in a huff for Atlantic City, and then with his nephew S. J. Popeil, whom Nathan considered insufficiently grateful for the start he had given him in the kitchen- gadget business. That second feud led to a climactic legal showdown over S. J. Popeil's Chop-O-Matic, a food preparer with a pleated, W-shaped blade rotated by a special clutch mechanism. The Chop-O-Matic was ideal for making coleslaw and chopped liver, and when Morris introduced a strikingly similar product, called the Roto-Chop, S. J. Popeil sued his uncle for patent infringement. (As it happened, the Chop-O-Matic itself seemed to have been inspired by the Blitzhacker, from Switzerland, and S.J. later lost a patent judgment to the Swiss.)
The two squared off in Trenton, in May of 1958, in a courtroom jammed with Morrises and Popeils. When the trial opened, Nathan Morris was on the stand, being cross-examined by his nephew's attorneys, who were out to show him that he was no more than a huckster and a copycat. At a key point in the questioning, the judge suddenly burst in. "He took the index finger of his right hand and he pointed it at Morris," Jack Dominik, Popeil's longtime patent lawyer, recalls, "and as long as I live I will never forget what he said. `I know you! You're a pitchman! I've seen you on the boardwalk!' And Morris pointed his index finger back at the judge and shouted, `No! I'm a manufacturer. I'm a dignified manufacturer, and I work with the most eminent of counsel!' " (Nathan Morris, according to Dominik, was the kind of man who referred to everyone he worked with as eminent.) "At that moment," Dominik goes on, "Uncle Nat's face was getting red and the judge's was getting redder, so a recess was called." What happened later that day is best described in Dominik's unpublished manuscript, "The Inventions of Samuel Joseph Popeil by Jack E. Dominik--His Patent Lawyer." Nathan Morris had a sudden heart attack, and S.J. was guilt-stricken. "Sobbing ensued," Dominik writes. "Remorse set in. The next day, the case was settled. Thereafter, Uncle Nat's recovery from his previous day's heart attack was nothing short of a miracle."
Nathan Morris was a performer, like so many of his relatives, and pitching was, first and foremost, a performance. It's said that Nathan's nephew Archie (the Pitchman's Pitchman) Morris once sold, over a long afternoon, gadget after gadget to a well-dressed man. At the end of the day, Archie watched the man walk away, stop and peer into his bag, and then dump the whole lot into a nearby garbage can. The Morrises were that good. "My cousins could sell you an empty box," Ron says.
The last of the Morrises to be active in the pitching business is Arnold (the Knife) Morris, so named because of his extraordinary skill with the Sharpcut, the forerunner of the Ginsu. He is in his early seventies, a cheerful, impish man with a round face and a few wisps of white hair, and a trademark move whereby, after cutting a tomato into neat, regular slices, he deftly lines the pieces up in an even row against the flat edge of the blade. Today, he lives in Ocean Township, a few miles from Asbury Park, with Phyllis, his wife of twenty-nine years, whom he refers to (with the same irresistible conviction that he might use to describe, say, the Feather Touch Knife) as "the prettiest girl in Asbury Park." One morning recently, he sat in his study and launched into a pitch for the Dial-O-Matic, a slicer produced by S. J. Popeil some forty years ago.
"Come on over, folks. I'm going to show you the most amazing slicing machine you have ever seen in your life," he began. Phyllis, sitting nearby, beamed with pride. He picked up a package of barbecue spices, which Ron Popeil sells alongside his Showtime Rotisserie, and used it as a prop. "Take a look at this!" He held it in the air as if he were holding up a Tiffany vase. He talked about the machine's prowess at cutting potatoes, then onions, then tomatoes. His voice, a marvellous instrument inflected with the rhythms of the Jersey Shore, took on a singsong quality: "How many cut tomatoes like this? You stab it. You jab it. The juices run down your elbow. With the Dial-O-Matic, you do it a little differently. You put it in the machine and you wiggle"--he mimed fixing the tomato to the bed of the machine. "The tomato! Lady! The tomato! The more you wiggle, the more you get. The tomato! Lady! Every slice comes out perfectly, not a seed out of place. But the thing I love my Dial-O-Matic for is coleslaw. My mother-in-law used to take her cabbage and do this." He made a series of wild stabs at an imaginary cabbage. "I thought she was going to commit suicide. Oh, boy, did I pray--that she wouldn't slip! Don't get me wrong. I love my mother-in-law. It's her daughter I can't figure out. You take the cabbage. Cut it in half. Coleslaw, hot slaw. Pot slaw. Liberty slaw. It comes out like shredded wheat . . ."
It was a vaudeville monologue, except that Arnold wasn't merely entertaining; he was selling. "You can take a pitchman and make a great actor out of him, but you cannot take an actor and always make a great pitchman out of him," he says. The pitchman must make you applaud and take out your money. He must be able to execute what in pitchman's parlance is called "the turn"--the perilous, crucial moment where he goes from entertainer to businessman. If, out of a crowd of fifty, twenty-five people come forward to buy, the true pitchman sells to only twenty of them. To the remaining five, he says, "Wait! There's something else I want to show you!" Then he starts his pitch again, with slight variations, and the remaining four or five become the inner core of the next crowd, hemmed in by the people around them, and so eager to pay their money and be on their way that they start the selling frenzy all over again. The turn requires the management of expectation. That's why Arnold always kept a pineapple tantalizingly perched on his stand. "For forty years, I've been promising to show people how to cut the pineapple, and I've never cut it once," he says. "It got to the point where a pitchman friend of mine went out and bought himself a plastic pineapple. Why would you cut the pineapple? It cost a couple bucks. And if you cut it they'd leave." Arnold says that he once hired some guys to pitch a vegetable slicer for him at a fair in Danbury, Connecticut, and became so annoyed at their lackadaisical attitude that he took over the demonstration himself. They were, he says, waiting for him to fail: he had never worked that particular slicer before and, sure enough, he was massacring the vegetables. Still, in a single pitch he took in two hundred dollars. "Their eyes popped out of their heads," Arnold recalls. "They said, `We don't understand it. You don't even know how to work the damn machine.' I said, `But I know how to do one thing better than you.' They said, `What's that?' I said, `I know how to ask for the money.' And that's the secret to the whole damn business."
Ron Popeil started pitching his father's kitchen gadgets at the Maxwell Street flea market in Chicago, in the mid-fifties. He was thirteen. Every morning, he would arrive at the market at five and prepare fifty pounds each of onions, cabbages, and carrots, and a hundred pounds of potatoes. He sold from six in the morning until four in the afternoon, bringing in as much as five hundred dollars a day. In his late teens, he started doing the state- and county-fair circuit, and then he scored a prime spot in the Woolworth's at State and Washington, in the Loop, which at the time was the top-grossing Woolworth's store in the country. He was making more than the manager of the store, selling the Chop- O-Matic and the Dial-O-Matic. He dined at the Pump Room and wore a Rolex and rented hundred-and-fifty-dollar-a-night hotel suites. In pictures from the period, he is beautiful, with thick dark hair and blue-green eyes and sensuous lips, and, several years later, when he moved his office to 919 Michigan Avenue, he was called the Paul Newman of the Playboy Building. Mel Korey, a friend of Ron's from college and his first business partner, remembers the time he went to see Ron pitch the Chop-O-Matic at the State Street Woolworth's. "He was mesmerizing," Korey says. "There were secretaries who would take their lunch break at Woolworth's to watch him because he was so good-looking. He would go into the turn, and people would just come running." Several years ago, Ron's friend Steve Wynn, the founder of the Mirage resorts, went to visit Michael Milken in prison. They were near a television, and happened to catch one of Ron's infomercials just as he was doing the countdown, a routine taken straight from the boardwalk, where he says, "You're not going to spend two hundred dollars, not a hundred and eighty dollars, not one-seventy, not one- sixty . . ." It's a standard pitchman's gimmick: it sounds dramatic only because the starting price is set way up high. But something about the way Ron did it was irresistible. As he got lower and lower, Wynn and Milken--who probably know as much about profit margins as anyone in America--cried out in unison, "Stop, Ron! Stop!"
Was Ron the best? The only attempt to settle the question definitively was made some forty years ago, when Ron and Arnold were working a knife set at the Eastern States Exposition, in West Springfield, Massachusetts. A third man, Frosty Wishon, who was a legend in his own right, was there, too. "Frosty was a well-dressed, articulate individual and a good salesman," Ron says. "But he thought he was the best. So I said, `Well, guys, we've got a ten-day show, eleven, maybe twelve hours a day. We'll each do a rotation, and we'll compare how much we sell." In Morris-Popeil lore, this is known as "the shoot-out," and no one has ever forgotten the outcome. Ron beat Arnold, but only by a whisker- -no more than a few hundred dollars. Frosty Wishon, meanwhile, sold only half as much as either of his rivals. "You have no idea the pressure Frosty was under," Ron continues. "He came up to me at the end of the show and said, `Ron, I will never work with you again as long as I live.' "
No doubt Frosty Wishon was a charming and persuasive person, but he assumed that this was enough--that the rules of pitching were the same as the rules of celebrity endorsement. When Michael Jordan pitches McDonald's hamburgers, Michael Jordan is the star. But when Ron Popeil or Arnold Morris pitched, say, the Chop-O-Matic, his gift was to make the Chop-O-Matic the star. It was, after all, an innovation. It represented a different way of dicing onions and chopping liver: it required consumers to rethink the way they went about their business in the kitchen. Like most great innovations, it was disruptive. And how do you persuade people to disrupt their lives? Not merely by ingratiation or sincerity, and not by being famous or beautiful. You have to explain the invention to customers-- not once or twice but three or four times, with a different twist each time. You have to show them exactly how it works and why it works, and make them follow your hands as you chop liver with it, and then tell them precisely how it fits into their routine, and, finally, sell them on the paradoxical fact that, revolutionary as the gadget is, it's not at all hard to use.
Thirty years ago, the videocassette recorder came on the market, and it was a disruptive product, too: it was supposed to make it possible to tape a television show so that no one would ever again be chained to the prime-time schedule. Yet, as ubiquitous as the VCR became, it was seldom put to that purpose. That's because the VCR was never pitched: no one ever explained the gadget to American consumers--not once or twice but three or four times--and no one showed them exactly how it worked or how it would fit into their routine, and no pair of hands guided them through every step of the process. All the VCR-makers did was hand over the box with a smile and a pat on the back, tossing in an instruction manual for good measure. Any pitchman could have told you that wasn't going to do it.
Once, when I was over at Ron's house in Coldwater Canyon, sitting on one of the high stools in his kitchen, he showed me what real pitching is all about. He was talking about how he had just had dinner with the actor Ron Silver, who is playing Ron's friend Robert Shapiro in a new movie about the O. J. Simpson trial. "They shave the back of Ron Silver's head so that he's got a bald spot, because, you know, Bob Shapiro's got a bald spot back there, too," Ron said. "So I say to him, `You've gotta get GLH.' " GLH, one of Ron's earlier products, is an aerosol spray designed to thicken the hair and cover up bald spots. "I told him, `It will make you look good. When you've got to do the scene, you shampoo it out.' "
At this point, the average salesman would have stopped. The story was an aside, no more. We had been discussing the Showtime Rotisserie, and on the counter behind us was a Showtime cooking a chicken and next to it a Showtime cooking baby-back ribs, and on the table in front of him Ron's pasta maker was working, and he was frying some garlic so that we could have a little lunch. But now that he had told me about GLH it was unthinkable that he would not also show me its wonders. He walked quickly over to a table at the other side of the room, talking as he went. "People always ask me, `Ron, where did you get that name GLH?' I made it up. Great-Looking Hair." He picked up a can. "We make it in nine different colors. This is silver-black." He picked up a hand mirror and angled it above his head so that he could see his bald spot. "Now, the first thing I'll do is spray it where I don't need it." He shook the can and began spraying the crown of his head, talking all the while. "Then I'll go to the area itself." He pointed to his bald spot. "Right here. O.K. Now I'll let that dry. Brushing is fifty per cent of the way it's going to look." He began brushing vigorously, and suddenly Ron Popeil had what looked like a complete head of hair. "Wow," I said. Ron glowed. "And you tell me `Wow.' That's what everyone says. `Wow.' That's what people say who use it. `Wow.' If you go outside"--he grabbed me by the arm and pulled me out onto the deck--"if you are in bright sunlight or daylight, you cannot tell that I have a big bald spot in the back of my head. It really looks like hair, but it's not hair. It's quite a product. It's incredible. Any shampoo will take it out. You know who would be a great candidate for this? Al Gore. You want to see how it feels?" Ron inclined the back of his head toward me. I had said, "Wow," and had looked at his hair inside and outside, but the pitchman in Ron Popeil wasn't satisfied. I had to feel the back of his head. I did. It felt just like real hair.
The Tinkerer
Ron Popeil inherited more than the pitching tradition of Nathan Morris. He was very much the son of S. J. Popeil, and that fact, too, goes a long way toward explaining the success of the Showtime Rotisserie. S.J. had a ten-room apartment high in the Drake Towers, near the top of Chicago's Magnificent Mile. He had a chauffeured Cadillac limousine with a car phone, a rarity in those days, which he delighted in showing off (as in "I'm calling you from the car"). He wore three-piece suits and loved to play the piano. He smoked cigars and scowled a lot and made funny little grunting noises as he talked. He kept his money in T-bills. His philosophy was expressed in a series of epigrams: To his attorney, "If they push you far enough, sue"; to his son, "It's not how much you spend, it's how much you make." And, to a designer who expressed doubts about the utility of one of his greatest hits, the Pocket Fisherman, "It's not for using; it's for giving." In 1974, S.J.'s second wife, Eloise, decided to have him killed, so she hired two hit men--one of whom, aptly, went by the name of Mr. Peeler. At the time, she was living at the Popeil estate in Newport Beach with her two daughters and her boyfriend, a thirty-seven-year-old machinist. When, at Eloise's trial, S.J. was questioned about the machinist, he replied, "I was kind of happy to have him take her off my hands." That was vintage S.J. But eleven months later, after Eloise got out of prison, S.J. married her again. That was vintage S.J., too. As a former colleague of his puts it, "He was a strange bird."
S. J. Popeil was a tinkerer. In the middle of the night, he would wake up and make frantic sketches on a pad he kept on his bedside table. He would disappear into his kitchen for hours and make a huge mess, and come out with a faraway look on his face. He loved standing behind his machinists, peering over their shoulders while they were assembling one of his prototypes. In the late forties and early fifties, he worked almost exclusively in plastic, reinterpreting kitchen basics with a subtle, modernist flair. "Popeil Brothers made these beautiful plastic flour sifters," Tim Samuelson, a curator at the Chicago Historical Society and a leading authority on the Popeil legacy, says. "They would use contrasting colors, or a combination of opaque plastic with a translucent swirl plastic." Samuelson became fascinated with all things Popeil after he acquired an original Popeil Brothers doughnut maker, in red-and-white plastic, which he felt "had beautiful lines"; to this day, in the kitchen of his Hyde Park high-rise, he uses the Chop-O-Matic in the preparation of salad ingredients. "There was always a little twist to what he did," Samuelson goes on. "Take the Popeil automatic egg turner. It looks like a regular spatula, but if you squeeze the handle the blade turns just enough to flip a fried egg."
Walter Herbst, a designer whose firm worked with Popeil Brothers for many years, says that S.J.'s modus operandi was to "come up with a holistic theme. He'd arrive in the morning with it. It would be something like"--Herbst assumes S.J.'s gruff voice--" 'We need a better way to shred cabbage.' It was a passion, an absolute goddam passion. One morning, he must have been eating grapefruit, because he comes to work and calls me and says, 'We need a better way to cut grapefruit!' " The idea they came up with was a double-bladed paring knife, with the blades separated by a fraction of an inch so that both sides of the grapefruit membrane could be cut simultaneously. "There was a little grocery store a few blocks away," Herbst says. "So S.J. sends the chauffeur out for grapefruit. How many? Six. Well, over the period of a couple of weeks, six turns to twelve and twelve turns to twenty, until we were cutting thirty to forty grapefruits a day. I don't know if that little grocery store ever knew what happened."
S. J. Popeil's finest invention was undoubtedly the Veg-O-Matic, which came on the market in 1960 and was essentially a food processor, a Cuisinart without the motor. The heart of the gadget was a series of slender, sharp blades strung like guitar strings across two Teflon-coated metal rings, which were made in Woodstock, Illinois, from 364 Alcoa, a special grade of aluminum. When the rings were aligned on top of each other so that the blades ran parallel, a potato or an onion pushed through would come out in perfect slices. If the top ring was rotated, the blades formed a crosshatch, and a potato or an onion pushed through would come out diced. The rings were housed in a handsome plastic assembly, with a plunger to push the vegetables through the blades. Technically, the Veg-O-Matic was a triumph: the method of creating blades strong enough to withstand the assault of vegetables received a U.S. patent. But from a marketing perspective it posed a problem. S.J.'s products had hitherto been sold by pitchmen armed with a mound of vegetables meant to carry them through a day's worth of demonstrations. But the Veg-O-Matic was too good. In a single minute, according to the calculations of Popeil Brothers, it could produce a hundred and twenty egg wedges, three hundred cucumber slices, eleven hundred and fifty potato shoestrings, or three thousand onion dices. It could go through what used to be a day's worth of vegetables in a matter of minutes. The pitchman could no longer afford to pitch to just a hundred people at a time; he had to pitch to a hundred thousand. The Veg-O-Matic needed to be sold on television, and one of the very first pitchmen to grasp this fact was Ron Popeil.
In the summer of 1964, just after the Veg-O-Matic was introduced, Mel Korey joined forces with Ron Popeil in a company called Ronco. They shot a commercial for the Veg-O-Matic for five hundred dollars, a straightforward pitch shrunk to two minutes, and set out from Chicago for the surrounding towns of the Midwest. They cold-called local department stores and persuaded them to carry the Veg-O-Matic on guaranteed sale, which meant that whatever the stores didn't sell could be returned. Then they visited the local television station and bought a two- or three-week run of the cheapest airtime they could find, praying that it would be enough to drive traffic to the store. "We got Veg-O-Matics wholesale for $3.42," Korey says. "They retailed for $9.95, and we sold them to the stores for $7.46, which meant that we had four dollars to play with. If I spent a hundred dollars on television, I had to sell twenty-five Veg-O-Matics to break even." It was clear, in those days, that you could use television to sell kitchen products if you were Procter & Gamble. It wasn't so clear that this would work if you were Mel Korey and Ron Popeil, two pitchmen barely out of their teens selling a combination slicer-dicer that no one had ever heard of. They were taking a wild gamble, and, to their amazement, it paid off. "They had a store in Butte, Montana--Hennessy's," Korey goes on, thinking back to those first improbable years. "Back then, people there were still wearing peacoats. The city was mostly bars. It had just a few three-story buildings. There were twenty-seven thousand people, and one TV station. I had the Veg-O-Matic, and I go to the store, and they said, 'We'll take a case. We don't have a lot of traffic here.' I go to the TV station and the place is a dump. The only salesperson was going blind and deaf. So I do a schedule. For five weeks, I spend three hundred and fifty dollars. I figure if I sell a hundred and seventy-four machines--six cases--I'm happy. I go back to Chicago, and I walk into the office one morning and the phone is ringing. They said, 'We sold out. You've got to fly us another six cases of Veg-O-Matics.' The next week, on Monday, the phone rings. It's Butte again: 'We've got a hundred and fifty oversold.' I fly him another six cases. Every few days after that, whenever the phone rang we'd look at each other and say, 'Butte, Montana.' " Even today, thirty years later, Korey can scarcely believe it. "How many homes in total in that town? Maybe several thousand? We ended up selling two thousand five hundred Veg-O-Matics in five weeks!"
Why did the Veg-O-Matic sell so well? Doubtless, Americans were eager for a better way of slicing vegetables. But it was more than that: the Veg-O-Matic represented a perfect marriage between the medium (television) and the message (the gadget). The Veg-O-Matic was, in the relevant sense, utterly transparent. You took the potato and you pushed it through the Teflon-coated rings and--voilĂ !--you had French fries. There were no buttons being pressed, no hidden and intimidating gears: you could show-and-tell the Veg-O-Matic in a two-minute spot and allay everyone's fears about a daunting new technology. More specifically, you could train the camera on the machine and compel viewers to pay total attention to the product you were selling. TV allowed you to do even more effectively what the best pitchmen strove to do in live demonstrations--make the product the star.
This was a lesson Ron Popeil never forgot. In his infomercial for the Showtime Rotisserie, he opens not with himself but with a series of shots of meat and poultry, glistening almost obscenely as they rotate in the Showtime. A voice-over describes each shot: a "delicious six-pound chicken," a "succulent whole duckling," a "mouthwatering pork-loin roast . . ." Only then do we meet Ron, in a sports coat and jeans. He explains the problems of conventional barbecues, how messy and unpleasant they are. He bangs a hammer against the door of the Showtime, to demonstrate its strength. He deftly trusses a chicken, impales it on the patented two-pronged Showtime spit rod, and puts it into the oven. Then he repeats the process with a pair of chickens, salmon steaks garnished with lemon and dill, and a rib roast. All the time, the camera is on his hands, which are in constant motion, manipulating the Showtime apparatus gracefully, with his calming voice leading viewers through every step: "All I'm going to do here is slide it through like this. It goes in very easily. I'll match it up over here. What I'd like to do is take some herbs and spices here. All I'll do is slide it back. Raise up my glass door here. I'll turn it to a little over an hour. . . . Just set it and forget it."
Why does this work so well? Because the Showtime--like the Veg-O-Matic before it--was designed to be the star. From the very beginning, Ron insisted that the entire door be a clear pane of glass, and that it slant back to let in the maximum amount of light, so that the chicken or the turkey or the baby-back ribs turning inside would be visible at all times. Alan Backus says that after the first version of the Showtime came out Ron began obsessing over the quality and evenness of the browning and became convinced that the rotation speed of the spit wasn't quite right. The original machine moved at four revolutions per minute. Ron set up a comparison test in his kitchen, cooking chicken after chicken at varying speeds until he determined that the optimal speed of rotation was actually six r.p.m. One can imagine a bright-eyed M.B.A. clutching a sheaf of focus-group reports and arguing that Ronco was really selling convenience and healthful living, and that it was foolish to spend hundreds of thousands of dollars retooling production in search of a more even golden brown. But Ron understood that the perfect brown is important for the same reason that the slanted glass door is important: because in every respect the design of the product must support the transparency and effectiveness of its performance during a demonstration--the better it looks onstage, the easier it is for the pitchman to go into the turn and ask for the money.
If Ron had been the one to introduce the VCR, in other words, he would not simply have sold it in an infomercial. He would also have changed the VCR itself, so that it made sense in an infomercial. The clock, for example, wouldn't be digital. (The haplessly blinking unset clock has, of course, become a symbol of frustration.) The tape wouldn't be inserted behind a hidden door--it would be out in plain view, just like the chicken in the rotisserie, so that if it was recording you could see the spools turn. The controls wouldn't be discreet buttons; they would be large, and they would make a reassuring click as they were pushed up and down, and each step of the taping process would be identified with a big, obvious numeral so that you could set it and forget it. And would it be a slender black, low-profile box? Of course not. Ours is a culture in which the term "black box" is synonymous with incomprehensibility. Ron's VCR would be in red-and-white plastic, both opaque and translucent swirl, or maybe 364 Alcoa aluminum, painted in some bold primary color, and it would sit on top of the television, not below it, so that when your neighbor or your friend came over he would spot it immediately and say, "Wow, you have one of those Ronco Tape-O-Matics!"
A Real Piece of Work
Ron Popeil did not have a happy childhood. "I remember baking a potato. It must have been when I was four or five years old," he told me. We were in his kitchen, and had just sampled some baby-back ribs from the Showtime. It had taken some time to draw the memories out of him, because he is not one to dwell on the past. "I couldn't get that baked potato into my stomach fast enough, because I was so hungry." Ron is normally in constant motion, moving his hands, chopping food, bustling back and forth. But now he was still. His parents split up when he was very young. S.J. went off to Chicago. His mother disappeared. He and his older brother, Jerry, were shipped off to a boarding school in upstate New York. "I remember seeing my mother on one occasion. I don't remember seeing my father, ever, until I moved to Chicago, at thirteen. When I was in the boarding school, the thing I remember was a Sunday when the parents visited the children, and my parents never came. Even knowing that they weren't going to show up, I walked out to the perimeter and looked out over the farmland, and there was this road." He made an undulating motion with his hand to suggest a road stretching off into the distance. "I remember standing on the road crying, looking for the movement of a car miles away, hoping that it was my mother and father. And they never came. That's all I remember about boarding school." Ron remained perfectly still. "I don't remember ever having a birthday party in my life. I remember that my grandparents took us out and we moved to Florida. My grandfather used to tie me down in bed--my hands, my wrists, and my feet. Why? Because I had a habit of turning over on my stomach and bumping my head either up and down or side to side. Why? How? I don't know the answers. But I was spread-eagle, on my back, and if I was able to twist over and do it my grandfather would wake up at night and come in and beat the hell out of me." Ron stopped, and then added, "I never liked him. I never knew my mother or her parents or any of that family. That's it. Not an awful lot to remember. Obviously, other things took place. But they have been erased."
When Ron came to Chicago, at thirteen, with his grandparents, he was put to work in the Popeil Brothers factory--but only on the weekends, when his father wasn't there. "Canned salmon and white bread for lunch, that was the diet," he recalls. "Did I live with my father? Never. I lived with my grandparents." When he became a pitchman, his father gave him just one advantage: he extended his son credit. Mel Korey says that he once drove Ron home from college and dropped him off at his father's apartment. "He had a key to the apartment, and when he walked in his dad was in bed already. His dad said, 'Is that you, Ron?' And Ron said, 'Yeah.' And his dad never came out. And by the next morning Ron still hadn't seen him." Later, when Ron went into business for himself, he was persona non grata around Popeil Brothers. "Ronnie was never allowed in the place after that," one of S.J.'s former associates recalls. "He was never let in the front door. He was never allowed to be part of anything." My father, Ron says simply, "was all business. I didn't know him personally."
Here is a man who constructed his life in the image of his father--who went into the same business, who applied the same relentless attention to the workings of the kitchen, who got his start by selling his father's own products--and where was his father? "You know, they could have done wonders together," Korey says, shaking his head. "I remember one time we talked with K-tel about joining forces, and they said that we would be a war machine--that was their word. Well, Ron and his dad, they could have been a war machine." For all that, it is hard to find in Ron even a trace of bitterness. Once, I asked him, "Who are your inspirations?" The first name came easily: his good friend Steve Wynn. He was silent for a moment, and then he added, "My father." Despite everything, Ron clearly found in his father's example a tradition of irresistible value. And what did Ron do with that tradition? He transcended it. He created the Showtime, which is indisputably a better gadget, dollar for dollar, than the Morris Metric Slicer, the Dutch Kitchen Shredder Grater, the Chop-O-Matic, and the Veg-O-Matic combined.
When I was in Ocean Township, visiting Arnold Morris, he took me to the local Jewish cemetery, Chesed Shel Ames, on a small hilltop just outside town. We drove slowly through the town's poorer sections in Arnold's white Mercedes. It was a rainy day. At the cemetery, a man stood out front in an undershirt, drinking a beer. We entered through a little rusty gate. "This is where it all starts," Arnold said, by which he meant that everyone--the whole spirited, squabbling clan--was buried here. We walked up and down the rows until we found, off in a corner, the Morris headstones. There was Nathan Morris, of the straw boater and the opportune heart attack, and next to him his wife, Betty. A few rows over was the family patriarch, Kidders Morris, and his wife, and a few rows from there Irving Rosenbloom, who made a fortune in plastic goods out on Long Island. Then all the Popeils, in tidy rows: Ron's grandfather Isadore, who was as mean as a snake, and his wife, Mary; S.J., who turned a cold shoulder to his own son; Ron's brother, Jerry, who died young. Ron was from them, but he was not of them. Arnold walked slowly among the tombstones, the rain dancing off his baseball cap, and then he said something that seemed perfectly right. "You know, I'll bet you you'll never find Ronnie here."
On the Air
One Saturday night a few weeks ago, Ron Popeil arrived at the headquarters of the television shopping network QVC, a vast gleaming complex nestled in the woods of suburban Philadelphia. Ron is a regular on QVC. He supplements his infomercials with occasional appearances on the network, and, for twenty-four hours beginning that midnight, QVC had granted him eight live slots, starting with a special "Ronco" hour between midnight and 1 a.m. Ron was travelling with his daughter Shannon, who had got her start in the business selling the Ronco Electric Food Dehydrator on the fair circuit, and the plan was that the two of them would alternate throughout the day. They were pitching a Digital Jog Dial version of the Showtime, in black, available for one day only, at a "special value" of $129.72.
In the studio, Ron had set up eighteen Digital Jog Dial Showtimes on five wood-panelled gurneys. From Los Angeles, he had sent, via Federal Express, dozens of Styrofoam containers with enough meat for each of the day's airings: eight fifteen-pound turkeys, seventy-two hamburgers, eight legs of lamb, eight ducks, thirty-odd chickens, two dozen or so Rock Cornish game hens, and on and on, supplementing them with garnishes, trout, and some sausage bought that morning at three Philadelphia-area supermarkets. QVC's target was thirty-seven thousand machines, meaning that it hoped to gross about $4.5 million during the twenty-four hours--a huge day, even by the network's standards. Ron seemed tense. He barked at the team of QVC producers and cameramen bustling around the room. He fussed over the hero plates--the ready-made dinners that he would use to showcase meat taken straight from the oven. "Guys, this is impossible," he said, peering at a tray of mashed potatoes and gravy. "The level of gravy must be higher." He was limping a little. "You know, there's a lot of pressure on you," he said wearily. " 'How did Ron do? Is he still the best?' "
With just a few minutes to go, Ron ducked into the greenroom next to the studio to put GLH in his hair: a few aerosol bursts, followed by vigorous brushing. "Where is God right now?" his co-host, Rick Domeier, yelled out, looking around theatrically for his guest star. "Is God backstage?" Ron then appeared, resplendent in a chef's coat, and the cameras began to roll. He sliced open a leg of lamb. He played with the dial of the new digital Showtime. He admired the crispy, succulent skin of the duck. He discussed the virtues of the new food-warming feature--where the machine would rotate at low heat for up to four hours after the meat was cooked in order to keep the juices moving--and, all the while, bantered so convincingly with viewers calling in on the testimonial line that it was as if he were back mesmerizing the secretaries in the Woolworth's at State and Washington.
In the greenroom, there were two computer monitors. The first displayed a line graph charting the number of calls that came in at any given second. The second was an electronic ledger showing the total sales up to that point. As Ron took flight, one by one, people left the studio to gather around the computers. Shannon Popeil came first. It was 12:40 a.m. In the studio, Ron was slicing onions with one of his father's Dial-O-Matics. She looked at the second monitor and gave a little gasp. Forty minutes in, and Ron had already passed seven hundred thousand dollars. A QVC manager walked in. It was 12:48 a.m., and Ron was roaring on: $837,650. "It can't be!" he cried out. "That's unbelievable!" Two QVC producers came over. One of them pointed at the first monitor, which was graphing the call volume. "Jump," he called out. "Jump!" There were only a few minutes left. Ron was extolling the virtues of the oven one final time, and, sure enough, the line began to take a sharp turn upward, as all over America viewers took out their wallets. The numbers on the second screen began to change in a blur of recalculation--rising in increments of $129.72 plus shipping and taxes. "You know, we're going to hit a million dollars, just on the first hour," one of the QVC guys said, and there was awe in his voice. It was one thing to talk about how Ron was the best there ever was, after all, but quite another to see proof of it, before your very eyes. At that moment, on the other side of the room, the door opened, and a man appeared, stooped and drawn but with a smile on his face. It was Ron Popeil, who invented a better rotisserie in his kitchen and went out and pitched it himself. There was a hush, and then the whole room stood up and cheered.
Dept. of Useful Things
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
November 27, 2000
Out of the Frying Pan, Into the Voting Booth
My parents have an electric stove in their kitchen made by a company called Moffat. It has four burners on top and a raised panel that runs across the back with a set of knobs on it, and the way the panel is laid out has always been a bone of contention in our family. The knobs for the two left-hand burners are on the left side of the back panel, stacked one on top of the other, with the top knob controlling the back burner and the bottom knob controlling the front burner--same thing on the right-hand side. My mother finds this logical. But not my father. Every time he looks at the stove, he gets confused and thinks that the top knob controls the front burner.
Does this mean that my mother is more rational than my father? I don't think so. It simply means that any time you create a visual guide to an action that isn't intuitive--that requires some kind of interpretation or physical contortion--you're going to baffle some people. From the perspective of "usability" researchers, my father has fallen victim to an ill-designed interface. People who pop the trunk of their car when they mean to pop the gas-tank lid are experiencing the same kind of confusion, as the singer John Denver did, apparently, when he died in an airplane crash a few years ago. Denver was flying a new, experimental plane, and may not have realized how little fuel he had, since the fuel gauge wasn't linear, the way you'd expect it to be. When the line on that sort of gauge registers one-quarter, for example, it doesn't mean that the twenty-six-gallon tank is a quarter full; it means that the tank has less than five gallons left.
Then, there's the question of voting. Susan King Roth, an associate professor of visual communication at Ohio State University, did an experiment recently with voting machines and found that a surprising number of the people in her study didn't vote on the issues section of the ballot. Why? Because the issues proposals were at the top of the ballot, sixty-seven inches from the floor, and the eye height of the average American woman is sixty inches. Some people in the study simply couldn't see the proposals.
The Florida butterfly ballot may be the textbook example of what can go wrong when design isn't intuitive. The usability expert Kevin Fox has identified three "cognitive paths" that could have led voters to misunderstand the butterfly layout: gestalt grouping, linear visual search, and numeric mapping--all of which point out that the way the butterfly ballot invites itself to be read does not match the way it invites voters to act. In the language of usability studies, there is an incompatibility between input and output. In a sense, it's just like the problem with the Moffat stove. My father hasn't burned down the house yet. But sometimes he puts a pot on the back burner and turns on the front burner. If he's used the stove twenty thousand times in his life, it's a reasonable guess that he's made this mistake maybe a few hundred times. In the grand scheme of things, that's not a very high percentage. Then again, sometimes a few hundred mistakes can turn out to be awfully important.
Designs For Working
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
December 11, 2000
DEPT. OF HUMAN RESOURCES
Why your bosses want to turn your
new office into Greenwich Village.
1.
In the early nineteen-sixties, Jane Jacobs lived on Hudson Street, in Greenwich Village, near the intersection of Eighth Avenue and Bleecker Street. It was then, as now, a charming district of nineteenth-century tenements and town houses, bars and shops, laid out over an irregular grid, and Jacobs loved the neighborhood. In her 1961 masterpiece, "The Death and Life of Great American Cities," she rhapsodized about the White Horse Tavern down the block, home to Irish longshoremen and writers and intellectuals--a place where, on a winter's night, as "the doors open, a solid wave of conversation and animation surges out and hits you." Her Hudson Street had Mr. Slube, at the cigar store, and Mr. Lacey, the locksmith, and Bernie, the candy-store owner, who, in the course of a typical day, supervised the children crossing the street, lent an umbrella or a dollar to a customer, held on to some keys or packages for people in the neighborhood, and "lectured two youngsters who asked for cigarettes." The street had "bundles and packages, zigzagging from the drug store to the fruit stand and back over to the butcher's," and "teenagers, all dressed up, are pausing to ask if their slips show or their collars look right." It was, she said, an urban ballet.
The miracle of Hudson Street, according to Jacobs, was created by the particular configuration of the streets and buildings of the neighborhood. Jacobs argued that when a neighborhood is oriented toward the street, when sidewalks are used for socializing and play and commerce, the users of that street are transformed by the resulting stimulation: they form relationships and casual contacts they would never have otherwise. The West Village, she pointed out, was blessed with a mixture of houses and apartments and shops and offices and industry, which meant that there were always people "outdoors on different schedules and... in the place for different purposes." It had short blocks, and short blocks create the greatest variety in foot traffic. It had lots of old buildings, and old buildings have the low rents that permit individualized and creative uses. And, most of all, it had people, cheek by jowl, from every conceivable walk of life. Sparely populated suburbs may look appealing, she said, but without an active sidewalk life, without the frequent, serendipitous interactions of many different people, "there is no public acquaintanceship, no foundation of public trust, no cross-connections with the necessary people--and no practice or ease in applying the most ordinary techniques of city public life at lowly levels."
Jane Jacobs did not win the battle she set out to fight. The West Village remains an anomaly. Most developers did not want to build the kind of community Jacobs talked about, and most Americans didn't want to live in one. To reread "Death and Life" today, however, is to be struck by how the intervening years have given her arguments a new and unexpected relevance. Who, after all, has a direct interest in creating diverse, vital spaces that foster creativity and serendipity? Employers do. On the fortieth anniversary of its publication, "Death and Life" has been reborn as a primer on workplace design.
The parallels between neighborhoods and offices are striking. There was a time, for instance, when companies put their most valued employees in palatial offices, with potted plants in the corner, and secretaries out front, guarding access. Those offices were suburbs--gated communities, in fact--and many companies came to realize that if their best employees were isolated in suburbs they would be deprived of public acquaintanceship, the foundations of public trust, and cross-connections with the necessary people. In the eighties and early nineties, the fashion in corporate America was to follow what designers called "universal planning"--rows of identical cubicles, which resembled nothing so much as a Levittown. Today, universal planning has fallen out of favor, for the same reason that the postwar suburbs like Levittown did: to thrive, an office space must have a diversity of uses--it must have the workplace equivalent of houses and apartments and shops and industry.
If you visit the technology companies of Silicon Valley, or the media companies of Manhattan, or any of the firms that self-consciously identify themselves with the New Economy, you'll find that secluded private offices have been replaced by busy public spaces, open-plan areas without walls, executives next to the newest hires. The hush of the traditional office has been supplanted by something much closer to the noisy, bustling ballet of Hudson Street. Forty years ago, people lived in neighborhoods like the West Village and went to work in the equivalent of suburbs. Now, in one of the odd reversals that mark the current economy, they live in suburbs and, increasingly, go to work in the equivalent of the West Village.
2.
The office used to be imagined as a place where employees punch clocks and bosses roam the halls like high-school principals, looking for miscreants. But when employees sit chained to their desks, quietly and industriously going about their business, an office is not functioning as it should. That's because innovation--the heart of the knowledge economy--is fundamentally social. Ideas arise as much out of casual conversations as they do out of formal meetings. More precisely, as one study after another has demonstrated, the best ideas in any workplace arise out of casual contacts among different groups within the same company. If you are designing widgets for Acme.com, for instance, it is unlikely that a breakthrough idea is going to come from someone else on the widget team: after all, the other team members are as blinkered by the day-to- day demands of dealing with the existing product as you are. Someone from outside Acme.com--your old engineering professor, or a guy you used to work with at Apex.com--isn't going to be that helpful, either. A person like that doesn't know enough about Acme's widgets to have a truly useful idea. The most useful insights are likely to come from someone in customer service, who hears firsthand what widget customers have to say, or from someone in marketing, who has wrestled with the problem of how to explain widgets to new users, or from someone who used to work on widgets a few years back and whose work on another Acme product has given him a fresh perspective. Innovation comes from the interactions of people at a comfortable distance from one another, neither too close nor too far. This is why--quite apart from the matter of logistics and efficiency--companies have offices to begin with. They go to the trouble of gathering their employees under one roof because they want the widget designers to bump into the people in marketing and the people in customer service and the guy who moved to another department a few years back.
The catch is that getting people in an office to bump into people from another department is not so easy as it looks. In the sixties and seventies, a researcher at M.I.T. named Thomas Allen conducted a decade-long study of the way in which engineers communicated in research-and-development laboratories. Allen found that the likelihood that any two people will communicate drops off dramatically as the distance between their desks increases: we are four times as likely to communicate with someone who sits six feet away from us as we are with someone who sits sixty feet away. And people seated more than seventy-five feet apart hardly talk at all.
Allen's second finding was even more disturbing. When the engineers weren't talking to those in their immediate vicinity, many of them spent their time talking to people outside their company--to their old computer-science professor or the guy they used to work with at Apple. He concluded that it was actually easier to make the outside call than to walk across the room. If you constantly ask for advice or guidance from people inside your organization, after all, you risk losing prestige. Your colleagues might think you are incompetent. The people you keep asking for advice might get annoyed at you. Calling an outsider avoids these problems. "The engineer can easily excuse his lack of knowledge by pretending to be an `expert in something else' who needs some help in `broadening into this new area,'" Allen wrote. He did his study in the days before E-mail and the Internet, but the advent of digital communication has made these problems worse. Allen's engineers were far too willing to go outside the company for advice and new ideas. E-mail makes it even easier to talk to people outside the company.
The task of the office, then, is to invite a particular kind of social interaction--the casual, nonthreatening encounter that makes it easy for relative strangers to talk to each other. Offices need the sort of social milieu that Jane Jacobs found on the sidewalks of the West Village. "It is possible in a city street neighborhood to know all kinds of people without unwelcome entanglements, without boredom, necessity for excuses, explanations, fears of giving offense, embarrassments respecting impositions or commitments, and all such paraphernalia of obligations which can accompany less limited relationships," Jacobs wrote. If you substitute "office" for "city street neighborhood," that sentence becomes the perfect statement of what the modern employer wants from the workplace.
3.
Imagine a classic big-city office tower, with a floor plate of a hundred and eighty feet by a hundred and eighty feet. The center part of every floor is given over to the guts of the building: elevators, bathrooms, electrical and plumbing systems. Around the core are cubicles and interior offices, for support staff and lower management. And around the edges of the floor, against the windows, are rows of offices for senior staff, each room perhaps two hundred or two hundred and fifty square feet. The best research about office communication tells us that there is almost no worse way to lay out an office. The executive in one corner office will seldom bump into any other executive in a corner office. Indeed, stringing the exterior offices out along the windows guarantees that there will be very few people within the critical sixty-foot radius of those offices. To maximize the amount of contact among employees, you really ought to put the most valuable staff members in the center of the room, where the highest number of people can be within their orbit. Or, even better, put all places where people tend to congregate--the public areas--in the center, so they can draw from as many disparate parts of the company as possible. Is it any wonder that creative firms often prefer loft-style buildings, which have usable centers?
Another way to increase communication is to have as few private offices as possible. The idea is to exchange private space for public space, just as in the West Village, where residents agree to live in tiny apartments in exchange for a wealth of nearby cafés and stores and bars and parks. The West Village forces its residents outdoors. Few people, for example, have a washer and dryer in their apartment, and so even laundry is necessarily a social event: you have to take your clothes to the laundromat down the street. In the office equivalent, designers force employees to move around, too. They build in "functional inefficiencies"; they put kitchens and copiers and printers and libraries in places that can be reached only by a circuitous journey.
A more direct approach is to create an office so flexible that the kinds of people who need to spontaneously interact can actually be brought together. For example, the Ford Motor Company, along with a group of researchers from the University of Michigan, recently conducted a pilot project on the effectiveness of "war rooms" in software development. Previously, someone inside the company who needed a new piece of software written would have a series of meetings with the company's programmers, and the client and the programmers would send messages back and forth. In the war-room study, the company moved the client, the programmers, and a manager into a dedicated conference room, and made them stay there until the project was done. Using the war room cut the software-development time by two-thirds, in part because there was far less time wasted on formal meetings or calls outside the building: the people who ought to have been bumping into each other were now sitting next to each other.
Two years ago, the advertising agency TBWA\Chiat\Day moved into new offices in Los Angeles, out near the airport. In the preceding years, the firm had been engaged in a radical, and in some ways disastrous, experiment with a "nonterritorial" office: no one had a desk or any office equipment of his own. It was a scheme that courted failure by neglecting all the ways in which an office is a sort of neighborhood. By contrast, the new office is an almost perfect embodiment of Jacobsian principles of community. The agency is in a huge old warehouse, three stories high and the size of three football fields. It is informally known as Advertising City, and that's what it is: a kind of artfully constructed urban neighborhood. The floor is bisected by a central corridor called Main Street, and in the center of the room is an open space, with café tables and a stand of ficus trees, called Central Park. There's a basketball court, a game room, and a bar. Most of the employees are in snug workstations known as nests, and the nests are grouped together in neighborhoods that radiate from Main Street like Paris arrondissements. The top executives are situated in the middle of the room. The desk belonging to the chairman and creative director of the company looks out on Central Park. The offices of the chief financial officer and the media director abut the basketball court. Sprinkled throughout the building are meeting rooms and project areas and plenty of nooks where employees can closet themselves when they need to. A small part of the building is elevated above the main floor on a mezzanine, and if you stand there and watch the people wander about with their portable phones, and sit and chat in Central Park, and play basketball in the gym, and you feel on your shoulders the sun from the skylights and listen to the gentle buzz of human activity, it is quite possible to forget that you are looking at an office.
4.
In "The Death and Life of Great American Cities," Jacobs wrote of the importance of what she called "public characters"--people who have the social position and skills to orchestrate the movement of information and the creation of bonds of trust:
A public character is anyone who is in frequent contact with a wide circle of people and who is sufficiently interested to make himself a public character....The director of a settlement on New York's Lower East Side, as an example, makes a regular round of stores. He learns from the cleaner who does his suits about the presence of dope pushers in the neighborhood. He learns from the grocer that the Dragons are working up to something and need attention. He learns from the candy store that two girls are agitating the Sportsmen toward a rumble. One of his most important information spots is an unused breadbox on Rivington Street.... A message spoken there for any teen-ager within many blocks will reach his ears unerringly and surprisingly quickly, and the opposite flow along the grapevine similarly brings news quickly in to the breadbox.
A vital community, in Jacobs's view, required more than the appropriate physical environment. It also required a certain kind of person, who could bind together the varied elements of street life. Offices are no different. In fact, as office designers have attempted to create more vital workplaces, they have become increasingly interested in identifying and encouraging public characters.
One of the pioneers in this way of analyzing offices is Karen Stephenson, a business-school professor and anthropologist who runs a New York-based consulting company called Netform. Stephenson studies social networks. She goes into a company--her clients include J.P. Morgan, the Los Angeles Police Department, T.R.W., and I.B.M.--and distributes a questionnaire to its employees, asking about which people they have contact with. Whom do you like to spend time with? Whom do you talk to about new ideas? Where do you go to get expert advice? Every name in the company becomes a dot on a graph, and Stephenson draws lines between all those who have regular contact with each other. Stephenson likens her graphs to X-rays, and her role to that of a radiologist. What she's depicting is the firm's invisible inner mechanisms, the relationships and networks and patterns of trust that arise as people work together over time, and that are hidden beneath the organization chart. Once, for example, Stephenson was doing an "X-ray" of a Head Start organization. The agency was mostly female, and when Stephenson analyzed her networks she found that new hires and male staffers were profoundly isolated, communicating with the rest of the organization through only a handful of women. "I looked at tenure in the organization, office ties, demographic data. I couldn't see what tied the women together, and why the men were talking only to these women," Stephenson recalls. "Nor could the president of the organization. She gave me a couple of ideas. She said, `Sorry I can't figure it out.' Finally, she asked me to read the names again, and I could hear her stop, and she said, `My God, I know what it is. All those women are smokers.'" The X- ray revealed that the men--locked out of the formal power structure of the organization--were trying to gain access and influence by hanging out in the smoking area with some of the more senior women.
What Stephenson's X-rays do best, though, is tell you who the public characters are. In every network, there are always one or two people who have connections to many more people than anyone else. Stephenson calls them "hubs," and on her charts lines radiate out from them like spokes on a wheel. (Bernie the candy-store owner, on Jacobs's Hudson Street, was a hub.) A few people are also what Stephenson calls "gatekeepers": they control access to critical people, and link together a strategic few disparate groups. Finally, if you analyze the graphs there are always people who seem to have lots of indirect links to other people--who are part of all sorts of networks without necessarily being in the center of them. Stephenson calls those people "pulsetakers." (In Silicon Valleyspeak, the person in a sea of cubicles who pops his or her head up over the partition every time something interesting is going on is called a prairie dog: prairie dogs are pulsetakers.)
5.
In the past year, Stephenson has embarked on a partnership with Steelcase, the world's largest manufacturer of office furniture, in order to use her techniques in the design of offices. Traditionally, office designers would tell a company what furniture should go where. Stephenson and her partners at Steelcase propose to tell a company what people should go where, too. At Steelcase, they call this "floor-casting."
One of the first projects for the group is the executive level at Steelcase's headquarters, a five-story building in Grand Rapids, Michigan. The executive level, on the fourth floor, is a large, open room filled with small workstations. (Jim Hackett, the head of the company, occupies what Steelcase calls a Personal Harbor, a black, freestanding metal module that may be--at seven feet by eight--the smallest office of a Fortune 500 C.E.O.) One afternoon recently, Stephenson pulled out a laptop and demonstrated how she had mapped the communication networks of the leadership group onto a seating chart of the fourth floor. The dots and swirls are strangely compelling--abstract representations of something real and immediate. One executive, close to Hackett, was inundated with lines from every direction. "He's a hub, a gatekeeper, and a pulsetaker across all sorts of different dimensions," Stephenson said. "What that tells you is that he is very strategic. If there is no succession planning around that person, you have got a huge risk to the knowledge base of this company. If he's in a plane accident, there goes your knowledge." She pointed to another part of the floor plan, with its own thick overlay of lines. "That's sales and marketing. They have a pocket of real innovation here. The guy who runs it is very good, very smart." But then she pointed to the lines connecting that department with other departments. "They're all coming into this one place," she said, and she showed how all the lines coming out of marketing converged on one senior executive. "There's very little path redundancy. In human systems, you need redundancy, you need communication across multiple paths." What concerned Stephenson wasn't just the lack of redundancy but the fact that, in her lingo, many of the paths were "unconfirmed": they went only one way. People in marketing were saying that they communicated with the senior management, but there weren't as many lines going in the other direction. The sales-and-marketing team, she explained, had somehow become isolated from senior management. They couldn't get their voices heard when it came to innovation--and that fact, she said, ought to be a big consideration when it comes time to redo the office. "If you ask the guy who heads sales and marketing who he wants to sit next to, he'll pick out all the people he trusts," she said. "But do you sit him with those people? No. What you want to do is put people who don't trust each other near each other. Not necessarily next to each other, because they get too close. But close enough so that when you pop your head up, you get to see people, they are in your path, and all of a sudden you build an inviting space where they can hang out, kitchens and things like that. Maybe they need to take a hub in an innovation network and place the person with a pulsetaker in an expert network--to get that knowledge indirectly communicated to a lot of people."
The work of translating Stephenson's insights onto a new floor plan is being done in a small conference room--a war room--on the second floor of Steelcase headquarters. The group consists of a few key people from different parts of the firm, such as human resources, design, technology, and space-planning research. The walls of the room are cluttered with diagrams and pictures and calculations and huge, blownup versions of Stephenson's X-rays. Team members stress that what they are doing is experimental. They don't know yet how directly they want to translate findings from the communications networks to office plans. After all, you don't want to have to redo the entire office every time someone leaves or joins the company. But it's clear that there are some very simple principles from the study of public characters which ought to drive the design process. "You want to place hubs at the center," Joyce Bromberg, the director of space planning, says. "These are the ones other people go to in order to get information. Give them an environment that allows access. But there are also going to be times that they need to have control--so give them a place where they can get away. Gatekeepers represent the fit between groups. They transmit ideas. They are brokers, so you might want to put them at the perimeter, and give them front porches"--areas adjoining the workspace where you might put little tables and chairs. "Maybe they could have swinging doors with white boards, to better transmit information. As for pulsetakers, they are the roamers. Rather than give them one fixed work location, you might give them a series of touchdown spots--where you want them to stop and talk. You want to enable their meandering."
One of the other team members was a tall, thoughtful man named Frank Graziano. He had a series of pencil drawings--with circles representing workstations of all the people whose minds, as he put it, he wanted to make "explicit." He said that he had done the plan the night before. "I think we can thread innovation through the floor," he went on, and with a pen drew a red line that wound its way through the maze of desks. It was his Hudson Street.
6.
"The Death and Life of Great American Cities" was a controversial book, largely because there was always a whiff of paternalism in Jacobs's vision of what city life ought to be. Chelsea--the neighborhood directly to the north of her beloved West Village--had "mixtures and types of buildings and densities of dwelling units per acre... almost identical with those of Greenwich Village," she noted. But its long-predicted renaissance would never happen, she maintained, because of the "barriers of long, self-isolating blocks." She hated Chatham Village, a planned "garden city" development in Pittsburgh. It was a picturesque green enclave, but it suffered, in Jacobs's analysis, from a lack of sidewalk life. She wasn't concerned that some people might not want an active street life in their neighborhood; that what she saw as the "self-isolating blocks" of Chelsea others would see as a welcome respite from the bustle of the city, or that Chatham Village would appeal to some people precisely because one did not encounter on its sidewalks a "solid wave of conversation and animation." Jacobs felt that city dwellers belonged in environments like the West Village, whether they realized it or not.
The new workplace designers are making the same calculation, of course. The point of the new offices is to compel us to behave and socialize in ways that we otherwise would not--to overcome our initial inclination to be office suburbanites. But, in all the studies of the new workplaces, the reservations that employees have about a more social environment tend to diminish once they try it. Human behavior, after all, is shaped by context, but how it is shaped--and whether we'll be happy with the result--we can understand only with experience. Jane Jacobs knew the virtues of the West Village because she lived there. What she couldn't know was that her ideas about community would ultimately make more sense in the workplace. From time to time, social critics have bemoaned the falling rates of community participation in American life, but they have made the same mistake. The reason Americans are content to bowl alone (or, for that matter, not bowl at all) is that, increasingly, they receive all the social support they need--all the serendipitous interactions that serve to make them happy and productive--from nine to five.
The Trouble with Fries
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
March 5, 2001
ANNALS OF EATING
Fast food is killing us. Can it be fixed?
1.
In 1954, a man named Ray Kroc, who made his living selling the five-spindle Multimixer milkshake machine, began hearing about a hamburger stand in San Bernardino, California. This particular restaurant, he was told, had no fewer than eight of his machines in operation, meaning that it could make forty shakes simultaneously. Kroc was astounded. He flew from Chicago to Los Angeles, and drove to San Bernardino, sixty miles away, where he found a small octagonal building on a corner lot. He sat in his car and watched as the workers showed up for the morning shift. They were in starched white shirts and paper hats, and moved with a purposeful discipline. As lunchtime approached, customers began streaming into the parking lot, lining up for bags of hamburgers. Kroc approached a strawberry blonde in a yellow convertible.
"How often do you come here?" he asked.
"Anytime I am in the neighborhood," she replied, and, Kroc would say later, "it was not her sex appeal but the obvious relish with which she devoured the hamburger that made my pulse begin to hammer with excitement." He came back the next morning, and this time set up inside the kitchen, watching the griddle man, the food preparers, and, above all, the French-fry operation, because it was the French fries that truly captured his imagination. They were made from top-quality oblong Idaho russets, eight ounces apiece, deep-fried to a golden brown, and salted with a shaker that, as he put it, kept going like a Salvation Army girl's tambourine. They were crispy on the outside and buttery soft on the inside, and that day Kroc had a vision of a chain of restaurants, just like the one in San Bernardino, selling golden fries from one end of the country to the other. He asked the two brothers who owned the hamburger stand if he could buy their franchise rights. They said yes. Their names were Mac and Dick McDonald.
Ray Kroc was the great visionary of American fast food, the one who brought the lessons of the manufacturing world to the restaurant business. Before the fifties, it was impossible, in most American towns, to buy fries of consistent quality. Ray Kroc was the man who changed that. "The french fry," he once wrote, "would become almost sacrosanct for me, its preparation a ritual to be followed religiously." A potato that has too great a percentage of water--and potatoes, even the standard Idaho russet burbank, vary widely in their water content--will come out soggy at the end of the frying process. It was Kroc, back in the fifties, who sent out field men, armed with hydrometers, to make sure that all his suppliers were producing potatoes in the optimal solids range of twenty to twenty-three per cent. Freshly harvested potatoes, furthermore, are rich in sugars, and if you slice them up and deep-fry them the sugars will caramelize and brown the outside of the fry long before the inside is cooked. To make a crisp French fry, a potato has to be stored at a warm temperature for several weeks in order to convert those sugars to starch. Here Kroc led the way as well, mastering the art of "curing" potatoes by storing them under a giant fan in the basement of his first restaurant, outside Chicago.
Perhaps his most enduring achievement, though, was the so-called potato computer--developed for McDonald's by a former electrical engineer for Motorola named Louis Martino--which precisely calibrated the optimal cooking time for a batch of fries. (The key: when a batch of cold raw potatoes is dumped into a vat of cooking oil, the temperature of the fat will drop and then slowly rise. Once the oil has risen three degrees, the fries are ready.) Previously, making high-quality French fries had been an art. The potato computer, the hydrometer, and the curing bins made it a science. By the time Kroc was finished, he had figured out how to turn potatoes into an inexpensive snack that would always be hot, salty, flavorful, and crisp, no matter where or when you bought it.
This was the first fast-food revolution--the mass production of food that had reliable mass appeal. But today, as the McDonald's franchise approaches its fiftieth anniversary, it is clear that fast food needs a second revolution. As many Americans now die every year from obesity-related illnesses--heart disease and complications of diabetes--as from smoking, and the fast-food toll grows heavier every year. In the fine new book "Fast Food Nation," the journalist Eric Schlosser writes of McDonald's and Burger King in the tone usually reserved for chemical companies, sweatshops, and arms dealers, and, as shocking as that seems at first, it is perfectly appropriate. Ray Kroc's French fries are killing us. Can fast food be fixed?
2.
Fast-food French fries are made from a baking potato like an Idaho russet, or any other variety that is mealy, or starchy, rather than waxy. The potatoes are harvested, cured, washed, peeled, sliced, and then blanched--cooked enough so that the insides have a fluffy texture but not so much that the fry gets soft and breaks. Blanching is followed by drying, and drying by a thirty-second deep fry, to give the potatoes a crisp shell. Then the fries are frozen until the moment of service, when they are deep-fried again, this time for somewhere around three minutes. Depending on the fast-food chain involved, there are other steps interspersed in this process. McDonald's fries, for example, are briefly dipped in a sugar solution, which gives them their golden-brown color; Burger King fries are dipped in a starch batter, which is what gives those fries their distinctive hard shell and audible crunch. But the result is similar. The potato that is first harvested in the field is roughly eighty per cent water. The process of creating a French fry consists, essentially, of removing as much of that water as possible--through blanching, drying, and deep-frying--and replacing it with fat.
Elisabeth Rozin, in her book "The Primal Cheeseburger," points out that the idea of enriching carbohydrates with fat is nothing new. It's a standard part of the cuisine of almost every culture. Bread is buttered; macaroni comes with cheese; dumplings are fried; potatoes are scalloped, baked with milk and cheese, cooked in the dripping of roasting meat, mixed with mayonnaise in a salad, or pan-fried in butterfat as latkes. But, as Rozin argues, deep-frying is in many ways the ideal method of adding fat to carbohydrates. If you put butter on a mashed potato, for instance, the result is texturally unexciting: it simply creates a mush. Pan-frying results in uneven browning and crispness. But when a potato is deep-fried the heat of the oil turns the water inside the potato into steam, which causes the hard granules of starch inside the potato to swell and soften: that's why the inside of the fry is fluffy and light. At the same time, the outward migration of the steam limits the amount of oil that seeps into the interior, preventing the fry from getting greasy and concentrating the oil on the surface, where it turns the outer layer of the potato brown and crisp. "What we have with the french fry," Rozin writes, "is a near perfect enactment of the enriching of a starch food with oil or fat."
This is the trouble with the French fry. The fact that it is cooked in fat makes it unhealthy. But the contrast that deep-frying creates between its interior and its exterior--between the golden shell and the pillowy whiteness beneath--is what makes it so irresistible. The average American now eats a staggering thirty pounds of French fries a year, up from four pounds when Ray Kroc was first figuring out how to mass-produce a crisp fry. Meanwhile, fries themselves have become less healthful. Ray Kroc, in the early days of McDonald's, was a fan of a hot-dog stand on the North Side of Chicago called Sam's, which used what was then called the Chicago method of cooking fries. Sam's cooked its fries in animal fat, and Kroc followed suit, prescribing for his franchises a specially formulated beef tallow called Formula 47 (in reference to the forty-seven-cent McDonald's "All-American meal" of the era: fifteen-cent hamburger, twelve-cent fries, twenty-cent shake). Among aficionados, there is general agreement that those early McDonald's fries were the finest mass-market fries ever made: the beef tallow gave them an unsurpassed rich, buttery taste. But in 1990, in the face of public concern about the health risks of cholesterol in animal-based cooking oil, McDonald's and the other major fast-food houses switched to vegetable oil. That wasn't an improvement, however. In the course of making vegetable oil suitable for deep frying, it is subjected to a chemical process called hydrogenation, which creates a new substance called a trans unsaturated fat. In the hierarchy of fats, polyunsaturated fats--the kind found in regular vegetable oils--are the good kind; they lower your cholesterol. Saturated fats are the bad kind. But trans fats are worse: they wreak havoc with the body's ability to regulate cholesterol.
According to a recent study involving some eighty thousand women, for every five-per-cent increase in the amount of saturated fats that a woman consumes, her risk of heart disease increases by seventeen per cent. But only a two-per-cent increase in trans fats will increase her heart-disease risk by ninety-three per cent. Walter Willett, an epidemiologist at Harvard--who helped design the study--estimates that the consumption of trans fats in the United States probably causes about thirty thousand premature deaths a year.
McDonald's and the other fast-food houses aren't the only purveyors of trans fats, of course; trans fats are in crackers and potato chips and cookies and any number of other processed foods. Still, a lot of us get a great deal of our trans fats from French fries, and to read the medical evidence on trans fats is to wonder at the odd selectivity of the outrage that consumers and the legal profession direct at corporate behavior. McDonald's and Burger King and Wendy's have switched to a product, without disclosing its risks, that may cost human lives. What is the difference between this and the kind of thing over which consumers sue companies every day?
3.
The French-fry problem ought to have a simple solution: cook fries in oil that isn't so dangerous. Oils that are rich in monounsaturated fats, like canola oil, aren't nearly as bad for you as saturated fats, and are generally stable enough for deep-frying. It's also possible to "fix" animal fats so that they aren' t so problematic. For example, K. C. Hayes, a nutritionist at Brandeis University, has helped develop an oil called Appetize. It's largely beef tallow, which gives it a big taste advantage over vegetable shortening, and makes it stable enough for deep-frying. But it has been processed to remove the cholesterol, and has been blended with pure corn oil, in a combination that Hayes says removes much of the heart-disease risk.
Perhaps the most elegant solution would be for McDonald's and the other chains to cook their fries in something like Olestra, a fat substitute developed by Procter & Gamble. Ordinary fats are built out of a molecular structure known as a triglyceride: it's a microscopic tree, with a trunk made of glycerol and three branches made of fatty acids. Our bodies can't absorb triglycerides, so in the digestive process each of the branches is broken off by enzymes and absorbed separately. In the production of Olestra, the glycerol trunk of a fat is replaced with a sugar, which has room for not three but eight fatty acids. And our enzymes are unable to break down a fat tree with eight branches--so the Olestra molecule can't be absorbed by the body at all. "Olestra" is as much a process as a compound: you can create an "Olestra" version of any given fat. Potato chips, for instance, tend to be fried in cottonseed oil, because of its distinctively clean taste. Frito-Lay's no-fat Wow! chips are made with an Olestra version of cottonseed oil, which behaves just like regular cottonseed oil except that it's never digested. A regular serving of potato chips has a hundred and fifty calories, ninety of which are fat calories from the cooking oil. A serving of Wow! chips has seventy-five calories and no fat. If Procter & Gamble were to seek F.D.A. approval for the use of Olestra in commercial deep-frying (which it has not yet done), it could make an Olestra version of the old McDonald's Formula 47, which would deliver every nuance of the old buttery, meaty tallow at a fraction of the calories.
Olestra, it must be said, does have some drawbacks--in particular, a reputation for what is delicately called "gastrointestinal distress." The F.D.A. has required all Olestra products to carry a somewhat daunting label saying that they may cause "cramping and loose stools." Not surprisingly, sales have been disappointing, and Olestra has never won the full acceptance of the nutrition community. Most of this concern, however, appears to be overstated. Procter & Gamble has done randomized, double-blind studies--one of which involved more than three thousand people over six weeks--and found that people eating typical amounts of Olestra-based chips don't have significantly more gastrointestinal problems than people eating normal chips. Diarrhea is such a common problem in America--nearly a third of adults have at least one episode each month--that even F.D.A. regulators now appear to be convinced that in many of the complaints they received Olestra was unfairly blamed for a problem that was probably caused by something else. The agency has promised Procter & Gamble that the warning label will be reviewed.
Perhaps the best way to put the Olestra controversy into perspective is to compare it to fibre. Fibre is vegetable matter that goes right through you: it's not absorbed by the gastrointestinal tract. Nutritionists tell us to eat it because it helps us lose weight and it lowers cholesterol--even though if you eat too many baked beans or too many bowls of oat bran you will suffer the consequences. Do we put warning labels on boxes of oat bran? No, because the benefits of fibre clearly outweigh its drawbacks. Research has suggested that Olestra, like fibre, helps people lose weight and lowers cholesterol; too much Olestra, like too much fibre, may cause problems. (Actually, too much Olestra may not be as troublesome as too much bran. According to Procter & Gamble, eating a large amount of Olestra--forty grams--causes no more problems than eating a small bowl--twenty grams--of wheat bran.) If we had Olestra fries, then, they shouldn't be eaten for breakfast, lunch, and dinner. In fact, fast-food houses probably shouldn't use hundred-per-cent Olestra; they should cook their fries in a blend, using the Olestra to displace the most dangerous trans and saturated fats. But these are minor details. The point is that it is entirely possible, right now, to make a delicious French fry that does not carry with it a death sentence. A French fry can be much more than a delivery vehicle for fat.
4.
Is it really that simple, though? Consider the cautionary tale of the efforts of a group of food scientists at Auburn University, in Alabama, more than a decade ago to come up with a better hamburger. The Auburn team wanted to create a leaner beef that tasted as good as regular ground beef. They couldn't just remove the fat, because that would leave the meat dry and mealy. They wanted to replace the fat. "If you look at ground beef, it contains moisture, fat, and protein," says Dale Huffman, one of the scientists who spearheaded the Auburn project. "Protein is relatively constant in all beef, at about twenty per cent. The traditional McDonald's ground beef is around twenty per cent fat. The remainder is water. So you have an inverse ratio of water and fat. If you reduce fat, you need to increase water." The goal of the Auburn scientists was to cut about two-thirds of the fat from normal ground beef, which meant that they needed to find something to add to the beef that would hold an equivalent amount of water--and continue to retain that water even as the beef was being grilled. Their choice? Seaweed, or, more precisely, carrageenan. "It's been in use for centuries," Huffman explains. "It's the stuff that keeps the suspension in chocolate milk--otherwise the chocolate would settle at the bottom. It has tremendous water-holding ability. There's a loose bond between the carrageenan and the moisture." They also selected some basic flavor enhancers, designed to make up for the lost fat "taste." The result was a beef patty that was roughly three-quarters water, twenty per cent protein, five per cent or so fat, and a quarter of a per cent seaweed. They called it AU Lean.
It didn't take the Auburn scientists long to realize that they had created something special. They installed a test kitchen in their laboratory, got hold of a McDonald's grill, and began doing blind taste comparisons of AU Lean burgers and traditional twenty- per-cent-fat burgers. Time after time, the AU Lean burgers won. Next, they took their invention into the field. They recruited a hundred families and supplied them with three kinds of ground beef for home cooking over consecutive three-week intervals--regular "market" ground beef with twenty per cent fat, ground beef with five percent fat, and AU Lean. The families were asked to rate the different kinds of beef, without knowing which was which. Again, the AU Lean won hands down--trumping the other two on "likability," "tenderness," "flavorfulness," and "juiciness."
What the Auburn team showed was that, even though people love the taste and feel of fat--and naturally gravitate toward high-fat food--they can be fooled into thinking that there is a lot of fat in something when there isn't. Adam Drewnowski, a nutritionist at the University of Washington, has found a similar effect with cookies. He did blind taste tests of normal and reduced-calorie brownies, biscotti, and chocolate-chip, oatmeal, and peanut-butter cookies. If you cut the sugar content of any of those cookies by twenty-five per cent, he found, people like the cookies much less. But if you cut the fat by twenty-five per cent they barely notice. "People are very finely attuned to how much sugar there is in a liquid or a solid," Drewnowski says. "For fat, there's no sensory break point. Fat comes in so many guises and so many textures it is very difficult to perceive how much is there." This doesn't mean we are oblivious of fat levels, of course. Huffman says that when his group tried to lower the fat in AU Lean below five per cent, people didn't like it anymore. But, within the relatively broad range of between five and twenty-five per cent, you can add water and some flavoring and most people can't tell the difference.
What's more, people appear to be more sensitive to the volume of food they consume than to its calorie content. Barbara Rolls, a nutritionist at Penn State, has demonstrated this principle with satiety studies. She feeds one group of people a high-volume snack and another group a low-volume snack. Even though the two snacks have the same calorie count, she finds that people who eat the high-volume snack feel more satisfied. "People tend to eat a constant weight or volume of food in a given day, not a constant portion of calories," she says. Eating AU Lean, in short, isn't going to leave you with a craving for more calories; you'll feel just as full.
For anyone looking to improve the quality of fast food, all this is heartening news. It means that you should be able to put low-fat cheese and low-fat mayonnaise in a Big Mac without anyone's complaining. It also means that there's no particular reason to use twenty-per-cent-fat ground beef in a fast-food burger. In 1990, using just this argument, the Auburn team suggested to McDonald's that it make a Big Mac out of AU Lean. Shortly thereafter, McDonald's came out with the McLean Deluxe. Other fast-food houses scrambled to follow suit. Nutritionists were delighted. And fast food appeared on the verge of a revolution.
Only, it wasn't. The McLean was a flop, and four years later it was off the market. What happened? Part of the problem appears to have been that McDonald's rushed the burger to market before many of the production kinks had been worked out. More important, though, was the psychological handicap the burger faced. People liked AU Lean in blind taste tests because they didn't know it was AU Lean; they were fooled into thinking it was regular ground beef. But nobody was fooled when it came to the McLean Deluxe. It was sold as the healthy choice--and who goes to McDonald's for health food?
Leann Birch, a developmental psychologist at Penn State, has looked at the impact of these sorts of expectations on children. In one experiment, she took a large group of kids and fed them a big lunch. Then she turned them loose in a room with lots of junk food. "What we see is that some kids eat almost nothing," she says. "But other kids really chow down, and one of the things that predicts how much they eat is the extent to which parents have restricted their access to high-fat, high-sugar food in the past: the more the kids have been restricted, the more they eat." Birch explains the results two ways. First, restricting food makes kids think not in terms of their own hunger but in terms of the presence and absence of food. As she puts it, "The kid is essentially saying, 'If the food's here I better get it while I can, whether or not I'm hungry.' We see these five-year-old kids eating as much as four hundred calories." Birch's second finding, though, is more important. Because the children on restricted diets had been told that junk food was bad for them, they clearly thought that it had to taste good. When it comes to junk food, we seem to follow an implicit script that powerfully biases the way we feel about food. We like fries not in spite of the fact that they're unhealthy but because of it.
That is sobering news for those interested in improving the American diet. For years, the nutrition movement in this country has made transparency one of its principal goals: it has assumed that the best way to help people improve their diets is to tell them precisely what's in their food, to label certain foods good and certain foods bad. But transparency can backfire, because sometimes nothing is more deadly for our taste buds than the knowledge that what we are eating is good for us. McDonald's should never have called its new offering the McLean Deluxe, in other words. They should have called it the Burger Supreme or the Monster Burger, and then buried the news about reduced calories and fat in the tiniest type on the remotest corner of their Web site. And if we were to cook fries in some high-tech, healthful cooking oil--whether Olestrized beef tallow or something else with a minimum of trans and saturated fats--the worst thing we could do would be to market them as healthy fries. They will not taste nearly as good if we do. They have to be marketed as better fries, as Classic Fries, as fries that bring back the rich tallowy taste of the original McDonald's.
What, after all, was Ray Kroc's biggest triumph? A case could be made for the field men with their hydrometers, or the potato-curing techniques, or the potato computer, which turned the making of French fries from an art into a science. But we should not forget Ronald McDonald, the clown who made the McDonald's name irresistible to legions of small children. Kroc understood that taste comprises not merely the food on our plate but also the associations and assumptions and prejudices we bring to the table--that half the battle in making kids happy with their meal was calling what they were eating a Happy Meal. The marketing of healthful fast food will require the same degree of subtlety and sophistication. The nutrition movement keeps looking for a crusader--someone who will bring about better public education and tougher government regulations. But we need much more than that. We need another Ray Kroc.
Wrong Turn
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
June 11, 2001
A REPORTER AT LARGE
How the fight to make America's
highways safer went off course.
I. BANG
Every two miles, the average driver makes four hundred observations, forty decisions, and one mistake. Once every five hundred miles, one of those mistakes leads to a near collision, and once every sixty-one thousand miles one of those mistakes leads to a crash. When people drive, in other words, mistakes are endemic and accidents inevitable, and that is the first and simplest explanation for what happened to Robert Day on the morning of Saturday, April 9, 1994. He was driving a 1980 Jeep Wagoneer from his home, outside Philadelphia, to spend a day working on train engines in Winslow Township, New Jersey. He was forty-four years old, and made his living as an editor for the Chilton Book Company. His ten-year-old son was next to him, in the passenger seat. It was a bright, beautiful spring day. Visibility was perfect, and the roadway was dry, although one of the many peculiarities of car crashes is that they happen more often under ideal road conditions than in bad weather. Day's route took him down the Atlantic City Expressway to Fleming Pike, a two-lane country road that winds around a sharp curve and intersects, about a mile later, with Egg Harbor Road. In that final stretch of Fleming Pike, there is a scattering of houses and a fairly thick stand of trees on either side of the road, obscuring all sight lines to the left and right. As he approached the intersection, then, Day could not have seen a blue-and-gray 1993 Ford Aerostar minivan travelling between forty and fifty miles per hour southbound on Egg Harbor, nor a white 1984 Mazda 626 travelling at approximately fifty miles per hour in the other direction. Nor, apparently, did he see the stop sign at the corner, or the sign a tenth of a mile before that, warning of the intersection ahead. Day's son, in the confusing aftermath of the accident, told police that he was certain his father had come to a stop at the corner. But the accident's principal witness says he never saw any brake lights on the Wagoneer, and, besides, there is no way that the Jeep could have done the damage that it did from a standing start. Perhaps Day was distracted. The witness says that Day's turn signal had been on since he left the expressway. Perhaps he was looking away and looked back at the road at the wrong time, since there is an area, a few hundred yards before Egg Harbor Road, just on the near side of a little ridge, where the trees and houses make it look as if Fleming Pike ran without interruption well off into the distance. We will never know, and in any case it does not matter much. Day merely did what all of us do every time we get in a car: he made a mistake. It's just that he was unlucky enough that his mistake led him directly into the path of two other cars.
The driver of the Ford Aerostar was Stephen Capoferri, then thirty-nine. He worked in the warehouse of Whitehall Laboratories, in southern New Jersey. He had just had breakfast with his parents and was on his way to the bank. The driver of the Mazda was Elizabeth Wolfrum. She was twenty-four. She worked as the manager of a liquor store. Her eighteen-year-old sister, Julie, was in the passenger seat; a two-year-old girl was in the back seat. Because of the vegetation on either side of Fleming Pike, Capoferri did not see Day's vehicle until it was just eighty-five feet from the point of impact, and if we assume that Day was travelling at forty miles per hour, or fifty-nine feet per second, that means that Capoferri had about 1.5 seconds to react. That is scarcely enough time. The average adult needs about that long simply to translate an observation ("That car is going awfully fast") into an action ("I ought to hit my brake"). Capoferri hit Day broadside, at a slight angle, the right passenger side of the Aerostar taking most of the impact. The Jeep was pushed sidewise, but it kept going forward, pulling off the grille and hood of the Aerostar, and sending it into a two-hundred-and-seventy-degree counterclockwise spin. As the Jeep lurched across the intersection, it slammed into the side of Wolfrum's Mazda. The cars slapped together, and then skidded together across the intersection, ending on the grass on the far, southeastern corner. According to documents filed by Elizabeth Wolfrum's lawyers, Wolfrum suffered eighteen injuries, including a ruptured spleen, multiple liver lacerations, brain damage, and fractures to the legs, ribs, ankles, and nose. Julie Wolfrum was partially ejected from the Mazda and her face hit the ground. She subsequently underwent seventeen separate surgical procedures and remained in intensive care for forty-four days. In post-crash photographs, their car looks as if it had been dropped head first from an airplane. Robert Day suffered massive internal injuries and was pronounced dead two hours later, at West Jersey Hospital. His son was bruised and shaken up. Capoferri walked away largely unscathed.
"Once the impact occurred, I did a spin," he remembers. "I don't recall doing that. I may have blacked out. It couldn't have been for very long. I wanted to get out. I was trying to judge how I was. I was having a little trouble breathing. But I knew I could walk. My senses were gradually coming back to normal. I'm pretty sure I went to Day's vehicle first. I went to the driver's side. He was semi-conscious. He had blood coming out of his mouth. I tried to keep him awake. His son was in the passenger seat. He had no injuries. He said, 'Is my father O.K.?' I seem to remember looking in the Mazda. My first impression was that they were dead, because the driver's side of the vehicle was very badly smashed in. I think they needed the 'jaws of life' to get them out. There was a little girl in the back. She was crying."
Capoferri has long black hair and a beard and the build of a wrestler. He is a thoughtful man who chooses his words carefully. As he talked, he was driving his Taurus back toward the scene of the accident, and he was apologetic that he could not recall more details of those moments leading up to the accident. But what is there to remember? In the popular imagination--fuelled by the car crashes of Hollywood movies, with their special effects and complicated stunts--an accident is a protracted sequence, played out in slow motion, over many frames. It is not that way in real life. The time that elapsed between the collision of Capoferri and Day and Day and Wolfrum was probably no more than twenty-five milliseconds, faster than the blinking of an eye, and the time that elapsed between the moment Capoferri struck Day and the moment his van came to a rest, two hundred and seventy degrees later, was probably no more than a second. Capoferri said that a friend of his, who lived right on the corner where the accident happened, told him later that all the crashing and spinning and skidding sounded like an single, sharp explosion--bang!
II. THE PASSIVE APPROACH
In the middle part of the last century, a man named William Haddon changed forever the way Americans think about car accidents. Haddon was, by training, a medical doctor and an epidemiologist and, by temperament, a New Englander--tall and reed-thin, with a crewcut, a starched white shirt, and a bow tie. He was exacting and cerebral, and so sensitive to criticism that it was said of him that he could be "blistered by moonbeams." He would not eat mayonnaise, or anything else subject to bacterial contamination. He hated lawyers, which was ironic, because it was lawyers who became his biggest disciples. Haddon was discovered by Daniel Patrick Moynihan, when Moynihan was working for Averell Harriman, then the Democratic governor of New York State. It was 1958. Moynihan was chairing a meeting on traffic safety, in Albany's old state-executive-office chambers, and a young man at the back of the room kept asking pointed questions. "What's your name?" Moynihan eventually asked, certain he had collared a Republican spy. "Haddon, sir," the young man answered. He was just out of the Harvard School of Public Health, and convinced that what the field of traffic safety needed was the rigor of epidemiology. Haddon asked Moynihan what data he was using. Moynihan shrugged. He wasn't using any data at all.
Haddon and Moynihan went across the street to Yezzi's, a local watering hole, and Moynihan fell under Haddon's spell. The orthodoxy of that time held that safety was about reducing accidents--educating drivers, training them, making them slow down. To Haddon, this approach made no sense. His goal was to reduce the injuries that accidents caused. In particular, he did not believe in safety measures that depended on changing the behavior of the driver, since he considered the driver unreliable, hard to educate, and prone to error. Haddon believed the best safety measures were passive. "He was a gentle man," Moynihan recalls. "Quiet, without being mum. He never forgot that what we were talking about were children with their heads smashed and broken bodies and dead people."
Several years later, Moynihan was working for President Johnson in the Department of Labor, and hired a young lawyer out of Harvard named Ralph Nader to work on traffic-safety issues. Nader, too, was a devotee of Haddon's ideas, and he converted a young congressional aide named Joan Claybrook. In 1959, Moynihan wrote an enormously influential article, articulating Haddon's principles, called "Epidemic on the Highways." In 1965, Nader wrote his own homage to the Haddon philosophy, "Unsafe at Any Speed," which became a best-seller, and in 1966 the Haddon crusade swept Washington. In the House and the Senate, there were packed hearings on legislation to create a federal regulatory agency for traffic safety. Moynihan and Haddon testified, as did a liability lawyer from South Carolina, in white shoes and a white suit, and a Teamsters official, Jimmy Hoffa, whom Claybrook remembers as a "fabulous" witness. It used to be that, during a frontal crash, steering columns in cars were pushed back through the passenger compartment, potentially impaling the driver. The advocates argued that columns should collapse inward on impact. Instrument panels ought to be padded, they said, and knobs shouldn't stick out, where they might cause injury. Doors ought to have strengthened side-impact beams. Roofs should be strong enough to withstand a rollover. Seats should have head restraints to protect against neck injuries. Windshields ought to be glazed, so that if you hit them with your head at high speed your face wasn't cut to ribbons. The bill sailed through both houses of Congress, and a regulatory body, which eventually became the National Highway Traffic Safety Administration, was established. Haddon was made its commissioner, Claybrook his special assistant. "I remember a Senate hearing we had with Warren Magnuson," Nader recalls. "He was listening to a pediatrician who was one of our allies, Seymour Charles, from New Jersey, and Charles was showing how there were two cars that collided, and one had a collapsible steering column and one didn't, and one driver walked away, the other was killed. And, just like that, Magnuson caught on. 'You mean,' he said, 'you can have had a crash without an injury?' That's it! A crash without an injury. That idea was very powerful."
There is no question that the improvements in auto design which Haddon and his disciples pushed for saved countless lives. They changed the way cars were built, and put safety on the national agenda. What they did not do, however, is make American highways the safest in the world. In fact--and this is the puzzling thing about the Haddon crusade--the opposite happened. United States auto-fatality rates were the lowest in the world before Haddon came along. But, since the late nineteen-seventies, just as the original set of N.H.T.S.A. safety standards were having their biggest impact, America's safety record has fallen to eleventh place. According to calculations by Leonard Evans, a longtime General Motors researcher and one of the world's leading experts on traffic safety, if American traffic fatalities had declined at the same rate as Canada's or Australia's between 1979 and 1997, there would have been somewhere in the vicinity of a hundred and sixty thousand fewer traffic deaths in that span.
This is not to suggest, of course, that Haddon's crusade is responsible for a hundred and sixty thousand highway deaths. Traffic safety is the most complex of phenomena--fatality rates can be measured in many ways, and reflect a hundred different variables--and in this period there were numerous factors that distinguished the United States from places like Canada and Australia, including different trends in drunk driving. Nor is it to say that the Haddonites had anything but the highest motives. Still, Evans's figures raise a number of troubling questions. Haddon and Nader and Claybrook told us, after all, that the best way to combat the epidemic on the highways was to shift attention from the driver to the vehicle. No other country pursued the passive strategy as vigorously, and no other country had such high expectations for its success. But America's slipping record on auto safety suggests that somewhere in the logic of that approach there was a mistake. And, if so, it necessarily changes the way we think about car crashes like the one that happened seven years ago on the corner of Fleming Pike and Egg Harbor Road.
"I think that the philosophical argument behind the passive approach is a strong one," Evans says. A physicist by training, he is a compact, spry man in his sixties, with a trace in his voice of his native Northern Ireland. On the walls of his office in suburban Detroit is a lifetime of awards and certifications from safety researchers, but, like many technical types, he is embittered by how hard it has been to make his voice heard in the safety debates of the past thirty years. "Either you can persuade people to boil their own water because there is a typhoid epidemic or you can put chlorine in the water," he went on. "And the second, passive solution is obviously preferred to the first, because there is no way you can persuade everyone to act in a prudent way. But starting from that philosophical principle and then ignoring reality is a recipe for disaster. And that's what happened. Why?" Here Evans nearly leaped out of his chair. "Because there isn't any chlorine for traffic crashes."
III. THE FIRST COLLISION
Robert Day's crash was not the accident of a young man. He was hit from the side, and adolescents and young adults usually have side-impact crashes when their cars slide off the road into a fixed object like a tree, often at reckless speeds. Older people tend to have side-impact crashes at normal speeds, in intersections, and as the result of error, not negligence. In fact, Day's crash was not merely typical in form; it was the result of a common type of driver error. He didn't see something he was supposed to see.
His mistake is, on one level, difficult to understand. There was a sign, clearly visible from the roadway, telling him of an intersection ahead, and then another, in bright red, telling him to stop. How could he have missed them both? From what we know of human perception, though, this kind of mistake happens all the time. Imagine, for instance, that you were asked to look at the shape of a cross, briefly displayed on a computer screen, and report on which arm of the cross was longer. After you did this a few times, another object, like a word or a small colored square--what psychologists call a critical stimulus--flashes next to the cross on the screen, right in front of your eyes. Would you see the critical stimulus? Most of us would say yes. Intuitively, we believe that we "see" everything in our field of vision--particularly things right in front of us--and that the difference between the things we pay attention to and the things we don't is simply that the things we focus on are the things we become aware of. But when experiments to test this assumption were conducted recently by Arien Mack, a psychologist at the New School, in New York, she found, to her surprise, that a significant portion of her observers didn't see the second object at all: it was directly in their field of vision, and yet, because their attention was focussed on the cross, they were oblivious of it. Mack calls this phenomenon "inattentional blindness."
Daniel Simons, a professor of psychology at Harvard, has done a more dramatic set of experiments, following on the same idea. He and a colleague, Christopher Chabris, recently made a video of two teams of basketball players, one team in white shirts and the other in black, each player in constant motion as two basketballs are passed back and forth. Observers were asked to count the number of passes completed by the members of the white team. After about forty-five seconds of passes, a woman in a gorilla suit walks into the middle of the group, stands in front of the camera, beats her chest vigorously, and then walks away. "Fifty per cent of the people missed the gorilla," Simons says. "We got the most striking reactions. We'd ask people, 'Did you see anyone walking across the screen?' They'd say no. Anything at all? No. Eventually, we'd ask them, 'Did you notice the gorilla?' And they'd say, 'The what?'" Simons's experiment is one of those psychological studies which are impossible to believe in the abstract: if you look at the video (called "Gorillas in Our Midst") when you know what's coming, the woman in the gorilla suit is inescapable. How could anyone miss that? But people do. In recent years, there has been much scientific research on the fallibility of memory--on the fact that eyewitnesses, for example, often distort or omit critical details when they recall what they saw. But the new research points to something that is even more troubling: it isn't just that our memory of what we see is selective; it's that seeing itself is selective.
This is a common problem in driving. Talking on a cell phone and trying to drive, for instance, is not unlike trying to count passes in a basketball game and simultaneously keep track of wandering animals. "When you get into a phone conversation, it's different from the normal way we have evolved to interact," David Strayer, a professor of psychology at the University of Utah, says. "Normally, conversation is face to face. There are all kinds of cues. But when you are on the phone you strip that away. It's virtual reality. You attend to that virtual reality, and shut down processing of the here and now." Strayer has done tests of people who were driving and talking on phones, and found that they remember far fewer things than those driving without phones. Their field of view shrinks. In one experiment, he flashed red and green lights at people while they were driving, and those on the phone missed twice as many lights as the others, and responded far more slowly to those lights they did see. "We tend to find the biggest deficits in unexpected events, a child darting onto the road, a light changing," Strayer says. "Someone going into your lane. That's what you don't see. There is a part of driving that is automatic and routine. There is a second part of driving that is completely unpredictable, and that is the part that requires attention." This is what Simons found with his gorilla, and it is the scariest part of inattentional blindness. People allow themselves to be distracted while driving because they think that they will still be able to pay attention to anomalies. But it is precisely those anomalous things, those deviations from the expected script, which they won't see.
Marc Green, a psychologist with an accident-consulting firm in Toronto, once worked on a case where a woman hit a bicyclist with her car. "She was pulling into a gas station," Green says. "It was five o'clock in the morning. She'd done that almost every day for a year. She looks to the left, and then she hears a thud. There's a bicyclist on the ground. She'd looked down that sidewalk nearly every day for a year and never seen anybody. She adaptively learned to ignore what was on that sidewalk because it was useless information. She may actually have turned her eyes toward him and failed to see him." Green says that, once you understand why the woman failed to see the bicyclist, the crash comes to seem almost inevitable.
It's the same conclusion that Haddon reached, and that formed the basis for his conviction that Americans were spending too much time worrying about what happened before an accident and not enough time worrying about what happened during and after an accident. Sometimes crashes happen because people do stupid things that they shouldn't have done--like drink or speed or talk on their cell phone. But sometimes people do stupid things that they cannot help, and it makes no sense to construct a safety program that does not recognize human fallibility. Just imagine, for example, that you're driving down a country road. The radio is playing. You're talking to your son, next to you. There is a highway crossing up ahead, but you can't see it, nor can you see any cars on the roadway, because of a stand of trees on both sides of the road. Maybe you look away from the road, for a moment, to change the dial on the radio, or something catches your eye outside, and when you glance back it happens to be at the very moment when a trick of geography makes it look as if your road stretched without interruption well off into the distance. Suddenly, up ahead, right in front of your eyes looms a bright-red anomalous stop sign--as out of place in the momentary mental universe that you have constructed for yourself as a gorilla in a basketball game--and, precisely because it is so anomalous, it doesn't register. Then--bang! How do you prevent an accident like that?
IV. THE SECOND COLLISION
One day in 1968, a group of engineers from the Cleveland-based auto-parts manufacturer Eaton, Yale &Towne went to Washington, D.C., to see William Haddon. They carried with them a secret prototype of what they called the People Saver. It was a nylon air cushion that inflated on impact, and the instant Haddon saw it he was smitten. "Oh, he was ecstatic, just ecstatic," Claybrook recalls. "I think it was one of the most exciting moments of his life."
The air bag had been invented in the early fifties by a man named John Hetrick, who became convinced, after running his car into a ditch, that drivers and passengers would be much safer if they could be protected by some kind of air cushion. But how could one inflate it in the first few milliseconds of a crash? As he pondered the problem, Hetrick remembered a freak accident that had happened during the war, when he was in the Navy working in a torpedo-maintenance shop. Torpedos carry a charge of compressed air, and one day a torpedo covered in canvas accidentally released its charge. All at once, Hetrick recalled years later, the canvas "shot up into the air, quicker than you could blink an eye." Thus was the idea for the air bag born.
In its earliest incarnation, the air bag was a crude device; one preliminary test inadvertently killed a baboon, and there were widespread worries about the safety of detonating what was essentially a small bomb inside a car. (Indeed, as a result of numerous injuries to children and small adults, air bags have now been substantially depowered.) But to Haddon the People Saver was the embodiment of everything he believed in--it was the chlorine in the water, and it solved a problem that had been vexing him for years. The Haddonites had always insisted that what was generally called a crash was actually two separate events. The first collision was the initial contact between two automobiles, and in order to prevent the dangerous intrusion of one car into the passenger compartment of another, they argued, cars ought to be built with a protective metal cage around the front and back seats. The second collision, though, was even more important. That was the collision between the occupants of a car and the inside of their own vehicle. If the driver and his passengers were to survive the abrupt impact of a crash, they needed a second safety system, which carefully and gradually decelerated their bodies. The logical choice for that task was seat belts, but Haddon, with his background in public health, didn't trust safety measures that depended on an individual's active cooperation. "The biggest problem we had back then was that only about twelve per cent of the public used seat belts," Claybrook says. "They were terribly designed, and people didn't use them." With the air bag, there was no decision to make. The Haddonites called it a "technological vaccine," and attacked its doubters in Detroit for showing "an absence of moral and ethical leadership." The air bag, they vowed, was going to replace the seat belt. In "Unsafe at Any Speed," Nader wrote:
The seat belt should have been introduced in the twenties and rendered obsolete by the early fifties, for it is only the first step toward a more rational passenger restraint system which modern technology could develop and perfect for mass production. Such a system ideally would not rely on the active participation of the passenger to take effect; it would be the superior passive safety design which would come into use only when needed, and without active participation of the occupant. . . . Protection like this could be achieved by a kind of inflatable air bag restraint which would be actuated to envelop a passenger before a crash.
For the next twenty years, Haddon, Nader, and Claybrook were consumed by the battle to force a reluctant Detroit to make the air bag mandatory equipment. There were lawsuits, and heated debates, and bureaucratic infighting. The automakers, mindful of cost and other concerns, argued that the emphasis ought to be on seat belts. But, to the Haddonites, Detroit was hopelessly in the grip of the old paradigm on auto safety. His opponents, Haddon wrote, with typical hauteur, were like "Malinowski's natives in their approaches to the hazards out the reef which they did not understand." Their attitudes were "redolent of the extranatural, supernatural and the pre-scientific." In 1991, the Haddonites won. That year, a law was passed requiring air bags in every new car by the end of the decade. It sounded like a great victory. But was it?
V. HADDON'S MISTAKE
When Stephen Capoferri's Aerostar hit Robert Day's Jeep Wagoneer, Capoferri's seat belt lay loose across his hips and chest. His shoulder belt probably had about two inches of slack. At impact, his car decelerated, but Capoferri's body kept moving forward, and within thirty milliseconds the slack in his seat belts was gone. In the language of engineers, he "loaded" his restraints. Under the force of Capoferri's onrushing weight, his belts began to stretch--the fabric giving by as much as six inches. As his shoulder belt grew taut, it dug into his chest, compressing it by another two inches, and if you had seen Capoferri at the moment of maximum forward trajectory his shoulder belt around his chest would have looked like a rubber band around a balloon. Simultaneously, within those first few milliseconds, his air bag exploded and rose to meet him at more than a hundred miles per hour. Forty to fifty milliseconds after impact, it had enveloped his face, neck, and upper chest. A fraction of a second later, the bag deated. Capoferri was thrown back against his seat. Total time elapsed: one hundred milliseconds.
Would Capoferri have lived without an air bag? Probably. He would have stretched his seat belt so far that his head would have hit the steering wheel. But his belts would have slowed him down enough that he might only have broken his nose or cut his forehead or suffered a mild concussion. The other way around, however, with an air bag but not a seat belt, his fate would have been much more uncertain. In the absence of seat belts, air bags work best when one car hits another squarely, so that the driver pitches forward directly into the path of the oncoming bag. But Capoferri hit Day at a slight angle. The front-passenger side of the Aerostar sustained more damage than the driver's side, which means that without his belts holding him in place he would have been thrown away from the air bag off to the side, toward the rearview mirror or perhaps even the front-passenger "A" pillar. Capoferri's air bag protected him only because he was wearing his seat belt. Car-crash statistics show this to be the rule. Wearing a seat belt cuts your chances of dying in an accident by forty-three per cent. If you add the protection of an air bag, your fatality risk is cut by forty-seven per cent. But an air bag by itself reduces the risk of dying in an accident by just thirteen per cent.
That the effectiveness of an air bag depended on the use of a seat belt was a concept that the Haddonites, in those early days, never properly understood. They wanted the air bag to replace the seat belt when in fact it was capable only of supplementing it, and they clung to that belief, even in the face of mounting evidence to the contrary. Don Huelke, a longtime safety researcher at the University of Michigan, remembers being on an N.H.T.S.A. advisory committee in the early nineteen-seventies, when people at the agency were trying to come up with statistics for the public on the value of air bags. "Their estimates were that something like twenty-eight thousand people a year could be saved by the air bags," he recalls, "and then someone pointed out to them that there weren't that many driver fatalities in frontal crashes in a year. It was kind of like 'Oops.' So the estimates were reduced." In 1977, Claybrook became the head of N.H.T.S.A. and renewed the push for air bags. The agency's estimate now was that air bags would cut a driver's risk of dying in a crash by forty per cent--a more modest but still implausible figure. "In 1973, there was a study in the open literature, performed at G.M., that estimated that the air bag would reduce the fatality risk to an unbelted driver by eighteen per cent," Leonard Evans says. "N.H.T.S.A. had this information and dismissed it. Why? Because it was from the automobile industry."
The truth is that even today it is seat belts, not air bags, that are providing the most important new safety advances. Had Capoferri been driving a late-model Ford minivan, for example, his seat belt would have had what is called a pretensioner: a tiny explosive device that would have taken the slack out of the belt just after the moment of impact. Without the pretensioner, Stephen Kozak, an engineer at Ford, explains, "you start to accelerate before you hit the belt. You get the clothesline effect." With it, Capoferri's deceleration would have been a bit more gradual. At the same time, belts are now being designed which cut down on chest compression. Capoferri's chest wall was pushed in two inches, and had he been a much older man, with less resilient bones and cartilage, that two-inch compression might have been enough to fracture three or four ribs. So belts now "pay out" extra webbing after a certain point: as Capoferri stretched forward, his belt would have been lengthened by several inches, relieving the pressure on his chest. The next stage in seat-belt design is probably to offer car buyers the option of what is called a four-point belt--two shoulder belts that run down the chest, like suspenders attached to a lap belt. Ford showed a four-point prototype at the auto shows this spring, and early estimates are that it might cut fatality risk by another ten per cent--which would make seat belts roughly five times more effective in saving lives than air bags by themselves. "The best solution is to provide automatic protection, including air bags, as baseline protection for everyone, with seat belts as a supplement for those who will use them," Haddon wrote in 1984. In putting air bags first and seat belts second, he had things backward.
Robert Day suffered a very different kind of accident from Stephen Capoferri's: he was hit from the side, and the physics of a side-impact crash are not nearly so forgiving. Imagine, for instance, that you punched a brick wall as hard as you could. If your fist was bare, you'd break your hand. If you had a glove with two inches of padding, your hand would sting. If you had a glove with six inches of padding, you might not feel much of anything. The more energyabsorbing material--the more space--you can put between your body and the wall, the better off you are. An automobile accident is no different. Capoferri lived, in part, because he had lots of space between himself and Day's Wagoneer. Cars have steel rails connecting the passenger compartment with the bumper, and each of those rails is engineered with what are called convolutions--accordionlike folds designed to absorb, slowly and evenly, the impact of a collision. Capoferri's van was engineered with twenty-seven inches of crumple room, and at the speed he was travelling he probably used about twenty-one inches of that. But Day had four inches, no more, between his body and the door, and perhaps another five to six inches in the door itself. Capoferri hit the wall with a boxing glove. Day punched it with his bare hand.
Day's problems were compounded by the fact that he was not wearing his seat belt. The right-front fender of Capoferri's Aerostar struck his Wagoneer squarely on the driver's door, pushing the Jeep sidewise, and if Day had been belted he would have moved with his vehicle, away from the onrushing Aerostar. But he wasn't, and so the Jeep moved out from under him: within fifteen milliseconds, the four inches of space between his body and the side of the Jeep was gone. The impact of the Aerostar slammed the driver's door against his ribs and spleen.
Day could easily have been ejected from his vehicle at that point. The impact of Capoferri's van shattered the glass in Day's door, and a Wagoneer, like most sports-utility vehicles, has a low belt line--meaning that the side windows are so large that with the glass gone there's a hole big enough for an unrestrained body to fly through. This is what it means to be "thrown clear" of a crash, although when that phrase is used in the popular literature it is sometimes said as if it were a good thing, when of course to be "thrown clear" of a crash is merely to be thrown into some other hard and even more lethal object, like the pavement or a tree or another car. Day, for whatever reason, was not thrown clear, and in that narrow sense he was lucky. This advantage, however, amounted to little. Day's door was driven into him like a sledgehammer.
Would a front air bag have saved Robert Day? Not at all. He wasn't moving forward into the steering wheel. He was moving sidewise into the door. Some cars now have additional air bags that are intended to protect the head as it hits the top of the door frame in a side-impact crash. But Day didn't die of head injuries. He died of abdominal injuries. Conceivably, a side-impact bag might have offered his abdomen some slight protection. But Day's best chance of surviving the accident would have been to wear his seat belt. It would have held him in place in those first few milliseconds of impact. It would have preserved some part of the space separating him from the door, diminishing the impact of the Aerostar. Day made two mistakes that morning, then, the second of which was not buckling up. But this is a point on which the Haddonites were in error as well, because the companion to their obsession with air bags was the equally false belief that encouraging drivers to wear their seat belts was a largely futile endeavor.
In the early nineteen-seventies, just at the moment when Haddon and Claybrook were pushing hardest for air bags, the Australian state of Victoria passed the world's first mandatory seat-belt legislation, and the law was an immediate success. With an aggressive public-education campaign, rates of seat-belt use jumped from twenty to eighty per cent. During the next several years, Canada, New Zealand, Germany, France, and others followed suit. But a similar movement in the United States in the early seventies stalled. James Gregory, who headed the N.H.T.S.A. during the Ford years, says that if Nader had advocated mandatory belt laws they might have carried the day. But Nader, then at the height of his fame and influence, didn't think that belt laws would work in this country. "You push mandatory belts, you might get a very adverse reaction," Nader says today of his thinking back then. "Mindless reaction. And how many tickets do you give out a day? What about back seats? At what point do you require a seat belt for small kids? And it's administratively difficult when people cross state lines. That's why I always focussed on the passive. We have a libertarian streak that Europe doesn't have." Richard Peet, a congressional staffer who helped draft legislation in Congress giving states financial incentives to pass belt laws, founded an organization in the early seventies to promote belt-wearing. "After I did that, some of the people who worked for Nader's organization went after me, saying that I was selling out the air-bag movement," Peet recalls. "That pissed me off. I thought the safety movement was the safety movement and we were all working together for common aims." In "Auto Safety," a history of auto-safety regulation, John Graham, of the Harvard School of Public Health, writes of Claybrook's time at the N.H.T.S.A.:
Her lack of aggressive leadership on safety belt use was a major source of irritation among belt use advocates, auto industry officials, and officials from state safety programs. They saw her pessimistic attitudes as a self-fulfilling prophecy. One of Claybrook's aides at N.H.T.S.A. who worked with state agencies acknowledged: "It is fair to say that Claybrook never made a dedicated effort to get mandatory belt-use laws." Another aide offered the following explanation of her philosophy: "Joan didn't do much on mandatory belt use because her primary interests were in vehicle regulation. She was fond of saying 'it is easier to get twenty auto companies to do something than to get 200 million Americans to do something.' "
Claybrook says that while at the N.H.T.S.A. she mailed a letter to all the state governors encouraging them to pass mandatory seat-belt legislation, and "not one governor would help us." It is clear that she had low expectations for her efforts. Even as late as 1984, Claybrook was still insisting that trying to encourage seat-belt use was a fool's errand. "It is not likely that mandatory seat belt usage laws will be either enacted or found acceptable to the public in large numbers," Claybrook wrote. "There is massive public resistance to adult safety belt usage." In the very year her words were published, however, a coalition of medical groups finally managed to pass the country's first mandatory seat-belt law, in New York, and the results were dramatic. One state after another soon did likewise, and public opinion about belts underwent what the pollster Gary Lawrence has called "one of the most phenomenal shifts in attitudes ever measured." Americans, it turned out, did not have a cultural aversion to seat belts. They just needed some encouragement. "It's not a big Freudian thing whether you buckle up or not," says B. J. Campbell, a former safety researcher at the University of North Carolina, who was one of the veterans of the seat-belt movement. "It's just a habit, and either you're in the habit of doing it or you're not."
Today, belt-wearing rates in the United States are just over seventy per cent, and every year they inch up a little more. But if the seat-belt campaign had begun in the nineteen-seventies, instead of the nineteen-eighties, the use rate in this country would be higher right now, and in the intervening years an awful lot of car accidents might have turned out differently, including one at the intersection of Egg Harbor Road and Fleming Pike.
VI. CRASH TEST
William Haddon died in 1985, of kidney disease, at the age of fty-eight. From the time he left government until his death, he headed an influential research group called the Insurance Institute for Highway Safety.
Joan Claybrook left the N.H.T.S.A. in 1980 and went on to run Ralph Nader's advocacy group Public Citizen, where she has been a powerful voice on auto safety ever since. In an interview this spring, Claybrook listed the things that she would do if she were back as the country's traffic-safety czar. "I'd issue a rollover standard, and have a thirty-miles-per-hour test for air bags," she said. "Upgrade the seating structure. Integrate the head restraint better. Upgrade the tire-safety standard. Provide much more consumer information. And also do more crash testing, whether it's rollover or offset crash testing and rear-crash testing." The most effective way to reduce automobile fatalities, she went on, would be to focus on rollovers--lowering the center of gravity in S.U.V.s, strengthening doors and roofs. In the course of outlining her agenda, Claybrook did not once mention the words "seat belt."
Ralph Nader, for his part, spends a great deal of time speaking at college campuses about political activism. He remains a distinctive figure, tall and slightly stooped, with a bundle of papers under his arm. His interests have widened in recent years, but he is still passionate about his first crusade. "Haddon was all business--never made a joke, didn't tolerate fools easily," Nader said not long ago, when he was asked about the early days. He has a deep, rumbling press-conference voice, and speaks in sentence fragments, punctuated with long pauses. "Very dedicated. He influenced us all." The auto-safety campaign, he went on, "was a spectacular success of the federal-government mission. When the regulations were allowed, they worked. And it worked because it deals with technology rather than human behavior." Nader had just been speaking in Detroit, at Wayne State University, and was on the plane back to Washington, D.C. He was folded into his seat, his knees butting up against the tray table in front of him, and from time to time he looked enviously over at the people stretching their legs in the exit row. Did he have any regrets? Yes, he said. He wished that back in 1966 he had succeeded in keeping the criminal-penalties provision in the auto-safety bill that Congress passed that summer. "That would have gone right to the executive suite," he said.
There were things, he admitted, that had puzzled him over the years. He couldn't believe the strides that had been made against drunk driving. "You've got to hand it to MADD. It took me by surprise. The drunk-driving culture is deeply embedded. I thought it was too ingrained." And then there was what had happened with seat belts. "Use rates are up sharply," he said. "They're a lot higher than I thought they would be. I thought it would be very hard to hit fifty per cent. The most unlikely people now buckle up." He shook his head, marvelling. He had always been a belt user, and recommends belts to others, but who knew they would catch on?
Other safety activists, who had seen what had happened to driver behavior in Europe and Australia in the seventies, weren't so surprised, of course. But Nader was never the kind of activist who had great faith in the people whose lives he was trying to protect.He and the other Haddonites were sworn to a theory that said that the way to prevent typhoid is to chlorinate the water, even though there are clearly instances where chlorine will not do the trick. This is the blindness of ideology. It is what happens when public policy is conducted by those who cannot conceive that human beings will do willingly what is in their own interest. What was the truly poignant thing about Robert Day, after all? Not just that he was a click away from saving his only life but that his son, sitting right next to him, was wearinghis seat belt. In the Days' Jeep Wagoneer, a fight that experts assumed was futile was already half won.
One day this spring, a team of engineers at Ford conducted a crash test on a 2003 Mercury. This was at Ford's test facility in Dearborn, a long, rectangular white steel structure, bisected by a five-hundred-and-fifty-foot runway. Ford crashes as many as two cars a day there, ramming them with specially designed sleds or dragging them down the runway with a cable into a twenty-foot cube of concrete. Along the side of the track were the twisted hulks of previous experiments: a Ford Focus wagon up on blocks; a mangled BMW S.U.V. that had been crashed, out of competitive curiosity, the previous week; a Ford Explorer that looked as though it had been thrown into a blender. In a room at the back, there were fifty or sixty crash-test dummies, propped up on tables and chairs, in a dozen or more configurations--some in Converse sneakers, some in patent-leather shoes, some without feet and legs at all, each one covered with multiple electronic sensors, all designed to measure the kinds of injuries possible in a crash.
The severity of any accident is measured not by the speed of the car at the moment of impact but by what is known as the delta V--the difference between how fast a car is going at the moment of impact and how fast it is moving after the accident. Capoferri's delta V was about twenty-five miles per hour, seven miles per hour higher than the accident average. The delta V of the Mercury test, though, was to be thirty-five miles per hour, which is the equivalent of hitting an identical parked car at seventy miles per hour. The occupants were two adult-size dummies in orange shorts. Their faces were covered in wet paint, red above the upper jaw and blue below it, to mark where their faces hit on the air bag. The back seat carried a full cargo of computers and video cameras. A series of yellow lights began flashing. An engineer stood to the side, holding an abort button. Then a bank of stage lights came on, directly above the point of impact. Sixteen video cameras began rolling. A voice came over a loudspeaker, counting down: five, four, three... There was a blur as the Mercury swept by--then bang, as the car hit the barrier and the dual front air bags exploded. A plastic light bracket skittered across the floor, and the long warehouse was suddenly still.
It was a moment of extraordinary violence, yet it was also strangely compelling. This was performance art, an abstract and ritualized rendering of reality, given in a concrete-and-steel gallery. The front end of the Mercury was perfectly compressed; the car was thirty inches shorter than it had been a moment before. The windshield was untouched. The "A" pillars and roofline were intact. The passenger cabin was whole. In the dead center of the deflated air bags, right where they were supposed to be, were perfect blue-and-red paint imprints of the dummies' faces.
But it was only a performance, and that was the hard thing to remember. In the real world, people rarely have perfectly square frontal collisions, sitting ramrod straight and ideally positioned; people rarely have accidents that so perfectly showcase the minor talents of the air bag. A crash test is beautiful. In the sequence we have all seen over and over in automobile commercials, the dummy rises magically to meet the swelling cushion, always in slow motion, the bang replaced by Mozart, and on those theatrical terms the dowdy fabric strips of the seat belt cannot compete with the billowing folds of the air bag. This is the image that seduced William Haddon when the men from Eaton, Yale showed him the People Saver so many years ago, and the image that warped auto safety for twenty long years. But real accidents are seldom like this. They are ugly and complicated, shaped by the messy geometries of the everyday world and by the infinite variety of human frailty. A man looks away from the road at the wrong time. He does not see what he ought to see. Another man does not have time to react. The two cars collide, but at a slight angle. There is a two-hundred-and-seventy-degree spin. There is skidding and banging. A belt presses deep into one man's chest--and that saves his life. The other man's unrestrained body smashes against the car door--and that kills him.
"They left pretty early, about eight, nine in the morning," Susan Day, Robert Day's widow, recalls. "I was at home when the hospital called. I went to see my son first. He was pretty much O.K., had a lot of bruising. Then they came in and said, 'Your husband didn't make it.'"
The Mosquito Killer
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 2, 2001
ANNALS OF PUBLIC HEALTH
Millions of people owe their lives to Fred Soper.
Why isn't he a hero?
1.
In the late nineteen-thirties, a chemist who worked for the J.R. Geigy company, in Switzerland, began experimenting with an odorless white crystalline powder called dichloro-diphenyl-trichloroethane. The chemist, Paul MĂĽller, wanted to find a way to protect woollens against moths, and his research technique was to coat the inside of a glass box with whatever chemical he was testing, and then fill it with houseflies. To his dismay, the flies seemed unaffected by the new powder. But, in one of those chance decisions on which scientific discovery so often turns, he continued his experiment overnight--and in the morning all the flies were dead. He emptied the box, and put in a fresh batch of flies. By the next morning, they, too, were dead. He added more flies, and then a handful of other insects. They all died. He scrubbed the box with an acetone solvent, and repeated the experiment with a number of closely related compounds that he had been working with. The flies kept dying. Now he was excited: had he come up with a whole line of potent new insecticides? As it turned out, he hadn't. The new candidate chemicals were actually useless. To his amazement, what was killing the flies in the box were scant traces of the first compound, dichloro-diphenyl-trichloroethane--or, as it would come to be known, DDT.
In 1942, Geigy sent a hundred kilograms of the miracle powder to its New York office. The package lay around, undisturbed, until another chemist, Victor Froelicher, happened to translate the extraordinary claims for DDT into English, and then passed on a sample to the Department of Agriculture, which in turn passed it on to its entomology research station, in Orlando, Florida. The Orlando laboratory had been charged by the Army to develop new pesticides, because the military, by this point in the war, was desperate for a better way to protect its troops against insect-borne disease. Typhus--the lethal fever spread by lice--had killed millions of people during and after the First World War and was lurking throughout the war zones. Worse, in almost every theatre of operations, malaria-carrying mosquitoes were causing havoc. As Robert Rice recounted in this magazine almost fifty years ago, the First Marine Division had to be pulled from combat in 1942 and sent to Melbourne to recuperate because, out of seventeen thousand men, ten thousand were incapacitated with malarial headaches, fevers, and chills. Malaria hit eighty-five per cent of the men holding onto Bataan. In fact, at any one time in the early stages of the war, according to General Douglas MacArthur, two-thirds of his troops in the South Pacific were sick with malaria. Unless something was done, MacArthur complained to the malariologist Paul Russell, it was going to be "a long war." Thousands of candidate insecticides were tested at Orlando, and DDT was by far the best.
To gauge a chemical's potential against insects, the Orlando researchers filled a sleeve with lice and a candidate insecticide, slipped the sleeve over a subject's arm, and taped it down at both ends. After twenty-four hours, the dead lice were removed and fresh lice were added. A single application of DDT turned out to kill lice for a month, almost four times longer than the next-best insecticide. As Rice described it, researchers filled twelve beakers with mosquito larvae, and placed descending amounts of DDT in each receptacle--with the last beaker DDT free. The idea was to see how much chemical was needed to kill the mosquitoes. The mosquito larvae in every beaker died. Why? Because just the few specks of chemical that floated through the air and happened to land in the last beaker while the experiment was being set up were enough to kill the mosquitoes. Quickly, a field test was scheduled. Two duck ponds were chosen, several miles apart. One was treated with DDT. One was not. Spraying was done on a day when the wind could not carry the DDT from the treated to the untreated pond. The mosquito larvae in the first pond soon died. But a week later mosquito larvae in the untreated pond also died: when ducks from the first pond visited the second pond, there was enough DDT residue on their feathers to kill mosquitoes there as well.
The new compound was administered to rabbits and cats. Rice tells how human volunteers slathered themselves with it, and sat in vaults for hours, inhaling the fumes. Tests were done to see how best to apply it. "It was put in solution or suspension, depending on what we were trying to do," Geoffrey Jeffery, who worked on DDT at the Tennessee Valley Authority, recalls. "Sometimes we'd use some sort of petroleum-based carrier, even diesel oil, or add water to a paste or concentration and apply it on the wall with a Hudson sprayer." Under conditions of great secrecy, factories were set up, to manufacture the new chemical by the ton. It was rushed to every Allied theatre. In Naples, in 1944, the Army averted a catastrophic typhus epidemic by "dusting" more than a million people with DDT powder. The Army Air Force built DDT "bombs," attaching six-hundred-and-twenty-five-gallon tanks to the underside of the wings of B-25s and C-47s, and began spraying Pacific beachheads in advance of troop arrivals. In Saipan, invading marines were overtaken by dengue, a debilitating fever borne by the Aedes variety of mosquito. Five hundred men were falling sick every day, each incapacitated for four to five weeks. The medical officer called in a DDT air strike that saturated the surrounding twenty-five square miles with nearly nine thousand gallons of five-per-cent DDT solution. The dengue passed. The marines took Saipan.
It is hard to overestimate the impact that DDT's early success had on the world of public health. In the nineteen-forties, there was still malaria in the American South. There was malaria throughout Europe, Asia, and the Caribbean. In India alone, malaria killed eight hundred thousand people a year. When, in 1920, William Gorgas, the man who cleansed the Panama Canal Zone of malaria, fell mortally ill during a trip through England, he was knighted on his deathbed by King George V and given an official state funeral at St. Paul's Cathedral--and this for an American who just happened to be in town when he died. That is what it meant to be a malaria fighter in the first half of the last century. And now there was a chemical--the first successful synthetic pesticide--that seemed to have an almost magical ability to kill mosquitoes. In 1948, MĂĽller won the Nobel Prize for his work with DDT, and over the next twenty years his discovery became the centerpiece of the most ambitious public-health campaign in history.
Today, of course, DDT is a symbol of all that is dangerous about man's attempts to interfere with nature. Rachel Carson, in her landmark 1962 book, "Silent Spring," wrote memorably of the chemical's environmental consequences, how its unusual persistence and toxicity had laid waste to wildlife and aquatic ecosystems. Only two countries--India and China--continue to manufacture the substance, and only a few dozen more still use it. In May, at the Stockholm Convention on Persistent Organic Pollutants, more than ninety countries signed a treaty, placing DDT on a restricted-use list, and asking all those still using the chemical to develop plans for phasing it out entirely. On the eve of its burial, however--and at a time when the threat of insect-borne disease around the world seems to be resurgent--it is worth remembering that people once felt very differently about DDT, and that between the end of the Second World War and the beginning of the nineteen-sixties it was considered not a dangerous pollutant but a lifesaver. The chief proponent of that view was a largely forgotten man named Fred Soper, who ranks as one of the unsung heroes of the twentieth century. With DDT as his weapon, Soper almost saved the world from one of its most lethal afflictions. Had he succeeded, we would not today be writing DDT's obituary. We would view it in the same heroic light as penicillin and the polio vaccine.
2.
Fred Soper was a physically imposing man. He wore a suit, it was said, like a uniform. His hair was swept straight back from his forehead. His eyes were narrow. He had large wire-rimmed glasses, and a fastidiously maintained David Niven mustache. Soper was born in Kansas in 1893, received a doctorate from the Johns Hopkins School of Public Health, and spent the better part of his career working for the Rockefeller Foundation, which in the years before the Second World War--before the establishment of the United Nations and the World Health Organization--functioned as the world's unofficial public-health directorate, using its enormous resources to fight everything from yellow fever in Colombia to hookworm in Thailand.
In those years, malaria warriors fell into one of two camps. The first held that the real enemy was the malaria parasite--the protozoan that mosquitoes pick up from the blood of an infected person and transmit to others. The best way to break the chain of infection, this group argued, was to treat the sick with antimalarial drugs, to kill the protozoan so there was nothing for mosquitoes to transmit. The second camp held, to the contrary, that the mosquito was the real enemy, since people would not get malaria in the first place if there were no mosquitoes around to bite them. Soper belonged to the latter group, and his special contribution was to raise the killing of mosquitoes to an art. Gorgas, Soper's legendary predecessor, said that in order to fight malaria you had to learn to think like a mosquito. Soper disagreed. Fighting malaria, he said, had very little to do with the intricacies of science and biology. The key was learning to think like the men he hired to go door-to-door and stream-to-stream, killing mosquitoes. His method was to apply motivation, discipline, organization, and zeal, in understanding human nature. Fred Soper was the General Patton of entomology.
While working in South America in 1930, Soper had enforced a rigorous protocol for inspecting houses for mosquito infestation, which involved checking cisterns and climbing along roof gutters. (He pushed himself so hard perfecting the system in the field that he lost twenty-seven pounds in three months.) He would map an area to be cleansed of mosquitoes, give each house a number, and then assign each number to a sector. A sector, in turn, would be assigned to an inspector, armed with the crude pesticides then available; the inspector's schedule for each day was planned to the minute, in advance, and his work double-checked by a supervisor. If a supervisor found a mosquito that the inspector had missed, he received a bonus. And if the supervisor found that the inspector had deviated by more than ten minutes from his preassigned schedule the inspector was docked a day's pay. Once, in the state of Rio de Janeiro, a large ammunition dump--the NiterĂłi Arsenal--blew up. Soper, it was said, heard the explosion in his office, checked the location of the arsenal on one of his maps, verified by the master schedule that an inspector was at the dump at the time of the accident, and immediately sent condolences and a check to the widow. The next day, the inspector showed up for work, and Soper fired him on the spot--for being alive. Soper, in one memorable description, "seemed equally capable of browbeating man or mosquito." He did not engage in small talk. In 1973, at Soper's eightieth-birthday party, a former colleague recounted how much weight he had lost working for Soper; another told a story of how Soper looked at him uncomprehendingly when he asked to go home to visit his ailing wife; a third spoke of Soper's betting prowess. "He was very cold and very formal," remembers Andrew Spielman, a senior investigator in tropical disease at the Harvard School of Public Health and the author, with Michael D'Antonio, of the marvellous new book "Mosquito: A Natural History of Our Most Persistent and Deadly Foe." "He always wore a suit and tie. With that thin little mustache and big long upper lip, he scared the hell out of me."
One of Soper's greatest early victories came in Brazil, in the late nineteen-thirties, when he took on a particularly vicious strain of mosquito known as Anopheles gambiae. There are about twenty-five hundred species of mosquito in the world, each with its own habits and idiosyncrasies--some like running water, some like standing water, some bite around the ankles, some bite on the arms, some bite indoors, some bite outdoors--but only mosquitoes of the genus Anopheles are capable of carrying the human malaria parasite. And, of the sixty species of Anopheles that can transmit malaria, gambiae is the variety best adapted to spreading the disease. In California, there is a strain of Anopheles known as freeborni, which is capable of delivering a larger dose of malaria parasite than gambiae ever could. But freeborni is not a good malaria vector, because it prefers animals to people. Gambiae, by contrast, bites humans ninety-five per cent of the time. It has long legs and yellow-and-black spotted wings. It likes to breed in muddy pools of water, even in a water-filled footprint. And, unlike many mosquitoes, it is long-lived, meaning that once it has picked up the malaria parasite it can spread the protozoan to many others. Gambiae gathers in neighborhoods in the evenings, slips into houses at dusk, bites quietly and efficiently during the night, digests its "blood meal" while resting on the walls of the house, and then slips away in the morning. In epidemiology, there is a concept known as the "basic reproduction number," or BRN, which refers to the number of people one person can infect with a contagious disease. The number for H.I.V., which is relatively difficult to transmit, is just above one. For measles, the BRN is between twelve and fourteen. But with a vector like gambiae in the picture the BRN for malaria can be more than a hundred, meaning that just one malarious person can be solely responsible for making a hundred additional people sick. The short answer to the question of why malaria is such an overwhelming problem in Africa is that gambiae is an African mosquito.
In March, 1930, a Rockefeller Foundation entomologist named Raymond Shannon was walking across tidal flats to the Potengi River, in Natal, Brazil, when he noticed, to his astonishment, two thousand gambiae larvae in a pool of water, thousands of miles from their homeland. Less than a kilometre away was a port where French destroyers brought mail across the Atlantic from Africa, and Shannon guessed that the mosquito larvae had come over, fairly recently, aboard one of the mail ships. He notified Soper, who was his boss, and Soper told Brazilian officials to open the dykes damming the tidal flats, because salt water from the ocean would destroy the gambiae breeding spots. The government refused. Over the next few years, there were a number of small yet worrisome outbreaks of malaria, followed by a few years of drought, which kept the problem in check. Then, in 1938, the worst malaria epidemic in the history of the Americas broke out. Gambiae had spread a hundred and fifty miles along the coast and inland, infecting a hundred thousand people and killing as many as twenty thousand. Soper was called in. This was several years before the arrival of DDT, so he brought with him the only tools malariologists had in those years: diesel oil and an arsenic-based mixture called Paris green, both of which were spread on the pools of water where gambiae larvae bred; and pyrethrum, a natural pesticide made from a variety of chrysanthemum, which was used to fumigate buildings. Four thousand men were put at his disposal. He drew maps and divided up his troops. The men wore uniforms, and carried flags to mark where they were working, and they left detailed written records of their actions, to be reviewed later by supervisors. When Soper discovered twelve gambiae in a car leaving an infected area, he set up thirty de-insectization posts along the roads, spraying the interiors of cars and trucks; seven more posts on the rail lines; and defumigation posts at the ports and airports. In Soper's personal notes, now housed at the National Library of Medicine, in Bethesda, there is a cue card, on which is typed a quotation from a veteran of the Rockefeller Foundation's efforts, in the early twentieth century, to eradicate hookworm. "Experience proved that the best way to popularize a movement so foreign to the customs of the people . . . was to prosecute it as though it were the only thing in the universe left undone." It is not hard to imagine the card tacked above Soper's desk in Rio for inspiration: his goal was not merely to cripple the population of gambiae, since that would simply mean that they would return, to kill again. His goal was to eliminate gambiae from every inch of the region of Brazil that they had colonized--an area covering some eighteen thousand square miles. It was an impossible task. Soper did it in twenty-two months.
3.
While DDT was being tested in Orlando, Soper was in North Africa with the United States Typhus Commission, charged with preventing the kind of louse-spread typhus epidemics that were so devastating during the First World War. His tool of choice was a delousing powder called MYL. Lice live in the folds of clothing, and a previous technique had been to treat the clothing after people had disrobed. But that was clearly not feasible in Muslim cities like Cairo and Algiers, nor was it practical for large-scale use. So Soper devised a new technique. He had people tie their garments at the ankles and wrists, and then he put the powder inside a dust gun, of the sort used in gardening, and blew it down the collar, creating a balloon effect. "We were in Algiers, waiting for Patton to get through Sicily," Thomas Aitken, an entomologist who worked with Soper in those years, remembers. "We were dusting people out in the countryside. This particular day, a little old Arab man, only about so high, came along with his donkey and stopped to talk to us. We told him what we were doing, and we dusted him. The next day, he comes by again and says that that had been the first time in his life that he had ever been able to sleep through the night."
In December of 1943, the typhus team was dispatched to Naples, where in the wake of the departing German Army the beginnings of a typhus epidemic had been detected. The rituals of Cairo were repeated, only this time the typhus fighters, instead of relying on MYL (which easily lost its potency), were using DDT. Men with dusters careered through the narrow cobblestoned streets of the town, amid the wreckage of the war, delousing the apartment buildings of typhus victims. Neapolitans were dusted as they came out of the railway stations in the morning, and dusted in the streets, and dusted in the crowded grottoes that served as bomb shelters beneath the city streets. In the first month, more than 1.3 million people were dusted, saving countless lives.
Soper's diary records a growing fascination with this new weapon. July 25, 1943: "Lunch with L.L. Williams and Justin Andrews. L.L. reports that he has ordered 10,000 lbs of Neocid [DDT]and that Barber reports it to be far superior to [Paris Green]for mosquitoes." February 25, 1944: "Knipling visits laboratory. Malaria results [for DDT]ARE FANTASTIC." When Rome fell, in mid-1944, Soper declared that he wanted to test DDT in Sardinia, the most malarious part of Italy. In 1947, he got his wish. He pulled out his old organization charts from Brazil. The island--a rocky, mountainous region the size of New Hampshire, with few roads--was mapped and divided up hierarchically, the smallest unit being the area that could be covered by a sprayer in a week. Thirty-three thousand people were hired. More than two hundred and eighty-six tons of DDT were acquired. Three hundred and thirty-seven thousand buildings were sprayed. The target Anopheles was labranchiae, which flourishes not just in open water but also in the thick weeds that surround the streams and ponds and marshes of Sardinia. Vegetation was cut back, and a hundred thousand acres of swampland were drained. Labranchiae larvae were painstakingly collected and counted and shipped to a central laboratory, where precise records were kept of the status of the target vector. In 1946, before the campaign started, there were seventy-five thousand malaria cases on the island. In 1951, after the campaign finished, there were nine.
"The locals regarded this as the best thing that had ever happened to them," Thomas Aitken says. He had signed on with the Rockefeller Foundation after the war, and was one of the leaders of the Sardinian effort. "The fact that malaria was gone was welcome," he went on. "But also the DDT got rid of the houseflies. Sardinian houses were made of stone. The wires for the lights ran along the walls near the ceiling. And if you looked up at the wires they were black with housefly droppings from over the years. And suddenly the flies disappeared." Five years ago, Aitken says, he was invited back to Sardinia for a celebration to mark the forty-fifth anniversary of malaria's eradication from the island. "There was a big meeting at our hotel. The public was invited, as well as a whole bunch of island and city officials, the mayor of Cagliari, and representatives of the Italian government. We all sat on a dais, at the side of the room, and I gave a speech there, in Italian, and when I finished everybody got up and clapped their hands and was shouting. It was very embarrassing. I started crying. I couldn't help it. Just reminiscing now . . ."
Aitken is a handsome, courtly man of eighty-eight, lean and patrician in appearance. He lives outside New Haven, in an apartment filled with art and furniture from his time in Sardinia. As he thought back to those years, there were tears in his eyes, and at that moment it was possible to appreciate the excitement that gripped malariologists in the wake of the Second World War. The old-school mosquito men called themselves mud-hen malariologists, because they did their job in swamps and ditches and stagnant pools of water. Paris green and pyrethrum were crude insecticides that had to be applied repeatedly; pyrethrum killed only those mosquitoes that happened to be in the room when you were spraying. But here, seemingly, was a clean, pure, perfectly modern weapon. You could spray a tiny amount on a wall, and that single application would kill virtually every mosquito landing on that surface for the next six months. Who needed a standing army of inspectors anymore? Who needed to slog through swamps? This was an age of heroics in medicine. Sabin and Salk were working on polio vaccines with an eye to driving that disease to extinction. Penicillin was brand new, and so effective that epidemiologists were dreaming of an America without venereal disease. The extinction of smallpox, that oldest of scourges, seemed possible. All the things that we find sinister about DDT today--the fact that it killed everything it touched, and kept on killing everything it touched--were precisely what made it so inspiring at the time. "The public-health service didn't pay us a lot," says McWilson Warren, who spent the early part of his career fighting malaria in the Malaysian jungle. "So why were we there? Because there was something so wonderful about being involved with people who thought they were doing something more important than themselves." In the middle of the war, Soper had gone to Egypt, and warned the government that it had an incipient invasion of gambiae. The government ignored him, and the next year the country was hit with an epidemic that left more than a hundred thousand dead. In his diary, Soper wrote of his subsequent trip to Egypt, "In the afternoon to the Palace where Mr. Jacobs presents me to His Majesty King Faruk. The King says that he is sorry to know that measures I suggested last year were not taken at that time." Soper had triumphed over gambiae in Brazil, driven lice from Cairo and Naples, and had a weapon, DDT, that seemed like a gift from God--and now kings were apologizing to him. Soper started to dream big: Why not try to drive malaria from the entire world?
4.
Fred Soper's big idea came to be known as the Global Malaria Eradication Programme. In the early nineteen-fifties, Soper had been instrumental in getting the Brazilian malariologist Marcolino Candau--whom he had hired during the anti-gambiae campaign of the nineteen-thirties--elected as director-general of the World Health Organization, and, in 1955, with Candau's help, Soper pushed through a program calling on all member nations to begin a rigorous assault on any malaria within their borders. Congress was lobbied, and John Kennedy, then a senator, became an enthusiastic backer. Beginning in 1958, the United States government pledged the equivalent of billions in today's dollars for malaria eradication--one of the biggest commitments that a single country has ever made to international health. The appeal of the eradication strategy was its precision. The idea was not to kill every Anopheles mosquito in a given area, as Soper had done with gambiae in Brazil. That was unnecessary. The idea was to use DDT to kill only those mosquitoes which were directly connected to the spread of malaria--only those which had just picked up the malaria parasite from an infected person and were about to fly off and infect someone else. When DDTis used for this purpose, Spielman writes in "Mosquito," "it is applied close to where people sleep, on the inside walls of houses. After biting, the mosquitoes generally fly to the nearest vertical surface and remain standing there for about an hour, anus down, while they drain the water from their gut contents and excrete it in a copious, pink-tinged stream. If the surfaces the mosquitoes repair to are coated by a poison that is soluble in the wax that covers all insects' bodies, the mosquitoes will acquire a lethal dose." Soper pointed out that people who get malaria, and survive, generally clear their bodies of the parasite after three years. If you could use spraying to create a hiatus during which minimal transmission occurred--and during which anyone carrying the parasite had a chance to defeat it--you could potentially eradicate malaria. You could stop spraying and welcome the mosquitoes back, because there would be no more malaria around for them to transmit. Soper was under no illusions about how difficult this task would be. But, according to his calculations, it was technically possible, if he and his team achieved eighty-per-cent coverage--if they sprayed eight out of every ten houses in infected areas.
Beginning in the late fifties, DDT was shipped out by the ton. Training institutes were opened. In India alone, a hundred and fifty thousand people were hired. By 1960, sixty-six nations had signed up. "What we all had was a handheld pressure sprayer of three-gallon capacity," Jesse Hobbs, who helped run the eradication effort in Jamaica in the early sixties, recalls. "Generally, we used a formulation that was water wettable, meaning you had powder you mixed with water. Then you pressurized the tank. The squad chief would usually have notified the household some days before. The instructions were to take the pictures off the wall, pull everything away from the wall. Take the food and eating utensils out of the house. The spray man would spray with an up-and-down movement--at a certain speed, according to a pattern. You started at a certain point and sprayed the walls and ceiling, then went outside to spray the eaves of the roof. A spray man could cover ten to twelve houses a day. You were using about two hundred milligrams per square foot of DDT, which isn't very much, and it was formulated in a way that you could see where you sprayed. When it dried, it left a deposit, like chalk. It had a bit of a chlorine smell. It's not perfume. It's kind of like swimming-pool water. People were told to wait half an hour for the spray to dry, then they could go back." The results were dramatic. In Taiwan, much of the Caribbean, the Balkans, parts of northern Africa, the northern region of Australia, and a large swath of the South Pacific, malaria was eliminated. Sri Lanka saw its cases drop to about a dozen every year. In India, where malaria infected an estimated seventy-five million and killed eight hundred thousand every year, fatalities had dropped to zero by the early sixties. Between 1945 and 1965, DDT saved millions--even tens of millions--of lives around the world, perhaps more than any other man-made drug or chemical before or since.
What DDT could not do, however, was eradicate malaria entirely. How could you effectively spray eighty per cent of homes in the Amazonian jungle, where communities are spread over hundreds of thousands of highly treacherous acres? Sub-Saharan Africa, the most malarious place on earth, presented such a daunting logistical challenge that the eradication campaign never really got under way there. And, even in countries that seemed highly amenable to spraying, problems arose. "The rich had houses that they didn't want to be sprayed, and they were giving bribes," says Socrates Litsios, who was a scientist with the W.H.O. for many years and is now a historian of the period. "The inspectors would try to double their spraying in the morning so they wouldn't have to carry around the heavy tanks all day, and as a result houses in the afternoon would get less coverage. And there were many instances of corruption with insecticides, because they were worth so much on the black market. People would apply diluted sprays even when they knew they were worthless." Typical of the logistical difficulties is what happened to the campaign in Malaysia. In Malaysian villages, the roofs of the houses were a thatch of palm fronds called atap. They were expensive to construct, and usually lasted five years. But within two years of DDT spraying the roofs started to fall down. As it happened, the atap is eaten by caterpillar larvae, which in turn are normally kept in check by parasitic wasps. But the DDT repelled the wasps, leaving the larvae free to devour the atap. "Then the Malaysians started to complain about bedbugs, and it turns out what normally happens is that ants like to eat bedbug larvae," McWilson Warren said. "But the ants were being killed by the DDT and the bedbugs weren't--they were pretty resistant to it. So now you had a bedbug problem." He went on, "The DDT spray teams would go into villages, and no one would be at home and the doors would be locked and you couldn't spray the house. And, understand, for that campaign to work almost every house had to be sprayed. You had to have eighty-per-cent coverage. I remember there was a malaria meeting in '62 in Saigon, and the Malaysians were saying that they could not eradicate malaria. It was not possible. And everyone was arguing with them, and they were saying, 'Look, it's not going to work.' And if Malaysia couldn't do it--and Malaysia was one of the most sophisticated places in the region--who could?"
At the same time, in certain areas DDT began to lose its potency. DDT kills by attacking a mosquito's nervous system, affecting the nerve cells so that they keep firing and the insect goes into a spasm, lurching, shuddering, and twitching before it dies. But in every population of mosquitoes there are a handful with a random genetic mutation that renders DDT nontoxic--that prevents it from binding to nerve endings. When mass spraying starts, those genetic outliers are too rare to matter. But, as time goes on, they are the only mosquitoes still breeding, and entire new generations of insects become resistant. In Greece, in the late nineteen-forties, for example, a malariologist noticed Anopheles sacharovi mosquitoes flying around a room that had been sprayed with DDT. In time, resistance began to emerge in areas where spraying was heaviest. To the malaria warriors, it was a shock. "Why should they have known?" Janet Hemingway, an expert in DDT resistance at the University of Wales in Cardiff, says. "It was the first synthetic insecticide. They just assumed that it would keep on working, and that the insects couldn't do much about it." Soper and the malariologist Paul Russell, who was his great ally, responded by pushing for an all-out war on malaria. We had to use DDT, they argued, or lose it. "If countries, due to lack of funds, have to proceed slowly, resistance is almost certain to appear and eradication will become economically impossible," Russell wrote in a 1956 report. "TIME IS OF THE ESSENCE because DDT resistance has appeared in six or seven years." But, with the administrative and logistical problems posed by the goal of eighty-per-cent coverage, that deadline proved impossible to meet.
5.
In 1963, the money from Congress ran out. Countries that had been told they could wipe out malaria in four years--and had diverted much of their health budgets to that effort--grew disillusioned as the years dragged on and eradication never materialized. Soon, they put their money back into areas that seemed equally pressing, like maternal and child health. Spraying programs were scaled back. In those countries where the disease had not been completely eliminated, malaria rates began to inch upward. In 1969, the World Health Organization formally abandoned global eradication, and in the ensuing years it proved impossible to muster any great enthusiasm from donors to fund antimalaria efforts. The W.H.O. now recommends that countries treat the disease largely through the health-care system--through elimination of the parasite--but many anti-malarial drugs are no longer effective. In the past thirty years, there have been outbreaks in India, Sri Lanka, Brazil, and South Korea, among other places. "Our troubles with mosquitoes are getting worse," Spielman concludes in "Mosquito," "making more people sick and claiming more lives, millions of lives, every year."
For Soper, the unravelling of his dream was pure torture. In 1959, he toured Asia to check on the eradication campaigns of Thailand, the Philippines, Ceylon, and India, and came back appalled at what he had seen. Again and again, he found, countries were executing his strategy improperly. They weren't spraying for long enough. They didn't realize that unless malaria was ground into submission it would come roaring back. But what could he do? He had prevailed against gambiae in Brazil in the nineteen-thirties because he had been in charge; he had worked with the country's dictator to make it illegal to prevent an inspector from entering a house, and illegal to prevent the inspector from treating any open container of water. Jesse Hobbs tells of running into Soper one day in Trinidad, after driving all day in an open jeep through the tropical heat. Soper drove up in a car and asked Hobbs to get in; Hobbs demurred, gesturing at his sweaty shirt. "Son," Soper responded, "we used to go out in a day like this in Brazil and if we found a sector chief whose shirt was not wet we'd fire him." Killing mosquitoes, Soper always said, was not a matter of knowledge and academic understanding; it was a matter of administration and discipline. "He used to say that if you have a democracy you can't have eradication," Litsios says. "When Soper was looking for a job at Johns Hopkins--this would have been '46--he told a friend that 'they turned me down because they said I was a fascist.'" Johns Hopkins was right, of course: he was a fascist--a disease fascist--because he believed a malaria warrior had to be. But now roofs were falling down in Malaysia, and inspectors were taking bribes, and local health officials did not understand the basic principles of eradication--and his critics had the audacity to blame his ideas, rather than their own weakness.
It was in this same period that Rachel Carson published "Silent Spring," taking aim at the environmental consequences of DDT. "The world has heard much of the triumphant war against disease through the control of insect vectors of infection," she wrote, alluding to the efforts of men like Soper, "but it has heard little of the other side of the story--the defeats, the short-lived triumphs that now strongly support the alarming view that the insect enemy has been made actually stronger by our efforts." There had already been "warnings," she wrote, of the problems created by pesticides:
On Nissan Island in the South Pacific, for example, spraying had been carried on intensively during the Second World War, but was stopped when hostilities came to an end. Soon swarms of a malaria-carrying mosquito reinvaded the island. All of its predators had been killed off and there had not been time for new populations to become established. The way was therefore clear for a tremendous population explosion. Marshall Laird, who had described this incident, compares chemical control to a treadmill; once we have set foot on it we are unable to stop for fear of the consequences.
It is hard to read that passage and not feel the heat of Soper's indignation. He was familiar with "Silent Spring"--everyone in the malaria world was--and what was Carson saying?Of course the mosquitoes came back when DDT spraying stopped. The question was whether the mosquitoes were gone long enough to disrupt the cycle of malaria transmission. The whole point of eradication, to his mind, was that it got you off the treadmill: DDT was so effective that if you used it properly you could stop spraying and not fear the consequences. Hadn't that happened in places like Taiwan and Jamaica and Sardinia?
"Silent Spring" was concerned principally with the indiscriminate use of DDT for agricultural purposes; in the nineteen-fifties, it was being sprayed like water in the Western countryside, in an attempt to control pests like the gypsy moth and the spruce budworm. Not all of Carson's concerns about the health effects of DDT have stood the test of time--it has yet to be conclusively linked to human illness--but her larger point was justified: DDT was being used without concern for its environmental consequences. It must have galled Soper, however, to see how Carson effectively lumped the malaria warriors with those who used DDT for economic gain. Nowhere in "Silent Spring" did Carson acknowledge that the chemical she was excoriating as a menace had, in the two previous decades, been used by malariologists to save somewhere in the vicinity of ten million lives. Nor did she make it clear how judiciously the public-health community was using the chemical. By the late fifties, health experts weren't drenching fields and streams and poisoning groundwater and killing fish. They were leaving a microscopic film on the inside walls of houses; spraying every house in a country the size of Guyana, for example, requires no more DDT in a year than a large cotton farm does. Carson quoted a housewife from Hinsdale, Illinois, who wrote about the damage left by several years of DDT spraying against bark beetles: "The town is almost devoid of robins and starlings; chickadees have not been on my shelf for two years, and this year the cardinals are gone too; the nesting population in the neighborhood seems to consist of one dove pair and perhaps one catbird family. . . . 'Will they ever come back?' [the children]ask, and I do not have the answer." Carson then quoted a bird-lover from Alabama:"There was not a sound of the song of a bird. It was eerie, terrifying. What was man doing to our perfect and beautiful world?" But to Soper the world was neither perfect nor beautiful, and the question of what man could do to nature was less critical than what nature, unimpeded, could do to man. Here, from a well-thumbed page inserted in Soper's diaries, is a description of a town in Egypt during that country's gambiae invasion of 1943--a village in the grip of its own, very different, unnatural silence:
Most houses are without roofs. They are just a square of dirty earth. In those courtyards and behind the doors of these hovels were found whole families lying on the floor; some were just too weakened by illness to get up and others were lying doubled up shaking from head to foot with their teeth chattering and their violently trembling hands trying in vain to draw some dirty rags around them for warmth. They were in the middle of the malaria crisis. There was illness in every house. There was hardly a house which had not had its dead and those who were left were living skeletons, their old clothing in rags, their limbs swollen from undernourishment and too weak to go into the fields to work or even to get food.
It must have seemed to Soper that the ground had shifted beneath his feet--that the absolutes that governed his life, that countenanced even the most extreme of measures in the fight against disease, had suddenly and bewilderingly been set aside. "I was on several groups who evaluated malaria-eradication programs in some of the Central American countries and elsewhere," Geoffrey Jeffery recalls. "Several times we came back with the answer that with the present technology and effort it wasn't going to work. Well, that didn't suit Soper very much. He harangued us. We shouldn't be saying things like that!" Wilbur Downs, a physician who worked for the Rockefeller Foundation in Mexico in the fifties, used to tell of a meeting with Soper and officials of the Mexican government about the eradication of malaria in that country. Soper had come down from Washington, and amid excited talk of ending malaria forever Downs pointed out that there were serious obstacles to eradication--among them the hastened decomposition and absorption of DDT by the clays forming adobe walls. It was all too much for Soper. This was the kind of talk that was impeding eradication--the doubting, the equivocation, the incompetence, the elevation of songbirds over human life. In the middle of the meeting, Soper--ramrod straight, eyes afire--strode over to Downs, put both his hands around his neck, and began to shake.
6.
Fred Soper ran up against the great moral of the late twentieth century--that even the best-intentioned efforts have perverse consequences, that benefits are inevitably offset by risks. This was the lesson of "Silent Spring," and it was the lesson, too, that malariologists would take from the experience with global eradication. DDT, Spielman argues, ought to be used as selectively as possible, to quell major outbreaks. "They should have had a strong rule against spraying the same villages again and again," he says. "But that went against their doctrine. They wanted eighty-per-cent coverage. They wanted eight out of ten houses year after year after year, and that's a sure formula for resistance." Soper and Russell once argued about whether, in addition to house spraying, malaria fighters should continue to drain swamps. Russell said yes; Soper said no, that it would be an unnecessary distraction. Russell was right: it made no sense to use only one weapon against malaria. Spielman points out that malaria transmission in sub-Saharan Africa is powerfully affected by the fact that so many people live in mud huts. The walls of that kind of house need to be constantly replastered, and to do that villagers dig mud holes around their huts. But a mud hole is a prime breeding spot for gambiae. If economic aid were directed at helping villagers build houses out of brick, Spielman argues, malaria could be dealt a blow. Similarly, the Princeton University malariologist Burton Singer says that since the forties it has been well known that mosquito larvae that hatch in rice fields--a major breeding site in southeast Asia--can be killed if the water level in the fields is intermittently drained, a practice that has the additional effect of raising rice yields. Are these perfect measures? No. But, under the right circumstances, they are sustainable. In a speech Soper presented on eradication, he quoted Louis Pasteur: "It is within the power of man to rid himself of every parasitic disease." The key phrase, for Soper, was "within the power." Soper believed that the responsibility of the public-health professional was to make an obligation out of what was possible. He never understood that concessions had to be made to what was practical. "This is the fundamental difference between those of us in public health who have an epidemiological perspective, and people, like Soper, with more of a medical approach," Spielman says. "We deal with populations over time, populations of individuals. They deal with individuals at a moment in time. Their best outcome is total elimination of the condition in the shortest possible period. Our first goal is to cause no outbreaks, no epidemics, to manage, to contain the infection." Bringing the absolutist attitudes of medicine to a malarious village, Spielman says, "is a good way to do a bad thing." The Fred Soper that we needed, in retrospect, was a man of more modest ambitions.
But, of course, Fred Soper with modest ambitions would not be Fred Soper; his epic achievements arose from his fanaticism, his absolutism, his commitment to saving as many lives as possible in the shortest period of time. For all the talk of his misplaced ambition, there are few people in history to whom so many owe their lives. The Global Malaria Eradication Programme helped eliminate the disease from the developed world, and from many parts of the developing world. In a number of cases where the disease returned, it came back at a lower level than it had been in the prewar years, and even in those places where eradication made little headway the campaign sometimes left in place a public infrastructure that had not existed before. The problem was that Soper had raised expectations too high. He had said that the only acceptable outcome for Global Eradication was global eradication, and when that did not happen he was judged--and, most important, he judged himself--a failure. But isn't the urgency Soper felt just what is lacking in the reasonableness of our contemporary attitude--in our caution and thoughtfulness and restraint? In the wake of the failure of eradication, it was popular to say that truly effective malaria control would have to await the development of a public-health infrastructure in poorer countries. Soper's response was, invariably: What about now? In a letter to a friend, he snapped, "The delay in handling malaria until it can be done by local health units is needlessly sacrificing the generation now living." There is something to admire in that attitude; it is hard to look at the devastation wrought by H.I.V. and malaria and countless other diseases in the Third World and not conclude that what we need, more than anything, is someone who will marshal the troops, send them house to house, monitor their every movement, direct their every success, and, should a day of indifference leave their shirts unsullied, send them packing. Toward the end of his life, Soper, who died in 1975, met with an old colleague, M. A. Farid, with whom he had fought gambiae in Egypt years before. "How do things go?" Soper began. "Bad!" Farid replied, for this was in the years when everyone had turned against Soper's vision. "Who will be our ally?" Soper asked. And Farid said simply, "Malaria," and Soper, he remembered, almost hugged him, because it was clear what Farid meant: Someday, when DDT is dead and buried, and the West wakes up to a world engulfed by malaria, we will think back on Fred Soper and wish we had another to take his place.
he Critics
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 16, 2001
"SUPER FRIENDS"
Sumner Redstone and the
rules of the corporate memoir.
1.
In the early nineteen-nineties, Sumner Redstone, the head of Viacom, wanted to merge his company with Paramount Communications. The problem was that the chairman of Paramount, Martin Davis, was being difficult. As Redstone recounts in his new autobiography, "A Passion to Win" (Simon & Schuster; $26), he and Davis would meet at, say, a charitable function, and Davis would make it sound as if the deal were imminent. Then, abruptly, he would back away. According to Redstone, Davis was a ditherer, a complicated and emotionally cold man who couldn't bear to part with his company. Yet, somehow in the course of their dealings, Redstone writes, he and Davis developed a "mutual respect and fond friendship." They became, he says a page later, "friends" who "enjoyed each other's company and were developing a close working rapport," and who had, he tells us two pages after that, "a great affection for each other." The turning point in the talks comes when Davis and Redstone have dinner in a dining room at Morgan Stanley, and Redstone is once more struck by how Davis "had a genuine affection for me." When the two have dinner again, this time at Redstone's suite in the Carlyle Hotel, Davis looks out over the spectacular lights of nighttime Manhattan and says, "You know, Sumner, when this deal gets done, they'll build a big statue of you in the middle of Central Park and I'll be forgotten." "No, Martin," Redstone replies. "They'll build statues of both of us and I will be looking up to you in admiration." Davis laughs. "It was just the right touch," Redstone reports, and one can almost imagine him at that point throwing a brawny forearm around Davis's shoulders and giving him a manly squeeze.
2.
"A Passion to Win," which Redstone wrote with Peter Knobler, is an account of a man's rise to the top of a multibillion-dollar media empire. It is the tale of the complex negotiations, blinding flashes of insight, and lengthy dinners at exclusive Manhattan hotels which created the colossus that is Viacom. But mostly it is a story about the value of friendship, about how very, very powerful tycoons like Redstone have the surprising ability to transcend their own egos and see each other, eye to eye, as human beings.
For instance, Gerald Levin, the head of Time Warner, might look like a rival of Redstone's. Not at all. He is, Redstone tells us, "a very close friend." Redstone says that he and Sherry Lansing, who heads Paramount Pictures, a division of Viacom, "are not just business associates, we are extremely close friends." So, too, with Geraldine Laybourne, who used to head Viacom's Nickelodeon division. "We were not just business associates," he writes, in the plainspoken manner that is his trademark. "We were friends." The singer Tony Bennett was one of Redstone's idols for years, and then one day Redstone's employees threw him a surprise birthday party and there was Bennett, who had come thousands of miles to sing a song. "Now," Redstone says proudly, "he is my friend." The producer Bob Evans? "A good friend." Aaron Spelling? "One of my closest friends." Bill and Hillary? "I have come to know and like the Clintons." Redstone's great friend Martin Davis warned him once about Barry Diller: "Don't trust him. He's got too big an ego." But Redstone disagreed. "Barry Diller and I were extremely friendly," he says. Ted Kennedy he met years ago, at a get-together of business executives. Everyone else was flattering Kennedy. Not so Redstone. A true friend is never disingenuous. As he recalls, he said to Kennedy,
"Look, I don't want to disagree with everybody, but, Senator, the problem is that you believe . . . that you can solve any problem by just throwing money at it. It doesn't work that way."
Conversation ceased, glances were exchanged. Everyone was appalled. Then Senator Kennedy said: "Sumner's right." . . . After that, Senator Kennedy called me regularly when he came to Boston and we developed a lasting friendship.
You might think that Redstone simply becomes friends with anyone he meets who is rich or famous. This is not the case. Once, Redstone was invited to dinner at the office of Robert Maxwell, the British press baron. It was no small matter, since Redstone is one of those moguls for whom dinner has enormous symbolic importance--it is the crucible in which friendships are forged. But Maxwell didn't show up until halfway through the meal: not a friend. On another occasion, Redstone hires a highly touted executive away from the retailing giant Wal-Mart to run his Blockbuster division, and then learns that the man is eating dinner alone in his hotel dining room and isn't inviting his fellow-executives to join him. Dinner alone? Redstone was worried. That was not friendly behavior. The executive, needless to say, was not long for Viacom.
What Redstone likes most in a friend is someone who reminds him of himself. "I respected Malone for having started with nothing and rising to become chairman of the very successful Tele-Communications, Inc.," Redstone writes of John Malone, the billionaire cable titan. "I had admired Kerkorian's success over the years," he says of Kirk Kerkorian, the billionaire corporate raider. "He started with nothing, and I have a special affection for people who start with nothing and create empires. . . . Today Kirk Kerkorian and I are friends." (They are so friendly, in fact, that they recently had a meal together at Spago.) Of his first meeting with John Antioco, whom Redstone would later hire to run Blockbuster--replacing the executive who ate dinner alone--Redstone writes, "We hit it off immediately. . . . He had come from humble beginnings, which I empathized with." Soon, the two men are dining together. Look at my life, Redstone seems to marvel again and again in "A Passion to Win." You think you see a hard-nosed mogul, selflessly wringing the last dollar out of megadeals on behalf of his shareholders. But inside that mogul beats a heart of warmth and compassion. Ever wonder how Redstone was able to pull off his recent mammoth merger with CBS? He happens to have lunch with the CBS chief, Mel Karmazin, in an exclusive Viacom corporate dining room, and discovers that he and Karmazin are kindred spirits. "Both of us had started with nothing," Redstone writes, "and ended up in control of major corporations." Can you believe it?
3.
In 1984, Lee Iacocca, the chairman of the resurgent Chrysler Corporation, published his autobiography, "Iacocca," with the writer William Novak. It was a charming book, in which Ia-cocca came across as a homespun, no-nonsense man of the people, and it sold more copies than any other business book in history. This was good news for Iacocca, because it made him a household name. But it was bad news for the rest of us, because it meant that an entire class of C.E.O.s promptly signed up ghostwriters and turned out memoirs designed to portray themselves as homespun, no-nonsense men of the people.
"Iacocca" began with a brief, dramatic prologue, in which he described his last day at Ford, where he had worked his entire life. He had just been fired by Henry Ford II, and it was a time of great personal crisis. "Before I left the house," he wrote, establishing the conflict between him and Henry Ford that would serve as the narrative engine of the book, "I kissed my wife, Mary, and my two daughters, Kathi and Lia. . . . Even today, their pain is what stays with me. It's like the lioness and her cubs. If the hunter knows what's good for him, he will leave the little ones alone. Henry Ford made my kids suffer, and for that I'll never forgive him." Now every C.E.O. book begins with a short, dramatic prologue, in which the author describes a day of great personal crisis that is intended to serve as the narrative engine of the book. In "Work in Progress," by Michael Eisner, Disney's C.E.O., it's the day he suffered chest pains at the Herb Allen conference in Sun Valley: "I spent much of dinner at Herb Allen's talking to Tom Brokaw, the NBC anchorman, who told me a long story about fly-fishing with his friend Robert Redford. . . . The pain in my arms returned." In "A Passion to Win," it's the day Redstone clung to a ledge during a fire at the Copley Plaza Hotel, in Boston, eventually suffering third-degree burns over forty-five per cent of his body: "The pain was excruciating but I refused to let go. That way was death."
Iacocca followed the dramatic prologue with a chapter on his humble origins. It opens, "Nicola Iacocca, my father, arrived in this country in 1902 at the age of twelve--poor, alone, and scared." Now every C.E.O. has humble origins. Then Iacocca spoke of an early mentor, a gruff, no-nonsense man who instilled lessons that guide him still. His name was Charlie Beacham, and he was "the kind of guy you'd charge up the hill for even though you knew very well you could get killed in the process. He had the rare gift of being tough and generous at the same time." Sure enough, everywhere now there are gruff, no-nonsense men instilling lessons that guide C.E.O.s to this day. ("Nobbe, who was in his sixties, was a stern disciplinarian and a tough guy who didn't take crap from anyone," writes the former Scott Paper and Sunbeam C.E.O. Al Dunlap, in his book "Mean Business." "He was always chewing me out. . . . Still, Nobbe rapidly won my undying respect and admiration because he wore his bastardness like a well-earned badge of honor.")
The legacy of "Iacocca" wouldn't matter so much if most C.E.O.s were, in fact, homespun men of the people who had gruff mentors, humble beginnings, and searing personal crises that shaped their lives and careers. But they aren't. Redstone's attempt to play the humble-beginnings card, for instance, is compromised by the fact that he didn't exactly have humble beginnings. Although his earliest years were spent in a tenement, his family's fortunes rapidly improved. He went to Harvard and Harvard Law School. His father was a highly successful businessman, and it was his father's company that served as the basis for the Viacom empire. (Just why Redstone continues to think that he comes from nothing, under the circumstances, is an interesting case study in the psychology of success: perhaps, if you are worth many billions, an upper-middle-class upbringing simply feels like nothing.) Eisner's personal crisis ends with him driving himself to Cedars-Sinai Hospital in Los Angeles--one of the best hospitals in the world--where he is immediately met by not one but two cardiologists, who take him to a third cardiologist, who tells Eisner that the procedure he is about to undergo, an angiogram, a common surgical procedure, is ninety-eight per cent safe. In "On the Firing Line," the former Apple C.E.O. Gil Amelio's day of personal crisis is triggered merely by walking down the halls of his new company: "In each of the offices near mine toiled some key executive I was just coming to know, wrestling with problems that would only gradually be revealed to me. I wondered what caged alligators they would let loose at me on some future date." Dunlap, meanwhile, tells us that one of his first acts as C.E.O. of Scott Paper was, in an orgy of unpretentiousness, to throw out the bookshelves in his office and replace them with Aboriginal paintings from Australia: "To me, the paintings made a lot more sense. They showed people who had to survive by their wits, people who couldn't call out for room service." Among Dunlap's gruff mentors was the Australian multimillionaire Kerry Packer, and one day, while playing tennis with Packer, Dunlap has a personal crisis. He pops a tendon. Packer rushes over, picks him up, and carries him to a lounge chair. "This was not only a wealthy man and a man who had political power, this was a physically powerful man," Dunlap reports. "In the end," he adds, taking Iacocca's lioness and Amelio's alligators to the next level, "Kerry and I split because we were just too similar. We were like two strong-willed, dominant animals who hunted together and brought down the biggest prey, but, when not hunting, fought each other." It is hard to read passages like these and not shudder at the prospect of the General Electric chairman Jack Welch's upcoming memoir, for which Warner Books paid a seven-milliondollar advance. Who will be tapped as the gruff mentor? What was Welch's career-altering personal crisis? What wild-animal metaphors will he employ? ("As I looked around the room, I felt like a young wildebeest being surveyed by a group of older and larger--but less nimble--wildebeests, whose superior market share and penetration of the herd were no match for my greater hunger, born of my impoverished middle-class upbringing in the jungles of suburban Boston.")
The shame of it is that many of these books could have been fascinating. Scattered throughout Eisner's "Work in Progress," for example, are numerous hints about how wonderfully weird and obsessive Eisner is. He hears that Universal is thinking of building a rival theme park four miles from Disney in Orlando, and he and his assistant climb the fence at the Universal construction site at three in the morning to check on its progress. He sneaks into performances of the musical "Beauty and the Beast" in Los Angeles at least a dozen times, and when the stage version of "The Lion King" has its first tryout, in Minneapolis, he flies there from Los Angeles half a dozen times during the course of one summer to give his "notes." When he is thinking of building Euro Disney, outside Paris, he is told that it takes half an hour to travel by MĂ©tro from the Arc de Triomphe to the end of the line, six miles from the Disney site. Eisner gets on the MĂ©tro to see for himself. He sets his watch. It takes twenty-five minutes.
By the end of the book, the truth is spilling out from under the breezy façade: Eisner is a compulsive, detail-oriented control freak. That fact, of course, says a lot about why Disney is successful. But you cannot escape the sense, while reading "Work in Progress," that you weren't supposed to reach that conclusion--that the bit about climbing the fence was supposed to be evidence of Eisner's boyish enthusiasm, and the bit about seeing "Beauty and the Beast" a dozen times was supposed to make it look as if he just loved the theatre. This is the sorry state of C.E.O. memoirs in the post-Iacocca era. It's only when they fail at their intended task that they truly succeed.
4.
"A Passion to Win" ought to have been a terrific book, because Redstone has a terrific story to tell. He graduated from Boston Latin High School with the highest grade-point average in the school's three-hundred-year history. During the war, he was a cryptographer, part of the team that successfully cracked Japanese military and diplomatic codes. After the war, he had a brilliant career as a litigator, arguing a case before the Supreme Court. The mobster Bugsy Siegel once offered him a job. Then Redstone took over his father's business, and, through a series of breathtaking acquisitions--Viacom, Paramount Communications, Blockbuster, and then CBS--turned himself, in the space of twenty years, into one of the richest men in the world.
What links all these successes, it becomes clear, is a very particular and subtle intelligence. Here, in one of the book's best passages, is Redstone's description of his dealings with Wayne Huizenga, the man from whom he bought Blockbuster. Huizenga, he writes, put together his empire by buying out local video stores around the country:
He and his Blockbuster associates would swoop in on some video guy who saw money for his store dangling from Huizenga's pockets. When negotiations came to an impasse, rather than say, "We have a problem with the proposal," and make a counteroffer, he would say, "Sorry we couldn't do a deal. Good luck to you," shake the guy's hand, pull on the leather coat and head for the elevator.
Seeing the deal about to fall apart, the video operator, who only moments before was seeing dollar signs, would run after him. "Wait, don't go. Come back. Let's talk about it." Huizenga hadn't hit the down button. He had been waiting. That's how he got his concessions.
When Redstone was negotiating for Blockbuster, Huizenga pulled the same stunt. It would be 2 a.m., Redstone says, and Huizenga would put on his coat and head for the exit. But Redstone was wise to him:
Huizenga would get to the elevator and no one would run after him. One time he waited there for fifteen minutes before it dawned on him that we weren't going to chase him. He got to his car. Nothing.
He would soon find some excuse to call--he left papers in our office--waiting for us to say, "Why don't you come back." Still, nothing. Once he was literally on his plane, perhaps even circling the neighborhood, when he phoned and said he had to be back in New York for a Merrill Lynch dinner anyway and maybe we could get together.
Redstone has a great intuitive grasp of people. He understood immediately that Huizenga was simply a bully. This kind of insight is hardly rare among people who make their living at the negotiating table. It's the skill of the poker player. But poker is a game of manipulation and exploitation--and Redstone doesn't seem to manipulate or exploit. He persuades and seduces: he would concede that your straight flush beat his three of a kind, but then, over a very long dinner at Spago, he would develop such a rapport with you that you'd willingly split the pot with him. It's no accident that, of Paramount's many suitors, Redstone won the day, because he realized that what Martin Davis needed was the assurance of friendship: he needed to hear about the two statues in Central Park, one gazing in admiration at the other. Redstone's peculiar gift also explains why he seems to have ended up as "friends" with so many of the people with whom he's done business. In Redstone's eyes, these people really are his friends. At the moment when he looked into Davis's eyes that night at the Carlyle, he absolutely believed they had a special bond--and, more important, he made Davis believe it, too. Redstone's heart happily follows his mind, and that's a powerful gift for someone whose success depends on the serial seduction of takeover targets.
Most of us, needless to say, don't think of friendships this way. Our hearts don't always follow our minds; they go off in crazy directions, and we develop loyalties that make no functional sense. But there's little of that fuzziness in Redstone's world, and perhaps that's why "A Passion to Win" is sometimes so chilling. A picture runs in the Post, Redstone tells us, that shows him walking down a street in Paris with "a beautiful woman." Phyllis, his wife of fifty-two years, files for divorce. The news hits him "like a bullet," he says. "I could not believe that she wanted to end it." It takes him only a few sentences, though, to recover from his wounds. "Of course, divorce settlement or no, my interest in Viacom's parent company, National Amusements, had been structured in such a way that events in Phyllis's and my personal life would not affect the ownership, control or management of Viacom," he assures us. Redstone says that he considered Frank Biondi, his longtime deputy at Viacom, "my friend." But one day he decides to get rid of Biondi, and immediately the gibes and cheap shots appear. Biondi is lazy. Biondi cannot negotiate deals. Biondi is not C.E.O. material. "Frank took the news calmly, almost as if he expected it," Redstone writes of the firing. "But I was shocked to learn that the first person he called was not his wife, but his lawyer to determine his rights under his contract. We were prepared to honor his contract to the fullest, so that was not an issue, but I found this implicit statement of his priorities to be revealing." What kind of person says this about a friend? Redstone aligns his passions with his interests, and when his interests change, so do his friendships.
At the very end of "A Passion to Win," Redstone recounts Viacom's merger with CBS. The deal meant that the network's C.E.O., Mel Karmazin, would come aboard as chief operating officer of Viacom. But that in turn meant that two of Redstone's most trusted executives, Tom Dooley and Philippe Dauman, would have to give up their posts as deputy chairmen. Redstone says that he was "shocked" when he was told this. Dooley and Dauman were not just business associates; they were his "close friends." Redstone says that he could not accept this, that there was "no way" he could agree to the deal if it meant losing his deputies. At this point, though, we simply don't believe him--we don't believe that someone as smart as Redstone wouldn't have realized this going into the deal with CBS, and we don't believe that Redstone's entirely instrumental friendships could possibly stand in the way of his getting bigger and richer. "A Passion to Win" would have told us much more about Redstone, and about business, if it had confronted this fact and tried to make sense of it. But Redstone is a supremely unself-conscious man, and that trait, which has served him so well in the business world, is fatal in an author. Karmazin comes. Dauman and Dooley go. Redstone moves blithely on to make new best friends.
Java Man
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 30, 2001
A CRITIC AT LARGE
How caffeine created the modern world.
1.
The original Coca-Cola was a late-nineteenth-century concoction known as Pemberton's French Wine Coca, a mixture of alcohol, the caffeine-rich kola nut, and coca, the raw ingredient of cocaine. In the face of social pressure, first the wine and then the coca were removed, leaving the more banal modern beverage in its place: carbonated, caffeinated sugar water with less kick to it than a cup of coffee. But is that the way we think of Coke? Not at all. In the nineteen-thirties, a commercial artist named Haddon Sundblom had the bright idea of posing a portly retired friend of his in a red Santa Claus suit with a Coke in his hand, and plastering the image on billboards and advertisements across the country. Coke, magically, was reborn as caffeine for children, caffeine without any of the weighty adult connotations of coffee and tea. It was--as the ads with Sundblom's Santa put it--"the pause that refreshes." It added life. It could teach the world to sing.
One of the things that have always made drugs so powerful is their cultural adaptability, their way of acquiring meanings beyond their pharmacology. We think of marijuana, for example, as a drug of lethargy, of disaffection. But in Colombia, the historian David T. Courtwright points out in "Forces of Habit" (Harvard; $24.95), "peasants boast that cannabis helps them to quita el cansancio or reduce fatigue; increase their fuerza and ánimo, force and spirit; and become incansable, tireless." In Germany right after the Second World War, cigarettes briefly and suddenly became the equivalent of crack cocaine. "Up to a point, the majority of the habitual smokers preferred to do without food even under extreme conditions of nutrition rather than to forgo tobacco," according to one account of the period. "Many housewives... bartered fat and sugar for cigarettes." Even a drug as demonized as opium has been seen in a more favorable light. In the eighteen-thirties, Franklin Delano Roosevelt's grandfather Warren Delano II made the family fortune exporting the drug to China, and Delano was able to sugarcoat his activities so plausibly that no one ever accused his grandson of being the scion of a drug lord. And yet, as Bennett Alan Weinberg and Bonnie K. Bealer remind us in their marvellous new book "The World of Caffeine" (Routledge; $27.50), there is no drug quite as effortlessly adaptable as caffeine, the Zelig of chemical stimulants.
At one moment, in one form, it is the drug of choice of café intellectuals and artists; in another, of housewives; in another, of Zen monks; and, in yet another, of children enthralled by a fat man who slides down chimneys. King Gustav III, who ruled Sweden in the latter half of the eighteenth century, was so convinced of the particular perils of coffee over all other forms of caffeine that he devised an elaborate experiment. A convicted murderer was sentenced to drink cup after cup of coffee until he died, with another murderer sentenced to a lifetime of tea drinking, as a control. (Unfortunately, the two doctors in charge of the study died before anyone else did; then Gustav was murdered; and finally the tea drinker died, at eighty-three, of old age--leaving the original murderer alone with his espresso, and leaving coffee's supposed toxicity in some doubt.) Later, the various forms of caffeine began to be divided up along sociological lines. Wolfgang Schivelbusch, in his book "Tastes of Paradise," argues that, in the eighteenth century, coffee symbolized the rising middle classes, whereas its great caffeinated rival in those years--cocoa, or, as it was known at the time, chocolate--was the drink of the aristocracy. "Goethe, who used art as a means to lift himself out of his middle class background into the aristocracy, and who as a member of a courtly society maintained a sense of aristocratic calm even in the midst of immense productivity, made a cult of chocolate, and avoided coffee," Schivelbusch writes. "Balzac, who despite his sentimental allegiance to the monarchy, lived and labored for the literary marketplace and for it alone, became one of the most excessive coffee-drinkers in history. Here we see two fundamentally different working styles and means of stimulation--fundamentally different psychologies and physiologies." Today, of course, the chief cultural distinction is between coffee and tea, which, according to a list drawn up by Weinberg and Bealer, have come to represent almost entirely opposite sensibilities:
Coffee Aspect
Tea Aspect
Male
Female
Boisterous
Decorous
Indulgence
Temperance
Hardheaded
Romantic
Topology
Geometry
Heidegger
Carnap
Beethoven
Mozart
Libertarian
Statist
Promiscuous
Pure
That the American Revolution began with the symbolic rejection of tea in Boston Harbor, in other words, makes perfect sense. Real revolutionaries would naturally prefer coffee. By contrast, the freedom fighters of Canada, a hundred years later, were most definitely tea drinkers. And where was Canada's autonomy won? Not on the blood-soaked fields of Lexington and Concord but in the genteel drawing rooms of Westminster, over a nice cup of Darjeeling and small, triangular cucumber sandwiches.
2.
All this is a bit puzzling. We don't fetishize the difference between salmon eaters and tuna eaters, or people who like their eggs sunny-side up and those who like them scrambled. So why invest so much importance in the way people prefer their caffeine? A cup of coffee has somewhere between a hundred and two hundred and fifty milligrams; black tea brewed for four minutes has between forty and a hundred milligrams. But the disparity disappears if you consider that many tea drinkers drink from a pot, and have more than one cup. Caffeine is caffeine. "The more it is pondered," Weinberg and Bealer write, "the more paradoxical this duality within the culture of caffeine appears. After all, both coffee and tea are aromatic infusions of vegetable matter, served hot or cold in similar quantities; both are often mixed with cream or sugar; both are universally available in virtually any grocery or restaurant in civilized society; and both contain the identical psychoactive alkaloid stimulant, caffeine."
It would seem to make more sense to draw distinctions based on the way caffeine is metabolized rather than on the way it is served. Caffeine, whether it is in coffee or tea or a soft drink, moves easily from the stomach and intestines into the bloodstream, and from there to the organs, and before long has penetrated almost every cell of the body. This is the reason that caffeine is such a wonderful stimulant. Most substances can't cross the blood-brain barrier, which is the body's defensive mechanism, preventing viruses or toxins from entering the central nervous system. Caffeine does so easily. Within an hour or so, it reaches its peak concentration in the brain, and there it does a number of things--principally, blocking the action of adenosine, the neuromodulator that makes you sleepy, lowers your blood pressure, and slows down your heartbeat. Then, as quickly as it builds up in your brain and tissues, caffeine is gone--which is why it's so safe. (Caffeine in ordinary quantities has never been conclusively linked to serious illness.)
But how quickly it washes away differs dramatically from person to person. A two-hundred-pound man who drinks a cup of coffee with a hundred milligrams of caffeine will have a maximum caffeine concentration of one milligram per kilogram of body weight. A hundred-pound woman having the same cup of coffee will reach a caffeine concentration of two milligrams per kilogram of body weight, or twice as high. In addition, when women are on the Pill, the rate at which they clear caffeine from their bodies slows considerably. (Some of the side effects experienced by women on the Pill may in fact be caffeine jitters caused by their sudden inability to tolerate as much coffee as they could before.) Pregnancy reduces a woman's ability to process caffeine still further. The half-life of caffeine in an adult is roughly three and a half hours. In a pregnant woman, it's eighteen hours. (Even a four-month-old child processes caffeine more efficiently.) An average man and woman sitting down for a cup of coffee are thus not pharmaceutical equals: in effect, the woman is under the influence of a vastly more powerful drug. Given these differences, you'd think that, instead of contrasting the caffeine cultures of tea and coffee, we'd contrast the caffeine cultures of men and women.
3.
But we don't, and with good reason. To parse caffeine along gender lines does not do justice to its capacity to insinuate itself into every aspect of our lives, not merely to influence culture but even to create it. Take coffee's reputation as the "thinker's" drink. This dates from eighteenth-century Europe, where coffeehouses played a major role in the egalitarian, inclusionary spirit that was then sweeping the continent. They sprang up first in London, so alarming Charles II that in 1676 he tried to ban them. It didn't work. By 1700, there were hundreds of coffeehouses in London, their subversive spirit best captured by a couplet from a comedy of the period: "In a coffeehouse just now among the rabble / I bluntly asked, which is the treason table." The movement then spread to Paris, and by the end of the eighteenth century coffeehouses numbered in the hundreds--most famously, the Café de la Régence, near the Palais Royal, which counted among its customers Robespierre, Napoleon, Voltaire, Victor Hugo, Théophile Gautier, Rousseau, and the Duke of Richelieu. Previously, when men had gathered together to talk in public places, they had done so in bars, which drew from specific socioeconomic niches and, because of the alcohol they served, created a specific kind of talk. The new coffeehouses, by contrast, drew from many different classes and trades, and they served a stimulant, not a depressant. "It is not extravagant to claim that it was in these gathering spots that the art of conversation became the basis of a new literary style and that a new ideal of general education in letters was born," Weinberg and Bealer write.
It is worth noting, as well, that in the original coffeehouses nearly everyone smoked, and nicotine also has a distinctive physiological effect. It moderates mood and extends attention, and, more important, it doubles the rate of caffeine metabolism: it allows you to drink twice as much coffee as you could otherwise. In other words, the original coffeehouse was a place where men of all types could sit all day; the tobacco they smoked made it possible to drink coffee all day; and the coffee they drank inspired them to talk all day. Out of this came the Enlightenment. (The next time we so perfectly married pharmacology and place, we got Joan Baez.)
In time, caffeine moved from the café to the home. In America, coffee triumphed because of the country's proximity to the new Caribbean and Latin American coffee plantations, and the fact that throughout the nineteenth century duties were negligible. Beginning in the eighteen-twenties, Courtwright tells us, Brazil "unleashed a flood of slave-produced coffee. American per capita consumption, three pounds per year in 1830, rose to eight pounds by 1859."
What this flood of caffeine did, according to Weinberg and Bealer, was to abet the process of industrialization--to help "large numbers of people to coordinate their work schedules by giving them the energy to start work at a given time and continue it as long as necessary." Until the eighteenth century, it must be remembered, many Westerners drank beer almost continuously, even beginning their day with something called "beer soup." (Bealer and Weinberg helpfully provide the following eighteenth-century German recipe: "Heat the beer in a saucepan; in a separate small pot beat a couple of eggs. Add a chunk of butter to the hot beer. Stir in some cool beer to cool it, then pour over the eggs. Add a bit of salt, and finally mix all the ingredients together, whisking it well to keep it from curdling.") Now they began each day with a strong cup of coffee. One way to explain the industrial revolution is as the inevitable consequence of a world where people suddenly preferred being jittery to being drunk. In the modern world, there was no other way to keep up. That's what Edison meant when he said that genius was ninety-nine per cent perspiration and one per cent inspiration. In the old paradigm, working with your mind had been associated with leisure. It was only the poor who worked hard. (The quintessential pre-industrial narrative of inspiration belonged to Archimedes, who made his discovery, let's not forget, while taking a bath.) But Edison was saying that the old class distinctions no longer held true--that in the industrialized world there was as much toil associated with the life of the mind as there had once been with the travails of the body.
In the twentieth century, the professions transformed themselves accordingly: medicine turned the residency process into an ordeal of sleeplessness, the legal profession borrowed a page from the manufacturing floor and made its practitioners fill out time cards like union men. Intellectual heroics became a matter of endurance. "The pace of computation was hectic," James Gleick writes of the Manhattan Project in "Genius," his biography of the physicist Richard Feynman. "Feynman's day began at 8:30 and ended fifteen hours later. Sometimes he could not leave the computing center at all. He worked through for thirty-one hours once and the next day found that an error minutes after he went to bed had stalled the whole team. The routine allowed just a few breaks." Did Feynman's achievements reflect a greater natural talent than his less productive forebears had? Or did he just drink a lot more coffee? Paul Hoffman, in "The Man Who Loved Only Numbers," writes of the legendary twentieth-century mathematician Paul Erdös that "he put in nineteen-hour days, keeping himself fortified with 10 to 20 milligrams of Benzedrine or Ritalin, strong espresso and caffeine tablets. 'A mathematician,' Erdös was fond of saying, 'is a machine for turning coffee into theorems.'" Once, a friend bet Erdös five hundred dollars that he could not quit amphetamines for a month. Erdös took the bet and won, but, during his time of abstinence, he found himself incapable of doing any serious work. "You've set mathematics back a month," he told his friend when he collected, and immediately returned to his pills.
Erdös's unadulterated self was less real and less familiar to him than his adulterated self, and that is a condition that holds, more or less, for the rest of society as well. Part of what it means to be human in the modern age is that we have come to construct our emotional and cognitive states not merely from the inside out--with thought and intention--but from the outside in, with chemical additives. The modern personality is, in this sense, a synthetic creation: skillfully regulated and medicated and dosed with caffeine so that we can always be awake and alert and focussed when we need to be. On a bet, no doubt, we could walk away from caffeine if we had to. But what would be the point? The lawyers wouldn't make their billable hours. The young doctors would fall behind in their training. The physicists might still be stuck out in the New Mexico desert. We'd set the world back a month.
4.
That the modern personality is synthetic is, of course, a disquieting notion. When we talk of synthetic personality--or of constructing new selves through chemical means--we think of hard drugs, not caffeine. Timothy Leary used to make such claims about LSD, and the reason his revolution never took flight was that most of us found the concept of tuning in, turning on, and dropping out to be a bit creepy. Here was this shaman, this visionary--and yet, if his consciousness was so great, why was he so intent on altering it? More important, what exactly were we supposed to be tuning in to? We were given hints, with psychedelic colors and deep readings of "Lucy in the Sky with Diamonds," but that was never enough. If we are to re-create ourselves, we would like to know what we will become.
Caffeine is the best and most useful of our drugs because in every one of its forms it can answer that question precisely. It is a stimulant that blocks the action of adenosine, and comes in a multitude of guises, each with a ready-made story attached, a mixture of history and superstition and whimsy which infuses the daily ritual of adenosine blocking with meaning and purpose. Put caffeine in a red can and it becomes refreshing fun. Brew it in a teapot and it becomes romantic and decorous. Extract it from little brown beans and, magically, it is hardheaded and potent. "There was a little known Russian émigré, Trotsky by name, who during World War I was in the habit of playing chess in Vienna's Café Central every evening," Bealer and Weinberg write, in one of the book's many fascinating café yarns:
A typical Russian refugee, who talked too much but seemed utterly harmless, indeed, a pathetic figure in the eyes of the Viennese. One day in 1917 an official of the Austrian Foreign Ministry rushed into the minister's room, panting and excited, and told his chief, "Your excellency . . . Your excellency . . . Revolution has broken out in Russia." The minister, less excitable and less credulous than his official, rejected such a wild claim and retorted calmly, "Go away . . . Russia is not a land where revolutions break out. Besides, who on earth would make a revolution in Russia? Perhaps Herr Trotsky from the Café Central?"
The minister should have known better. Give a man enough coffee and he's capable of anything.
Drugstore Athlete
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
September 10, 2001
THE SPORTING SCENE
To beat the competition,
first you have to beat the drug test.
1.
At the age of twelve, Christiane Knacke-Sommer was plucked from a small town in Saxony to train with the elite SC Dynamo swim club, in East Berlin. After two years of steady progress, she was given regular injections and daily doses of small baby-blue pills, which she was required to take in the presence of a trainer. Within weeks, her arms and shoulders began to thicken. She developed severe acne. Her pubic hair began to spread over her abdomen. Her libido soared out of control. Her voice turned gruff. And her performance in the pool began to improve dramatically, culminating in a bronze medal in the hundred-metre butterfly at the 1980 Moscow Olympics. But then the Wall fell and the truth emerged about those little blue pills. In a new book about the East German sports establishment, "Faust's Gold," Steven Ungerleider recounts the moment in 1998 when Knacke-Sommer testified in Berlin at the trial of her former coaches and doctors:
"Did defendant Gläser or defendant Binus ever tell you that the blue pills were the anabolic steroid known as Oral-Turinabol?" the prosecutor asked. "They told us they were vitamin tablets," Christiane said, "just like they served all the girls with meals." "Did defendant Binus ever tell you the injection he gave was Depot-Turinabol?" "Never," Christiane said, staring at Binus until the slight, middle-aged man looked away. "He said the shots were another kind of vitamin." "He never said he was injecting you with the male hormone testosterone?" the prosecutor persisted. "Neither he nor Herr Gläser ever mentioned Oral-Turinabol or Depot-Turinabol," Christiane said firmly. "Did you take these drugs voluntarily?" the prosecutor asked in a kindly tone. "I was fifteen years old when the pills started," she replied, beginning to lose her composure. "The training motto at the pool was, 'You eat the pills, or you die.' It was forbidden to refuse."
As her testimony ended, Knacke-Sommer pointed at the two defendants and shouted, "They destroyed my body and my mind!" Then she rose and threw her Olympic medal to the floor.
Anabolic steroids have been used to enhance athletic performance since the early sixties, when an American physician gave the drugs to three weight lifters, who promptly jumped from mediocrity to world records. But no one ever took the use of illegal drugs quite so far as the East Germans. In a military hospital outside the former East Berlin, in 1991, investigators discovered a ten-volume archive meticulously detailing every national athletic achievement from the mid-sixties to the fall of the -- Berlin Wall, each entry annotated with the name of the drug and the dosage given to the athlete. An average teen-age girl naturally produces somewhere around half a milligram of testosterone a day. The East German sports authorities routinely prescribed steroids to young adolescent girls in doses of up to thirty-five milligrams a day. As the investigation progressed, former female athletes, who still had masculinized physiques and voices, came forward with tales of deformed babies, inexplicable tumors, liver dysfunction, internal bleeding, and depression. German prosecutors handed down hundreds of indictments of former coaches, doctors, and sports officials, and won numerous convictions. It was the kind of spectacle that one would have thought would shock the sporting world. Yet it didn't. In a measure of how much the use of drugs in competitive sports has changed in the past quarter century, the trials caused barely a ripple.
Today, coaches no longer have to coerce athletes into taking drugs. Athletes take them willingly. The drugs themselves are used in smaller doses and in creative combinations, leaving few telltale physical signs, and drug testers concede that it is virtually impossible to catch all the cheaters, or even, at times, to do much more than guess when cheating is taking place. Among the athletes, meanwhile, there is growing uncertainty about what exactly is wrong with doping. When the cyclist Lance Armstrong asserted last year, after his second consecutive Tour de France victory, that he was drug-free, some doubters wondered whether he was lying, and others simply assumed he was, and wondered why he had to. The moral clarity of the East German scandal -- with its coercive coaches, damaged athletes, and corrupted competitions--has given way to shades of gray. In today's climate, the most telling moment of the East German scandal was not Knacke-Sommer's outburst. It was when one of the system's former top officials, at the beginning of his trial, shrugged and quoted Brecht: "Competitive sport begins where healthy sport ends."
2.
Perhaps the best example of how murky the drug issue has become is the case of Ben Johnson, the Canadian sprinter who won the one hundred metres at the Seoul Olympics, in 1988. Johnson set a new world record, then failed a post-race drug test and was promptly stripped of his gold medal and suspended from international competition. No athlete of Johnson's calibre has ever been exposed so dramatically, but his disgrace was not quite the victory for clean competition that it appeared to be.
Johnson was part of a group of world-class sprinters based in Toronto in the nineteen-seventies and eighties and trained by a brilliant coach named Charlie Francis. Francis was driven and ambitious, eager to give his athletes the same opportunities as their competitors from the United States and Eastern Europe, and in 1979 he began discussing steroids with one of his prize sprinters, Angella Taylor. Francis felt that Taylor had the potential that year to run the two hundred metres in close to 22.90 seconds, a time that would put her within striking distance of the two best sprinters in the world, Evelyn Ashford, of the United States, and Marita Koch, of East Germany. But, seemingly out of nowhere, Ashford suddenly improved her two-hundred-metre time by six-tenths of a second. Then Koch ran what Francis calls, in his autobiography, "Speed Trap," a "science fictional" 21.71. In the sprints, individual improvements are usually measured in hundredths of a second; athletes, once they have reached their early twenties, typically improve their performance in small, steady increments, as experience and strength increase. But these were quantum leaps, and to Francis the explanation was obvious. "Angella wasn't losing ground because of a talent gap," he writes; "she was losing because of a drug gap, and it was widening by the day." (In the case of Koch, at least, he was right. In the East German archives, investigators found a letter from Koch to the director of research at V.E.B. Jenapharm, an East German pharmaceutical house, in which she complained, "My drugs were not as potent as the ones that were given to my opponent Brbel Eckert, who kept beating me." In East Germany, Ungerleider writes, this particular complaint was known as "dope-envy.") Later, Francis says, he was confronted at a track meet by Brian Oldfield, then one of the world's best shot-putters:
"When are you going to start getting serious?" he demanded. "When are you going to tell your guys the facts of life?" I asked him how he could tell they weren't already using steroids. He replied that the muscle density just wasn't there. "Your guys will never be able to compete against the Americans--their careers will be over," he persisted.
Among world-class athletes, the lure of steroids is not that they magically transform performance--no drug can do that--but that they make it possible to train harder. An aging baseball star, for instance, may realize that what he needs to hit a lot more home runs is to double the intensity of his weight training. Ordinarily, this might actually hurt his performance. "When you're under that kind of physical stress," Charles Yesalis, an epidemiologist at Pennsylvania State University, says, "your body releases corticosteroids, and when your body starts making those hormones at inappropriate times it blocks testosterone. And instead of being anabolic--instead of building muscle--corticosteroids are catabolic. They break down muscle. That's clearly something an athlete doesn't want." Taking steroids counteracts the impact of corticosteroids and helps the body bounce back faster. If that home-run hitter was taking testosterone or an anabolic steroid, he'd have a better chance of handling the extra weight training.
It was this extra training that Francis and his sprinters felt they needed to reach the top. Angella Taylor was the first to start taking steroids. Ben Johnson followed in 1981, when he was twenty years old, beginning with a daily dose of five milligrams of the steroid Dianabol, in three-week on-and-off cycles. Over time, that protocol grew more complex. In 1984, Taylor visited a Los Angeles doctor, Robert Kerr, who was famous for his willingness to provide athletes with pharmacological assistance. He suggested that the Canadians use human growth hormone, the pituitary extract that promotes lean muscle and that had become, in Francis's words, "the rage in elite track circles." Kerr also recommended three additional substances, all of which were believed to promote the body's production of growth hormone: the amino acids arginine and ornithine and the dopamine precursor L-dopa. "I would later learn," Francis writes, "that one group of American women was using three times as much growth hormone as Kerr had suggested, in addition to 15 milligrams per day of Dianabol, another 15 milligrams of Anavar, large amounts of testosterone, and thyroxine, the synthetic thyroid hormone used by athletes to speed the metabolism and keep people lean." But the Canadians stuck to their initial regimen, making only a few changes: Vitamin B12, a non-steroidal muscle builder called inosine, and occasional shots of testosterone were added; Dianabol was dropped in favor of a newer steroid called Furazabol; and L-dopa, which turned out to cause stiffness, was replaced with the blood-pressure drug Dixarit.
Going into the Seoul Olympics, then, Johnson was a walking pharmacy. But--and this is the great irony of his case--none of the drugs that were part of his formal pharmaceutical protocol resulted in his failed drug test. He had already reaped the benefit of the steroids in intense workouts leading up to the games, and had stopped Furazabol and testosterone long enough in advance that all traces of both supplements should have disappeared from his system by the time of his race--a process he sped up by taking the diuretic Moduret. Human growth hormone wasn't--and still isn't--detectable by a drug test, and arginine, ornithine, and Dixarit were legal. Johnson should have been clean. The most striking (and unintentionally hilarious) moment in "Speed Trap" comes when Francis describes his bewilderment at being informed that his star runner had failed a drug test--for the anabolic steroid stanozolol. "I was floored," Francis writes:
To my knowledge, Ben had never injected stanozolol. He occasionally used Winstrol, an oral version of the drug, but for no more than a few days at a time, since it tended to make him stiff. He'd always discontinued the tablets at least six weeks before a meet, well beyond the accepted "clearance time." . . . After seven years of using steroids, Ben knew what he was doing. It was inconceivable to me that he might take stanozolol on his own and jeopardize the most important race of his life.
Francis suggests that Johnson's urine sample might have been deliberately contaminated by a rival, a charge that is less preposterous than it sounds. Documents from the East German archive show, for example, that in international competitions security was so lax that urine samples were sometimes switched, stolen from a "clean" athlete, or simply "borrowed" from a noncompetitor. "The pure urine would either be infused by a catheter into the competitor's bladder (a rather painful procedure) or be held in condoms until it was time to give a specimen to the drug control lab," Ungerleider writes. (The top East German sports official Manfred Höppner was once in charge of urine samples at an international weight-lifting competition. When he realized that several of his weight lifters would not pass the test, he broke open the seal of their specimens, poured out the contents, and, Ungerleider notes, "took a nice long leak of pure urine into them.") It is also possible that Johnson's test was simply botched. Two years later, in 1990, track and field's governing body claimed that Butch Reynolds, the world's four-hundred-metre record holder, had tested positive for the steroid nandrolone, and suspended him for two years. It did so despite the fact that half of his urine-sample data had been misplaced, that the testing equipment had failed during analysis of the other half of his sample, and that the lab technician who did the test identified Sample H6 as positive--and Reynolds's sample was numbered H5. Reynolds lost the prime years of his career.
We may never know what really happened with Johnson's assay, and perhaps it doesn't much matter. He was a doper. But clearly this was something less than a victory for drug enforcement. Here was a man using human growth hormone, Dixarit, inosine, testosterone, and Furazabol, and the only substance that the testers could find in him was stanozolol--which may have been the only illegal drug that he hadn't used. Nor is it encouraging that Johnson was the only prominent athlete caught for drug use in Seoul. It is hard to believe, for instance, that the sprinter Florence Griffith Joyner, the star of the Seoul games, was clean. Before 1988, her best times in the hundred metres and the two hundred metres were, respectively, 10.96 and 21.96. In 1988, a suddenly huskier FloJo ran 10.49 and 21.34, times that no runner since has even come close to equalling. In other words, at the age of twenty-eight--when most athletes are beginning their decline--Griffith Joyner transformed herself in one season from a career-long better-than-average sprinter to the fastest female sprinter in history. Of course, FloJo never failed a drug test. But what does that prove? FloJo went on to make a fortune as a corporate spokeswoman. Johnson's suspension cost him an estimated twenty-five million dollars in lost endorsements. The real lesson of the Seoul Olympics may simply have been that Johnson was a very unlucky man.
3.
The basic problem with drug testing is that testers are always one step behind athletes. It can take years for sports authorities to figure out what drugs athletes are using, and even longer to devise effective means of detecting them. Anabolic steroids weren't banned by the International Olympic Committee until 1975, almost a decade after the East Germans started using them. In 1996, at the Atlanta Olympics, five athletes tested positive for what we now know to be the drug Bromantan, but they weren't suspended, because no one knew at the time what Bromantan was. (It turned out to be a Russian-made psycho-stimulant.) Human growth hormone, meanwhile, has been around for twenty years, and testers still haven't figured out how to detect it.
Perhaps the best example of the difficulties of drug testing is testosterone. It has been used by athletes to enhance performance since the fifties, and the International Olympic Committee announced that it would crack down on testosterone supplements in the early nineteen-eighties. This didn't mean that the I.O.C. was going to test for testosterone directly, though, because the testosterone that athletes were getting from a needle or a pill was largely indistinguishable from the testosterone they produce naturally. What was proposed, instead, was to compare the level of testosterone in urine with the level of another hormone, epitestosterone, to determine what's called the T/E ratio. For most people, under normal circumstances, that ratio is 1:1, and so the theory was that if testers found a lot more testosterone than epitestosterone it would be a sign that the athlete was cheating. Since a small number of people have naturally high levels of testosterone, the I.O.C. avoided the risk of falsely accusing anyone by setting the legal limit at 6:1.
Did this stop testosterone use? Not at all. Through much of the eighties and nineties, most sports organizations conducted their drug testing only at major competitions. Athletes taking testosterone would simply do what Johnson did, and taper off their use in the days or weeks prior to those events. So sports authorities began randomly showing up at athletes' houses or training sites and demanding urine samples. To this, dopers responded by taking extra doses of epitestosterone with their testosterone, so their T/E would remain in balance. Testers, in turn, began treating elevated epitestosterone levels as suspicious, too. But that still left athletes with the claim that they were among the few with naturally elevated testosterone. Testers, then, were forced to take multiple urine samples, measuring an athlete's T/E ratio over several weeks. Someone with a naturally elevated T/E ratio will have fairly consistent ratios from week to week. Someone who is doping will have telltale spikes--times immediately after taking shots or pills when the level of the hormone in his blood soars. Did all these precautions mean that cheating stopped? Of course not. Athletes have now switched from injection to transdermal testosterone patches, which administer a continuous low-level dose of the hormone, smoothing over the old, incriminating spikes. The patch has another advantage: once you take it off, your testosterone level will drop rapidly, returning to normal, depending on the dose and the person, in as little as an hour. "It's the peaks that get you caught," says Don Catlin, who runs the U.C.L.A. Olympic Analytical Laboratory. "If you took a pill this morning and an unannounced test comes this afternoon, you'd better have a bottle of epitestosterone handy. But, if you are on the patch and you know your own pharmacokinetics, all you have to do is pull it off." In other words, if you know how long it takes for you to get back under the legal limit and successfully stall the test for that period, you can probably pass the test. And if you don't want to take that chance, you can just keep your testosterone below 6:1, which, by the way, still provides a whopping performance benefit. "The bottom line is that only careless and stupid people ever get caught in drug tests," Charles Yesalis says. "The lite athletes can hire top medical and scientific people to make sure nothing bad happens, and you can't catch them."
4.
But here is where the doping issue starts to get complicated, for there's a case to be made that what looks like failure really isn't--that regulating aggressive doping, the way the 6:1 standard does, is a better idea than trying to prohibit drug use. Take the example of erythropoietin, or EPO. EPO is a hormone released by your kidneys that stimulates the production of red blood cells, the body's oxygen carriers. A man-made version of the hormone is given to those with suppressed red-blood-cell counts, like patients undergoing kidney dialysis or chemotherapy. But over the past decade it has also become the drug of choice for endurance athletes, because its ability to increase the amount of oxygen that the blood can carry to the muscles has the effect of postponing fatigue. "The studies that have attempted to estimate EPO's importance say it's worth about a three-, four-, or five-per-cent advantage, which is huge," Catlin says. EPO also has the advantage of being a copy of a naturally occurring substance, so it's very hard to tell if someone has been injecting it. (A cynic would say that this had something to do with the spate of remarkable times in endurance races during that period.)
So how should we test for EPO? One approach, which was used in the late nineties by the International Cycling Union, is a test much like the T/E ratio for testosterone. The percentage of your total blood volume which is taken up by red blood cells is known as your hematocrit. The average adult male has a hematocrit of between thirty-eight and forty-four per cent. Since 1995, the cycling authorities have declared that any rider who had a hematocrit above fifty per cent would be suspended--a deliberately generous standard (like the T/E ratio) meant to avoid falsely accusing someone with a naturally high hematocrit. The hematocrit rule also had the benefit of protecting athletes' health. If you take too much EPO, the profusion of red blood cells makes the blood sluggish and heavy, placing enormous stress on the heart. In the late eighties, at least fifteen professional cyclists died from suspected EPO overdoses. A fifty-per-cent hematocrit limit is below the point at which EPO becomes dangerous.
But, like the T/E standard, the hematocrit standard had a perverse effect: it set the legal limit so high that it actually encouraged cyclists to titrate their drug use up to the legal limit. After all, if you are riding for three weeks through the mountains of France and Spain, there's a big difference between a hematocrit of forty-four per cent and one of 49.9 per cent. This is why Lance Armstrong faced so many hostile questions about EPO from the European press--and why eyebrows were raised at his five-year relationship with an Italian doctor who was thought to be an expert on performance-enhancing drugs. If Armstrong had, say, a hematocrit of forty-four per cent, the thinking went, why wouldn't he have raised it to 49.9, particularly since the rules (at least, in 2000) implicitly allowed him to do so. And, if he didn't, how on earth did he win?
The problems with hematocrit testing have inspired a second strategy, which was used on a limited basis at the Sydney Olympics and this summer's World Track and Field Championships. This test measures a number of physiological markers of EPO use, including the presence of reticulocytes, which are the immature red blood cells produced in large numbers by EPO injections. If you have a lot more reticulocytes than normal, then there's a good chance you've used EPO recently. The blood work is followed by a confirmatory urinalysis. The test has its weaknesses. It's really only useful in picking up EPO used in the previous week or so, whereas the benefits of taking the substance persist for a month. But there's no question that, if random EPO testing were done aggressively in the weeks leading to a major competition, it would substantially reduce cheating.
On paper, this second strategy sounds like a better system. But there's a perverse effect here as well. By discouraging EPO use, the test is simply pushing savvy athletes toward synthetic compounds called hemoglobin-based oxygen carriers, which serve much the same purpose as EPO but for which there is no test at the moment. "I recently read off a list of these new blood-oxygen expanders to a group of toxicologists, and none had heard of any of them," Yesalis says. "That's how fast things are moving." The attempt to prevent EPO use actually promotes inequity: it gives an enormous advantage to those athletes with the means to keep up with the next wave of pharmacology. By contrast, the hematocrit limit, though more permissive, creates a kind of pharmaceutical parity. The same is true of the T/E limit. At the 1986 world swimming championships, the East German Kristin Otto set a world record in the hundred-metre freestyle, with an extraordinary display of power in the final leg of the race. According to East German records, on the day of her race Otto had a T/E ratio of 18:1. Testing can prevent that kind of aggressive doping; it can insure no one goes above 6:1. That is a less than perfect outcome, of course, but international sports is not a perfect world. It is a place where Ben Johnson is disgraced and FloJo runs free, where Butch Reynolds is barred for two years and East German coaches pee into cups--and where athletes without access to the cutting edge of medicine are condemned to second place. Since drug testers cannot protect the purity of sport, the very least they can do is to make sure that no athlete can cheat more than any other.
5.
The first man to break the four-minute mile was the Englishman Roger Bannister, on a windswept cinder track at Oxford, nearly fifty years ago. Bannister is in his early seventies now, and one day last summer he returned to the site of his historic race along with the current world-record holder in the mile, Morocco's Hicham El Guerrouj. The two men chatted and compared notes and posed for photographs. "I feel as if I am looking at my mirror image," Bannister said, indicating El Guerrouj's similarly tall, high-waisted frame. It was a polite gesture, an attempt to suggest that he and El Guerrouj were part of the same athletic lineage. But, as both men surely knew, nothing could be further from the truth.
Bannister was a medical student when he broke the four-minute mile in 1954. He did not have time to train every day, and when he did he squeezed in his running on his hour-long midday break at the hospital. He had no coach or trainer or entourage, only a group of running partners who called themselves "the Paddington lunch time club." In a typical workout, they might run ten consecutive quarter miles--ten laps--with perhaps two minutes of recovery between each repetition, then gobble down lunch and hurry back to work. Today, that training session would be considered barely adequate for a high-school miler. A month or so before his historic mile, Bannister took a few days off to go hiking in Scotland. Five days before he broke the four-minute barrier, he stopped running entirely, in order to rest. The day before the race, he slipped and fell on his hip while working in the hospital. Then he ran the most famous race in the history of track and field. Bannister was what runners admiringly call an "animal," a natural.
El Guerrouj, by contrast, trains five hours a day, in two two-and-a-half-hour sessions. He probably has a team of half a dozen people working with him: at the very least, a masseur, a doctor, a coach, an agent, and a nutritionist. He is not in medical school. He does not go hiking in rocky terrain before major track meets. When Bannister told him, last summer, how he had prepared for his four-minute mile, El Guerrouj was stunned. "For me, a rest day is perhaps when I train in the morning and spend the afternoon at the cinema," he said. El Guerrouj certainly has more than his share of natural ability, but his achievements are a reflection of much more than that: of the fact that he is better coached and better prepared than his opponents, that he trains harder and more intelligently, that he has found a way to stay injury free, and that he can recover so quickly from one day of five-hour workouts that he can follow it, the next day, with another five-hour workout.
Of these two paradigms, we have always been much more comfortable with the first: we want the relation between talent and achievement to be transparent, and we worry about the way ability is now so aggressively managed and augmented. Steroids bother us because they violate the honesty of effort: they permit an athlete to train too hard, beyond what seems reasonable. EPO fails the same test. For years, athletes underwent high-altitude training sessions, which had the same effect as EPO--promoting the manufacture of additional red blood cells. This was considered acceptable, while EPO is not, because we like to distinguish between those advantages which are natural or earned and those which come out of a vial.
Even as we assert this distinction on the playing field, though, we defy it in our own lives. We have come to prefer a world where the distractable take Ritalin, the depressed take Prozac, and the unattractive get cosmetic surgery to a world ruled, arbitrarily, by those fortunate few who were born focussed, happy, and beautiful. Cosmetic surgery is not "earned" beauty, but then natural beauty isn't earned, either. One of the principal contributions of the late twentieth century was the moral deregulation of social competition--the insistence that advantages derived from artificial and extraordinary intervention are no less legitimate than the advantages of nature. All that athletes want, for better or worse, is the chance to play by those same rules.
Operation Rescue
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
September 17, 2001
COMMENT
One of the most striking aspects of the automobile industry is the precision with which it makes calculations of life and death. The head restraint on the back of a car seat has been determined to reduce an occupant's risk of dying in an accident by 0.36 per cent. The steel beams in a car's side doors cut fatalities by 1.7 per cent. The use of a seat belt in a right-front collision reduces the chances of a front-seat passenger's being killed through ejection by fourteen per cent, with a margin of error of plus or minus one per cent. When auto engineers discuss these numbers, they use detailed charts and draw curves on quadrille paper, understanding that it is through the exact and dispassionate measurement of fatality effects and the resulting technical tinkering that human lives are saved. They could wax philosophical about the sanctity of life, but what would that accomplish? Sometimes progress in matters of social policy occurs when the moralizers step back and the tinkerers step forward. In the face of the right-to-life debate in the countryand show trials like the Bush Administration's recent handling of the stem-cell controversy, it's worth wondering what would happen if those involved in that debate were to learn the same lesson.
Suppose, for example, that, instead of focussing on the legality of abortion, we focussed on the number of abortions in this country. That's the kind of thing that tinkerers do: they focus not on the formal status of social phenomena but on their prevalence. And the prevalence of abortion in this country is striking. In 1995, for example, American adolescents terminated pregnancies at a rate roughly a third greater than their Canadian, English, and Swedish counterparts, around triple that of French teen-agers, and six times that of Dutch and Italian adolescents.
This is not because abortions are more readily available in America. The European countries with the lowest abortion rates are almost all places where abortions are easier to get than they are in the United States. And it's not because pregnant European teen-agers are more likely to carry a child to term than Americans. (If anything, the opposite is true.) Nor is it because American teen-agers have more sex than Europeans: sexual behavior, in the two places, appears to be much the same. American teen-agers have more abortions because they get pregnant more than anyone else: they simply don't use enough birth control.
Bringing the numbers down is by no means an insurmountable problem. Many Western European countries managed to reduce birth rates among teen-agers by more than seventy per cent between 1970 and 1995, and reproductive-health specialists say that there's no reason we couldn't follow suit. Since the early nineteen-seventies, for instance, the federal Title X program has funded thousands of family-planning clinics around the country, and in the past twenty years the program has been responsible for preventing an estimated nine million abortions. It could easily be expanded. There is also solid evidence that a comprehensive, national sex-education curriculum could help to reduce unintended pregnancies still further. If these steps succeeded in bringing our teen-age-pregnancy rates into line with those in Canada and England, the number of abortions in this country could drop by about five hundred thousand a year. For those who believe that a fetus is a human being, this is like saying that if we could find a few hundred million dollars, and face the fact that, yes, teen-agers have sex, we could save the equivalent of the population of Arizona within a decade.
But this is not, unfortunately, the way things are viewed in Washington. Since the eighties, Title X has been under constant attack. Taking inflation into account, its level of funding is now about sixty per cent lower than it was twenty years ago, and the Bush Administration's budget appropriation does little to correct that shortfall. As for sex education, the President's stated preference is that a curriculum instructing teen-agers to abstain from sex be given parity with forms of sex education that mention the option of contraception. The chief distinguishing feature of abstinence-only programs is that there's no solid evidence that they do any good. The right's squeamishness about sex has turned America into the abortion capital of the West.
But, then, this is the same movement that considered Ronald Reagan to be an ally and Bill Clinton a foe. And what does the record actually show? In the eight years of President Reagan's Administration, there was an average of 1.6 million abortions a year; by the end of President Clinton's first term, when the White House was much more favorably disposed toward the kinds of policies that are now anathema in Washington, that annual figure had dropped by more than two hundred thousand. A tinkerer would look at those numbers and wonder whether we need a new definition of "pro-life."
Safety in the Skies
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
October 1, 2001
ANNALS OF AVIATION
How far can airline safety go?
1.
On November 24, 1971, a man in a dark suit, white shirt, and sunglasses bought a ticket in the name of Dan Cooper on the 2:50 P.M. Northwest Orient flight from Portland to Seattle. Once aboard the plane, he passed a note to a flight attendant. He was carrying a bomb, he said, and he wanted two hundred thousand dollars, four parachutes, and "no funny stuff." In Seattle, the passengers and flight attendants were allowed to leave, and the F.B.I. handed over the parachutes and the money in used twenty-dollar bills. Cooper then told the pilot to fly slowly at ten thousand feet in the direction of Nevada, and not long after takeoff, somewhere over southwest Washington, he gathered up the ransom, lowered the plane's back stairs, and parachuted into the night.
In the aftermath of Cooper's leap, "para-jacking," as it was known, became an epidemic in American skies. Of the thirty-one hijackings in the United States the following year, nineteen were attempts at Cooper-style extortion, and in fifteen of those cases the hijackers demanded parachutes so that they, too, could leap to freedom. It was a crime wave unlike any America had seen, and in response Boeing installed a special latch on its 727 model which prevented the tail stairs from being lowered in flight. The latch was known as the Cooper Vane, and it seemed, at the time, to be an effective response to the reign of terror in the skies. Of course, it was not. The Cooper Vane just forced hijackers to come up with ideas other than parachuting out of planes.
This is the great paradox of law enforcement. The better we are at preventing and solving the crimes before us, the more audacious criminals become. Put alarms and improved locks on cars, and criminals turn to the more dangerous sport of carjacking. Put guards and bulletproof screens in banks, and bank robbery gets taken over by high-tech hackers. In the face of resistance, crime falls in frequency but rises in severity, and few events better illustrate this tradeoff than the hijackings of September 11th. The way in which those four planes were commandeered that Tuesday did not simply reflect a failure of our security measures; it reflected their success. When you get very good at cracking down on ordinary hijacking -- when you lock the stairs at the back of the aircraft with a Cooper Vane -- what you are left with is extraordinary hijacking.
2.
The first serious push for airport security began in late 1972, in the wake of a bizarre hijacking of a DC-9 flight out of Birmingham, Alabama. A group of three men -- one an escaped convict and two awaiting trial for rape -- demanded a ransom of ten million dollars and had the pilot circle the Oak Ridge, Tennessee, nuclear facility for five hours, threatening to crash the plane if their demands were not met. Until that point, security at airports had been minimal, but, as the director of the Federal Aviation Administration said at the time, "The Oak Ridge odyssey has cleared the air." In December of that year, the airlines were given sixty days to post armed security officers at passenger-boarding checkpoints. On January 5, 1973, all passengers and all carry-on luggage were required by law to be screened, and X-ray machines and metal detectors began to be installed in airports.
For a time, the number of hijackings dropped significantly. But it soon became clear that the battle to make flying safer was only beginning. In the 1985 hijacking of TWA Flight 847 out of Athens -- which lasted seventeen days -- terrorists bypassed the X-ray machines and the metal detectors by using members of the cleaning staff to stash guns and grenades in a washroom of the plane. In response, the airlines started to require background checks and accreditation of ground crews. In 1986, El Al security officers at London's Heathrow Airport found ten pounds of high explosives in the luggage of an unwitting and pregnant Irish girl, which had been placed there by her Palestinian boyfriend. Now all passengers are asked if they packed their bags themselves. In a string of bombings in the mid-eighties, terrorists began checking explosives-filled bags onto planes without boarding the planes themselves. Airlines responded by introducing "bag matching" on international flights -- stipulating that no luggage can be loaded on a plane unless its owner is on board as well. As an additional safety measure, the airlines started X-raying and searching checked bags for explosives. But in the 1988 bombing of Pan Am Flight 103 over Lockerbie, Scotland, terrorists beat that system by hiding plastic explosives inside a radio. As a result, the airlines have now largely switched to using CT scanners, a variant of the kind used in medical care, which take a three-dimensional picture of the interior of every piece of luggage and screen it with pattern-recognition software. The days when someone could stroll onto a plane with a bag full of explosives are long gone.
3.
These are the security obstacles that confront terrorists planning an attack on an airline. They can't bomb an international flight with a checked bag, because they know that there is a good chance the bag will be intercepted. They can't check the bag and run, because the bomb will never get on board. And they can't hijack the plane with a gun, because there is no sure way of getting that weapon on board. The contemporary hijacker, in other words, must either be capable of devising a weapon that can get past security or be willing to go down with the plane. Most terrorists have neither the cleverness to meet the first criterion nor the audacity to meet the second, which is why the total number of hijackings has been falling for the past thirty years. During the nineties, in fact, the number of civil aviation "incidents" worldwide -- hijackings, bombings, shootings, attacks, and so forth -- dropped by more than seventy per cent. But this is where the law -- enforcement paradox comes in: Even as the number of terrorist acts has diminished, the number of people killed in hijackings and bombings has steadily increased. And, despite all the improvements in airport security, the percentage of terrorist hijackings foiled by airport security in the years between 1987 and 1996 was at its lowest point in thirty years. Airport-security measures have simply chased out the amateurs and left the clever and the audacious. "A look at the history of attacks on commercial aviation reveals that new terrorist methods of attack have virtually never been foreseen by security authorities," the Israeli terrorism expert Ariel Merari writes, in the recent book "Aviation Terrorism and Security."
The security system was caught by surprise when an airliner was first hijacked for political extortion; it was unprepared when an airliner was attacked on the tarmac by a terrorist team firing automatic weapons; when terrorists, who arrived as passengers, collected their luggage from the conveyer belt, took out weapons from their suitcases, and strafed the crowd in the arrivals hall; when a parcel bomb sent by mail exploded in an airliner's cargo hold in mid-flight; when a bomb was brought on board by an unwitting passenger. . . . The history of attacks on aviation is the chronicle of a cat-and-mouse game, where the cat is busy blocking old holes and the mouse always succeeds in finding new ones.
And no hole was bigger than the one found on September 11th.
4.
What the attackers understood was the structural weakness of the passenger-gate security checkpoint, particularly when it came to the detection of knives. Hand-luggage checkpoints use X-ray machines, which do a good job of picking out a large, dense, and predictable object like a gun. Now imagine looking at a photograph of a knife. From the side, the shape is unmistakable. But if the blade edge is directly facing the camera what you'll see is just a thin line. "If you stand the knife on its edge, it could be anything," says Harry Martz, who directs the Center for Nondestructive Characterization at Lawrence Livermore Laboratories. "It could be a steel ruler. Then you put in computers, hair dryers, pens, clothes hangers, and it makes it even more difficult to pick up the pattern."
The challenge of detecting something like a knife blade is made harder still by the psychological demands on X-ray operators. What they are looking for -- weapons -- is called the "signal," and a well-documented principle of human-factors research is that as the "signal rate" declines, detection accuracy declines as well. If there was a gun in every second bag, for instance, you could expect the signals to be detected with almost perfect accuracy: the X-ray operator would be on his toes. But guns are almost never found in bags, which means that the vigilance of the operator inevitably falters. This is a significant problem in many fields, from nuclear-plant inspection to quality-control in manufacturing plants -- where the job of catching defects on, say, a car becomes harder and harder as cars become better made. "I've studied this in people who look for cracks in the rotor disks of airplane engines," says Colin Drury, a human-factors specialist at the University of Buffalo. "Remember the DC-10 crash at Sioux City? That was a rotor disk. Well, the probability of that kind of crack happening is incredibly small. Most inspectors won't see one in their lifetime, so it's very difficult to remain alert to that." The F.A.A. periodically plants weapons in baggage to see whether they are detected. But it's not clear what effect that kind of test has on vigilance. In the wake of the September attacks, some commentators called for increased training for X-ray security operators. Yet the problem is not just a lack of expertise; it is the paucity of signals. "Better training is only going to get you so far," explains Douglas Harris, chairman of Anacapa Sciences, a California-based human-factors firm. "If it now takes a day to teach people the techniques they need, adding another day isn't going to make much difference."
A sophisticated terrorist wanting to smuggle knives on board, in other words, has a good shot at "gaming" the X-ray machine by packing his bags cleverly and exploiting the limitations of the operator. If he chooses, he can also beat the metal detector by concealing on his person knives made of ceramic or plastic, which wouldn't trip the alarm. The knife strategy has its drawbacks, of course. It's an open question how long a group of terrorists armed only with knives can hold off a cabin full of passengers. But if all they need is to make a short flight from Boston to downtown Manhattan knives would suffice.
5.
Can we close the loopholes that led to the September 11th attack? Logistically, an all-encompassing security system is probably impossible. A new safety protocol that adds thirty seconds to the check-in time of every passenger would add more than three hours to the preparation time for a 747, assuming that there are no additional checkpoints. Reforms that further encumber the country's already overstressed air-traffic system are hardly reforms; they are self-inflicted wounds. People have suggested that we station armed federal marshals on more flights. This could be an obstacle for some terrorists but an opportunity for others, who could overcome a marshal to gain possession of a firearm.
What we ought to do is beef up security for a small percentage of passengers deemed to be high-risk. The airlines already have in place a screening technology of this sort, known as CAPPS -- Computer-Assisted Passenger Prescreening System. When a ticket is purchased on a domestic flight in the United States, the passenger is rated according to approximately forty pieces of data. Though the parameters are classified, they appear to include the traveller's address, credit history, and destination; whether he or she is travelling alone; whether the ticket was paid for in cash; how long before the departure it was bought; and whether it is one way. (A recent review by the Department of Justice affirmed that the criteria are not discriminatory on the basis of ethnicity.) A sixty-eight-year-old male who lives on Park Avenue, has a fifty-thousand-dollar limit on his credit card, and has flown on the Washington-New York shuttle twice a week for the past eight years, for instance, is never going to get flagged by the CAPPS system. Probably no more than a handful of people per domestic flight ever are, but those few have their checked luggage treated with the kind of scrutiny that, until this month, was reserved for international flights. Their bags are screened for explosives and held until the passengers are actually on board. It would be an easy step to use the CAPPS ratings at the gate as well. Those dubbed high-risk could have their hand luggage scrutinized by the slower but much more comprehensive CT scanner, which would make hiding knives or other weapons in hand luggage all but impossible.
At the same time, high-risk passengers could be asked to undergo an electronic strip search known as a body scan. In a conventional X-ray, the rays pass through the body, leaving an imprint on a detector on the other side. In a body scanner, the X-rays are much weaker, penetrating clothes but not the body, so they bounce back and leave an imprint of whatever lies on the surface of the skin. A body scanner would have picked up a ceramic knife in an instant. Focussing on a smaller group of high-risk people would have the additional benefit of improving the detection accuracy of the security staff: it would raise the signal rate.
We may never know, of course, whether an expanded CAPPS system would have flagged the September 11th terrorists, but certainly those who planned the attack would have had to take that possibility seriously. The chief distinction between American and Israeli airport defense, at the moment, is that the American system focusses on technological examination of the baggage while the Israeli system focusses on personal interrogation and assessment of the passenger -- which has resulted in El Al's having an almost unblemished record against bombings and hijackings over the past twenty years. Wider use of CAPPS profiling would correct that shortcoming, and narrow still further the options available for any would-be terrorist. But we shouldn't delude ourselves that these steps will end hijackings, any more than the Cooper Vane did thirty years ago. Better law enforcement doesn't eliminate crime. It forces the criminals who remain to come up with something else. And, as we have just been reminded, that something else, all too frequently, is something worse.
The Scourge You Know
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
October 29, 2001
CONTAGIONS
If you are wondering what to worry about when it comes to biological weapons, you should concern yourself, first of all, with things that are easy to deliver. Biological agents are really dangerous only when they can reach lots of people, and very few bioweapons can easily do that. In 1990, members of Japan's Aum Shinrikyo cult drove around the Parliament buildings in Tokyo in an automobile rigged to disseminate botulinum toxin. It didn't work. The same group also tried, repeatedly, to release anthrax from a rooftop, and that didn't work, either. It's simply too complicated to make anthrax in the fine, "mist" form that is the most lethal. And the spores are destroyed so quickly by sunlight that any kind of mass administration of anthrax is extremely difficult.
A much scarier biological weapon would be something contagious: something a few infected people could spread, unwittingly, in ever widening and more terrifying circles. Even with a contagious agent, though, you don't really have to worry about pathogens that are what scientists call stable--that are easy to identify and that don't change from place to place or year to year--because those kinds of biological agents are easy to defend against. That's why you shouldn't worry quite so much about smallpox. Deadly as it is, smallpox is so well understood that the vaccine is readily made and extraordinarily effective, and works for decades. If we wanted to, we could all be inoculated against smallpox in a matter of years.
What you really should worry about, then, is something that is highly contagious and highly unstable, a biological agent that kills lots of people and isn't easy to treat, that mutates so rapidly that each new bout of terror requires a brand-new vaccine. What you should worry about, in other words, is the influenza virus.
If there is an irony to America's current frenzy over anthrax and biological warfare--the paralyzed mailrooms, the endless talk-show discussions, the hoarding of antibiotics, and the closed halls of Congress--it is that it has occurred right at the beginning of the flu season, the time each year when the democracies of the West are routinely visited by one of the most deadly of all biological agents. This year, around twenty thousand Americans will die of the flu, and if this is one of those years, like 1957 or 1968, when we experience an influenza pandemic, that number may hit fifty thousand. The victims will primarily be the very old and the very young, although there will be a significant number of otherwise healthy young adults among them, including many pregnant women. All will die horrible deaths, racked by raging fevers, infections, headaches, chills, and sweats. And the afflicted, as they suffer, will pass their illness on to others, creating a wave of sickness that will cost the country billions of dollars. Influenza "quietly kills tens of thousands of people every year," Edwin Kilbourne, a research professor at New York Medical College and one of the country's leading flu experts, says. "And those who don't die are incapacitated for weeks. It mounts a silent and pervasive assault."
That we have chosen to worry more about anthrax than about the flu is hardly surprising. The novel is always scarier than the familiar, and the flu virus, as far as we know, isn't being sent through the mail by terrorists. But it is a strange kind of public-health policy that concerns itself more with the provenance of illness than with its consequences; and the consequences of the flu, year in, year out, dwarf everything but the most alarmist bioterror scenarios. If even a fraction of the energy and effort now being marshalled against anthrax were directed instead at the flu, we could save thousands of lives. Kilbourne estimates that at least half the deaths each year from the flu are probably preventable: vaccination rates among those most at risk under the age of fifty are a shameful twenty-three per cent, and for asthmatic children, who are also at high risk, the vaccination rate is ten per cent. And vaccination has been shown to save money: the costs of hospitalization for those who get sick far exceed the costs of inoculating everyone else. Why, under the circumstances, this country hasn't mounted an aggressive flu-vaccination program is a question that Congress might want to consider, when it returns to its newly fumigated, anthrax-free chambers. Not all threats to health and happiness come from terrorists in faraway countries. Many are the result of what, through simple indifference, we do to ourselves.
THE ARCHIVE
complete list
Articles from the New Yorker
Smaller
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
November 26, 2001
ANNALS OF TECHNOLOGY
The disposable diaper and the meaning of progress.
1.
The best way to explore the mystery of the Huggies Ultratrim disposable diaper is to unfold it and then cut it in half, widthwise, across what is known as the diaper's chassis. At Kimberly-Clark's Lakeview plant, in Neenah, Wisconsin, where virtually all the Huggies in the Midwest are made, there is a quality-control specialist who does this all day long, culling diapers from the production line, pinning them up against a lightboard, and carefully dismembering them with a pair of scissors. There is someone else who does a "visual cull," randomly picking out Huggies and turning them over to check for flaws. But a surface examination tells you little. A diaper is not like a computer that makes satisfying burbling noises from time to time, hinting at great inner complexity. It feels like papery underwear wrapped around a thin roll of Cottonelle. But peel away the soft fabric on the top side of the diaper, the liner, which receives what those in the trade delicately refer to as the "insult." You'll find a layer of what's called polyfilm, which is thinner than a strip of Scotch tape. This layer is one of the reasons the garment stays dry: it has pores that are large enough to let air flow in, so the diaper can breathe, but small enough to keep water from flowing out, so the diaper doesn't leak.
Or run your hands along that liner. It feels like cloth. In fact, the people at Kimberly-Clark make the liner out of a special form of plastic, a polyresin. But they don't melt the plastic into a sheet, as one would for a plastic bag. They spin the resin into individual fibres, and then use the fibres to create a kind of microscopic funnel, channelling the insult toward the long, thick rectangular pad that runs down the center of the chassis, known as the absorbent core. A typical insult arrives at a rate of seven millilitres a second, and might total seventy millilitres of fluid. The liner can clear that insult in less than twenty seconds. The core can hold three or more of those insults, with a chance of leakage in the single digits. The baby's skin will remain almost perfectly dry, and that is critical, because prolonged contact between the baby and the insult (in particular, ammonium hydroxide, a breakdown product of urine) is what causes diaper rash. And all this will be accomplished by a throwaway garment measuring, in the newborn size, just seven by thirteen inches. This is the mystery of the modern disposable diaper: how does something so small do so much?
2.
Thirty-seven years ago, the Silicon Valley pioneer Gordon Moore made a famous prediction. The number of transistors that engineers could fit onto a microchip, he said, would double every two years. It seemed like a foolhardy claim: it was not clear that you could keep making transistors smaller and smaller indefinitely. It also wasn't clear that it would make sense to do so. Most of the time when we make things smaller, after all, we pay a price. A smaller car is cheaper and more fuel-efficient, and easier to park and maneuver, but it will never be as safe as a larger car. In the nineteen-fifties and sixties, the transistor radio was all the rage; it could fit inside your pocket and run on a handful of batteries. But, because it was so small, the sound was terrible, and virtually all the other mini-electronics turn out to be similarly imperfect. Tiny cell phones are hard to dial. Tiny televisions are hard to watch. In making an object smaller, we typically compromise its performance. The remarkable thing about chips, though, was that there was no drawback: if you could fit more and more transistors onto a microchip, then instead of using ten or twenty or a hundred microchips for a task you could use just one. This meant, in turn, that you could fit microchips in all kinds of places (such as cellular phones and laptops) that you couldn't before, and, because you were using one chip and not a hundred, computer power could be had at a fraction of the price, and because chips were now everywhere and in such demand they became even cheaper to make--and so on and so on. Moore's Law, as it came to be called, describes that rare case in which there is no trade-off between size and performance. Microchips are what might be termed a perfect innovation.
In the past twenty years, diapers have got smaller and smaller, too. In the early eighties, they were three times bulkier than they are now, thicker and substantially wider in the crotch. But in the mid-eighties Huggies and Procter & Gamble's Pampers were reduced in bulk by fifty per cent; in the mid-nineties they shrank by a third or so; and in the next few years they may shrink still more. It seems reasonable that there should have been a downside to this, just as there was to the shrinking of cars and radios: how could you reduce the amount of padding in a diaper and not, in some way, compromise its ability to handle an insult? Yet, as diapers got smaller, they got better, and that fact elevates the diaper above nearly all the thousands of other products on the supermarket shelf.
Kimberly-Clark's Lakeview plant is a huge facility, just down the freeway from Green Bay. Inside, it is as immaculate as a hospital operating room. The walls and floors have been scrubbed white. The stainless-steel machinery gleams. The employees are dressed in dark-blue pants, starched light-blue button-down shirts, and tissue-paper caps. There are rows of machines in the plant, each costing more than fifteen million dollars--a dizzying combination of conveyor belts and whirling gears and chutes stretching as long as a city block and creating such a din that everyone on the factory floor wears headsets and communicates by radio. Computers monitor a million data points along the way, insuring that each of those components is precisely cut and attached according to principles and processes and materials protected, on the Huggies Ultratrim alone, by hundreds of patents. At the end of the line, the Huggies come gliding out of the machine, stacked upright, one after another in an endless row, looking like exquisitely formed slices of white bread in a toast rack. For years, because of Moore's Law, we have considered the microchip the embodiment of the technological age. But if the diaper is also a perfect innovation, doesn't it deserve a place beside the chip?
3.
The modern disposable diaper was invented twice, first by Victor Mills and then by Carlyle Harmon and Billy Gene Harper. Mills worked for Procter & Gamble, and he was a legend. Ivory soap used to be made in an expensive and time-consuming batch-by-batch method. Mills figured out a simpler, continuous process. Duncan Hines cake mixes used to have a problem blending flour, sugar, and shortening in a consistent mixture. Mills introduced the machines used for milling soap, which ground the ingredients much more finely than before, and the result was New, Improved Duncan Hines cake mix. Ever wonder why Pringles, unlike other potato chips, are all exactly the same shape? Because they are made like soap: the potato is ground into a slurry, then pressed, baked, and wrapped--and that was Victor Mills's idea, too.
In 1957, Procter & Gamble bought the Charmin Paper Company, of Green Bay, Wisconsin, and Mills was told to think of new products for the paper business. Since he was a grandfather--and had always hated washing diapers--he thought of a disposable diaper. "One of the early researchers told me that among the first things they did was go out to a toy store and buy one of those Betsy Wetsy-type dolls, where you put water in the mouth and it comes out the other end," Ed Rider, the head of the archives department at Procter & Gamble, says. "They brought it back to the lab, hooked up its legs on a treadmill to make it walk, and tested diapers on it." The end result was Pampers, which were launched in Peoria, in 1961. The diaper had a simple rectangular shape. Its liner, which lay against the baby's skin, was made of rayon. The outside material was plastic. In between were multiple layers of crĂŞped tissue. The diaper was attached with pins and featured what was known as a Z fold, meaning that the edges of the inner side were pleated, to provide a better fit around the legs.
In 1968, Kimberly-Clark brought out Kimbies, which took the rectangular diaper and shaped it to more closely fit a baby's body. In 1976, Procter & Gamble brought out Luvs, which elasticized the leg openings to prevent leakage. But diapers still adhered to the basic Millsian notion of an absorbent core made out of paper--and that was a problem. When paper gets wet, the fluid soaks right through, which makes diaper rash worse. And if you put any kind of pressure on paper--if you squeeze it, or sit on it--it will surrender some of the water it has absorbed, which creates further difficulties, because a baby, in the usual course of squirming and crawling and walking, might place as much as five kilopascals of pressure on the absorbent core of a diaper. Diaper-makers tried to address this shortcoming by moving from crĂŞped tissue to what they called fluff, which was basically finely shredded cellulose. Then they began to compensate for paper's failing by adding more and more of it, until diapers became huge. But they now had Moore's Law in reverse: in order to get better, they had to get bigger--and bigger still wasn't very good.
Carlyle Harmon worked for Johnson & Johnson and Billy Gene Harper worked for Dow Chemical, and they had a solution. In 1966, each filed separate but virtually identical patent applications, proposing that the best way to solve the diaper puzzle was with a peculiar polymer that came in the form of little pepperlike flakes and had the remarkable ability to absorb up to three hundred times its weight in water.
In the Dow patent, Harper and his team described how they sprinkled two grams of the superabsorbent polymer between two twenty-inch-square sheets of nylon broadcloth, and then quilted the nylon layers together. The makeshift diaper was "thereafter put into use in personal management of a baby of approximately 6 months age." After four hours, the diaper was removed. It now weighed a hundred and twenty grams, meaning the flakes had soaked up sixty times their weight in urine.
Harper and Harmon argued that it was quite unnecessary to solve the paper problem by stuffing the core of the diaper with thicker and thicker rolls of shredded pulp. Just a handful of superabsorbent polymer would do the job. Thus was the modern diaper born. Since the mid-eighties, Kimberly-Clark and Procter & Gamble have made diapers the Harper and Harmon way, pulling out paper and replacing it with superabsorbent polymer. The old, paper-filled diaper could hold, at most, two hundred and seventy-five millilitres of fluid, or a little more than a cup. Today, a diaper full of superabsorbent polymer can handle as much as five hundred millilitres, almost twice that. The chief characteristic of the Mills diaper was its simplicity: the insult fell directly into the core. But the presence of the polymer has made the diaper far more complex. It takes longer for the polymer than it does paper to fully absorb an insult, for instance. So another component was added, the acquisition layer, between the liner and the core. The acquisition layer acts like blotting paper, holding the insult while the core slowly does its work, and distributing the fluid over its full length.
Diaper researchers sometimes perform what is called a re-wet test, where they pour a hundred millilitres of fluid onto the surface of a diaper and then apply a piece of filter paper to the diaper liner with five kilopascals of pressure--the average load a baby would apply to a diaper during ordinary use. In a contemporary superabsorbent diaper, like a Huggies or a Pampers, the filter paper will come away untouched after one insult. After two insults, there might be 0.1 millilitres of fluid on the paper. After three insults, the diaper will surrender, at most, only two millilitres of moisture--which is to say that, with the aid of superabsorbents, a pair of Huggies or Pampers can effortlessly hold, even under pressure, a baby's entire night's work.
The heir to the legacy of Billy Gene Harper at Dow Chemical is Fredric Buchholz, who works in Midland, Michigan, a small town two hours northwest of Detroit, where Dow has its headquarters. His laboratory is in the middle of the sprawling chemical works, a mile or two away from corporate headquarters, in a low, unassuming brick building. "We still don't understand perfectly how these polymers work," Buchholz said on a recent fall afternoon. What we do know, he said, is that superabsorbent polymers appear, on a microscopic level, to be like a tightly bundled fisherman's net. In the presence of water, that net doesn't break apart into thousands of pieces and dissolve, like sugar. Rather, it just unravels, the way a net would open up if you shook it out, and as it does the water gets stuck in the webbing. That ability to hold huge amounts of water, he said, could make superabsorbent polymers useful in fire fighting or irrigation, because slightly gelled water is more likely to stay where it's needed. There are superabsorbents mixed in with the sealant on the walls of the Chunnel between England and France, so if water leaks in the polymer will absorb the water and plug the hole.
Right now, one of the major challenges facing diaper technology, Buchholz said, is that urine is salty, and salt impairs the unravelling of the netting: superabsorbents can handle only a tenth as much salt water as fresh water. "One idea is to remove the salt from urine. Maybe you could have a purifying screen," he said. If the molecular structure of the superabsorbent were optimized, he went on, its absorptive capacity could increase by another five hundred per cent. "Superabsorbents could go from absorbing three hundred times their weight to absorbing fifteen hundred times their weight. We could have just one perfect particle of super-absorbent in a diaper. If you are going to dream, why not make the diaper as thin as a pair of underwear?"
Buchholz was in his laboratory, and he held up a small plastic cup filled with a few tablespoons of superabsorbent flakes, each not much larger than a grain of salt. "It's just a granular material, totally nontoxic," he said. "This is about two grams." He walked over to the sink and filled a large beaker with tap water, and poured the contents of the beaker into the jar of superabsorbent. At first, nothing happened. The amounts were so disproportionate that it looked as if the water would simply engulf the flakes. But, slowly and steadily, the water began to thicken. "Look," Buchholz said. "It's becoming soupy." Sure enough, little beads of gel were forming. Nothing else was happening: there was no gas given off, no burbling or sizzling as the chemical process took place. The superabsorbent polymer was simply swallowing up the water, and within minutes the contents of the cup had thickened into what looked like slightly lumpy, spongy pudding. Buchholz picked up the jar and tilted it, to show that nothing at all was coming out. He pushed and prodded the mass with his finger. The water had disappeared. To soak up that much liquid, the Victor Mills diaper would have needed a thick bundle of paper towelling. Buchholz had used a few tablespoons of superabsorbent flakes. Superabsorbent was not merely better; it was smaller.
4.
Why does it matter that the diaper got so small? It seems a trivial thing, chiefly a matter of convenience to the parent taking a bag of diapers home from the supermarket. But it turns out that size matters a great deal. There's a reason that there are now "new, improved concentrated" versions of laundry detergent, and that some cereals now come in smaller boxes. Smallness is one of those changes that send ripples through the whole economy. The old disposable diapers, for example, created a transportation problem. Tractor-trailers are prohibited by law from weighing more than eighty thousand pounds when loaded. That's why a truck carrying something heavy and compact like bottled water or Campbell's soup is "full," when the truck itself is still half empty. But the diaper of the eighties was what is known as a "high cube" item. It was bulky and not very heavy, meaning that a diaper truck was full before it reached its weight limit. By cutting the size of a diaper in half, companies could fit twice as many diapers on a truck, and cut transportation expenses in half. They could also cut the amount of warehouse space and labor they needed in half. And companies could begin to rethink their manufacturing operations. "Distribution costs used to force you to have plants in lots of places," Dudley Lehman, who heads the Kimberly-Clark diaper business, says. "As that becomes less and less of an issue, you say, 'Do I really need all my plants?' In the United States, it used to take eight. Now it takes five." (Kimberly-Clark didn't close any plants. But other manufacturers did, and here, perhaps, is a partial explanation for the great wave of corporate restructuring that swept across America in the late eighties and early nineties: firms could downsize their workforce because they had downsized their products.) And, because using five plants to make diapers is more efficient than using eight, it became possible to improve diapers without raising diaper prices--which is important, because the sheer number of diapers parents have to buy makes it a price-sensitive product. Until recently, diapers were fastened with little pieces of tape, and if the person changing the diapers got lotion or powder on her fingers the tape wouldn't work. A hook-and-loop, Velcro-like fastener doesn't have this problem. But it was years before the hook-and-loop fastener was incorporated into the diaper chassis: until over-all manufacturing costs were reduced, it was just too expensive.
Most important, though, is how size affects the way diapers are sold. The shelves along the aisles of a supermarket are divided into increments of four feet, and the space devoted to a given product category is almost always a multiple of that. Diapers, for example, might be presented as a twenty-foot set. But when diapers were at their bulkiest the space reserved for them was never enough. "You could only get a limited number on the shelf," says Sue Klug, the president of Catalina Marketing Solutions and a former executive for Albertson's and Safeway. "Say you only had six bags. Someone comes in and buys a few, and then someone else comes in and buys a few more. Now you're out of stock until someone reworks the shelf, which in some supermarkets might be a day or two." Out-of-stock rates are already a huge problem in the retail business. At any given time, only about ninety-two per cent of the products that a store is supposed to be carrying are actually on the shelf--which, if you consider that the average supermarket has thirty-five thousand items, works out to twenty-eight hundred products that are simply not there. (For a highly efficient retailer like Wal-Mart, in-stock rates might be as high as ninety-nine per cent; for a struggling firm, they might be in the low eighties.) But, for a fast-moving, bulky item like diapers, the problem of restocking was much worse. Supermarkets could have allocated more shelf space to diapers, of course, but diapers aren't a particularly profitable category for retailers--profit margins are about half what they are for the grocery department. So retailers would much rather give more shelf space to a growing and lucrative category like bottled water. "It's all a trade-off," Klug says. "If you expand diapers four feet, you've got to give up four feet of something else." The only way diaper-makers could insure that their products would actually be on the shelves was to make the products smaller, so they could fit twelve bags into the space of six. And if you can fit twelve bags on a shelf, you can introduce different kinds of diapers. You can add pull-ups and premium diapers and low-cost private-label diapers, all of which give parents more options.
"We cut the cost of trucking in half," says Ralph Drayer, who was in charge of logistics for Procter & Gamble for many years and now runs his own supply-chain consultancy in Cincinnati. "We cut the cost of storage in half. We cut handling in half, and we cut the cost of the store shelf in half, which is probably the most expensive space in the whole chain." Everything in the diaper world, from plant closings and trucking routes to product improvements and consumer choice and convenience, turns, in the end, on the fact that Harmon and Harper's absorbent core was smaller than Victor Mills's.
The shame of it, though, is that Harmon and Harper have never been properly celebrated for their accomplishment. Victor Mills is the famous one. When he died, he was given a Times obituary, in which he was called "the father of disposable diapers." When Carlyle Harmon died, seven months earlier, he got four hundred words in Utah's Deseret News, stressing his contributions to the Mormon Church. We tend to credit those who create an idea, not those who perfect it, forgetting that it is often only in the perfection of an idea that true progress occurs. Putting sixty-four transistors on a chip allowed people to dream of the future. Putting four million transistors on a chip actually gave them the future. The diaper is no different. The paper diaper changed parenting. But a diaper that could hold four insults without leakage, keep a baby's skin dry, clear an insult in twenty seconds flat, and would nearly always be in stock, even if you arrived at the supermarket at eight o'clock in the evening--and that would keep getting better at all those things, year in and year out--was another thing altogether. This was more than a good idea. This was something like perfection.
THE ARCHIVE
complete list
Articles from the New Yorker
Examined Life
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
December 17, 2001
A CRITIC AT LARGE
What Stanley H. Kaplan taught us about the SAT
1.
Once, in fourth grade, Stanley Kaplan got a B-plus on his report card and was so stunned that he wandered aimlessly around the neighborhood, ashamed to show his mother. This was in Brooklyn, on Avenue K in Flatbush, between the wars. Kaplan's father, Julius, was from Slutsk, in Belorussia, and ran a plumbing and heating business. His mother, Ericka, ninety pounds and four feet eight, was the granddaughter of the chief rabbi of the synagogue of Prague, and Stanley loved to sit next to her on the front porch, immersed in his schoolbooks while his friends were off playing stickball. Stanley Kaplan had Mrs. Holman for fifth grade, and when she quizzed the class on math equations, he would shout out the answers. If other students were having problems, Stanley would take out pencil and paper and pull them aside. He would offer them a dime, sometimes, if they would just sit and listen. In high school, he would take over algebra class, and the other kids, passing him in the hall, would call him Teach. One classmate, Aimee Rubin, was having so much trouble with math that she was in danger of being dropped from the National Honor Society. Kaplan offered to help her, and she scored a ninety-five on her next exam. He tutored a troubled eleven-year-old named Bob Linker, and Bob Linker ended up a successful businessman. In Kaplan?s sophomore year at City College, he got a C in biology and was so certain that there had been a mistake that he marched in to see the professor and proved that his true grade, an A, had accidentally been switched with that of another, not quite so studious, Stanley Kaplan. Thereafter, he became Stanley H. Kaplan, and when people asked him what the "H" stood for he would say "Higher scores!" or, with a sly wink, "Preparation!" He graduated Phi Beta Kappa and hung a shingle outside his parent's house on Avenue K, "Stanley H. Kaplan Educational Center," and started tutoring kids in the basement. In 1946, a high-school junior named Elizabeth, from Coney Island, came to him for help on an exam he was unfamiliar with. It was called the Scholastic Aptitude Test, and from that moment forward the business of getting into college in America was never quite the same.
The S.A.T., at that point, was just beginning to go into widespread use. Unlike existing academic exams, it was intended to measure innate ability--not what a student had learned but what a student was capable of learning--and it stated clearly in the instructions that "cramming or last-minute reviewing" was pointless. Kaplan was puzzled. In Flatbush you always studied for tests. He gave Elizabeth pages of math problems and reading-comprehension drills. He grilled her over and over, doing what the S.A.T. said should not be done. And what happened? On test day, she found the S.A.T. "a piece of cake," and promptly told all her friends, and her friends told their friends, and soon word of Stanley H. Kaplan had spread throughout Brooklyn.
A few years later, Kaplan married Rita Gwirtzman, who had grown up a mile away, and in 1951 they moved to a two-story brick-and-stucco house on Bedford Avenue, a block from his alma mater, James Madison High School. He renovated his basement, dividing it into classrooms. When the basement got too crowded, he rented a podiatrist's office near King's Highway, at the Brighton Beach subway stop. In the nineteen-seventies, he went national, setting up educational programs throughout the country, creating an S.A.T.-preparation industry that soon became crowded with tutoring companies and study manuals. Kaplan has now written a memoir, "Test Pilot" (Simon & Schuster; $19), which has as its subtitle "How I Broke Testing Barriers for Millions of Students and Caused a Sonic Boom in the Business of Education." That actually understates his importance. Stanley Kaplan changed the rules of the game.
2.
The S.A.T. is now seventy-five years old, and it is in trouble. Earlier this year, the University of California--the nation's largest public-university system--stunned the educational world by proposing a move toward a "holistic" admissions system, which would mean abandoning its heavy reliance on standardized-test scores. The school backed up its proposal with a devastating statistical analysis, arguing that the S.A.T. is virtually useless as a tool for making admissions decisions.
The report focussed on what is called predictive validity, a statistical measure of how well a high-school student's performance in any given test or program predicts his or her performance as a college freshman. If you wanted to, for instance, you could calculate the predictive validity of prowess at Scrabble, or the number of books a student reads in his senior year, or, more obviously, high-school grades. What the Educational Testing Service (which creates the S.A.T.) and the College Board (which oversees it) have always argued is that most performance measures are so subjective and unreliable that only by adding aptitude-test scores into the admissions equation can a college be sure it is picking the right students.
This is what the U.C. study disputed. It compared the predictive validity of three numbers: a student's high-school G.P.A., his or her score on the S.A.T. (or, as it is formally known, the S.A.T. I), and his or her score on what is known as the S.A.T. II, which is a so-called achievement test, aimed at gauging mastery of specific areas of the high-school curriculum. Drawing on the transcripts of seventy-eight thousand University of California freshmen from 1996 through 1999, the report found that, over all, the most useful statistic in predicting freshman grades was the S.A.T. II, which explained sixteen per cent of the "variance" (which is another measure of predictive validity). The second most useful was high-school G.P.A., at 15.4 per cent. The S.A.T. was the least useful, at 13.3 per cent. Combining high-school G.P.A. and the S.A.T. II explained 22.2 per cent of the variance in freshman grades. Adding in S.A.T. I scores increased that number by only 0.1 per cent. Nor was the S.A.T. better at what one would have thought was its strong suit: identifying high-potential students from bad schools. In fact, the study found that achievement tests were ten times more useful than the S.A.T. in predicting the success of students from similar backgrounds. "Achievement tests are fairer to students because they measure accomplishment rather than promise," Richard Atkinson, the president of the University of California, told a conference on college admissions last month. "They can be used to improve performance; they are less vulnerable to charges of cultural or socioeconomic bias; and they are more appropriate for schools because they set clear curricular guidelines and clarify what is important for students to learn. Most important, they tell students that a college education is within the reach of anyone with the talent and determination to succeed."
This argument has been made before, of course. The S.A.T. has been under attack, for one reason or another, since its inception. But what is happening now is different. The University of California is one of the largest single customers of the S.A.T. It was the U.C. system's decision, in 1968, to adopt the S.A.T. that affirmed the test's national prominence in the first place. If U.C. defects from the S.A.T., it is not hard to imagine it being followed by a stampede of other colleges. Seventy-five years ago, the S.A.T. was instituted because we were more interested, as a society, in what a student was capable of learning than in what he had already learned. Now, apparently, we have changed our minds, and few people bear more responsibility for that shift than Stanley H. Kaplan.
3.
From the moment he set up shop on Avenue K, Stanley Kaplan was a pariah in the educational world. Once, in 1956, he went to a meeting for parents and teachers at a local high school to discuss the upcoming S.A.T., and one of the teachers leading the meeting pointed his finger at Kaplan and shouted, "I refuse to continue until THAT MAN leaves the room." When Kaplan claimed that his students routinely improved their scores by a hundred points or more, he was denounced by the testing establishment as a "quack" and "the cram king" and a "snake oil salesman." At the Educational Testing Service, "it was a cherished assumption that the S.A.T. was uncoachable," Nicholas Lemann writes in his history of the S.A.T., "The Big Test":
The whole idea of psychometrics was that mental tests are a measurement of a psychical property of the brain, analogous to taking a blood sample. By definition, the test-taker could not affect the result. More particularly, E.T.S.' s main point of pride about the S.A.T. was its extremely high test-retest reliability, one of the best that any standardized test had ever achieved... . So confident of the S.A.T.'s reliability was E.T.S. that the basic technique it developed for catching cheaters was simply to compare first and second scores, and to mount an investigation in the case of any very large increase. E.T.S. was sure that substantially increasing one's score could be accomplished only by nefarious means.
But Kaplan wasn't cheating. His great contribution was to prove that the S.A.T. was eminently coachable--that whatever it was that the test was measuring was less like a blood sample than like a heart rate, a vital sign that could be altered through the right exercises. In those days, for instance, the test was a secret. Students walking in to take the S.A.T. were often in a state of terrified ignorance about what to expect. (It wasn't until the early eighties that the E.T.S. was forced to release copies of old test questions to the public.) So Kaplan would have "Thank Goodness It's Over" pizza parties after each S.A.T. As his students talked about the questions they had faced, he and his staff would listen and take notes, trying to get a sense of how better to structure their coaching. "Every night I stayed up past midnight writing new questions and study materials," he writes. "I spent hours trying to understand the design of the test, trying to think like the test makers, anticipating the types of questions my students would face." His notes were typed up the next day, cranked out on a Gestetner machine, hung to dry in the office, then snatched off the line and given to waiting students. If students knew what the S.A.T. was like, he reasoned, they would be more confident. They could skip the instructions and save time. They could learn how to pace themselves. They would guess more intelligently. (For a question with five choices, a right answer is worth one point but a wrong answer results in minus one-quarter of a point--which is why students were always warned that guessing was penalized. In reality, of course, if a student can eliminate even one obviously wrong possibility from the list of choices, guessing becomes an intelligent strategy.) The S.A.T. was a test devised by a particular institution, by a particular kind of person, operating from a particular mind-set. It had an ideology, and Kaplan realized that anyone who understood that ideology would have a tremendous advantage.
Critics of the S.A.T. have long made a kind of parlor game of seeing how many questions on the reading-comprehension section (where a passage is followed by a series of multiple-choice questions about its meaning) can be answered without reading the passage. David Owen, in the anti-S.A.T. account "None of the Above," gives the following example, adapted from an actual S.A.T. exam:
1.
The main idea of the passage is that:
A) a constricted view of [this novel] is natural and acceptable
B) a novel should not depict a vanished society
C) a good novel is an intellectual rather than an emotional experience
D) many readers have seen only the comedy [in this novel]
E) [this novel] should be read with sensitivity and an open mind
If you've never seen an S.A.T. before, it might be difficult to guess the right answer. But if, through practice and exposure, you have managed to assimilate the ideology of the S.A.T.--the kind of decent, middlebrow earnestness that permeates the testit's possible to develop a kind of gut feeling for the right answer, the confidence to predict, in the pressure and rush of examination time, what the S.A.T. is looking for. A is suspiciously postmodern. B is far too dogmatic. C is something that you would never say to an eager, college-bound student. Is it D? Perhaps, but D seems too small a point. It's probably E--and, sure enough, it is.
With that in mind, try this question:
2.
The author of [this passage] implies that a work of art is properly judged on the basis of its:
A) universality of human experience truthfully recorded
B) popularity and critical acclaim in its own age
C) openness to varied interpretations, including seemingly contradictory ones
D) avoidance of political and social issues of minor importance
E) continued popularity through different eras and with different societies
Is it any surprise that the answer is A? Bob Schaeffer, the public education director of the anti-test group FairTest, says that when he got a copy of the latest version of the S.A.T. the first thing he did was try the reading comprehension section blind. He got twelve out of thirteen questions right. The math portion of the S.A.T. is perhaps a better example of how coachable the test can be. Here is another question, cited by Owen, from an old S.A.T.:
In how many different color combinations can 3 balls be painted if each ball is painted one color and there are 3 colors available? (Order is not considered; e.g. red, blue, red is considered the same combination as red, red, blue.)
A) 4
B) 6
C) 9
D) 10
E) 27
This was, Owen points out, the twenty-fifth question in a twenty-five-question math section. S.A.T.s--like virtually all standardized tests--rank their math questions from easiest to hardest. If the hardest questions came first, the theory goes, weaker students would be so intimidated as they began the test that they might throw up their hands in despair. So this is a "hard" question. The second thing to understand about the S.A.T. is that it only really works if good students get the hard questions right and poor students get the hard questions wrong. If anyone can guess or blunder his way into the right answer to a hard question, then the test isn't doing its job. So this is the second clue: the answer to this question must not be something that an average student might blunder into answering correctly. With these two facts in mind, Owen says, don't focus on the question. Just look at the numbers: there are three balls and three colors. The average student is most likely to guess by doing one of three things--adding three and three, multiplying three times three, or, if he is feeling more adventurous, multiplying three by three by three. So six, nine, and twenty-seven are out. That leaves four and ten. Now, he says, read the problem. It can't be four, since anyone can think of more than four combinations. The correct answer must be D, 10.
Does being able to answer that question mean that a student has a greater "aptitude" for math? Of course not. It just means that he had a clever teacher. Kaplan once determined that the testmakers were fond of geometric problems involving the Pythagorean theorem. So an entire generation of Kaplan students were taught "boo, boo, boo, square root of two," to help them remember how the Pythagorean formula applies to an isosceles right triangle. "It was usually not lack of ability," Kaplan writes, "but poor study habits, inadequate instruction or a combination of the two that jeopardized students' performance." The S.A.T. was not an aptitude test at all.
4.
In proving that the S.A.T. was coachable, Stanley Kaplan did something else, which was of even greater importance. He undermined the use of aptitude tests as a means of social engineering. In the years immediately before and after the First World War, for instance, the country's Ă©lite colleges faced what became known as "the Jewish problem." They were being inundated with the children of Eastern European Jewish immigrants. These students came from the lower middle class and they disrupted the genteel Wasp sensibility that had been so much a part of the Ivy League tradition. They were guilty of "underliving and overworking." In the words of one writer, they "worked far into each night [and] their lessons next morning were letter perfect." They were "socially untrained," one Harvard professor wrote, "and their bodily habits are not good." But how could a college keep Jews out? Columbia University had a policy that the New York State Regents Examinations--the statewide curriculum-based high-school-graduation examination--could be used as the basis for admission, and the plain truth was that Jews did extraordinarily well on the Regents Exams. One solution was simply to put a quota on the number of Jews, which is what Harvard explored. The other idea, which Columbia followed, was to require applicants to take an aptitude test. According to Herbert Hawkes, the dean of Columbia College during this period, because the typical Jewish student was simply a "grind," who excelled on the Regents Exams because he worked so hard, a test of innate intelligence would put him back in his place. "We have not eliminated boys because they were Jews and do not propose to do so," Hawkes wrote in 1918: We have honestly attempted to eliminate the lowest grade of applicant and it turns out that a good many of the low grade men are New York City Jews. It is a fact that boys of foreign parentage who have no background in many cases attempt to educate themselves beyond their intelligence. Their accomplishment is over 100% of their ability on account of their tremendous energy and ambition. I do not believe however that a College would do well to admit too many men of low mentality who have ambition but not brains.
Today, Hawkes's anti-Semitism seems absurd, but he was by no means the last person to look to aptitude tests as a means of separating ambition from brains. The great selling point of the S.A.T. has always been that it promises to reveal whether the high-school senior with a 3.0 G.P.A. is someone who could have done much better if he had been properly educated or someone who is already at the limit of his abilities. We want to know that information because, like Hawkes, we prefer naturals to grinds: we think that people who achieve based on vast reserves of innate ability are somehow more promising and more worthy than those who simply work hard.
But is this distinction real? Some years ago, a group headed by the British psychologist John Sloboda conducted a study of musical talent. The group looked at two hundred and fifty-six young musicians, between the ages of ten and sixteen, drawn from Ă©lite music academies and public-school music programs alike. They interviewed all the students and their parents and recorded how each student did in England's national music-examination system, which, the researchers felt, gave them a relatively objective measure of musical ability. "What we found was that the best predictor of where you were on that scale was the number of hours practiced," Sloboda says. This is, if you think about it, a little hard to believe. We conceive musical ability to be a "talent"--people have an aptitude for music--and so it would make sense that some number of students could excel at the music exam without practicing very much. Yet Sloboda couldn't find any. The kids who scored the best on the test were, on average, practicing eight hundred per cent more than the kids at the bottom. "People have this idea that there are those who learn better than others, can get further on less effort,"Sloboda says. "On average, our data refuted that. Whether you're a dropout or at the best school, where you end up can be predicted by how much you practice."
Sloboda found another striking similarity among the "musical" children. They all had parents who were unusually invested in their musical education. It wasn't necessarily the case that the parents were themselves musicians or musically inclined. It was simply that they wanted their children to be that way. "The parents of the high achievers did things that most parents just don't do," he said. "They didn' t simply drop their child at the door of the teacher. They went into the practice room. They took notes on what the teacher said, and when they got home they would say, Remember when your teacher said do this and that. There was a huge amount of time and motivational investment by the parents." Does this mean that there is no such thing as musical talent? Of course not. Most of those hardworking children with pushy parents aren't going to turn out to be Itzhak Perlmans; some will be second violinists in their community orchestra. The point is that when it comes to a relatively well-defined and structured task--like playing an instrument or taking an exam--how hard you work and how supportive your parents are have a lot more to do with success than we ordinarily imagine. Ability cannot be separated from effort. The testmakers never understood that, which is why they thought they could weed out the grinds. But educators increasingly do, and that is why college admissions are now in such upheaval. The Texas state-university system, for example, has, since 1997, automatically admitted any student who places in the top ten per cent of his or her high-school class--regardless of S.A.T. score. Critics of the policy said that it would open the door to students from marginal schools whose S.A.T. scores would normally have been too low for admission to the University of Texas--and that is exactly what happened. But so what? The "top ten percenters," as they are known, may have lower S.A.T. scores, but they get excellent grades. In fact, their college G.P.A.s are the equal of students who scored two hundred to three hundred points higher on the S.A.T. In other words, the determination and hard work that propel someone to the top of his high-school class--even in cases where that high school is impoverished--are more important to succeeding in college (and, for that matter, in life) than whatever abstract quality the S.A.T. purports to measure. The importance of the Texas experience cannot be overstated. Here, at last, is an intelligent alternative to affirmative action, a way to find successful minority students without sacrificing academic performance. But we would never have got this far without Stanley Kaplan--without someone first coming along and puncturing the mystique of the S.A.T. "Acquiring test-taking skills is the same as learning to play the piano or ride a bicycle,"Kaplan writes. "It requires practice, practice, practice. Repetition breeds familiarity. Familiarity breeds confidence." In this, as in so many things, the grind was the natural.
To read Kaplan's memoir is to be struck by what a representative figure he was in the postwar sociological miracle that was Jewish Brooklyn. This is the lower-middle-class, second- and third-generation immigrant world, stretching from Prospect Park to Sheepshead Bay, that ended up peopling the upper reaches of American professional life. Thousands of students from those neighborhoods made their way through Kaplan's classroom in the fifties and sixties, many along what Kaplan calls the "heavily traveled path" from Brooklyn to Cornell, Yale, and the University of Michigan. Kaplan writes of one student who increased his score by three hundred and forty points, and ended up with a Ph.D. and a position as a scientist at Xerox. "Debbie" improved her S.A.T. by five hundred points, got into the University of Chicago, and earned a Ph.D. in clinical psychology. Arthur Levine, the president of Teachers College at Columbia University, raised his S.A.T.s by two hundred and eighty-two points, "making it possible," he writes on the book?s jacket, "for me to attend a better university than I ever would have imagined." Charles Schumer, the senior senator from New York, studied while he worked the mimeograph machine in Kaplan's office, and ended up with close to a perfect sixteen hundred.
These students faced a system designed to thwart the hard worker, and what did they do? They got together with their pushy parents and outworked it. Kaplan says that he knew a "strapping athlete who became physically ill before taking the S.A.T. because his mother was so demanding." There was the mother who called him to say, "Mr. Kaplan, I think I'm going to commit suicide. My son made only a 1000 on the S.A.T." "One mother wanted her straight-A son to have an extra edge, so she brought him to my basement for years for private tutoring in basic subjects," Kaplan recalls. "He was extremely bright and today is one of the country' s most successful ophthalmologists." Another student was "so nervous that his mother accompanied him to class armed with a supply of terry-cloth towels. She stood outside the classroom and when he emerged from our class sessions dripping in sweat, she wiped him dry and then nudged him back into the classroom." Then, of course, there was the formidable four-foot-eight figure of Ericka Kaplan, granddaughter of the chief rabbi of the synagogue of Prague. "My mother was a perfectionist whether she was keeping the company books or setting the dinner table," Kaplan writes, still in her thrall today. "She was my best cheerleader, the reason I performed so well, and I constantly strove to please her." What chance did even the most artfully constructed S.A.T. have against the mothers of Brooklyn?
5.
Stanley Kaplan graduated No. 2 in his class at City College, and won the school's Award for Excellence in Natural Sciences. He wanted to be a doctor, and he applied to five medical schools, confident that he would be accepted. To his shock, he was rejected by every single one. Medical schools did not take public colleges like City College seriously. More important, in the forties there was a limit to how many Jews they were willing to accept. "The term meritocracy--or success based on merit rather than heritage, wealth, or social status?wasn?t even coined yet," Kaplan writes, "and the methods of selecting students based on talent, not privilege, were still evolving."
That's why Stanley Kaplan was always pained by those who thought that what went on in his basement was somehow subversive. He loved the S.A.T. He thought that the test gave people like him the best chance of overcoming discrimination. As he saw it, he was simply giving the middle-class students of Brooklyn the same shot at a bright future that their counterparts in the private schools of Manhattan had. In 1983, after years of hostility, the College Board invited him to speak at its annual convention. It was one of the highlights of Kaplan's life. "Never, in my wildest dreams," he began, "did I ever think I'd be speaking to you here today."
The truth is, however, that Stanley Kaplan was wrong. What he did in his basement was subversive. The S.A.T. was designed as an abstract intellectual tool. It never occurred to its makers that aptitude was a social matter: that what people were capable of was affected by what they knew, and what they knew was affected by what they were taught, and what they were taught was affected by the industry of their teachers and parents. And if what the S.A.T. was measuring, in no small part, was the industry of teachers and parents, then what did it mean? Stanley Kaplan may have loved the S.A.T. But when he stood up and recited "boo, boo, boo, square root of two," he killed it.
THE ARCHIVE
complete list
Articles from the New Yorker
The Social Life of Paper
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
March 25, 2002
BOOKS
Looking for method in the mess.
1.
On a busy day, a typical air-traffic controller might be in charge of as many as twenty-five airplanes at a time--some ascending, some descending, each at a different altitude and travelling at a different speed. He peers at a large, monochromatic radar console, tracking the movement of tiny tagged blips moving slowly across the screen. He talks to the sector where a plane is headed, and talks to the pilots passing through his sector, and talks to the other controllers about any new traffic on the horizon. And, as a controller juggles all those planes overhead, he scribbles notes on little pieces of paper, moving them around on his desk as he does. Air-traffic control depends on computers and radar. It also depends, heavily, on paper and ink.
When people talk about the need to modernize the American air-traffic-control system, this is, in large part, what they are referring to. Whenever a plane takes off, the basic data about the flight -- the type of plane, the radar I.D. number, the requested altitude, the destination -- are printed out on a stiff piece of paper, perhaps one and a half by six and a half inches, known as a flight strip. And as the plane passes through each sector of the airspace the controller jots down, using a kind of shorthand, everything new that is happening to the plane -- its speed, say, and where it's heading, clearances from ground control, holding instructions, comments on the pilot. It's a method that dates back to the days before radar, and it drives critics of the air-traffic-control system crazy. Why, in this day and age, are planes being handled like breakfast orders in a roadside diner?
This is one of the great puzzles of the modern workplace. Computer technology was supposed to replace paper. But that hasn't happened. Every country in the Western world uses more paper today, on a per-capita basis, than it did ten years ago. The consumption of uncoated free-sheet paper, for instance -- the most common kind of office paper -- rose almost fifteen per cent in the United States between 1995 and 2000. This is generally taken as evidence of how hard it is to eradicate old, wasteful habits and of how stubbornly resistant we are to the efficiencies offered by computerization. A number of cognitive psychologists and ergonomics experts, however, don't agree. Paper has persisted, they argue, for very good reasons: when it comes to performing certain kinds of cognitive tasks, paper has many advantages over computers. The dismay people feel at the sight of a messy desk -- or the spectacle of air-traffic controllers tracking flights through notes scribbled on paper strips -- arises from a fundamental confusion about the role that paper plays in our lives.
2.
The case for paper is made most eloquently in "The Myth of the Paperless Office" (M.I.T.; $24.95), by two social scientists, Abigail Sellen and Richard Harper. They begin their book with an account of a study they conducted at the International Monetary Fund, in Washington, D.C. Economists at the I.M.F. spend most of their time writing reports on complicated economic questions, work that would seem to be perfectly suited to sitting in front of a computer. Nonetheless, the I.M.F. is awash in paper, and Sellen and Harper wanted to find out why. Their answer is that the business of writing reports -- at least at the I.M.F -- is an intensely collaborative process, involving the professional judgments and contributions of many people. The economists bring drafts of reports to conference rooms, spread out the relevant pages, and negotiate changes with one other. They go back to their offices and jot down comments in the margin, taking advantage of the freedom offered by the informality of the handwritten note. Then they deliver the annotated draft to the author in person, taking him, page by page, through the suggested changes. At the end of the process, the author spreads out all the pages with comments on his desk and starts to enter them on the computer -- moving the pages around as he works, organizing and reorganizing, saving and discarding.
Without paper, this kind of collaborative, iterative work process would be much more difficult. According to Sellen and Harper, paper has a unique set of "affordances" -- that is, qualities that permit specific kinds of uses. Paper is tangible: we can pick up a document, flip through it, read little bits here and there, and quickly get a sense of it. (In another study on reading habits, Sellen and Harper observed that in the workplace, people almost never read a document sequentially, from beginning to end, the way they would read a novel.) Paper is spatially flexible, meaning that we can spread it out and arrange it in the way that suits us best. And it's tailorable: we can easily annotate it, and scribble on it as we read, without altering the original text. Digital documents, of course, have their own affordances. They can be easily searched, shared, stored, accessed remotely, and linked to other relevant material. But they lack the affordances that really matter to a group of people working together on a report. Sellen and Harper write:
Because paper is a physical embodiment of information, actions performed in relation to paper are, to a large extent, made visible to one's colleagues. Reviewers sitting around a desk could tell whether a colleague was turning toward or away from a report; whether she was flicking through it or setting it aside. Contrast this with watching someone across a desk looking at a document on a laptop. What are they looking at? Where in the document are they? Are they really reading their e-mail? Knowing these things is important because they help a group coördinate its discussions and reach a shared understanding of what is being discussed.
3.
Paper enables a certain kind of thinking. Picture, for instance, the top of your desk. Chances are that you have a keyboard and a computer screen off to one side, and a clear space roughly eighteen inches square in front of your chair. What covers the rest of the desktop is probably piles -- piles of papers, journals, magazines, binders, postcards, videotapes, and all the other artifacts of the knowledge economy. The piles look like a mess, but they aren't. When a group at Apple Computer studied piling behavior several years ago, they found that even the most disorderly piles usually make perfect sense to the piler, and that office workers could hold forth in great detail about the precise history and meaning of their piles. The pile closest to the cleared, eighteen-inch-square working area, for example, generally represents the most urgent business, and within that pile the most important document of all is likely to be at the top. Piles are living, breathing archives. Over time, they get broken down and resorted, sometimes chronologically and sometimes thematically and sometimes chronologically and thematically; clues about certain documents may be physically embedded in the file by, say, stacking a certain piece of paper at an angle or inserting dividers into the stack.
But why do we pile documents instead of filing them? Because piles represent the process of active, ongoing thinking. The psychologist Alison Kidd, whose research Sellen and Harper refer to extensively, argues that "knowledge workers" use the physical space of the desktop to hold "ideas which they cannot yet categorize or even decide how they might use." The messy desk is not necessarily a sign of disorganization. It may be a sign of complexity: those who deal with many unresolved ideas simultaneously cannot sort and file the papers on their desks, because they haven't yet sorted and filed the ideas in their head. Kidd writes that many of the people she talked to use the papers on their desks as contextual cues to "recover a complex set of threads without difficulty and delay" when they come in on a Monday morning, or after their work has been interrupted by a phone call. What we see when we look at the piles on our desks is, in a sense, the contents of our brains.
Sellen and Harper arrived at similar findings when they did some consulting work with a chocolate manufacturer. The people in the firm they were most interested in were the buyers -- the staff who handled the company's relationships with its venders, from cocoa and sugar manufacturers to advertisers. The buyers kept folders (containing contracts, correspondence, meeting notes, and so forth) on every supplier they had dealings with. The company wanted to move the information in those documents online, to save space and money, and make it easier for everyone in the firm to have access to it. That sounds like an eminently rational thing to do. But when Sellen and Harper looked at the folders they discovered that they contained all kinds of idiosyncratic material -- advertising paraphernalia, printouts of e-mails, presentation notes, and letters -- much of which had been annotated in the margins with thoughts and amendments and, they write, "perhaps most important, comments about problems and issues with a supplier's performance not intended for the supplier's eyes." The information in each folder was organized -- if it was organized at all -- according to the whims of the particular buyer. Whenever other people wanted to look at a document, they generally had to be walked through it by the buyer who "owned" it, because it simply wouldn't make sense otherwise. The much advertised advantage of digitizing documents -- that they could be made available to anyone, at any time -- was illusory: documents cannot speak for themselves. "All of this emphasized that most of what constituted a buyer's expertise resulted from involvement with the buyer's own suppliers through a long history of phone calls and meetings," Sellen and Harper write:
The correspondence, notes, and other documents such discussions would produce formed a significant part of the documents buyers kept. These materials therefore supported rather than constituted the expertise of the buyers. In other words, the knowledge existed not so much in the documents as in the heads of the people who owned them -- in their memories of what the documents were, in their knowledge of the history of that supplier relationship, and in the recollections that were prompted whenever they went through the files.
4.
This idea that paper facilitates a highly specialized cognitive and social process is a far cry from the way we have historically thought about the stuff. Paper first began to proliferate in the workplace in the late nineteenth century as part of the move toward "systematic management." To cope with the complexity of the industrial economy, managers were instituting company-wide policies and demanding monthly, weekly, or even daily updates from their subordinates. Thus was born the monthly sales report, and the office manual and the internal company newsletter. The typewriter took off in the eighteen-eighties, making it possible to create documents in a fraction of the time it had previously taken, and that was followed closely by the advent of carbon paper, which meant that a typist could create ten copies of that document simultaneously. If you were, say, a railroad company, then you would now have a secretary at the company headquarters type up a schedule every week, setting out what train was travelling in what direction at what time, because in the mid-nineteenth century collisions were a terrible problem. Then the secretary would make ten carbon copies of that schedule and send them out to the stations along your railway line. Paper was important not to facilitate creative collaboration and thought but as an instrument of control.
Perhaps no one embodied this notion more than the turn-of-the-century reformer Melvil Dewey. Dewey has largely been forgotten by history, perhaps because he was such a nasty fellow -- an outspoken racist and anti-Semite -- but in his day he dominated America's thinking about the workplace. He invented the Dewey decimal system, which revolutionized the organization of libraries. He was an ardent advocate of shorthand and of the metric system, and was so obsessed with time-saving and simplification that he changed his first name from Melville to the more logical Melvil. (He also pushed for the adoption of "catalog" in place of "catalogue," and of "thruway" to describe major highways, a usage that survives to this day in New York State). Dewey's principal business was something called the Library Bureau, which was essentially the Office Depot of his day, selling card catalogues, cabinets, office chairs and tables, pre-printed business forms, and, most important, filing cabinets. Previously, businessmen had stored their documents in cumbersome cases, or folded and labelled the pieces of paper and stuck them in the pigeonholes of the secretary desks so common in the Victorian era. What Dewey proposed was essentially an enlarged version of a card catalogue, where paper documents hung vertically in long drawers.
The vertical file was a stunning accomplishment. In those efficiency-obsessed days, it prompted books and articles and debates and ended up winning a gold medal at the 1893 World's Fair, because it so neatly addressed the threat of disorder posed by the proliferation of paper. What good was that railroad schedule, after all, if it was lost on someone's desk? Now a railroad could buy one of Dewey's vertical filing cabinets, and put the schedule under "S," where everyone could find it. In "Scrolling Forward: Making Sense of Documents in the Digital Age" (Arcade; $24.95), the computer scientist David M. Levy argues that Dewey was the anti-Walt Whitman, and that his vision of regularizing and standardizing life ended up being just as big a component of the American psyche as Whitman's appeal to embrace the world just as it is. That seems absolutely right. The fact is, the thought of all those memos and reports and manuals made Dewey anxious, and that anxiety has never really gone away, even in the face of evidence that paper is no longer something to be anxious about.
When Thomas Edison invented the phonograph, for example, how did he imagine it would be used? As a dictation device that a businessman could pass around the office in place of a paper memo. In 1945, the computer pioneer Vannevar Bush imagined what he called a "memex" -- a mechanized library and filing cabinet, on which an office worker would store all his relevant information without the need for paper files at all. So, too, with the information-technology wizards who have descended on the workplace in recent years. Instead of a real desktop, they have offered us the computer desktop, where cookie-cutter icons run in orderly rows across a soothing background, implicitly promising to bring order to the chaos of our offices.
Sellen and Harper include in their book a photograph of an office piled high with stacks of paper. The occupant of the office -- a researcher in Xerox's European research facility -- was considered neither ineffective nor inefficient. Quite the contrary: he was, they tell us, legendary in being able to find any document in his office very quickly. But the managers of the laboratory were uncomfortable with his office because of what it said about their laboratory. They were, after all, an organization looking to develop digital workplace solutions. "They wanted to show that this was a workplace reaching out to the future rather than being trapped in an inefficient past," Sellen and Harper write. "Yet, if this individual's office was anything to go by, the reality was that this workplace of the future was full of paper." Whenever senior colleagues came by the office, then, the man with the messy desk was instructed to put his papers in boxes and hide them under the stairs. The irony is, of course, that it was not the researcher who was trapped in an inefficient past but the managers. They were captives of the nineteenth-century notion that paper was most useful when it was put away. They were channelling Melvil Dewey. But this is a different era. In the tasks that face modern knowledge workers, paper is most useful out in the open, where it can be shuffled and sorted and annotated and spread out. The mark of the contemporary office is not the file. It's the pile.
5.
Air-traffic controllers are quintessential knowledge workers. They perform a rarefied version of the task faced by the economists at the I.M.F. when they sit down at the computer with the comments and drafts of five other people spread around them, or the manager when she gets to her office on Monday morning, looks at the piles of papers on her desk, and tries to make sense of all the things she has to do in the coming week. When an air-traffic controller looks at his radar, he sees a two-dimensional picture of where the planes in his sector are. But what he needs to know is where his planes will be. He has to be able to take the evidence from radar, what he hears from the pilots and other controllers, and what he has written down on the flight strips in front of him, and construct a three-dimensional "picture" of all the planes in his sector. Psychologists call the ability to create that mental picture "situation awareness." "Situation awareness operates on three levels," says Mica Endsley, the president of S.A. Technologies, in Georgia, and perhaps the country's leading expert on the subject. "One is perceiving. Second is understanding what the information means -- analogous to reading comprehension. That's where you or I would have problems. We'd see the blips on the screen, and it wouldn't mean anything to us. The highest level, though, is projection -- the ability to predict which aircraft are coming in and when. You've got to be able to look into the future, probably by as much as five minutes."
Psychologists believe that those so-called flight strips play a major role in helping controllers achieve this situation awareness. Recently, for example, Wendy Mackay, a computer scientist now working in Paris, spent several months at an air-traffic-control facility near Orly Airport, in Paris. The French air-traffic-control system is virtually identical to the American system. One controller, the planning controller, is responsible for the radar. He has a partner, whose job is to alert the radar controller to incoming traffic, and what Mackay observed was how beautifully the strips enable efficient interaction between these two people. The planning controller, for instance, overhears what his partner is saying on the radio, and watches him annotate strips. If he has a new strip, he might keep it just out of his partner's visual field until it is relevant. "She [the planner] moves it into his peripheral view if the strip should be dealt with soon, but not immediately," Mackay writes. "If the problem is urgent, she will physically move it into his focal view, placing the strip on top of the stripboard or, rarely, inserting it."
Those strips moving in and out of the peripheral view of the controller serve as cognitive cues, which the controller uses to help keep the "picture" of his sector clear in his head. When taking over a control position, controllers touch and rearrange the strips in front of them. When they are given a new strip, they are forced mentally to register a new flight and the new traffic situation. By writing on the strips, they can off-load information, keeping their minds free to attend to other matters. The controller's flight strips are like the piles of paper on a desk: they are the physical manifestations of what goes on inside his head. Is it any wonder that the modernization of the air-traffic-control system has taken so long? No one wants to do anything that might disrupt that critical mental process.
This is, of course, a difficult conclusion for us to accept. Like the managers of the office-technology lab, we have in our heads the notion that an air-traffic-control center ought to be a pristine and gleaming place, full of the latest electronic gadgetry. We think of all those flight strips as cluttering and confusing the work of the office, and we fret about where all that paper will go. But, as Sellen and Harper point out, we needn't worry. It is only if paper's usefulness is in the information written directly on it that it must be stored. If its usefulness lies in the promotion of ongoing creative thinking, then, once that thinking is finished, the paper becomes superfluous. The solution to our paper problem, they write, is not to use less paper but to keep less paper. Why bother filing at all? Everything we know about the workplace suggests that few if any knowledge workers ever refer to documents again once they have filed them away, which should come as no surprise, since paper is a lousy way to archive information. It's too hard to search and it takes up too much space. Besides, we all have the best filing system ever invented, right there on our desks -- the personal computer. That is the irony of the P.C.: the workplace problem that it solves is the nineteenth-century anxiety. It's a better filing cabinet than the original vertical file, and if Dewey were alive today, he'd no doubt be working very happily in an information-technology department somewhere. The problem that paper solves, by contrast, is the problem that most concerns us today, which is how to support knowledge work. In fretting over paper, we have been tripped up by a historical accident of innovation, confused by the assumption that the most important invention is always the most recent. Had the computer come first -- and paper second -- no one would raise an eyebrow at the flight strips cluttering our air-traffic-control centers.
THE ARCHIVE
complete list
Articles from the New Yorker
Blowing Up
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
April 22 & 29, 2002
DEPARTMENT OF FINANCE
How Nassim Taleb turned the inevitability of disaster into an investment strategy
1.
One day in 1996, a Wall Street trader named Nassim Nicholas Taleb went to see Victor Niederhoffer. Victor Niederhoffer was one of the most successful money managers in the country. He lived and worked out of a thirteen-acre compound in Fairfield County, Connecticut, and when Taleb drove up that day from his home in Larchmont he had to give his name at the gate, and then make his way down a long, curving driveway. Niederhoffer had a squash court and a tennis court and a swimming pool and a colossal, faux-alpine mansion in which virtually every square inch of space was covered with eighteenth- and nineteenth-century American folk art. In those days, he played tennis regularly with the billionaire financier George Soros. He had just written a best-selling book, "The Education of a Speculator," dedicated to his father, Artie Niederhoffer, a police officer from Coney Island. He had a huge and eclectic library and a seemingly insatiable desire for knowledge. When Niederhoffer went to Harvard as an undergraduate, he showed up for the very first squash practice and announced that he would someday be the best in that sport; and, sure enough, he soon beat the legendary Shariff Khan to win the U.S. Open squash championship. That was the kind of man Niederhoffer was. He had heard of Taleb's growing reputation in the esoteric field of options trading, and summoned him to Connecticut. Taleb was in awe.
"He didn't talk much, so I observed him," Taleb recalls. "I spent seven hours watching him trade. Everyone else in his office was in his twenties, and he was in his fifties, and he had the most energy of them all. Then, after the markets closed, he went out to hit a thousand backhands on the tennis court." Taleb is Greek-Orthodox Lebanese and his first language was French, and in his pronunciation the name Niederhoffer comes out as the slightly more exotic Nieder hoffer. "Here was a guy living in a mansion with thousands of books, and that was my dream as a child," Taleb went on. "He was part chevalier, part scholar. My respect for him was intense." There was just one problem, however, and it is the key to understanding the strange path that Nassim Taleb has chosen, and the position he now holds as Wall Street's principal dissident. Despite his envy and admiration, he did not want to be Victor Niederhoffer -- not then, not now, and not even for a moment in between. For when he looked around him, at the books and the tennis court and the folk art on the walls -- when he contemplated the countless millions that Niederhoffer had made over the years -- he could not escape the thought that it might all have been the result of sheer, dumb luck.
Taleb knew how heretical that thought was. Wall Street was dedicated to the principle that when it came to playing the markets there was such a thing as expertise, that skill and insight mattered in investing just as skill and insight mattered in surgery and golf and flying fighter jets. Those who had the foresight to grasp the role that software would play in the modern world bought Microsoft in 1985, and made a fortune. Those who understood the psychology of investment bubbles sold their tech stocks at the end of 1999 and escaped the Nasdaq crash. Warren Buffett was known as the "sage of Omaha" because it seemed incontrovertible that if you started with nothing and ended up with billions then you had to be smarter than everyone else: Buffett was successful for a reason. Yet how could you know, Taleb wondered, whether that reason was responsible for someone's success, or simply a rationalization invented after the fact? George Soros seemed to be successful for a reason, too. He used to say that he followed something called "the theory of reflexivity." But then, later, Soros wrote that in most situations his theory "is so feeble that it can be safely ignored." An old trading partner of Taleb's, a man named Jean-Manuel Rozan, once spent an entire afternoon arguing about the stock market with Soros. Soros was vehemently bearish, and he had an elaborate theory to explain why, which turned out to be entirely wrong. The stock market boomed. Two years later, Rozan ran into Soros at a tennis tournament. "Do you remember our conversation?" Rozan asked. "I recall it very well," Soros replied. "I changed my mind, and made an absolute fortune." He changed his mind! The truest thing about Soros seemed to be what his son Robert had once said:
My father will sit down and give you theories to explain why he does this or that. But I remember seeing it as a kid and thinking, Jesus Christ, at least half of this is bullshit. I mean, you know the reason he changes his position on the market or whatever is because his back starts killing him. It has nothing to do with reason. He literally goes into a spasm, and it?s this early warning sign.
For Taleb, then, the question why someone was a success in the financial marketplace was vexing. Taleb could do the arithmetic in his head. Suppose that there were ten thousand investment managers out there, which is not an outlandish number, and that every year half of them, entirely by chance, made money and half of them, entirely by chance, lost money. And suppose that every year the losers were tossed out, and the game replayed with those who remained. At the end of five years, there would be three hundred and thirteen people who had made money in every one of those years, and after ten years there would be nine people who had made money every single year in a row, all out of pure luck. Niederhoffer, like Buffett and Soros, was a brilliant man. He had a Ph.D. in economics from the University of Chicago. He had pioneered the idea that through close mathematical analysis of patterns in the market an investor could identify profitable anomalies. But who was to say that he wasn't one of those lucky nine? And who was to say that in the eleventh year Niederhoffer would be one of the unlucky ones, who suddenly lost it all, who suddenly, as they say on Wall Street, "blew up"?
Taleb remembered his childhood in Lebanon and watching his country turn, as he puts it, from "paradise to hell" in six months. His family once owned vast tracts of land in northern Lebanon. All of that was gone. He remembered his grandfather, the former Deputy Prime Minister of Lebanon and the son of a Deputy Prime Minister of Lebanon and a man of great personal dignity, living out his days in a dowdy apartment in Athens. That was the problem with a world in which there was so much uncertainty about why things ended up the way they did: you never knew whether one day your luck would turn and it would all be washed away.
So here is what Taleb took from Niederhoffer. He saw that Niederhoffer was a serious athlete, and he decided that he would be, too. He would bicycle to work and exercise in the gym. Niederhoffer was a staunch empiricist, who turned to Taleb that day in Connecticut and said to him sternly, "Everything that can be tested must be tested," and so when Taleb started his own hedge fund, a few years later, he called it Empirica. But that is where it stopped. Nassim Taleb decided that he could not pursue an investment strategy that had any chance of blowing up.
2.
Nassim Taleb is a tall, muscular man in his early forties, with a salt-and-pepper beard and a balding head. His eyebrows are heavy and his nose is long. His skin has the olive hue of the Levant. He is a man of moods, and when his world turns dark the eyebrows come together and the eyes narrow and it is as if he were giving off an electrical charge. It is said, by some of his friends, that he looks like Salman Rushdie, although at his office his staff have pinned to the bulletin board a photograph of a mullah they swear is Taleb's long-lost twin, while Taleb himself maintains, wholly implausibly, that he resembles Sean Connery. He lives in a four-bedroom Tudor with twenty-six Russian Orthodox icons, nineteen Roman heads, and four thousand books, and he rises at dawn to spend an hour writing. He is the author of two books, the first a technical and highly regarded work on derivatives, and the second a treatise entitled "Fooled by Randomness," which was published last year and is to conventional Wall Street wisdom approximately what Martin Luther's ninety-five theses were to the Catholic Church. Some afternoons, he drives into the city and attends a philosophy lecture at City University. During the school year, in the evenings, he teaches a graduate course in finance at New York University, after which he can often be found at the bar at Odeon Café in Tribeca, holding forth, say, on the finer points of stochastic volatility or his veneration of the Greek poet C. P. Cavafy.
Taleb runs Empirica Capital out of an anonymous, concrete office park somewhere in the woods outside Greenwich, Connecticut. His offices consist, principally, of a trading floor about the size of a Manhattan studio apartment. Taleb sits in one corner, in front of a laptop, surrounded by the rest of his team -- Mark Spitznagel, the chief trader, another trader named Danny Tosto, a programmer named Winn Martin, and a graduate student named Pallop Angsupun. Mark Spitznagel is perhaps thirty. Win, Danny, and Pallop look as if they belonged in high school. The room has an overstuffed bookshelf in one corner, and a television muted and tuned to CNBC. There are two ancient Greek heads, one next to Taleb's computer and the other, somewhat bafflingly, on the floor, next to the door, as if it were being set out for the trash. There is almost nothing on the walls, except for a slightly battered poster for an exhibition of Greek artifacts, the snapshot of the mullah, and a small pen-and-ink drawing of the patron saint of Empirica Capital, the philosopher Karl Popper.
On a recent spring morning, the staff of Empirica were concerned with solving a thorny problem, having to do with the square root of n, where n is a given number of random set of observations, and what relation n might have to a speculator's confidence in his estimations. Taleb was up at a whiteboard by the door, his marker squeaking furiously as he scribbled possible solutions. Spitznagel and Pallop looked on intently. Spitznagel is blond and from the Midwest and does yoga: in contrast to Taleb, he exudes a certain laconic levelheadedness. In a bar, Taleb would pick a fight. Spitznagel would break it up. Pallop is of Thai extraction and is doing a Ph.D. in financial mathematics at Princeton. He has longish black hair, and a slightly quizzical air. "Pallop is very lazy," Taleb will remark, to no one in particular, several times over the course of the day, although this is said with such affection that it suggests that "laziness," in the Talebian nomenclature, is a synonym for genius. Pallop's computer was untouched and he often turned his chair around, so that he faced completely away from his desk. He was reading a book by the cognitive psychologists Amos Tversky and Daniel Kahneman, whose arguments, he said a bit disappointedly, were "not really quantifiable." The three argued back and forth about the solution. It appeared that Taleb might be wrong, but before the matter could be resolved the markets opened. Taleb returned to his desk and began to bicker with Spitznagel about what exactly would be put on the company boom box. Spitznagel plays the piano and the French horn and has appointed himself the Empirica d.j. He wanted to play Mahler, and Taleb does not like Mahler. "Mahler is not good for volatility," Taleb complained. "Bach is good. St. Matthew's Passion!" Taleb gestured toward Spitznagel, who was wearing a gray woollen turtleneck. "Look at him. He wants to be like von Karajan, like someone who wants to live in a castle. Technically superior to the rest of us. No chitchatting. Top skier. That's Mark!" As Spitznagel rolled his eyes, a man whom Taleb refers to, somewhat mysteriously, as Dr. Wu wandered in. Dr. Wu works for another hedge fund, down the hall, and is said to be brilliant. He is thin and squints through black-rimmed glasses. He was asked his opinion on the square root of n but declined to answer. "Dr. Wu comes here for intellectual kicks and to borrow books and to talk music with Mark," Taleb explained after their visitor had drifted away. He added darkly, "Dr. Wu is a Mahlerian."
Empirica follows a very particular investment strategy. It trades options, which is to say that it deals not in stocks and bonds but with bets on stocks and bonds. Imagine, for example, that General Motors stock is trading at fifty dollars, and imagine that you are a major investor on Wall Street. An options trader comes up to you with a proposition. What if, within the next three months, he decides to sell you a share of G.M. at forty-five dollars? How much would you charge for agreeing to buy it at that price? You would look at the history of G.M. and see that in a three-month period it has rarely dropped ten per cent, and obviously the trader is only going to make you buy his G.M. at forty-five dollars if the stock drops below that point. So you say you'll make that promise, or sell that option, for a relatively small fee, say, a dime. You are betting on the high probability that G.M. stock will stay relatively calm over the next three months, and if you are right you'll pocket the dime as pure profit. The trader, on the other hand, is betting on the unlikely event that G.M. stock will drop a lot, and if that happens his profits are potentially huge. If the trader bought a million options from you at a dime each and G.M. drops to thirty-five dollars, he'll buy a million shares at thirty-five dollars and turn around and force you to buy them at forty-five dollars, making himself suddenly very rich and you substantially poorer.
That particular transaction is called, in the argot of Wall Street, an "out-of-the-money option." But an option can be configured in a vast number of ways. You could sell the trader a G.M. option at thirty dollars, or, if you wanted to bet against G.M. stock going up, you could sell a G.M. option at sixty dollars. You could sell or buy options on bonds, on the S. & P. index, on foreign currencies or on mortgages, or on the relationship among any number of financial instruments of your choice; you can bet on the market booming, or the market crashing, or the market staying the same. Options allow investors to gamble heavily and turn one dollar into ten. They also allow investors to hedge their risk. The reason your pension fund may not be wiped out in the next crash is that it has protected itself by buying options. What drives the options game is the notion that the risks represented by all of these bets can be quantified; that by looking at the past behavior of G.M. you can figure out the exact chance of G.M. hitting forty-five dollars in the next three months, and whether at a dollar that option is a good or a bad investment. The process is a lot like the way insurance companies analyze actuarial statistics in order to figure out how much to charge for a life-insurance premium, and to make those calculations every investment bank has, on staff, a team of Ph.D.s, physicists from Russia, applied mathematicians from China, computer scientists from India. On Wall Street, those Ph.D.s are called "quants."
Nassim Taleb and his team at Empirica are quants. But they reject the quant orthodoxy, because they don't believe that things like the stock market behave in the way that physical phenomena like mortality statistics do. Physical events, whether death rates or poker games, are the predictable function of a limited and stable set of factors, and tend to follow what statisticians call a "normal distribution," a bell curve. But do the ups and downs of the market follow a bell curve? The economist Eugene Fama once studied stock prices and pointed out that if they followed a normal distribution you'd expect a really big jump, what he specified as a movement five standard deviations from the mean, once every seven thousand years. In fact, jumps of that magnitude happen in the stock market every three or four years, because investors don't behave with any kind of statistical orderliness. They change their mind. They do stupid things. They copy each other. They panic. Fama concluded that if you charted the ups and downs of the stock market the graph would have a "fat tail,"meaning that at the upper and lower ends of the distribution there would be many more outlying events than statisticians used to modelling the physical world would have imagined.
In the summer of 1997, Taleb predicted that hedge funds like Long Term Capital Management were headed for trouble, because they did not understand this notion of fat tails. Just a year later, L.T.C.M. sold an extraordinary number of options, because its computer models told it that the markets ought to be calming down. And what happened? The Russian government defaulted on its bonds; the markets went crazy; and in a matter of weeks L.T.C.M. was finished. Spitznagel, Taleb's head trader, says that he recently heard one of the former top executives of L.T.C.M. give a lecture in which he defended the gamble that the fund had made. "What he said was, Look, when I drive home every night in the fall I see all these leaves scattered around the base of the trees,?" Spitznagel recounts. "There is a statistical distribution that governs the way they fall, and I can be pretty accurate in figuring out what that distribution is going to be. But one day I came home and the leaves were in little piles. Does that falsify my theory that there are statistical rules governing how leaves fall? No. It was a man-made event." In other words, the Russians, by defaulting on their bonds, did something that they were not supposed to do, a once-in-a-lifetime, rule-breaking event. But this, to Taleb, is just the point: in the markets, unlike in the physical universe, the rules of the game can be changed. Central banks can decide to default on government-backed securities.
One of Taleb's earliest Wall Street mentors was a short-tempered Frenchman named Jean-Patrice, who dressed like a peacock and had an almost neurotic obsession with risk. Jean-Patrice would call Taleb from Regine's at three in the morning, or take a meeting in a Paris nightclub, sipping champagne and surrounded by scantily clad women, and once Jean-Patrice asked Taleb what would happen to his positions if a plane crashed into his building. Taleb was young then and brushed him aside. It seemed absurd. But nothing, Taleb soon realized, is absurd. Taleb likes to quote David Hume: "No amount of observations of white swans can allow the inference that all swans are white, but the observation of a single black swan is sufficient to refute that conclusion." Because L.T.C.M. had never seen a black swan in Russia, it thought no Russian black swans existed. Taleb, by contrast, has constructed a trading philosophy predicated entirely on the existence of black swans. on the possibility of some random, unexpected event sweeping the markets. He never sells options, then. He only buys them. He's never the one who can lose a great deal of money if G.M. stock suddenly plunges. Nor does he ever bet on the market moving in one direction or another. That would require Taleb to assume that he understands the market, and he doesn't. He hasn't Warren Buffett's confidence. So he buys options on both sides, on the possibility of the market moving both up and down. And he doesn't bet on minor fluctuations in the market. Why bother? If everyone else is vastly underestimating the possibility of rare events, then an option on G.M. at, say, forty dollars is going to be undervalued. So Taleb buys out-of-the-money options by the truckload. He buys them for hundreds of different stocks, and if they expire before he gets to use them he simply buys more. Taleb doesn't even invest in stocks, not for Empirica and not for his own personal account. Buying a stock, unlike buying an option, is a gamble that the future will represent an improved version of the past. And who knows whether that will be true? So all of Taleb's personal wealth, and the hundreds of millions that Empirica has in reserve, is in Treasury bills. Few on Wall Street have taken the practice of buying options to such extremes. But if anything completely out of the ordinary happens to the stock market, if some random event sends a jolt through all of Wall Street and pushes G.M. to, say, twenty dollars, Nassim Taleb will not end up in a dowdy apartment in Athens. He will be rich.
Not long ago, Taleb went to a dinner in a French restaurant just north of Wall Street. The people at the dinner were all quants: men with bulging pockets and open-collared shirts and the serene and slightly detached air of those who daydream in numbers. Taleb sat at the end of the table, drinking pastis and discussing French literature. There was a chess grand master at the table, with a shock of white hair, who had once been one of Anatoly Karpov's teachers, and another man who over the course of his career had worked, in order, at Stanford University, Exxon, Los Alamos National Laboratory, Morgan Stanley, and a boutique French investment bank. They talked about mathematics and chess and fretted about one of their party who had not yet arrived and who had the reputation, as one of the quants worriedly said, of "not being able to find the bathroom." When the check came, it was given to a man who worked in risk management at a big Wall Street bank, and he stared at it for a long time, with a slight mixture of perplexity and amusement, as if he could not remember what it was like to deal with a mathematical problem of such banality. The men at the table were in a business that was formally about mathematics but was really about epistemology, because to sell or to buy an option requires each party to confront the question of what it is he truly knows. Taleb buys options because he is certain that, at root, he knows nothing, or, more precisely, that other people believe they know more than they do. But there were plenty of people around that table who sold options, who thought that if you were smart enough to set the price of the option properly you could win so many of those one-dollar bets on General Motors that, even if the stock ever did dip below forty-five dollars, you'd still come out far ahead. They believe that the world is a place where, at the end of the day, leaves fall more or less in a predictable pattern.
The distinction between these two sides is the divide that emerged between Taleb and Niederhoffer all those years ago in Connecticut. Niederhoffer's hero is the nineteenth-century scientist Francis Galton. Niederhoffer called his eldest daughter Galt, and there is a full-length portrait of Galton in his library. Galton was a statistician and a social scientist (and a geneticist and a meteorologist), and if he was your hero you believed that by marshalling empirical evidence, by aggregating data points, you could learn whatever it was you needed to know. Taleb's hero, on the other hand, is Karl Popper, who said that you could not know with any certainty that a proposition was true; you could only know that it was not true. Taleb makes much of what he learned from Niederhoffer, but Niederhoffer insists that his example was wasted on Taleb. "In one of his cases, Rumpole of the Bailey talked about being tried by the bishop who doesn't believe in God," Niederhoffer says. "Nassim is the empiricist who doesn't believe in empiricism." What is it that you claim to learn from experience, if you believe that experience cannot be trusted? Today, Niederhoffer makes a lot of his money selling options, and more often than not the person who he sells those options to is Nassim Taleb. If one of them is up a dollar one day, in other words, that dollar is likely to have come from the other. The teacher and pupil have become predator and prey.
3.
Years ago, Nassim Taleb worked at the investment bank First Boston, and one of the things that puzzled him was what he saw as the mindless industry of the trading floor. A trader was supposed to come in every morning and buy and sell things, and on the basis of how much money he made buying and selling he was given a bonus. If he went too many weeks without showing a profit, his peers would start to look at him funny, and if he went too many months without showing a profit he would be gone. The traders were often well educated, and wore Savile Row suits and Ferragamo ties. They dove into the markets with a frantic urgency. They read the Wall Street Journal closely and gathered around the television to catch breaking news. "The Fed did this, the Prime Minister of Spain did that," Taleb recalls. "The Italian Finance Minister says there will be no competitive devaluation, this number is higher than expected, Abby Cohen just said this." It was a scene that Taleb did not understand.
"He was always so conceptual about what he was doing," says Howard Savery, who was Taleb?s assistant at the French bank Indosuez in the nineteen-eighties. "He used to drive our floor trader (his name was Tim) crazy. Floor traders are used to precision: "Sell a hundred futures at eighty-seven." Nassim would pick up the phone and say, "Tim, sell some." And Tim would say, "How many?" And he would say, "Oh, a social amount." It was like saying, "I don't have a number in mind, I just know I want to sell." There would be these heated arguments in French, screaming arguments. Then everyone would go out to dinner and have fun. Nassim and his group had this attitude that we're not interested in knowing what the new trade number is. When everyone else was leaning over their desks, listening closely to the latest figures, Nassim would make a big scene of walking out of the room."
At Empirica, then, there are no Wall Street Journals to be found. There is very little active trading, because the options that the fund owns are selected by computer. Most of those options will be useful only if the market does something dramatic, and, of course, on most days the market doesn't. So the job of Taleb and his team is to wait and to think. They analyze the company's trading policies, back-test various strategies, and construct ever-more sophisticated computer models of options pricing. Danny, in the corner, occasionally types things into the computer. Pallop looks dreamily off into the distance. Spitznagel takes calls from traders, and toggles back and forth between screens on his computer. Taleb answers e-mails and calls one of the firm's brokers in Chicago, affecting, as he does, the kind of Brooklyn accent that people from Brooklyn would have if they were actually from northern Lebanon: "Howyoudoin?" It is closer to a classroom than to a trading floor.
"Pallop, did you introspect?" Taleb calls out as he wanders back in from lunch. Pallop is asked what his Ph.D. is about. "Pretty much this," he says, waving a languid hand around the room.
"It looks like we will have to write it for him," Taleb chimes in, "because Pollop is very lazy."
What Empirica has done is to invert the traditional psychology of investing. You and I, if we invest conventionally in the market, have a fairly large chance of making a small amount of money in a given day from dividends or interest or the general upward trend of the market. We have almost no chance of making a large amount of money in one day, and there is a very small, but real, possibility that if the market collapses we could blow up. We accept that distribution of risks because, for fundamental reasons, it feels right. In the book that Pallop was reading by Kahneman and Tversky, for example, there is a description of a simple experiment, where a group of people were told to imagine that they had three hundred dollars. They were then given a choice between (a) receiving another hundred dollars or (b) tossing a coin, where if they won they got two hundred dollars and if they lost they got nothing. Most of us, it turns out, prefer (a) to (b). But then Kahneman and Tversky did a second experiment. They told people to imagine that they had five hundred dollars, and then asked them if they would rather (c) give up a hundred dollars or (d) toss a coin and pay two hundred dollars if they lost and nothing at all if they won. Most of us now prefer (d) to (c). What is interesting about those four choices is that, from a probabilistic standpoint, they are identical. They all yield an expected outcome of four hundred dollars. Nonetheless, we have strong preferences among them. Why? Because we're more willing to gamble when it comes to losses, but are risk averse when it comes to our gains. That's why we like small daily winnings in the stock market, even if that requires that we risk losing everything in a crash.
At Empirica, by contrast, every day brings a small but real possibility that they'll make a huge amount of money in a day; no chance that they'll blow up; and a very large possibility that they'll lose a small amount of money. All those dollar, and fifty-cent, and nickel options that Empirica has accumulated, few of which will ever be used, soon begin to add up. By looking at a particular column on the computer screens showing Empirica's positions, anyone at the firm can tell you precisely how much money Empirica has lost or made so far that day. At 11:30 A.M., for instance, they had recovered just twenty-eight percent of the money they had spent that day on options. By 12:30, they had recovered forty per cent, meaning that the day was not yet half over and Empirica was already in the red to the tune of several hundred thousand dollars. The day before that, it had made back eighty-five per cent of its money; the day before that, forty-eight per cent; the day before that, sixty-five per cent; and the day before that also sixty-five per cent; and, in fact-with a few notable exceptions, like the few days when the market reopened after September 11th -- Empirica has done nothing but lose money since last April. "We cannot blow up, we can only bleed to death," Taleb says, and bleeding to death, absorbing the pain of steady losses, is precisely what human beings are hardwired to avoid. "Say you've got a guy who is long on Russian bonds," Savery says. "He's making money every day. One day, lightning strikes and he loses five times what he made. Still, on three hundred and sixty-four out of three hundred and sixty-five days he was very happily making money. It's much harder to be the other guy, the guy losing money three hundred and sixty-four days out of three hundred and sixty-five, because you start questioning yourself. Am I ever going to make it back? Am I really right? What if it takes ten years? Will I even be sane ten years from now?" What the normal trader gets from his daily winnings is feedback, the pleasing illusion of progress. At Empirica, there is no feedback. "It's like you're playing the piano for ten years and you still can't play chopsticks," Spitznagel say, "and the only thing you have to keep you going is the belief that one day you'll wake up and play like Rachmaninoff." Was it easy knowing that Niederhoffer -- who represented everything they thought was wrong -- was out there getting rich while they were bleeding away? Of course it wasn't . If you watched Taleb closely that day, you could see the little ways in which the steady drip of losses takes a toll. He glanced a bit too much at the Bloomberg. He leaned forward a bit too often to see the daily loss count. He succumbs to an array of superstitious tics. If the going is good, he parks in the same space every day; he turned against Mahler because he associates Mahler with the last year's long dry spell. "Nassim says all the time that he needs me there, and I believe him," Spitznagel says. He is there to remind Taleb that there is a point to waiting, to help Taleb resist the very human impulse to abandon everything and stanch the pain of losing. "Mark is my cop," Taleb says. So is Pallop: he is there to remind Taleb that Empirica has the intellectual edge.
"The key is not having the ideas but having the recipe to deal with your ideas," Taleb says. "We don't need moralizing. We need a set of tricks." His trick is a protocol that stipulates precisely what has to be done in every situation. "We built the protocol, and the reason we did was to tell the guys, Don't listen to me, listen to the protocol. Now, I have the right to change the protocol, but there is a protocol to changing the protocol. We have to be hard on ourselves to do what we do. The bias we see in Niederhoffer we see in ourselves." At the quant dinner, Taleb devoured his roll, and as the busboy came around with more rolls Taleb shouted out "No, no!" and blocked his plate. It was a never-ending struggle, this battle between head and heart. When the waiter came around with wine, he hastily covered the glass with his hand. When the time came to order, he asked for steak frites -- without the frites, please! -- and then immediately tried to hedge his choice by negotiating with the person next to him for a fraction of his frites.
The psychologist Walter Mischel has done a series of experiments where he puts a young child in a room and places two cookies in front of him, one small and one large. The child is told that if he wants the small cookie he need only ring a bell and the experimenter will come back into the room and give it to him. If he wants the better treat, though, he has to wait until the experimenter returns on his own, which might be anytime in the next twenty minutes. Mischel has videotapes of six-year-olds, sitting in the room by themselves, staring at the cookies, trying to persuade themselves to wait. One girl starts to sing to herself. She whispers what seems to be the instructions -- that she can have the big cookie if she can only wait. She closes her eyes. Then she turns her back on the cookies. Another little boy swings his legs violently back and forth, and then picks up the bell and examines it, trying to do anything but think about the cookie he could get by ringing it. The tapes document the beginnings of discipline and self-control -- the techniques we learn to keep our impulses in check -- and to watch all the children desperately distracting themselves is to experience the shock of recognition: that's Nassim Taleb!
There is something else as well that helps to explain Taleb's resolve -- more than the tics and the systems and the self-denying ordinances. It happened a year or so before he went to see Niederhoffer. Taleb had been working as a trader at the Chicago Mercantile Exchange, and developed a persistently hoarse throat. At first, he thought nothing of it: a hoarse throat was an occupational hazard of spending every day in the pit. Finally, when he moved back to New York, he went to see a doctor, in one of those Upper East Side prewar buildings with a glamorous façade. Taleb sat in the office, staring out at the plain brick of the courtyard, reading the medical diplomas on the wall over and over, waiting and waiting for the verdict. The doctor returned and spoke in a low, grave voice: "I got the pathology report. It's not as bad as it sounds ?" But, of course, it was: he had throat cancer. Taleb's mind shut down. He left the office. It was raining outside. He walked and walked and ended up at a medical library. There he read frantically about his disease, the rainwater forming a puddle under his feet. It made no sense. Throat cancer was the disease of someone who has spent a lifetime smoking heavily. But Taleb was young, and he barely smoked at all. His risk of getting throat cancer was something like one in a hundred thousand, almost unimaginably small. He was a black swan! The cancer is now beaten, but the memory of it is also Taleb's secret, because once you have been a black swan -- not just seen one, but lived and faced death as one -- it becomes easier to imagine another on the horizon.
As the day came to an end, Taleb and his team turned their attention once again to the problem of the square root of n. Taleb was back at the whiteboard. Spitznagel was looking on. Pallop was idly peeling a banana. Outside, the sun was beginning to settle behind the trees. "You do a conversion to p1 and p2," Taleb said. His marker was once again squeaking across the whiteboard. "We say we have a Gaussian distribution, and you have the market switching from a low-volume regime to a high-volume. P21. P22. You have your igon value." He frowned and stared at his handiwork. The markets were now closed. Empirica had lost money, which meant that somewhere off in the woods of Connecticut Niederhoffer had no doubt made money. That hurt, but if you steeled yourself, and thought about the problem at hand, and kept in mind that someday the market would do something utterly unexpected because in the world we live in something utterly unexpected always happens, then the hurt was not so bad. Taleb eyed his equations on the whiteboard, and arched an eyebrow. It was a very difficult problem. "Where is Dr. Wu? Should we call in Dr. Wu?"
4.
A year after Nassim Taleb came to visit him, Victor Niederhoffer blew up. He sold a very large number of options on the S. & P. index, taking millions of dollars from other traders in exchange for promising to buy a basket of stocks from them at current prices, if the market ever fell. It was an unhedged bet, or what was called on Wall Street a "naked put," meaning that he bet everyone on one outcome: he bet in favor of the large probability of making a small amount of money, and against the small probability of losing a large amount of money-and he lost. On October 27, 1997, the market plummeted eight per cent, and all of the many, many people who had bought those options from Niederhoffer came calling all at once, demanding that he buy back their stocks at pre-crash prices. He ran through a hundred and thirty million dollars -- his cash reserves, his savings, his other stocks -- and when his broker came and asked for still more he didn't have it. In a day, one of the most successful hedge funds in America was wiped out. Niederhoffer had to shut down his firm. He had to mortgage his house. He had to borrow money from his children. He had to call Sotheby's and sell his prized silver collection -- the massive nineteenth-century Brazilian "sculptural group of victory" made for the Visconde De Figueirdeo, the massive silver bowl designed in 1887 by Tiffany & Company for the James Gordon Bennet Cup yacht race, and on and on. He stayed away from the auction. He couldn't bear to watch.
"It was one of the worst things that has ever happened to me in my life, right up there with the death of those closest to me," Niederhoffer said recently. It was a Saturday in March, and he was in the library of his enormous house. Two weary-looking dogs wandered in and out. He is a tall man, an athlete, thick through the upper body and trunk, with a long, imposing face and baleful, hooded eyes. He was shoeless. One collar on his shirt was twisted inward, and he looked away as he talked. "I let down my friends. I lost my business. I was a major money manager. Now I pretty much have had to start from ground zero." He paused. "Five years have passed. The beaver builds a dam. The river washes it away, so he tries to build a better foundation, and I think I have. But I'm always mindful of the possibility of more failures." In the distance, there was a knock on the door. It was a man named Milton Bond, an artist who had come to present Niederhoffer with a painting he had done of Moby Dick ramming the Pequod. It was in the folk-art style that Niederhoffer likes so much, and he went to meet Bond in the foyer, kneeling down in front of the painting as Bond unwrapped it. Niederhoffer has other paintings of the Pequod in his house, and paintings of the Essex, the ship on which Melville's story was based. In his office, on a prominent wall, is a painting of the Titanic. They were, he said, his way of staying humble. "One of the reasons I've paid lots of attention to the Essex is that it turns out that the captain of the Essex, as soon as he got back to Nantucket, was given another job," Niederhoffer said. "They thought he did a good job in getting back after the ship was rammed. The captain was asked, `How could people give you another ship?' And he said, `I guess on the theory that lightning doesn't strike twice.' It was a fairly random thing. But then he was given the other ship, and that one foundered, too. Got stuck in the ice. At that time, he was a lost man. He wouldn't even let them save him. They had to forcibly remove him from the ship. He spent the rest of his life as a janitor in Nantucket. He became what on Wall Street they call a ghost." Niederhoffer was back in his study now, his lanky body stretched out, his feet up on the table, his eyes a little rheumy. "You see? I can't afford to fail a second time. Then I'll be a total washout. That's the significance of the Pequod."
A month or so before he blew up, Taleb had dinner with Niederhoffer at a restaurant in Westport, and Niederhoffer told him that he had been selling naked puts. You can imagine the two of them across the table from each other, Niederhoffer explaining that his bet was an acceptable risk, that the odds of the market going down so heavily that he would be wiped out were minuscule, and Taleb listening and shaking his head, and thinking about black swans. "I was depressed when I left him," Taleb said. "Here is a guy who goes out and hits a thousand backhands. He plays chess like his life depends on it. Here is a guy who, whatever he wants to do when he wakes up in the morning, he ends up better than anyone else. Whatever he wakes up in the morning and decides to do, he did better than anyone else. I was talking to my hero . . ." This was the reason Taleb didn't want to be Niederhoffer when Niederhoffer was at his height -- the reason he didn't want the silver and the house and the tennis matches with George Soros. He could see all too clearly where it all might end up. In his mind's eye, he could envision Niederhoffer borrowing money from his children, and selling off his silver, and talking in a hollow voice about letting down his friends, and Taleb did not know if he had the strength to live with that possibility. Unlike Niederhoffer, Taleb never thought he was invincible. You couldn't if you had watched your homeland blow up, and had been the one person in a hundred thousand who gets throat cancer, and so for Taleb there was never any alternative to the painful process of insuring himself against catastrophe.
This kind of caution does not seem heroic, of course. It seems like the joyless prudence of the accountant and the Sunday-school teacher. The truth is that we are drawn to the Niederhoffers of this world because we are all, at heart, like Niederhoffer: we associate the willingness to risk great failure -- and the ability to climb back from catastrophe--with courage. But in this we are wrong. That is the lesson of Taleb and Niederhoffer, and also the lesson of our volatile times. There is more courage and heroism in defying the human impulse, in taking the purposeful and painful steps to prepare for the unimaginable.
Last fall, Niederhoffer sold a large number of options, betting that the markets would be quiet, and they were, until out of nowhere two planes crashed into the World Trade Center. "I was exposed. It was nip and tuck." Niederhoffer shook his head, because there was no way to have anticipated September 11th. "That was a totally unexpected event."
THE ARCHIVE
complete list
Articles from the New Yorker
Personality Plus
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
September 20, 2004
ANNALS OF PSYCHOLOGY
Employers love personality tests.
But what do they really reveal?
1.
When Alexander (Sandy) Nininger was twenty-three, and newly commissioned as a lieutenant in the United States Army, he was sent to the South Pacific to serve with the 57th Infantry of the Philippine Scouts. It was January, 1942. The Japanese had just seized Philippine ports at Vigan, Legazpi, Lamon Bay, and Lingayen, and forced the American and Philippine forces to retreat into Bataan, a rugged peninsula on the South China Sea. There, besieged and outnumbered, the Americans set to work building a defensive line, digging foxholes and constructing dikes and clearing underbrush to provide unobstructed sight lines for rifles and machine guns. Nininger's men were on the line's right flank. They labored day and night. The heat and the mosquitoes were nearly unbearable.
Quiet by nature, Nininger was tall and slender, with wavy blond hair. As Franklin M. Reck recounts in "Beyond the Call of Duty," Nininger had graduated near the top of his class at West Point, where he chaired the lecture-and-entertainment committee. He had spent many hours with a friend, discussing everything from history to the theory of relativity. He loved the theatre. In the evenings, he could often be found sitting by the fireplace in the living room of his commanding officer, sipping tea and listening to Tchaikovsky. As a boy, he once saw his father kill a hawk and had been repulsed. When he went into active service, he wrote a friend to say that he had no feelings of hate, and did not think he could ever kill anyone out of hatred. He had none of the swagger of the natural warrior. He worked hard and had a strong sense of duty.
In the second week of January, the Japanese attacked, slipping hundreds of snipers through the American lines, climbing into trees, turning the battlefield into what Reck calls a "gigantic possum hunt." On the morning of January 12th, Nininger went to his commanding officer. He wanted, he said, to be assigned to another company, one that was in the thick of the action, so he could go hunting for Japanese snipers.
He took several grenades and ammunition belts, slung a Garand rifle over his shoulder, and grabbed a sub machine gun. Starting at the point where the fighting was heaviest—near the position of the battalion's K Company—he crawled through the jungle and shot a Japanese soldier out of a tree. He shot and killed snipers. He threw grenades into enemy positions. He was wounded in the leg, but he kept going, clearing out Japanese positions for the other members of K Company, behind him. He soon ran out of grenades and switched to his rifle, and then, when he ran out of ammunition, used only his bayonet. He was wounded a second time, but when a medic crawled toward him to help bring him back behind the lines Nininger waved him off. He saw a Japanese bunker up ahead. As he leaped out of a shell hole, he was spun around by a bullet to the shoulder, but he kept charging at the bunker, where a Japanese officer and two enlisted men were dug in. He dispatched one soldier with a double thrust of his bayonet, clubbed down the other, and bayonetted the officer. Then, with outstretched arms, he collapsed face down. For his heroism, Nininger was posthumously awarded the Medal of Honor, the first American soldier so decorated in the Second World War.
2.
Suppose that you were a senior Army officer in the early days of the Second World War and were trying to put together a crack team of fearless and ferocious fighters. Sandy Nininger, it now appears, had exactly the right kind of personality for that assignment, but is there any way you could have known this beforehand? It clearly wouldn't have helped to ask Nininger if he was fearless and ferocious, because he didn't know that he was fearless and ferocious. Nor would it have worked to talk to people who spent time with him. His friend would have told you only that Nininger was quiet and thoughtful and loved the theatre, and his commanding officer would have talked about the evenings of tea and Tchaikovsky. With the exception, perhaps, of the Scarlet Pimpernel, a love of music, theatre, and long afternoons in front of a teapot is not a known predictor of great valor. What you need is some kind of sophisticated psychological instrument, capable of getting to the heart of his personality.
Over the course of the past century, psychology has been consumed with the search for this kind of magical instrument. Hermann Rorschach proposed that great meaning lay in the way that people described inkblots. The creators of the Minnesota Multiphasic Personality Inventory believed in the revelatory power of true-false items such as "I have never had any black, tarry-looking bowel movements" or "If the money were right, I would like to work for a circus or a carnival." Today, Annie Murphy Paul tells us in her fascinating new book, "Cult of Personality," that there are twenty-five hundred kinds of personality tests. Testing is a four-hundred-million-dollar-a-year industry. A hefty percentage of American corporations use personality tests as part of the hiring and promotion process. The tests figure in custody battles and in sentencing and parole decisions. "Yet despite their prevalence—and the importance of the matters they are called upon to decide—personality tests have received surprisingly little scrutiny," Paul writes. We can call in the psychologists. We can give Sandy Nininger a battery of tests. But will any of it help?
One of the most popular personality tests in the world is the Myers-Briggs Type Indicator (M.B.T.I.), a psychological-assessment system based on Carl Jung's notion that people make sense of the world through a series of psychological frames. Some people are extroverts, some are introverts. Some process information through logical thought. Some are directed by their feelings. Some make sense of the world through intuitive leaps. Others collect data through their senses. To these three categories— (I)ntroversion/(E)xtroversion, i(N)tuition/(S)ensing, (T)hinking/(F)eeling—the Myers-Briggs test adds a fourth: (J)udging/(P)erceiving. Judgers "like to live in a planned, orderly way, seeking to regulate and manage their lives," according to an M.B.T.I. guide, whereas Perceivers "like to live in a flexible, spontaneous way, seeking to experience and understand life, rather than control it." The M.B.T.I. asks the test-taker to answer a series of "forced-choice" questions, where one choice identifies you as belonging to one of these paired traits. The basic test takes twenty minutes, and at the end you are presented with a precise, multidimensional summary of your personality-your type might be INTJ or ESFP, or some other combination. Two and a half million Americans a year take the Myers-Briggs. Eighty-nine companies out of the Fortune 100 make use of it, for things like hiring or training sessions to help employees "understand" themselves or their colleagues. Annie Murphy Paul says that at the eminent consulting firm McKinsey, " 'associates' often know their colleagues' four-letter M.B.T.I. types by heart," the way they might know their own weight or (this being McKinsey) their S.A.T. scores.
It is tempting to think, then, that we could figure out the Myers-Briggs type that corresponds best to commando work, and then test to see whether Sandy Nininger fits the profile. Unfortunately, the notion of personality type is not nearly as straightforward as it appears. For example, the Myers-Briggs poses a series of items grouped around the issue of whether you—the test-taker—are someone who likes to plan your day or evening beforehand or someone who prefers to be spontaneous. The idea is obviously to determine whether you belong to the Judger or Perceiver camp, but the basic question here is surprisingly hard to answer. I think I'm someone who likes to be spontaneous. On the other hand, I have embarked on too many spontaneous evenings that ended up with my friends and me standing on the sidewalk, looking at each other and wondering what to do next. So I guess I'm a spontaneous person who recognizes that life usually goes more smoothly if I plan first, or, rather, I'm a person who prefers to be spontaneous only if there's someone around me who isn't. Does that make me spontaneous or not? I'm not sure. I suppose it means that I'm somewhere in the middle.
This is the first problem with the Myers-Briggs. It assumes that we are either one thing or another—Intuitive or Sensing, Introverted or Extroverted. But personality doesn't fit into neat binary categories: we fall somewhere along a continuum.
Here's another question: Would you rather work under a boss (or a teacher) who is good-natured but often inconsistent, or sharp-tongued but always logical?
On the Myers-Briggs, this is one of a series of questions intended to establish whether you are a Thinker or a Feeler. But I'm not sure I know how to answer this one, either. I once had a good-natured boss whose inconsistency bothered me, because he exerted a great deal of day-to-day control over my work. Then I had a boss who was quite consistent and very sharp-tongued—but at that point I was in a job where day-to-day dealings with my boss were minimal, so his sharp tongue didn't matter that much. So what do I want in a boss? As far as I can tell, the only plausible answer is: It depends. The Myers-Briggs assumes that who we are is consistent from one situation to another. But surely what we want in a boss, and how we behave toward our boss, is affected by what kind of job we have.
This is the gist of the now famous critique that the psychologist Walter Mischel has made of personality testing. One of Mischel's studies involved watching children interact with one another at a summer camp. Aggressiveness was among the traits that he was interested in, so he watched the children in five different situations: how they behaved when approached by a peer, when teased by a peer, when praised by an adult, when punished by an adult, and when warned by an adult. He found that how aggressively a child responded in one of those situations wasn't a good predictor of how that same child responded in another situation. Just because a boy was aggressive in the face of being teased by another boy didn't mean that he would be aggressive in the face of being warned by an adult. On the other hand, if a child responded aggressively to being teased by a peer one day, it was a pretty good indicator that he'd respond aggressively to being teased by a peer the next day. We have a personality in the sense that we have a consistent pattern of behavior. But that pattern is complex and that personality is contingent: it represents an interaction between our internal disposition and tendencies and the situations that we find ourselves in.
It's not surprising, then, that the Myers-Briggs has a large problem with consistency: according to some studies, more than half of those who take the test a second time end up with a different score than when they took it the first time. Since personality is continuous, not dichotomous, clearly some people who are borderline Introverts or Feelers one week slide over to Extroversion or Thinking the next week. And since personality is contingent, not stable, how we answer is affected by which circumstances are foremost in our minds when we take the test. If I happen to remember my first boss, then I come out as a Thinker. If my mind is on my second boss, I come out as a Feeler. When I took the Myers-Briggs, I scored as an INTJ. But, if odds are that I'm going to be something else if I take the test again, what good is it?
Once, for fun, a friend and I devised our own personality test. Like the M.B.T.I., it has four dimensions. The first is Canine/Feline. In romantic relationships, are you the pursuer, who runs happily to the door, tail wagging? Or are you the pursued? The second is More/Different. Is it your intellectual style to gather and master as much information as you can or to make imaginative use of a discrete amount of information? The third is Insider/Outsider. Do you get along with your parents or do you define yourself outside your relationship with your mother and father? And, finally, there is Nibbler/Gobbler. Do you work steadily, in small increments, or do everything at once, in a big gulp? I'm quite pleased with the personality inventory we devised. It directly touches on four aspects of life and temperament-romance, cognition, family, and work style—that are only hinted at by Myers-Briggs. And it can be completed in under a minute, nineteen minutes faster than Myers-Briggs, an advantage not to be dismissed in today's fast-paced business environment. Of course, the four traits it measures are utterly arbitrary, based on what my friend and I came up with over the course of a phone call. But then again surely all universal dichotomous typing systems are arbitrary.
Where did the Myers-Briggs come from, after all? As Paul tells us, it began with a housewife from Washington, D.C., named Katharine Briggs, at the turn of the last century. Briggs had a daughter, Isabel, an only child for whom (as one relative put it) she did "everything but breathe." When Isabel was still in her teens, Katharine wrote a book-length manuscript about her daughter's remarkable childhood, calling her a "genius" and "a little Shakespeare." When Isabel went off to Swarthmore College, in 1915, the two exchanged letters nearly every day. Then, one day, Isabel brought home her college boyfriend and announced that they were to be married. His name was Clarence (Chief) Myers. He was tall and handsome and studying to be a lawyer, and he could not have been more different from the Briggs women. Katharine and Isabel were bold and imaginative and intuitive. Myers was practical and logical and detail-oriented. Katharine could not understand her future son-in-law. "When the blissful young couple returned to Swarthmore," Paul writes, "Katharine retreated to her study, intent on 'figuring out Chief.' "She began to read widely in psychology and philosophy. Then, in 1923, she came across the first English translation of Carl Jung's "Psychological Types." "This is it!" Katharine told her daughter. Paul recounts, "In a dramatic display of conviction she burned all her own research and adopted Jung's book as her 'Bible,' as she gushed in a letter to the man himself. His system explained it all: Lyman [Katharine's husband], Katharine, Isabel, and Chief were introverts; the two men were thinkers, while the women were feelers; and of course the Briggses were intuitives, while Chief was a senser." Encouraged by her mother, Isabel—who was living in Swarthmore and writing mystery novels—devised a paper-and-pencil test to help people identify which of the Jungian categories they belonged to, and then spent the rest of her life tirelessly and brilliantly promoting her creation.
The problem, as Paul points out, is that Myers and her mother did not actually understand Jung at all. Jung didn't believe that types were easily identifiable, and he didn't believe that people could be permanently slotted into one category or another. "Every individual is an exception to the rule," he wrote; to "stick labels on people at first sight," in his view, was "nothing but a childish parlor game." Why is a parlor game based on my desire to entertain my friends any less valid than a parlor game based on Katharine Briggs's obsession with her son-in-law?
3.
The problems with the Myers-Briggs suggest that we need a test that is responsive to the complexity and variability of the human personality. And that is why, not long ago, I found myself in the office of a psychologist from New Jersey named Lon Gieser. He is among the country's leading experts on what is called the Thematic Apperception Test (T.A.T.), an assessment tool developed in the nineteen-thirties by Henry Murray, one of the most influential psychologists of the twentieth century.
I sat in a chair facing Gieser, as if I were his patient. He had in his hand two dozen or so pictures—mostly black-and-white drawings—on legal-sized cards, all of which had been chosen by Murray years before. "These pictures present a series of scenes," Gieser said to me. "What I want you to do with each scene is tell a story with a beginning, a middle, and an end." He handed me the first card. It was of a young boy looking at a violin. I had imagined, as Gieser was describing the test to me, that it would be hard to come up with stories to match the pictures. As I quickly discovered, though, the exercise was relatively effortless: the stories just tumbled out.
"This is a young boy," I began. "His parents want him to take up the violin, and they've been encouraging him. I think he is uncertain whether he wants to be a violin player, and maybe even resents the imposition of having to play this instrument, which doesn't seem to have any appeal for him. He's not excited or thrilled about this. He'd rather be somewhere else. He's just sitting there looking at it, and dreading having to fulfill this parental obligation."
I continued in that vein for a few more minutes. Gieser gave me another card, this one of a muscular man clinging to a rope and looking off into the distance. "He's climbing up, not climbing down," I said, and went on:
It's out in public. It's some kind of big square, in Europe, and there is some kind of spectacle going on. It's the seventeenth or eighteenth century. The King is coming by in a carriage, and this man is shimmying up, so he can see over everyone else and get a better view of the King. I don't get the sense that he's any kind of highborn person. I think he aspires to be more than he is. And he's kind of getting a glimpse of the King as a way of giving himself a sense of what he could be, or what his own future could be like.
We went on like this for the better part of an hour, as I responded to twelve cards—each of people in various kinds of ambiguous situations. One picture showed a woman slumped on the ground, with some small object next to her; another showed an attractive couple in a kind of angry embrace, apparently having an argument. (I said that the fight they were having was staged, that each was simply playing a role.) As I talked, Gieser took notes. Later, he called me and gave me his impressions. "What came out was the way you deal with emotion," he said. "Even when you recognized the emotion, you distanced yourself from it. The underlying motive is this desire to avoid conflict. The other thing is that when there are opportunities to go to someone else and work stuff out, your character is always going off alone. There is a real avoidance of emotion and dealing with other people, and everyone goes to their own corners and works things out on their own."
How could Gieser make such a confident reading of my personality after listening to me for such a short time? I was baffled by this, at first, because I felt that I had told a series of random and idiosyncratic stories. When I listened to the tape I had made of the session, though, I saw what Gieser had picked up on: my stories were exceedingly repetitive in just the way that he had identified. The final card that Gieser gave me was blank, and he asked me to imagine my own picture and tell a story about it. For some reason, what came to mind was Andrew Wyeth's famous painting "Christina's World," of a woman alone in a field, her hair being blown by the wind. She was from the city, I said, and had come home to see her family in the country: "I think she is taking a walk. She is pondering some piece of important news. She has gone off from the rest of the people to think about it." Only later did I realize that in the actual painting the woman is not strolling through the field. She is crawling, desperately, on her hands and knees. How obvious could my aversion to strong emotion be?
The T.A.T. has a number of cards that are used to assess achievement—that is, how interested someone is in getting ahead and succeeding in life. One is the card of the man on the rope; another is the boy looking at his violin. Gieser, in listening to my stories, concluded that I was very low in achievement:
Some people say this kid is dreaming about being a great violinist, and he's going to make it. With you, it wasn't what he wanted to do at all. His parents were making him do it. With the rope climbing, some people do this Tarzan thing. They climb the pole and get to the top and feel this great achievement. You have him going up the rope—and why is he feeling the pleasure? Because he's seeing the King. He's still a nobody in the public square, looking at the King.
Now, this is a little strange. I consider myself quite ambitious. On a questionnaire, if you asked me to rank how important getting ahead and being successful was to me, I'd check the "very important" box. But Gieser is suggesting that the T.A.T. allowed him to glimpse another dimension of my personality.
This idea—that our personality can hold contradictory elements—is at the heart of "Strangers to Ourselves," by the social psychologist Timothy D. Wilson. He is one of the discipline's most prominent researchers, and his book is what popular psychology ought to be (and rarely is): thoughtful, beautifully written, and full of unexpected insights. Wilson's interest is in what he calls the "adaptive unconscious" (not to be confused with the Freudian unconscious). The adaptive unconscious, in Wilson's description, is a big computer in our brain which sits below the surface and evaluates, filters, and looks for patterns in the mountain of data that come in through our senses. That system, Wilson argues, has a personality: it has a set of patterns and responses and tendencies that are laid down by our genes and our early-childhood experiences. These patterns are stable and hard to change, and we are only dimly aware of them. On top of that, in his schema we have another personality: it's the conscious identity that we create for ourselves with the choices we make, the stories we tell about ourselves, and the formal reasons we come up with to explain our motives and feelings. Yet this "constructed self" has no particular connection with the personality of our adaptive unconscious. In fact, they could easily be at odds. Wilson writes:
The adaptive unconscious is more likely to influence people's uncontrolled, implicit responses, whereas the constructed self is more likely to influence people's deliberative, explicit responses. For example, the quick, spontaneous decision of whether to argue with a co-worker is likely to be under the control of one's nonconscious needs for power and affiliation. A more thoughtful decision about whether to invite a co-worker over for dinner is more likely to be under the control of one's conscious, self-attributed motives.
When Gieser said that he thought I was low in achievement, then, he presumably saw in my stories an unconscious ambivalence toward success. The T.A.T., he believes, allowed him to go beyond the way I viewed myself and arrive at a reading with greater depth and nuance.
Even if he's right, though, does this help us pick commandos? I'm not so sure. Clearly, underneath Sandy Nininger's peaceful façade there was another Nininger capable of great bravery and ferocity, and a T.A.T. of Nininger might have given us a glimpse of that part of who he was. But let's not forget that he volunteered for the front lines: he made a conscious decision to put himself in the heat of the action. What we really need is an understanding of how those two sides of his personality interact in critical situations. When is Sandy Nininger's commitment to peacefulness more, or less, important than some unconscious ferocity? The other problem with the T.A.T., of course, is that it's a subjective instrument. You could say that my story about the man climbing the rope is evidence that I'm low in achievement or you could say that it shows a strong desire for social mobility. The climber wants to look down—not up—at the King in order to get a sense "of what he could be." You could say that my interpretation that the couple's fighting was staged was evidence of my aversion to strong emotion. Or you could say that it was evidence of my delight in deception and role-playing. This isn't to question Gieser's skill or experience as a diagnostician. The T.A.T. is supposed to do no more than identify themes and problem areas, and I'm sure Gieser would be happy to put me on the couch for a year to explore those themes and see which of his initial hypotheses had any validity. But the reason employers want a magical instrument for measuring personality is that they don't have a year to work through the ambiguities. They need an answer now.
4.
A larger limitation of both Myers-Briggs and the T.A.T. is that they are indirect. Tests of this kind require us first to identify a personality trait that corresponds to the behavior we're interested in, and then to figure out how to measure that trait—but by then we're two steps removed from what we're after. And each of those steps represents an opportunity for error and distortion. Shouldn't we try, instead, to test directly for the behavior we're interested in? This is the idea that lies behind what's known as the Assessment Center, and the leading practitioner of this approach is a company called Development Dimensions International, or D.D.I.
Companies trying to evaluate job applicants send them to D.D.I.'s headquarters, outside Pittsburgh, where they spend the day role-playing as business executives. When I contacted D.D.I., I was told that I was going to be Terry Turner, the head of the robotics division of a company called Global Solutions.
I arrived early in the morning, and was led to an office. On the desk was a computer, a phone, and a tape recorder. In the corner of the room was a video camera, and on my desk was an agenda for the day. I had a long telephone conversation with a business partner from France. There were labor difficulties at an overseas plant. A new product—a robot for the home-had run into a series of technical glitches. I answered e-mails. I prepared and recorded a talk for a product-launch meeting. I gave a live interview to a local television reporter. In the afternoon, I met with another senior Global Solutions manager, and presented a strategic plan for the future of the robotics division. It was a long, demanding day at the office, and when I left, a team of D.D.I. specialists combed through copies of my e-mails, the audiotapes of my phone calls and my speech, and the videotapes of my interviews, and analyzed me across four dimensions: interpersonal skills, leadership skills, business-management skills, and personal attributes. A few weeks later, I was given my report. Some of it was positive: I was a quick learner. I had good ideas. I expressed myself well, and—I was relieved to hear—wrote clearly. But, as the assessment of my performance made plain, I was something less than top management material:
Although you did a remarkable job addressing matters, you tended to handle issues from a fairly lofty perch, pitching good ideas somewhat unilaterally while lobbing supporting rationale down to the team below. . . . Had you brought your team closer to decisions by vesting them with greater accountability, responsibility and decision-making authority, they would have undoubtedly felt more engaged, satisfied and valued. . . .In a somewhat similar vein, but on a slightly more interpersonal level, while you seemed to recognize the value of collaboration and building positive working relationships with people, you tended to take a purely businesslike approach to forging partnerships. You spoke of win/win solutions from a business perspective and your rationale for partnering and collaboration seemed to be based solely on business logic. Additionally, at times you did not respond to some of the softer, subtler cues that spoke to people's real frustrations, more personal feelings, or true point of view.
Ouch! Of course, when the D.D.I. analysts said that I did not respond to "some of the softer, subtler cues that spoke to people's real frustrations, more personal feelings, or true point of view," they didn't mean that I was an insensitive person. They meant that I was insensitive in the role of manager. The T.A.T. and M.B.T.I. aimed to make global assessments of the different aspects of my personality. My day as Terry Turner was meant to find out only what I'm like when I'm the head of the robotics division of Global Solutions. That's an important difference. It respects the role of situation and contingency in personality. It sidesteps the difficulty of integrating my unconscious self with my constructed self by looking at the way that my various selves interact in the real world. Most important, it offers the hope that with experience and attention I can construct a more appropriate executive "self." The Assessment Center is probably the best method that employers have for evaluating personality.
But could an Assessment Center help us identify the Sandy Niningers of the world? The center makes a behavioral prediction, and, as solid and specific as that prediction is, people are least predictable at those critical moments when prediction would be most valuable. The answer to the question of whether my Terry Turner would be a good executive is, once again: It depends. It depends on what kind of company Global Solutions is, and on what kind of respect my co-workers have for me, and on how quickly I manage to correct my shortcomings, and on all kinds of other things that cannot be anticipated. The quality of being a good manager is, in the end, as irreducible as the quality of being a good friend. We think that a friend has to be loyal and nice and interesting—and that's certainly a good start. But people whom we don't find loyal, nice, or interesting have friends, too, because loyalty, niceness, and interestingness are emergent traits. They arise out of the interaction of two people, and all we really mean when we say that someone is interesting or nice is that they are interesting or nice to us.
All these difficulties do not mean that we should give up on the task of trying to understand and categorize one another. We could certainly send Sandy Nininger to an Assessment Center, and find out whether, in a make-believe battle, he plays the role of commando with verve and discipline. We could talk to his friends and discover his love of music and theatre. We could find out how he responded to the picture of the man on a rope. We could sit him down and have him do the Myers-Briggs and dutifully note that he is an Introverted, Intuitive, Thinking Judger, and, for good measure, take an extra minute to run him through my own favorite personality inventory and type him as a Canine, Different, Insider Gobbler. We will know all kinds of things about him then. His personnel file will be as thick as a phone book, and we can consult our findings whenever we make decisions about his future. We just have to acknowledge that his file will tell us little about the thing we're most interested in. For that, we have to join him in the jungles of Bataan.
THE ARCHIVE
complete list
Articles from the New Yorker
Getting Over It
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
November 8, 2004
ANNALS OF PSYCHOLOGY
The Man in the Gray Flannel Suit put
the war behind him. Why can't we?
1.
When Tom Rath, the hero of Sloan Wilson's 1955 novel "The Man in the Gray Flannel Suit," comes home to Connecticut each day from his job in Manhattan, his wife mixes him a Martini. If he misses the train, he'll duck into the bar at Grand Central Terminal and have a highball, or perhaps a Scotch. On Sunday mornings, Rath and his wife lie around drinking Martinis. Once, Rath takes a tumbler of Martinis to bed, and after finishing it drifts off to sleep. Then his wife wakes him up in the middle of the night, wanting to talk. "I will if you get me a drink," he says. She comes back with a glass half full of ice and gin. "On Greentree Avenue cocktail parties started at seven-thirty, when the men came home from New York, and they usually continued without any dinner until three or four o'clock in the morning," Wilson writes of the tidy neighborhood in Westport where Rath and countless other young, middle-class families live. "Somewhere around nine-thirty in the evening, Martinis and Manhattans would give way to highballs, but the formality of eating anything but hors d'oeuvres in-between had been entirely omitted."
"The Man in the Gray Flannel Suit" is about a public-relations specialist who lives in the suburbs, works for a media company in midtown, and worries about money, job security, and educating his children. It was an enormous best-seller. Gregory Peck played Tom Rath in the Hollywood version, and today, on the eve of the fiftieth anniversary of the book's publication, many of the themes the novel addresses seem strikingly contemporary. But in other ways "The Man in the Gray Flannel Suit" is utterly dated. The details are all wrong. Tom Rath, despite an introspective streak, is supposed to be a figure of middle-class normalcy. But by our standards he and almost everyone else in the novel look like alcoholics. The book is supposed to be an argument for the importance of family over career. But Rath's three children—the objects of his sacrifice—are so absent from the narrative and from Rath's consciousness that these days he'd be called an absentee father.
The most discordant note, though, is struck by the account of Rath's experience in the Second World War. He had, it becomes clear, a terrible war. As a paratrooper in Europe, he and his close friend Hank Mahoney find themselves trapped—starving and freezing—behind enemy lines, and end up killing two German sentries in order to take their sheepskin coats. But Rath doesn't quite kill one of them, and Mahoney urges him to finish the job:
Tom had knelt beside the sentry. He had not thought it would be difficult, but the tendons of the boy's neck had proved tough, and suddenly the sentry had started to sit up. In a rage Tom had plunged the knife repeatedly into his throat, ramming it home with all his strength until he had almost severed the head from the body.
At the end of the war, Rath and Mahoney are transferred to the Pacific theatre for the invasion of the island of Karkow. There Rath throws a hand grenade and inadvertently kills his friend. He crawls over to Hank's body, calling out his name. "Tom had put his hand under Mahoney's arm and turned him over," Wilson writes. "Mahoney's entire chest had been torn away, leaving the naked lungs and splintered ribs exposed."
Rath picks up the body and runs back toward his own men, dodging enemy fire. Coming upon a group of Japanese firing from a cave, he props the body up, crawls within fifteen feet of the machine gun, tosses in two grenades, and then finishes off the lone survivor with a knife. He takes Hank's body into a bombed-out pillbox and tries to resuscitate his friend's corpse. The medics tell him that Hank has been dead for hours. He won't listen. In a daze, he runs with the body toward the sea.
Wilson's description of Mahoney's death is as brutal and moving a description of the madness of combat as can be found in postwar fiction. But what happens to Rath as a result of that day in Karkow? Not much. It does not destroy him, or leave him permanently traumatized. The part of Rath's war experience that leaves him truly guilt-ridden is the adulterous affair that he has with a woman named Maria while waiting for redeployment orders in Rome. In the elevator of his midtown office, he runs into a friend who knew Maria, and learns that he fathered a son. He obsessively goes over and over the affair in his mind, trying to square his feeling toward Maria with his love for his wife, and his marriage is fully restored only when he confesses to the existence of his Italian child. Killing his best friend, by contrast, is something that comes up and then gets tucked away. As Rath sat on the beach, and Mahoney's body was finally taken away, Wilson writes:
A major, coming to squat beside him, said, "Some of these goddamn sailors got heads. They went ashore and got Jap heads, and they tried to boil them in the galley to get the skulls for souvenirs."
Tom had shrugged and said nothing. The fact that he had been too quick to throw a hand grenade and had killed Mahoney, the fact that some young sailors had wanted skulls for souvenirs, and the fact that a few hundred men had lost their lives to take the island of Karkow—all these facts were simply incomprehensible and had to be forgotten. That, he had decided, was the final truth of the war, and he had greeted it with relief, greeted it eagerly, the simple fact that it was incomprehensible and had to be forgotten. Things just happen, he had decided; they happen and they happen again, and anybody who tries to make sense out of it goes out of his mind.
You couldn't write that scene today, at least not without irony. No soldier, according to our contemporary understanding, could ever shrug off an experience like that. Today, it is Rath's affair with Maria that would be rationalized and explained away. He was a soldier, after all, in the midst of war. Who knew if he would ever see his wife again? Tim O'Brien's best-selling 1994 novel "In the Lake of the Woods" has a narrative structure almost identical to that of "The Man in the Gray Flannel Suit." O'Brien's hero, John Wade, is present at a massacre of civilians in the Vietnamese village of Thuan Yen. He kills a fellow-soldier—a man he loved like a brother. And, just like Rath, Wade sits down at the end of the long afternoon of the worst day of his war and tries to wish the memory away:
And then later still, snagged in the sunlight, he gave himself over to forgetfulness. "Go away," he murmured. He waited a moment, then said it again, firmly, much louder, and the little village began to vanish inside its own rosy glow. Here, he reasoned, was the most majestic trick of all. In the months and years ahead, John Wade would remember Thuan Yen the way chemical nightmares are remembered, impossible combinations, impossible events, and over time the impossibility itself would become the richest and deepest and most profound memory
This could not have happened. Therefore it did not.
Already he felt better.
But John Wade cannot forget. That's the point of O'Brien's book. "The Man in the Gray Flannel Suit" ends with Tom Rath stronger, and his marriage renewed. Wade falls apart, and when he returns home to the woman he left behind he wakes up screaming in his sleep. By the end of the novel, the past has come back and destroyed Wade, and one reason for the book's power is the inevitability of that disaster. This is the difference between a novel written in the middle of the last century and a novel written at the end of the century. Somehow in the intervening decades our understanding of what it means to experience a traumatic event has changed. We believe in John Wade now, not Tom Rath, and half a century after the publication of "The Man in the Gray Flannel Suit" it's worth wondering whether we've got it right.
2.
Several years ago, three psychologists—Bruce Rind, Robert Bauserman, and Philip Tromovitch—published an article on childhood sexual abuse in Psychological Bulletin, one of academic psychology's most prestigious journals. It was what psychologists call a meta-analysis. The three researchers collected fifty-nine studies that had been conducted over the years on the long-term psychological effects of childhood sexual abuse (C.S.A.), and combined the data, in order to get the most definitive and statistically powerful result possible.
What most studies of sexual abuse show is that if you gauge the psychological health of young adults—typically college students—using various measures of mental health (alcohol problems, depression, anxiety, eating disorders, obsessive-compulsive symptoms, social adjustment, sleeping problems, suicidal thoughts and behavior, and so on), those with a history of childhood sexual abuse will have more problems across the board than those who weren't abused. That makes intuitive sense. But Rind and his colleagues wanted to answer that question more specifically: how much worse off were the sexually abused? The fifty-nine studies were run through a series of sophisticated statistical tests. Studies from different times and places were put on the same scale. The results were surprising. The difference between the psychological health of those who had been abused and those who hadn't, they found, was marginal. It was two-tenths of a standard deviation. "That's like the difference between someone with an I.Q. of 100 and someone with an I.Q. of 97," Rind says. "Ninety-seven is statistically different from 100. But it's a trivial difference."
Then Rind and his colleagues went one step further. A significant percentage of people who were sexually abused as children grew up in families with a host of other problems, like violence, neglect, and verbal abuse. So, to the extent that the sexually abused were damaged, what caused the damage—the sexual abuse, or the violence and neglect that so often accompanied the abuse? The data suggested that it was the latter, and, if you account for such factors, that two-tenths of a standard deviation shrinks even more. "The real gap is probably smaller than 100 and 97," Rind says. "It might be 98, or maybe it's 99." The studies analyzed by Rind and his colleagues show that some victims of sexual abuse don't even regard themselves, in retrospect, as victims. Among the male college students surveyed, for instance, Rind and his colleagues found that "37 percent viewed their C.S.A. experiences as positive at the time they occurred," while forty-two per cent viewed them as positive when reflecting back on them.
The Rind article was published in the summer of 1998, and almost immediately it was denounced by conservative groups and lambasted in the media. Laura Schlessinger—a popular radio talk-show host known as Dr. Laura—called it "junk science." In Washington, Representative Matt Salmon called it "the Emancipation Proclamation for pedophiles," while Representative Tom DeLay accused it of "normalizing pedophilia." They held a press conference at which they demanded that the American Psychological Association censure the paper. In July of 1999, a year after its publication, both the House and the Senate overwhelmingly passed resolutions condemning the analysis. Few articles in the history of academic psychology have created such a stir.
But why? It's not as if the authors said that C.S.A. was a good thing. They just suggested that it didn't cause as many problems as we'd thought—and the question of whether C.S.A. is morally wrong doesn't hinge on its long-term consequences. Nor did the study say that sexual abuse was harmless. On average, the researchers concluded, the long-term damage is small. But that average is made up of cases where the damage is hard to find (like C.S.A. involving adolescent boys) and cases where the damage is quite significant (like father-daughter incest). Rind was trying to help psychologists focus on what was truly harmful. And, when it came to the effects of things like physical abuse and neglect, he and his colleagues sounded the alarm. "What happens in physical abuse is that it doesn't happen once," Rind says. "It happens time and time again. And, when it comes to neglect, the research shows that is the most noxious factor of all—worse than physical abuse. Why? Because it's not practiced for one week. It's a persistent thing. It's a permanent feature of the parent-child relationship. These are the kinds of things that cause problems in adulthood."
All Rind and his colleagues were saying is that sexual abuse is often something that people eventually can get over, and one of the reasons that the Rind study was so unacceptable is that we no longer think that traumatic experiences are things we can get over. We believe that the child who is molested by an uncle or a priest, on two or three furtive occasions, has to be permanently scarred by the experience—just as the soldier who accidentally kills his best friend must do more than sit down on the beach and decide that sometimes things just "happen."
In a recent history of the Rind controversy, the psychologist Scott Lilienfeld pointed out that when we find out that something we thought was very dangerous actually isn't that dangerous after all we usually regard what we've learned as good news. To him, the controversy was a paradox, and he is quite right. This attachment we have to John Wade over Tom Rath is not merely a preference for one kind of war narrative over another. It is a shift in perception so profound that the United States Congress could be presented with evidence of the unexpected strength and resilience of the human spirit and reject it without a single dissenting vote.
3.
In "The Man in the Gray Flannel Suit," Tom Rath works for Ralph Hopkins, who is the president of the United Broadcasting Company. Hopkins has decided that he wants to play a civic role in the issue of mental health, and Rath's job is to write his speeches and handle public relations connected to the project. "It all started when a group of doctors called on me a few months ago," Hopkins tells Rath, when he hires him for the job. "They apparently felt that there is too little public understanding of the whole question of mental illness, and that a campaign like the fight against cancer or polio is needed." Again and again, in the novel, the topic of mental health surfaces. Rath's father, we learn, suffered a nervous breakdown after serving in the trenches of the First World War, and died in what may well have been a suicide. His grandmother, whose death sets the book's plot in motion, wanders in and out of lucidity at the end of her life. Hopkins, in a hilarious scene, recalls his unsatisfactory experience with a psychiatrist. To Wilson's readers, this preoccupation would not have seemed out of place. In 1955, the population of New York State's twenty-seven psychiatric hospitals was nearly ninety-four thousand. (Today, largely because of anti-psychotic drugs, it is less than six thousand.) It was impossible to drive any distance from Manhattan and not be confronted with ominous, hulking reminders of psychiatric distress: the enormous complex across the Triborough Bridge, on Wards Island; Sagamore and Pilgrim Hospitals, on Long Island; Creedmoor, in Queens. Mental health mattered to the reader of the nineteen-fifties, in a way that, say, aids mattered in the novels of the late nineteen-eighties.
But Wilson draws a very clear line between the struggles of the Raths and the plight of those suffering from actual mental illness. At one point, for example, Rath's wife, Betsy, wonders why nothing is fun anymore:
It probably would take a psychiatrist to answer that. Maybe Tom and I both ought to visit one, she thought. What's the matter? the psychiatrist would say, and I would reply, I don't know—nothing seems to be much fun any more. All of a sudden the music stopped, and it didn't start again. Is that strange, or does it happen to everyone about the time when youth starts to go?
The psychiatrist would have an explanation, Betsy thought, but I don't want to hear it. People rely too much on explanations these days, and not enough on courage and action. . . . Tom has a good job, and he'll get his enthusiasm back, be a success at it. Everything's going to be fine. It does no good to wallow in night thoughts. In God we trust, and that's that.
This is not denial, much as it may sound like it. Betsy Rath is not saying that her husband doesn't have problems. She's just saying that, in all likelihood, Tom will get over his problems. This is precisely the idea that lies at the heart of the Rind meta-analysis. Once you've separated out the small number of seriously damaged people—the victims of father-daughter incest, or of prolonged neglect and physical abuse—the balance of C.S.A. survivors are pretty much going to be fine. The same is true, it turns out, of other kinds of trauma. The Columbia University psychologist George Bonanno, for instance, followed a large number of men and women who had recently lost a spouse. "In the bereavement area, the assumption has been that when people lose a loved one there is a kind of unitary process that everybody must go through," Bonanno says. "That process has been called grief work. The grief must be processed. It must be examined. It must be fully understood, then finished. It was the same kind of assumption that dominated the trauma world. The idea was that everybody exposed to these kinds of events will have to go through the same kind of process if they are to recover. And if you don't do this, if you have somehow inhibited or buried the experience, the assumption was that you would pay in the long run."
Instead, Bonanno found a wide range of responses. Some people went through a long and painful grieving process; others a period of debilitating depression. But by far the most common response was resilience: the majority of those who had just suffered from one of the most painful experiences of their lives never lapsed into serious depression, experienced a relatively brief period of grief symptoms, and soon returned to normal functioning. These people were not necessarily the hardiest or the healthiest. They just managed, by one means or another, to muddle through.
"Most people just plain cope well," Bonanno says. "The vast majority of people get over traumatic events, and get over them remarkably well. Only a small subset—five to fifteen per cent—struggle in a way that says they need help."
What these patterns of resilience suggest is that human beings are naturally endowed with a kind of psychological immune system, which keeps us in balance and overcomes wild swings to either end of the emotional spectrum. Most of us aren't resilient just in the wake of bad experiences, after all. We're also resilient in the wake of wonderful experiences; the joy of a really good meal, or winning a tennis match, or getting praised by a boss doesn't last that long, either. "One function of emotions is to signal to people quickly which things in their environments are dangerous and should be avoided and which are positive and should be approached," Timothy Wilson, a psychologist at the University of Virginia, has said. "People have very fast emotional reactions to events that serve as signals, informing them what to do. A problem with prolonged emotional reactions to past events is that it might be more difficult for these signals to get through. If people are still in a state of bliss over yesterday's success, today's dangers and hazards might be more difficult to recognize." (Wilson, incidentally, is Sloan Wilson's nephew.)
Wilson and his longtime collaborator, Daniel T. Gilbert, argue that a distinctive feature of this resilience is that people don't realize that they possess it. People are bad at forecasting their emotions—at appreciating how well, under most circumstances, they will recover. Not long ago, for instance, Gilbert, Wilson, and two other researchers—Carey Morewedge and Jane Risen—asked passengers at a subway station in Cambridge, Massachusetts, how much regret they thought they would feel if they arrived on the platform just as a train was pulling away. Then they approached passengers who really had arrived just as their train was leaving, and asked them how they felt. They found that the predictions of how bad it would feel to have just barely missed a train were on average greater than reports of how it actually felt to watch the train pull away. We suffer from what Wilson and Gilbert call an impact bias: we always assume that our emotional states will last much longer than they do. We forget that other experiences will compete for our attention and emotions. We forget that our psychological immune system will kick in and take away the sting of adversity. "When I talk about our research, I say to people, 'I'm not telling you that bad things don't hurt,'" Gilbert says. "Of course they do. It would be perverse to say that having a child or a spouse die is not a big deal. All I'm saying is that the reality doesn't meet the expectation."
This is the difference between our own era and the one of half a century ago—between "The Man in the Gray Flannel Suit" and "In the Lake of the Woods." Sloan Wilson's book came from a time and a culture that had the confidence and wisdom to understand this truth. "I love you more than I can tell," Rath says to his wife at the end of the novel. It's an ending that no one would write today, but only because we have become blind to the fact that the past—in all but the worst of cases—sooner or later fades away. Betsy turns back to her husband:
"I want you to be able to talk to me about the war. It might help us to understand each other. Did you really kill seventeen men?"
"Yes."
"Do you want to talk about it now?"
"No. It's not that I want to and can't—it's just that I'd rather think about the future. About getting a new car and driving up to Vermont with you tomorrow."
"That will be fun. It's not an insane world. At least, our part of it doesn't have to be."
THE ARCHIVE
complete list
Articles from the New Yorker
Something Borrowed
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
November 22, 2004
Annals of culture
Should a charge of plagiarism ruin your life?
1.
One day this spring, a psychiatrist named Dorothy Lewis got a call from her friend Betty, who works in New York City. Betty had just seen a Broadway play called "Frozen," written by the British playwright Bryony Lavery. "She said, 'Somehow it reminded me of you. You really ought to see it,'" Lewis recalled. Lewis asked Betty what the play was about, and Betty said that one of the characters was a psychiatrist who studied serial killers. "And I told her, 'I need to see that as much as I need to go to the moon.'"
Lewis has studied serial killers for the past twenty-five years. With her collaborator, the neurologist Jonathan Pincus, she has published a great many research papers, showing that serial killers tend to suffer from predictable patterns of psychological, physical, and neurological dysfunction: that they were almost all the victims of harrowing physical and sexual abuse as children, and that almost all of them have suffered some kind of brain injury or mental illness. In 1998, she published a memoir of her life and work entitled "Guilty by Reason of Insanity." She was the last person to visit Ted Bundy before he went to the electric chair. Few people in the world have spent as much time thinking about serial killers as Dorothy Lewis, so when her friend Betty told her that she needed to see "Frozen" it struck her as a busman's holiday.
But the calls kept coming. "Frozen" was winning raves on Broadway, and it had been nominated for a Tony. Whenever someone who knew Dorothy Lewis saw it, they would tell her that she really ought to see it, too. In June, she got a call from a woman at the theatre where "Frozen" was playing. "She said she'd heard that I work in this field, and that I see murderers, and she was wondering if I would do a talk-back after the show," Lewis said. "I had done that once before, and it was a delight, so I said sure. And I said, would you please send me the script, because I wanted to read the play."
The script came, and Lewis sat down to read it. Early in the play, something caught her eye, a phrase: "it was one of those days." One of the murderers Lewis had written about in her book had used that same expression. But she thought it was just a coincidence. "Then, there's a scene of a woman on an airplane, typing away to her friend. Her name is Agnetha Gottmundsdottir. I read that she's writing to her colleague, a neurologist called David Nabkus. And with that I realized that more was going on, and I realized as well why all these people had been telling me to see the play."
Lewis began underlining line after line. She had worked at New York University School of Medicine. The psychiatrist in "Frozen" worked at New York School of Medicine. Lewis and Pincus did a study of brain injuries among fifteen death-row inmates. Gottmundsdottir and Nabkus did a study of brain injuries among fifteen death-row inmates. Once, while Lewis was examining the serial killer Joseph Franklin, he sniffed her, in a grotesque, sexual way. Gottmundsdottir is sniffed by the play's serial killer, Ralph. Once, while Lewis was examining Ted Bundy, she kissed him on the cheek. Gottmundsdottir, in some productions of "Frozen," kisses Ralph. "The whole thing was right there," Lewis went on. "I was sitting at home reading the play, and I realized that it was I. I felt robbed and violated in some peculiar way. It was as if someone had stolen—I don't believe in the soul, but, if there was such a thing, it was as if someone had stolen my essence."
Lewis never did the talk-back. She hired a lawyer. And she came down from New Haven to see "Frozen." "In my book," she said, "I talk about where I rush out of the house with my black carry-on, and I have two black pocketbooks, and the play opens with her"—Agnetha—"with one big black bag and a carry-on, rushing out to do a lecture." Lewis had written about biting her sister on the stomach as a child. Onstage, Agnetha fantasized out loud about attacking a stewardess on an airplane and "biting out her throat." After the play was over, the cast came onstage and took questions from the audience. "Somebody in the audience said, 'Where did Bryony Lavery get the idea for the psychiatrist?'" Lewis recounted. "And one of the cast members, the male lead, said, 'Oh, she said that she read it in an English medical magazine.'" Lewis is a tiny woman, with enormous, childlike eyes, and they were wide open now with the memory. "I wouldn't have cared if she did a play about a shrink who's interested in the frontal lobe and the limbic system. That's out there to do. I see things week after week on television, on 'Law & Order' or 'C.S.I.,' and I see that they are using material that Jonathan and I brought to light. And it's wonderful. That would have been acceptable. But she did more than that. She took things about my own life, and that is the part that made me feel violated."
At the request of her lawyer, Lewis sat down and made up a chart detailing what she felt were the questionable parts of Lavery's play. The chart was fifteen pages long. The first part was devoted to thematic similarities between "Frozen" and Lewis's book "Guilty by Reason of Insanity." The other, more damning section listed twelve instances of almost verbatim similarities—totalling perhaps six hundred and seventy-five words—between passages from "Frozen" and passages from a 1997 magazine profile of Lewis. The profile was called "Damaged." It appeared in the February 24, 1997, issue of The New Yorker. It was written by me.
2.
Words belong to the person who wrote them. There are few simpler ethical notions than this one, particularly as society directs more and more energy and resources toward the creation of intellectual property. In the past thirty years, copyright laws have been strengthened. Courts have become more willing to grant intellectual-property protections. Fighting piracy has become an obsession with Hollywood and the recording industry, and, in the worlds of academia and publishing, plagiarism has gone from being bad literary manners to something much closer to a crime. When, two years ago, Doris Kearns Goodwin was found to have lifted passages from several other historians, she was asked to resign from the board of the Pulitzer Prize committee. And why not? If she had robbed a bank, she would have been fired the next day.
I'd worked on "Damaged" through the fall of 1996. I would visit Dorothy Lewis in her office at Bellevue Hospital, and watch the videotapes of her interviews with serial killers. At one point, I met up with her in Missouri. Lewis was testifying at the trial of Joseph Franklin, who claims responsibility for shooting, among others, the civil-rights leader Vernon Jordan and the pornographer Larry Flynt. In the trial, a videotape was shown of an interview that Franklin once gave to a television station. He was asked whether he felt any remorse. I wrote:
"I can't say that I do," he said. He paused again, then added, "The only thing I'm sorry about is that it's not legal."
"What's not legal?"
Franklin answered as if he'd been asked the time of day: "Killing Jews."
That exchange, almost to the word, was reproduced in "Frozen."
Lewis, the article continued, didn't feel that Franklin was fully responsible for his actions. She viewed him as a victim of neurological dysfunction and childhood physical abuse. "The difference between a crime of evil and a crime of illness," I wrote, "is the difference between a sin and a symptom." That line was in "Frozen," too—not once but twice. I faxed Bryony Lavery a letter:
I am happy to be the source of inspiration for other writers, and had you asked for my permission to quote—even liberally—from my piece, I would have been delighted to oblige. But to lift material, without my approval, is theft.
Almost as soon as I'd sent the letter, though, I began to have second thoughts. The truth was that, although I said I'd been robbed, I didn't feel that way. Nor did I feel particularly angry. One of the first things I had said to a friend after hearing about the echoes of my article in "Frozen" was that this was the only way I was ever going to get to Broadway—and I was only half joking. On some level, I considered Lavery's borrowing to be a compliment. A savvier writer would have changed all those references to Lewis, and rewritten the quotes from me, so that their origin was no longer recognizable. But how would I have been better off if Lavery had disguised the source of her inspiration?
Dorothy Lewis, for her part, was understandably upset. She was considering a lawsuit. And, to increase her odds of success, she asked me to assign her the copyright to my article. I agreed, but then I changed my mind. Lewis had told me that she "wanted her life back." Yet in order to get her life back, it appeared, she first had to acquire it from me. That seemed a little strange.
Then I got a copy of the script for "Frozen." I found it breathtaking. I realize that this isn't supposed to be a relevant consideration. And yet it was: instead of feeling that my words had been taken from me, I felt that they had become part of some grander cause. In late September, the story broke. The Times, the Observer in England, and the Associated Press all ran stories about Lavery's alleged plagiarism, and the articles were picked up by newspapers around the world. Bryony Lavery had seen one of my articles, responded to what she read, and used it as she constructed a work of art. And now her reputation was in tatters. Something about that didn't seem right.
3.
In 1992, the Beastie Boys released a song called "Pass the Mic," which begins with a six-second sample taken from the 1976 composition "Choir," by the jazz flutist James Newton. The sample was an exercise in what is called multiphonics, where the flutist "overblows" into the instrument while simultaneously singing in a falsetto. In the case of "Choir," Newton played a C on the flute, then sang C, D-flat, C—and the distortion of the overblown C, combined with his vocalizing, created a surprisingly complex and haunting sound. In "Pass the Mic," the Beastie Boys repeated the Newton sample more than forty times. The effect was riveting.
In the world of music, copyrighted works fall into two categories—the recorded performance and the composition underlying that performance. If you write a rap song, and want to sample the chorus from Billy Joel's "Piano Man," you first have to get permission from the record label to use the "Piano Man" recording, and then get permission from Billy Joel (or whoever owns his music) to use the underlying composition. In the case of "Pass the Mic," the Beastie Boys got the first kind of permission—the rights to use the recording of "Choir"—but not the second. Newton sued, and he lost—and the reason he lost serves as a useful introduction to how to think about intellectual property.
At issue in the case wasn't the distinctiveness of Newton's performance. The Beastie Boys, everyone agreed, had properly licensed Newton's performance when they paid the copyright recording fee. And there was no question about whether they had copied the underlying music to the sample. At issue was simply whether the Beastie Boys were required to ask for that secondary permission: was the composition underneath those six seconds so distinctive and original that Newton could be said to own it? The court said that it wasn't.
The chief expert witness for the Beastie Boys in the "Choir" case was Lawrence Ferrara, who is a professor of music at New York University, and when I asked him to explain the court's ruling he walked over to the piano in the corner of his office and played those three notes: C, D-flat, C. "That's it!" he shouted. "There ain't nothing else! That's what was used. You know what this is? It's no more than a mordent, a turn. It's been done thousands upon thousands of times. No one can say they own that."
Ferrara then played the most famous four-note sequence in classical music, the opening of Beethoven's Fifth: G, G, G, E-flat. This was unmistakably Beethoven. But was it original? "That's a harder case," Ferrara said. "Actually, though, other composers wrote that. Beethoven himself wrote that in a piano sonata, and you can find figures like that in composers who predate Beethoven. It's one thing if you're talking about da-da-da dummm, da-da-da dummm—those notes, with those durations. But just the four pitches, G, G, G, E-flat? Nobody owns those."
Ferrara once served as an expert witness for Andrew Lloyd Webber, who was being sued by Ray Repp, a composer of Catholic folk music. Repp said that the opening few bars of Lloyd Webber's 1984 "Phantom Song," from "The Phantom of the Opera," bore an overwhelming resemblance to his composition "Till You," written six years earlier, in 1978. As Ferrara told the story, he sat down at the piano again and played the beginning of both songs, one after the other; sure enough, they sounded strikingly similar. "Here's Lloyd Webber," he said, calling out each note as he played it. "Here's Repp. Same sequence. The only difference is that Andrew writes a perfect fourth and Repp writes a sixth."
But Ferrara wasn't quite finished. "I said, let me have everything Andrew Lloyd Webber wrote prior to 1978—'Jesus Christ Superstar,''Joseph,''Evita.'" He combed through every score, and in "Joseph and the Amazing Technicolor Dreamcoat" he found what he was looking for. "It's the song 'Benjamin Calypso.'" Ferrara started playing it. It was immediately familiar. "It's the first phrase of 'Phantom Song.' It's even using the same notes. But wait—it gets better. Here's 'Close Every Door,' from a 1969 concert performance of 'Joseph.'" Ferrara is a dapper, animated man, with a thin, well-manicured mustache, and thinking about the Lloyd Webber case was almost enough to make him jump up and down. He began to play again. It was the second phrase of ""The first half of 'Phantom' is in 'Benjamin Calypso.' The second half is in 'Close Every Door.' They are identical. On the button. In the case of the first theme, in fact, 'Benjamin Calypso' is closer to the first half of the theme at issue than the plaintiff's song. Lloyd Webber writes something in 1984, and he borrows from himself."
In the "Choir" case, the Beastie Boys' copying didn't amount to theft because it was too trivial. In the "Phantom" case, what Lloyd Webber was alleged to have copied didn't amount to theft because the material in question wasn't original to his accuser. Under copyright law, what matters is not that you copied someone else's work. What matters is what you copied, and how much you copied. Intellectual-property doctrine isn't a straightforward application of the ethical principle "Thou shalt not steal." At its core is the notion that there are certain situations where you can steal. The protections of copyright, for instance, are time-limited; once something passes into the public domain, anyone can copy it without restriction. Or suppose that you invented a cure for breast cancer in your basement lab. Any patent you received would protect your intellectual property for twenty years, but after that anyone could take your invention. You get an initial monopoly on your creation because we want to provide economic incentives for people to invent things like cancer drugs. But everyone gets to steal your breast-cancer cure—after a decent interval—because it is also in society's interest to let as many people as possible copy your invention; only then can others learn from it, and build on it, and come up with better and cheaper alternatives. This balance between the protecting and the limiting of intellectual property is, in fact, enshrined in the Constitution: "Congress shall have the power to promote the Progress of Science and useful Arts, by securing for limited"—note that specification, limited—"Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries."
4.
So is it true that words belong to the person who wrote them, just as other kinds of property belong to their owners? Actually, no. As the Stanford law professor Lawrence Lessig argues in his new book "Free Culture":
In ordinary language, to call a copyright a "property" right is a bit misleading, for the property of copyright is an odd kind of property. . . . I understand what I am taking when I take the picnic table you put in your backyard. I am taking a thing, the picnic table, and after I take it, you don't have it. But what am I taking when I take the good idea you had to put a picnic table in the backyard—by, for example, going to Sears, buying a table, and putting it in my backyard? What is the thing that I am taking then?
The point is not just about the thingness of picnic tables versus ideas, though that is an important difference. The point instead is that in the ordinary case—indeed, in practically every case except for a narrow range of exceptions—ideas released to the world are free. I don't take anything from you when I copy the way you dress—though I might seem weird if I do it every day. . . . Instead, as Thomas Jefferson said (and this is especially true when I copy the way someone dresses), "He who receives an idea from me, receives instruction himself without lessening mine; as he who lights his taper at mine, receives light without darkening me."
Lessig argues that, when it comes to drawing this line between private interests and public interests in intellectual property, the courts and Congress have, in recent years, swung much too far in the direction of private interests. He writes, for instance, about the fight by some developing countries to get access to inexpensive versions of Western drugs through what is called "parallel importation"—buying drugs from another developing country that has been licensed to produce patented medicines. The move would save countless lives. But it has been opposed by the United States not on the ground that it would cut into the profits of Western pharmaceutical companies (they don't sell that many patented drugs in developing countries anyway) but on the ground that it violates the sanctity of intellectual property. "We as a culture have lost this sense of balance," Lessig writes. "A certain property fundamentalism, having no connection to our tradition, now reigns in this culture."
Even what Lessig decries as intellectual-property extremism, however, acknowledges that intellectual property has its limits. The United States didn't say that developing countries could never get access to cheap versions of American drugs. It said only that they would have to wait until the patents on those drugs expired. The arguments that Lessig has with the hard-core proponents of intellectual property are almost all arguments about where and when the line should be drawn between the right to copy and the right to protection from copying, not whether a line should be drawn.
But plagiarism is different, and that's what's so strange about it. The ethical rules that govern when it's acceptable for one writer to copy another are even more extreme than the most extreme position of the intellectual-property crowd: when it comes to literature, we have somehow decided that copying is never acceptable. Not long ago, the Harvard law professor Laurence Tribe was accused of lifting material from the historian Henry Abraham for his 1985 book, "God Save This Honorable Court." What did the charge amount to? In an exposé that appeared in the conservative publication The Weekly Standard, Joseph Bottum produced a number of examples of close paraphrasing, but his smoking gun was this one borrowed sentence: "Taft publicly pronounced Pitney to be a 'weak member' of the Court to whom he could not assign cases." That's it. Nineteen words.
Not long after I learned about "Frozen," I went to see a friend of mine who works in the music industry. We sat in his living room on the Upper East Side, facing each other in easy chairs, as he worked his way through a mountain of CDs. He played "Angel," by the reggae singer Shaggy, and then "The Joker," by the Steve Miller Band, and told me to listen very carefully to the similarity in bass lines. He played Led Zeppelin's "Whole Lotta Love" and then Muddy Waters's "You Need Love," to show the extent to which Led Zeppelin had mined the blues for inspiration. He played "Twice My Age," by Shabba Ranks and Krystal, and then the saccharine seventies pop standard "Seasons in the Sun," until I could hear the echoes of the second song in the first. He played "Last Christmas," by Wham!, followed by Barry Manilow's "Can't Smile Without You" to explain why Manilow might have been startled when he first heard that song, and then "Joanna," by Kool and the Gang, because, in a different way, "Last Christmas" was an homage to Kool and the Gang as well. "That sound you hear in Nirvana," my friend said at one point, "that soft and then loud, kind of exploding thing, a lot of that was inspired by the Pixies. Yet Kurt Cobain"—Nirvana's lead singer and songwriter—"was such a genius that he managed to make it his own. And 'Smells Like Teen Spirit'?"—here he was referring to perhaps the best-known Nirvana song. "That's Boston's 'More Than a Feeling.'" He began to hum the riff of the Boston hit, and said, "The first time I heard 'Teen Spirit,' I said, 'That guitar lick is from "More Than a Feeling."' But it was different—it was urgent and brilliant and new."
He played another CD. It was Rod Stewart's "Do Ya Think I'm Sexy," a huge hit from the nineteen-seventies. The chorus has a distinctive, catchy hook—the kind of tune that millions of Americans probably hummed in the shower the year it came out. Then he put on "Taj Mahal," by the Brazilian artist Jorge Ben Jor, which was recorded several years before the Rod Stewart song. In his twenties, my friend was a d.j. at various downtown clubs, and at some point he'd become interested in world music. "I caught it back then," he said. A small, sly smile spread across his face. The opening bars of "Taj Mahal" were very South American, a world away from what we had just listened to. And then I heard it. It was so obvious and unambiguous that I laughed out loud; virtually note for note, it was the hook from "Do Ya Think I'm Sexy." It was possible that Rod Stewart had independently come up with that riff, because resemblance is not proof of influence. It was also possible that he'd been in Brazil, listened to some local music, and liked what he heard.
My friend had hundreds of these examples. We could have sat in his living room playing at musical genealogy for hours. Did the examples upset him? Of course not, because he knew enough about music to know that these patterns of influence—cribbing, tweaking, transforming—were at the very heart of the creative process. True, copying could go too far. There were times when one artist was simply replicating the work of another, and to let that pass inhibited true creativity. But it was equally dangerous to be overly vigilant in policing creative expression, because if Led Zeppelin hadn't been free to mine the blues for inspiration we wouldn't have got "Whole Lotta Love," and if Kurt Cobain couldn't listen to "More Than a Feeling" and pick out and transform the part he really liked we wouldn't have "Smells Like Teen Spirit"—and, in the evolution of rock, "Smells Like Teen Spirit" was a real step forward from "More Than a Feeling." A successful music executive has to understand the distinction between borrowing that is transformative and borrowing that is merely derivative, and that distinction, I realized, was what was missing from the discussion of Bryony Lavery's borrowings. Yes, she had copied my work. But no one was asking why she had copied it, or what she had copied, or whether her copying served some larger purpose.
5.
Bryony Lavery came to see me in early October. It was a beautiful Saturday afternoon, and we met at my apartment. She is in her fifties, with short tousled blond hair and pale-blue eyes, and was wearing jeans and a loose green shirt and clogs. There was something rugged and raw about her. In the Times the previous day, the theatre critic Ben Brantley had not been kind to her new play, "Last Easter." This was supposed to be her moment of triumph. "Frozen" had been nominated for a Tony. ""Last Easter" had opened Off Broadway. And now? She sat down heavily at my kitchen table. "I've had the absolute gamut of emotions," she said, playing nervously with her hands as she spoke, as if she needed a cigarette. "I think when one's working, one works between absolute confidence and absolute doubt, and I got a huge dollop of each. I was terribly confident that I could write well after 'Frozen,' and then this opened a chasm of doubt." She looked up at me. "I'm terribly sorry," she said.
Lavery began to explain: "What happens when I write is that I find that I'm somehow zoning on a number of things. I find that I've cut things out of newspapers because the story or something in them is interesting to me, and seems to me to have a place onstage. Then it starts coagulating. It's like the soup starts thickening. And then a story, which is also a structure, starts emerging. I'd been reading thrillers like 'The Silence of the Lambs,' about fiendishly clever serial killers. I'd also seen a documentary of the victims of the Yorkshire killers, Myra Hindley and Ian Brady, who were called the Moors Murderers. They spirited away several children. It seemed to me that killing somehow wasn't fiendishly clever. It was the opposite of clever. It was as banal and stupid and destructive as it could be. There are these interviews with the survivors, and what struck me was that they appeared to be frozen in time. And one of them said, 'If that man was out now, I'm a forgiving man but I couldn't forgive him. I'd kill him.' That's in 'Frozen.' I was thinking about that. Then my mother went into hospital for a very simple operation, and the surgeon punctured her womb, and therefore her intestine, and she got peritonitis and died."
When Lavery started talking about her mother, she stopped, and had to collect herself. "She was seventy-four, and what occurred to me is that I utterly forgave him. I thought it was an honest mistake. I'm very sorry it happened to my mother, but it's an honest mistake." Lavery's feelings confused her, though, because she could think of people in her own life whom she had held grudges against for years, for the most trivial of reasons. "In a lot of ways, 'Frozen' was an attempt to understand the nature of forgiveness," she said.
Lavery settled, in the end, on a play with three characters. The first is a serial killer named Ralph, who kidnaps and murders a young girl. The second is the murdered girl's mother, Nancy. The third is a psychiatrist from New York, Agnetha, who goes to England to examine Ralph. In the course of the play, the three lives slowly intersect—and the characters gradually change and become "unfrozen" as they come to terms with the idea of forgiveness. For the character of Ralph, Lavery says that she drew on a book about a serial killer titled "The Murder of Childhood," by Ray Wyre and Tim Tate. For the character of Nancy, she drew on an article written in the Guardian by a woman named Marian Partington, whose sister had been murdered by the serial killers Frederick and Rosemary West. And, for the character of Agnetha, Lavery drew on a reprint of my article that she had read in a British publication. "I wanted a scientist who would understand," Lavery said—a scientist who could explain how it was possible to forgive a man who had killed your daughter, who could explain that a serial killing was not a crime of evil but a crime of illness. "I wanted it to be accurate," she added.
So why didn't she credit me and Lewis? How could she have been so meticulous about accuracy but not about attribution? Lavery didn't have an answer. "I thought it was O.K. to use it," she said with an embarrassed shrug. "It never occurred to me to ask you. I thought it was news."
She was aware of how hopelessly inadequate that sounded, and when she went on to say that my article had been in a big folder of source material that she had used in the writing of the play, and that the folder had got lost during the play's initial run, in Birmingham, she was aware of how inadequate that sounded, too.
But then Lavery began to talk about Marian Partington, her other important inspiration, and her story became more complicated. While she was writing "Frozen," Lavery said, she wrote to Partington to inform her of how much she was relying on Partington's experiences. And when "Frozen" opened in London she and Partington met and talked. In reading through articles on Lavery in the British press, I found this, from the Guardian two years ago, long before the accusations of plagiarism surfaced:
Lavery is aware of the debt she owes to Partington's writing and is eager to acknowledge it.
"I always mention it, because I am aware of the enormous debt that I owe to the generosity of Marian Partington's piece . . . . You have to be hugely careful when writing something like this, because it touches on people's shattered lives and you wouldn't want them to come across it unawares."
Lavery wasn't indifferent to other people's intellectual property, then; she was just indifferent to my intellectual property. That's because, in her eyes, what she took from me was different. It was, as she put it, "news." She copied my description of Dorothy Lewis's collaborator, Jonathan Pincus, conducting a neurological examination. She copied the description of the disruptive neurological effects of prolonged periods of high stress. She copied my transcription of the television interview with Franklin. She reproduced a quote that I had taken from a study of abused children, and she copied a quotation from Lewis on the nature of evil. She didn't copy my musings, or conclusions, or structure. She lifted sentences like "It is the function of the cortex—and, in particular, those parts of the cortex beneath the forehead, known as the frontal lobes—to modify the impulses that surge up from within the brain, to provide judgment, to organize behavior and decision-making, to learn and adhere to rules of everyday life." It is difficult to have pride of authorship in a sentence like that. My guess is that it's a reworked version of something I read in a textbook. Lavery knew that failing to credit Partington would have been wrong. Borrowing the personal story of a woman whose sister was murdered by a serial killer matters because that story has real emotional value to its owner. As Lavery put it, it touches on someone's shattered life. Are boilerplate descriptions of physiological functions in the same league?
It also matters how Lavery chose to use my words. Borrowing crosses the line when it is used for a derivative work. It's one thing if you're writing a history of the Kennedys, like Doris Kearns Goodwin, and borrow, without attribution, from another history of the Kennedys. But Lavery wasn't writing another profile of Dorothy Lewis. She was writing a play about something entirely new—about what would happen if a mother met the man who killed her daughter. And she used my descriptions of Lewis's work and the outline of Lewis's life as a building block in making that confrontation plausible. Isn't that the way creativity is supposed to work? Old words in the service of a new idea aren't the problem. What inhibits creativity is new words in the service of an old idea.
And this is the second problem with plagiarism. It is not merely extremist. It has also become disconnected from the broader question of what does and does not inhibit creativity. We accept the right of one writer to engage in a full-scale knockoff of another—think how many serial-killer novels have been cloned from "The Silence of the Lambs." Yet, when Kathy Acker incorporated parts of a Harold Robbins sex scene verbatim in a satiric novel, she was denounced as a plagiarist (and threatened with a lawsuit). When I worked at a newspaper, we were routinely dispatched to "match" a story from the Times: to do a new version of someone else's idea. But had we "matched" any of the Times' words—even the most banal of phrases—it could have been a firing offense. The ethics of plagiarism have turned into the narcissism of small differences: because journalism cannot own up to its heavily derivative nature, it must enforce originality on the level of the sentence.
Dorothy Lewis says that one of the things that hurt her most about "Frozen" was that Agnetha turns out to have had an affair with her collaborator, David Nabkus. Lewis feared that people would think she had had an affair with her collaborator, Jonathan Pincus. "That's slander," Lewis told me. "I'm recognizable in that. Enough people have called me and said, 'Dorothy, it's about you,' and if everything up to that point is true, then the affair becomes true in the mind. So that is another reason that I feel violated. If you are going to take the life of somebody, and make them absolutely identifiable, you don't create an affair, and you certainly don't have that as a climax of the play."
It is easy to understand how shocking it must have been for Lewis to sit in the audience and see her "character" admit to that indiscretion. But the truth is that Lavery has every right to create an affair for Agnetha, because Agnetha is not Dorothy Lewis. She is a fictional character, drawn from Lewis's life but endowed with a completely imaginary set of circumstances and actions. In real life, Lewis kissed Ted Bundy on the cheek, and in some versions of "Frozen" Agnetha kisses Ralph. But Lewis kissed Bundy only because he kissed her first, and there's a big difference between responding to a kiss from a killer and initiating one. When we first see Agnetha, she's rushing out of the house and thinking murderous thoughts on the airplane. Dorothy Lewis also charges out of her house and thinks murderous thoughts. But the dramatic function of that scene is to make us think, in that moment, that Agnetha is crazy. And the one inescapable fact about Lewis is that she is not crazy: she has helped get people to rethink their notions of criminality because of her unshakable command of herself and her work. Lewis is upset not just about how Lavery copied her life story, in other words, but about how Lavery changed her life story. She's not merely upset about plagiarism. She's upset about art—about the use of old words in the service of a new idea—and her feelings are perfectly understandable, because the alterations of art can be every bit as unsettling and hurtful as the thievery of plagiarism. It's just that art is not a breach of ethics.
When I read the original reviews of "Frozen," I noticed that time and again critics would use, without attribution, some version of the sentence "The difference between a crime of evil and a crime of illness is the difference between a sin and a symptom." That's my phrase, of course. I wrote it. Lavery borrowed it from me, and now the critics were borrowing it from her. The plagiarist was being plagiarized. In this case, there is no "art" defense: nothing new was being done with that line. And this was not "news." Yet do I really own "sins and symptoms"? There is a quote by Gandhi, it turns out, using the same two words, and I'm sure that if I were to plow through the body of English literature I would find the path littered with crimes of evil and crimes of illness. The central fact about the "Phantom" case is that Ray Repp, if he was borrowing from Andrew Lloyd Webber, certainly didn't realize it, and Andrew Lloyd Webber didn't realize that he was borrowing from himself. Creative property, Lessig reminds us, has many lives—the newspaper arrives at our door, it becomes part of the archive of human knowledge, then it wraps fish. And, by the time ideas pass into their third and fourth lives, we lose track of where they came from, and we lose control of where they are going. The final dishonesty of the plagiarism fundamentalists is to encourage us to pretend that these chains of influence and evolution do not exist, and that a writer's words have a virgin birth and an eternal life. I suppose that I could get upset about what happened to my words. I could also simply acknowledge that I had a good, long ride with that line—and let it go.
"It's been absolutely bloody, really, because it attacks my own notion of my character," Lavery said, sitting at my kitchen table. A bouquet of flowers she had brought were on the counter behind her. "It feels absolutely terrible. I've had to go through the pain for being careless. I'd like to repair what happened, and I don't know how to do that. I just didn't think I was doing the wrong thing . . . and then the article comes out in the New York Times and every continent in the world." There was a long silence. She was heartbroken. But, more than that, she was confused, because she didn't understand how six hundred and seventy-five rather ordinary words could bring the walls tumbling down. "It's been horrible and bloody." She began to cry. "I'm still composting what happened. It will be for a purpose . . . whatever that purpose is.
THE ARCHIVE
complete list
Articles from the New Yorker
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
December 13, 2004
ANNALS OF TECHNOLOGY
Mammography, air power, and the limits of looking.
1.
At the beginning of the first Gulf War, the United States Air Force dispatched two squadrons of F-15E Strike Eagle fighter jets to find and destroy the Scud missiles that Iraq was firing at Israel. The rockets were being launched, mostly at night, from the backs of modified flatbed tractor-trailers, moving stealthily around a four-hundred-square-mile "Scud box" in the western desert. The plan was for the fighter jets to patrol the box from sunset to sunrise. When a Scud was launched, it would light up the night sky. An F-15E pilot would fly toward the launch point, follow the roads that crisscrossed the desert, and then locate the target using a state-of-the-art, $4.6-million device called a LANTIM navigation and targeting pod, capable of taking a high-resolution infrared photograph of a four-and-a-half-mile swath below the plane. How hard could it be to pick up a hulking tractor-trailer in the middle of an empty desert?
Almost immediately, reports of Scud kills began to come back from the field. The Desert Storm commanders were elated. "I remember going out to Nellis Air Force Base after the war," Barry Watts, a former Air Force colonel, says. "They did a big static display, and they had all the Air Force jets that flew in Desert Storm, and they had little placards in front of them, with a box score, explaining what this plane did and that plane did in the war. And, when you added up how many Scud launchers they claimed each got, the total was about a hundred." Air Force officials were not guessing at the number of Scud launchers hit; as far as they were concerned, they knew. They had a four-million-dollar camera, which took a nearly perfect picture, and there are few cultural reflexes more deeply ingrained than the idea that a picture has the weight of truth. "That photography not only does not, but cannot lie, is a matter of belief, an article of faith," Charles Rosen and Henri Zerner have written. "We tend to trust the camera more than our own eyes." Thus was victory declared in the Scud hunt--until hostilities ended and the Air Force appointed a team to determine the effectiveness of the air campaigns in Desert Storm. The actual number of definite Scud kills, the team said, was zero.
The problem was that the pilots were operating at night, when depth perception is impaired. LANTIM could see in the dark, but the camera worked only when it was pointed in the right place, and the right place wasn't obvious. Meanwhile, the pilot had only about five minutes to find his quarry, because after launch the Iraqis would immediately hide in one of the many culverts underneath the highway between Baghdad and Jordan, and the screen the pilot was using to scan all that desert measured just six inches by six inches. "It was like driving down an interstate looking through a soda straw," Major General Mike DeCuir, who flew numerous Scud-hunt missions throughout the war, recalled. Nor was it clear what a Scud launcher looked like on that screen. "We had an intelligence photo of one on the ground. But you had to imagine what it would look like on a black-and-white screen from twenty thousand feet up and five or more miles away," DeCuir went on. "With the resolution we had at the time, you could tell something was a big truck and that it had wheels, but at that altitude it was hard to tell much more than that." The postwar analysis indicated that a number of the targets the pilots had hit were actually decoys, constructed by the Iraqis from old trucks and spare missile parts. Others were tanker trucks transporting oil on the highway to Jordan. A tanker truck, after all, is a tractor-trailer hauling a long, shiny cylindrical object, and, from twenty thousand feet up at four hundred miles an hour on a six-by-six-inch screen, a long, shiny cylindrical object can look a lot like a missile. "It's a problem we've always had," Watts, who served on the team that did the Gulf War analysis, said. "It's night out. You think you've got something on the sensor. You roll out your weapons. Bombs go off. It's really hard to tell what you did."
You can build a high-tech camera, capable of taking pictures in the middle of the night, in other words, but the system works only if the camera is pointed in the right place, and even then the pictures are not self-explanatory. They need to be interpreted, and the human task of interpretation is often a bigger obstacle than the technical task of picture-taking. This was the lesson of the Scud hunt: pictures promise to clarify but often confuse. The Zapruder film intensified rather than dispelled the controversy surrounding John F. Kennedy's assassination. The videotape of the beating of Rodney King led to widespread uproar about police brutality; it also served as the basis for a jury's decision to acquit the officers charged with the assault. Perhaps nowhere have these issues been so apparent, however, as in the arena of mammography. Radiologists developed state-of-the-art X-ray cameras and used them to scan women's breasts for tumors, reasoning that, if you can take a nearly perfect picture, you can find and destroy tumors before they go on to do serious damage. Yet there remains a great deal of confusion about the benefits of mammography. Is it possible that we place too much faith in pictures?
2.
The head of breast imaging at Memorial Sloan-Kettering Cancer Center, in New York City, is a physician named David Dershaw, a youthful man in his fifties, who bears a striking resemblance to the actor Kevin Spacey. One morning not long ago, he sat down in his office at the back of the Sloan-Kettering Building and tried to explain how to read a mammogram.
Dershaw began by putting an X-ray on a light box behind his desk. "Cancer shows up as one of two patterns," he said. "You look for lumps and bumps, and you look for calcium. And, if you find it, you have to make a determination: is it acceptable, or is it a pattern that might be due to cancer?" He pointed at the X-ray. "This woman has cancer. She has these tiny little calcifications. Can you see them? Can you see how small they are?" He took out a magnifying glass and placed it over a series of white flecks; as a cancer grows, it produces calcium deposits. "That's the stuff we are looking for," he said.
Then Dershaw added a series of slides to the light box and began to explain all the varieties that those white flecks came in. Some calcium deposits are oval and lucent. "They're called eggshell calcifications," Dershaw said. "And they're basically benign." Another kind of calcium runs like a railway track on either side of the breast's many blood vessels--that's benign, too. "Then there's calcium that's thick and heavy and looks like popcorn," Dershaw went on. "That's just dead tissue. That's benign. There's another calcification that's little sacs of calcium floating in liquid. It's called 'milk of calcium.' That's another kind of calcium that's always benign." He put a new set of slides against the light. "Then we have calcium that looks like this--irregular. All of these are of different density and different sizes and different configurations. Those are usually benign, but sometimes they are due to cancer. Remember you saw those railway tracks? This is calcium laid down inside a tube as well, but you can see that the outside of the tube is irregular. That's cancer." Dershaw's explanations were beginning to be confusing. "There are certain calcifications in benign tissues that are always benign," he said. "There are certain kinds that are always associated with cancer. But those are the ends of the spectrum, and the vast amount of calcium is somewhere in the middle. And making that differentiation, between whether the calcium is acceptable or not, is not clear-cut."
The same is true of lumps. Some lumps are simply benign clumps of cells. You can tell they are benign because the walls of the mass look round and smooth; in a cancer, cells proliferate so wildly that the walls of the tumor tend to be ragged and to intrude into the surrounding tissue. But sometimes benign lumps resemble tumors, and sometimes tumors look a lot like benign lumps. And sometimes you have lots of masses that, taken individually, would be suspicious but are so pervasive that the reasonable conclusion is that this is just how the woman's breast looks. "If you have a CAT scan of the chest, the heart always looks like the heart, the aorta always looks like the aorta," Dershaw said. "So when there's a lump in the middle of that, it's clearly abnormal. Looking at a mammogram is conceptually different from looking at images elsewhere in the body. Everything else has anatomy--anatomy that essentially looks the same from one person to the next. But we don't have that kind of standardized information on the breast. The most difficult decision I think anybody needs to make when we're confronted with a patient is: Is this person normal? And we have to decide that without a pattern that is reasonably stable from individual to individual, and sometimes even without a pattern that is the same from the left side to the right."
Dershaw was saying that mammography doesn't fit our normal expectations of pictures. In the days before the invention of photography, for instance, a horse in motion was represented in drawings and paintings according to the convention of ventre Ă terre, or "belly to the ground." Horses were drawn with their front legs extended beyond their heads, and their hind legs stretched straight back, because that was the way, in the blur of movement, a horse seemed to gallop. Then, in the eighteen-seventies, came Eadweard Muybridge, with his famous sequential photographs of a galloping horse, and that was the end of ventre Ă terre. Now we knew how a horse galloped. The photograph promised that we would now be able to capture reality itself.
The situation with mammography is different. The way in which we ordinarily speak about calcium and lumps is clear and unambiguous. But the picture demonstrates how blurry those seemingly distinct categories actually are. Joann Elmore, a physician and epidemiologist at the University of Washington Harborview Medical Center, once asked ten board-certified radiologists to look at a hundred and fifty mammograms--of which twenty-seven had come from women who developed breast cancer, and a hundred and twenty-three from women who were known to have been healthy. One radiologist caught eighty-five per cent of the cancers the first time around. Another caught only thirty-seven per cent. One looked at the same X-rays and saw suspicious masses in seventy-eight per cent of the cases. Another doctor saw "focal asymmetric density" in half of the cancer cases; yet another saw no "focal asymmetric density" at all. There was one particularly perplexing mammogram that three radiologists thought was normal, two thought was abnormal but probably benign, four couldn't make up their minds about, and one was convinced was cancer. (The patient was fine.) Some of these differences are a matter of skill, and there is good evidence that with more rigorous training and experience radiologists can become better at reading breast X-rays. But so much of what can be seen on an X-ray falls into a gray area that interpreting a mammogram is also, in part, a matter of temperament. Some radiologists see something ambiguous and are comfortable calling it normal. Others see something ambiguous and get suspicious.
Does that mean radiologists ought to be as suspicious as possible? You might think so, but caution simply creates another kind of problem. The radiologist in the Elmore study who caught the most cancers also recommended immediate workups--a biopsy, an ultrasound, or additional X-rays--on sixty-four per cent of the women who didn't have cancer. In the real world, a radiologist who needlessly subjected such an extraordinary percentage of healthy patients to the time, expense, anxiety, and discomfort of biopsies and further testing would find himself seriously out of step with his profession. Mammography is not a form of medical treatment, where doctors are justified in going to heroic lengths on behalf of their patients. Mammography is a form of medical screening: it is supposed to exclude the healthy, so that more time and attention can be given to the sick. If screening doesn't screen, it ceases to be useful.
Gilbert Welch, a medical-outcomes expert at Dartmouth, has pointed out that, given current breast-cancer mortality rates, nine out of every thousand sixty-year-old women will die of breast cancer in the next ten years. If every one of those women had a mammogram every year, that number would fall to six. The radiologist seeing those thousand women, in other words, would read ten thousand X-rays over a decade in order to save three lives--and that's using the most generous possible estimate of mammography's effectiveness. The reason a radiologist is required to assume that the overwhelming number of ambiguous things are normal, in other words, is that the overwhelming number of ambiguous things really are normal. Radiologists are, in this sense, a lot like baggage screeners at airports. The chances are that the dark mass in the middle of the suitcase isn't a bomb, because you've seen a thousand dark masses like it in suitcases before, and none of those were bombs--and if you flagged every suitcase with something ambiguous in it no one would ever make his flight. But that doesn't mean, of course, that it isn't a bomb. All you have to go on is what it looks like on the X-ray screen--and the screen seldom gives you quite enough information.
3.
Dershaw picked up a new X-ray and put it on the light box. It belonged to a forty-eight-year-old woman. Mammograms indicate density in the breast: the denser the tissue is, the more the X rays are absorbed, creating the variations in black and white that make up the picture. Fat hardly absorbs the beam at all, so it shows up as black. Breast tissue, particularly the thick breast tissue of younger women, shows up on an X-ray as shades of light gray or white. This woman's breasts consisted of fat at the back of the breast and more dense, glandular tissue toward the front, so the X-ray was entirely black, with what looked like a large white, dense cloud behind the nipple. Clearly visible, in the black, fatty portion of the left breast, was a white spot. "Now, that looks like a cancer, that little smudgy, irregular, infiltrative thing," Dershaw said. "It's about five millimetres across." He looked at the X-ray for a moment. This was mammography at its best: a clear picture of a problem that needed to be fixed. Then he took a pen and pointed to the thick cloud just to the right of the tumor. The cloud and the tumor were exactly the same color. "That cancer only shows up because it's in the fatty part of the breast," he said. "If you take that cancer and put it in the dense part of the breast, you'd never see it, because the whiteness of the mass is the same as the whiteness of normal tissue. If the tumor was over there, it could be four times as big and we still wouldn't see it."
What's more, mammography is especially likely to miss the tumors that do the most harm. A team led by the research pathologist Peggy Porter analyzed four hundred and twenty-nine breast cancers that had been diagnosed over five years at the Group Health Cooperative of Puget Sound. Of those, two hundred and seventy-nine were picked up by mammography, and the bulk of them were detected very early, at what is called Stage One. (Cancer is classified into four stages, according to how far the tumor has spread from its original position.) Most of the tumors were small, less than two centimetres. Pathologists grade a tumor's aggression according to such measures as the "mitotic count"--the rate at which the cells are dividing--and the screen-detected tumors were graded "low" in almost seventy per cent of the cases. These were the kinds of cancers that could probably be treated successfully. "Most tumors develop very, very slowly, and those tend to lay down calcium deposits--and what mammograms are doing is picking up those calcifications," Leslie Laufman, a hematologist-oncologist in Ohio, who served on a recent National Institutes of Health breast-cancer advisory panel, said. "Almost by definition, mammograms are picking up slow-growing tumors."
A hundred and fifty cancers in Porter's study, however, were missed by mammography. Some of these were tumors the mammogram couldn't see--that were, for instance, hiding in the dense part of the breast. The majority, though, simply didn't exist at the time of the mammogram. These cancers were found in women who had had regular mammograms, and who were legitimately told that they showed no sign of cancer on their last visit. In the interval between X-rays, however, either they or their doctor had manually discovered a lump in their breast, and these "interval" cancers were twice as likely to be in Stage Three and three times as likely to have high mitotic counts; twenty-eight per cent had spread to the lymph nodes, as opposed to eighteen per cent of the screen-detected cancers. These tumors were so aggressive that they had gone from undetectable to detectable in the interval between two mammograms.
The problem of interval tumors explains why the overwhelming majority of breast-cancer experts insist that women in the critical fifty-to-sixty-nine age group get regular mammograms. In Porter's study, the women were X-rayed at intervals as great as every three years, and that created a window large enough for interval cancers to emerge. Interval cancers also explain why many breast-cancer experts believe that mammograms must be supplemented by regular and thorough clinical breast exams. ("Thorough" is defined as palpation of the area from the collarbone to the bottom of the rib cage, one dime-size area at a time, at three levels of pressure--just below the skin, the mid-breast, and up against the chest wall--by a specially trained practitioner for a period not less than five minutes per breast.) In a major study of mammography's effectiveness--one of a pair of Canadian trials conducted in the nineteen-eighties--women who were given regular, thorough breast exams but no mammograms were compared with those who had thorough breast exams and regular mammograms, and no difference was found in the death rates from breast cancer between the two groups. The Canadian studies are controversial, and some breast-cancer experts are convinced that they may have understated the benefits of mammography. But there is no denying the basic lessons of the Canadian trials: that a skilled pair of fingertips can find out an extraordinary amount about the health of a breast, and that we should not automatically value what we see in a picture over what we learn from our other senses.
"The finger has hundreds of sensors per square centimetre," says Mark Goldstein, a sensory psychophysicist who co-founded MammaCare, a company devoted to training nurses and physicians in the art of the clinical exam. "There is nothing in science or technology that has even come close to the sensitivity of the human finger with respect to the range of stimuli it can pick up. It's a brilliant instrument. But we simply don't trust our tactile sense as much as our visual sense."
4.
On the night of August 17, 1943, two hundred B-17 bombers from the United States Eighth Air Force set out from England for the German city of Schweinfurt. Two months later, two hundred and twenty-eight B-17s set out to strike Schweinfurt a second time. The raids were two of the heaviest nights of bombing in the war, and the Allied experience at Schweinfurt is an example of a more subtle--but in some cases more serious--problem with the picture paradigm.
The Schweinfurt raids grew out of the United States military's commitment to bombing accuracy. As Stephen Budiansky writes in his wonderful recent book "Air Power," the chief lesson of aerial bombardment in the First World War was that hitting a target from eight or ten thousand feet was a prohibitively difficult task. In the thick of battle, the bombardier had to adjust for the speed of the plane, the speed and direction of the prevailing winds, and the pitching and rolling of the plane, all while keeping the bombsight level with the ground. It was an impossible task, requiring complex trigonometric calculations. For a variety of reasons, including the technical challenges, the British simply abandoned the quest for precision: in both the First World War and the Second, the British military pursued a strategy of "morale" or "area" bombing, in which bombs were simply dropped, indiscriminately, on urban areas, with the intention of killing, dispossessing, and dispiriting the German civilian population.
But the American military believed that the problem of bombing accuracy was solvable, and a big part of the solution was something called the Norden bombsight. This breakthrough was the work of a solitary, cantankerous genius named Carl Norden, who operated out of a factory in New York City. Norden built a fifty-pound mechanical computer called the Mark XV, which used gears and wheels and gyroscopes to calculate airspeed, altitude, and crosswinds in order to determine the correct bomb-release point. The Mark XV, Norden's business partner boasted, could put a bomb in a pickle barrel from twenty thousand feet. The United States spent $1.5 billion developing it, which, as Budiansky points out, was more than half the amount that was spent building the atomic bomb. "At air bases, the Nordens were kept under lock and key in secure vaults, escorted to their planes by armed guards, and shrouded in a canvas cover until after takeoff," Budiansky recounts. The American military, convinced that its bombers could now hit whatever they could see, developed a strategic approach to bombing, identifying and selectively destroying targets that were critical to the Nazi war effort. In early 1943, General Henry (Hap) Arnold--the head of the Army Air Forces--assembled a group of prominent civilians to analyze the German economy and recommend critical targets. The Advisory Committee on Bombardment, as it was called, determined that the United States should target Germany's ball-bearing factories, since ball bearings were critical to the manufacture of airplanes. And the center of the German ball-bearing industry was Schweinfurt. Allied losses from the two raids were staggering. Thirty-six B-17s were shot down in the August attack, sixty-two bombers were shot down in the October raid, and between the two operations a further hundred and thirty-eight planes were badly damaged. Yet, with the war in the balance, this was considered worth the price. When the damage reports came in, Arnold exulted, "Now we have got Schweinfurt!" He was wrong.
The problem was not, as in the case of the Scud hunt, that the target could not be found, or that what was thought to be the target was actually something else. The B-17s, aided by their Norden Mark XVs, hit the ball-bearing factories hard. The problem was that the picture Air Force officers had of their target didn't tell them what they really needed to know. The Germans, it emerged, had ample stockpiles of ball bearings. They also had no difficulty increasing their imports from Sweden and Switzerland, and, through a few simple design changes, they were able to greatly reduce their need for ball bearings in aircraft production. What's more, although the factory buildings were badly damaged by the bombing, the machinery inside wasn't. Ball-bearing equipment turned out to be surprisingly hardy. "As it was, not a tank, plane, or other piece of weaponry failed to be produced because of lack of ball bearings," Albert Speer, the Nazi production chief, wrote after the war. Seeing a problem and understanding it, then, are two different things.
In recent years, with the rise of highly accurate long-distance weaponry, the Schweinfurt problem has become even more acute. If you can aim at and hit the kitchen at the back of a house, after all, you don't have to bomb the whole building. So your bomb can be two hundred pounds rather than a thousand. That means, in turn, that you can fit five times as many bombs on a single plane and hit five times as many targets in a single sortie, which sounds good--except that now you need to get intelligence on five times as many targets. And that intelligence has to be five times more specific, because if the target is in the bedroom and not the kitchen, you've missed him.
This is the issue that the United States command faced in the most recent Iraq war. Early in the campaign, the military mounted a series of air strikes against specific targets, where Saddam Hussein or other senior Baathist officials were thought to be hiding. There were fifty of these so-called "decapitation" attempts, each taking advantage of the fact that modern-day G.P.S.-guided bombs can be delivered from a fighter to within thirteen metres of their intended target. The strikes were dazzling in their precision. In one case, a restaurant was levelled. In another, a bomb burrowed down into a basement. But, in the end, every single strike failed. "The issue isn't accuracy," Watts, who has written extensively on the limitations of high-tech weaponry, says. "The issue is the quality of targeting information. The amount of information we need has gone up an order of magnitude or two in the last decade."
5.
Mammography has a Schweinfurt problem as well. Nowhere is that more evident than in the case of the breast lesion known as ductal carcinoma in situ, or DCIS, which shows up as a cluster of calcifications inside the ducts that carry milk to the nipple. It's a tumor that hasn't spread beyond those ducts, and it is so tiny that without mammography few women with DCIS would ever know they had it. In the past couple of decades, as more and more people have received regular breast X-rays and the resolution of mammography has increased, diagnoses of DCIS have soared. About fifty thousand new cases are now found every year in the United States, and virtually every DCIS lesion detected by mammography is promptly removed. But what has the targeting and destruction of DCIS meant for the battle against breast cancer? You'd expect that if we've been catching fifty thousand early-stage cancers every year, we should be seeing a corresponding decrease in the number of late-stage invasive cancers. It's not clear whether we have. During the past twenty years, the incidence of invasive breast cancer has continued to rise by the same small, steady increment every year.
In 1987, pathologists in Denmark performed a series of autopsies of women in their forties who had not been known to have breast cancer when they died of other causes. The pathologists looked at an average of two hundred and seventy-five samples of breast tissue in each case, and found some evidence of cancer--usually DCIS--in nearly forty per cent of the women. Since breast cancer accounts for less than four per cent of female deaths, clearly the overwhelming majority of these women, had they lived longer, would never have died of breast cancer. "To me, that indicates that these kinds of genetic changes happen really frequently, and that they can happen without having an impact on women's health," Karla Kerlikowske, a breast-cancer expert at the University of California at San Francisco, says. "The body has this whole mechanism to repair things, and maybe that's what happened with these tumors." Gilbert Welch, the medical-outcomes expert, thinks that we fail to understand the hit-or-miss nature of cancerous growth, and assume it to be a process that, in the absence of intervention, will eventually kill us. "A pathologist from the International Agency for Research on Cancer once told me that the biggest mistake we ever made was attaching the word 'carcinoma' to DCIS," Welch says. "The minute carcinoma got linked to it, it all of a sudden drove doctors to recommend therapy, because what was implied was that this was a lesion that would inexorably progress to invasive cancer. But we know that that's not always the case."
In some percentage of cases, however, DCIS does progress to something more serious. Some studies suggest that this happens very infrequently. Others suggest that it happens frequently enough to be of major concern. There is no definitive answer, and it's all but impossible to tell, simply by looking at a mammogram, whether a given DCIS tumor is among those lesions which will grow out from the duct or part of the majority that will never amount to anything. That's why some doctors feel that we have no choice but to treat every DCIS as life-threatening, and in thirty per cent of cases that means a mastectomy, and in another thirty-five per cent it means a lumpectomy and radiation. Would taking a better picture solve the problem? Not really, because the problem is that you don't know for sure what you're seeing, and as pictures have become better we have put ourselves in a position where we see more and more things that we don't know how to interpret. When it comes to DCIS, the mammogram delivers information without true understanding. "Almost half a million women have been diagnosed and treated for DCIS since the early nineteen-eighties--a diagnosis virtually unknown before then," Welch writes in his new book, "Should I Be Tested for Cancer?," a brilliant account of the statistical and medical uncertainties surrounding cancer screening. "This increase is the direct result of looking harder--in this case with 'better' mammography equipment. But I think you can see why it is a diagnosis that some women might reasonably prefer not to know about."
6.
The disturbing thing about DCIS, of course, is that our approach to this tumor seems like a textbook example of how the battle against cancer is supposed to work. Use a powerful camera. Take a detailed picture. Spot the tumor as early as possible. Treat it immediately and aggressively. The campaign to promote regular mammograms has used this early-detection argument with great success, because it makes intuitive sense. The danger posed by a tumor is represented visually. Large is bad; small is better--less likely to have metastasized. But here, too, tumors defy our visual intuitions.
According to Donald Berry, who is the chairman of the Department of Biostatistics and Applied Mathematics at M. D. Anderson Cancer Center, in Houston, a woman's risk of death increases only by about ten per cent for every additional centimetre in tumor length. "Suppose there is a tumor size above which the tumor is lethal, and below which it's not," Berry says. "The problem is that the threshold varies. When we find a tumor, we don't know whether it has metastasized already. And we don't know whether it's tumor size that drives the metastatic process or whether all you need is a few million cells to start sloughing off to other parts of the body. We do observe that it's worse to have a bigger tumor. But not amazingly worse. The relationship is not as great as you'd think."
In a recent genetic analysis of breast-cancer tumors, scientists selected women with breast cancer who had been followed for many years, and divided them into two groups--those whose cancer had gone into remission, and those whose cancer spread to the rest of their body. Then the scientists went back to the earliest moment that each cancer became apparent, and analyzed thousands of genes in order to determine whether it was possible to predict, at that moment, who was going to do well and who wasn't. Early detection presumes that it isn't possible to make that prediction: a tumor is removed before it becomes truly dangerous. But scientists discovered that even with tumors in the one-centimetre range--the range in which cancer is first picked up by a mammogram--the fate of the cancer seems already to have been set. "What we found is that there is biology that you can glean from the tumor, at the time you take it out, that is strongly predictive of whether or not it will go on to metastasize," Stephen Friend, a member of the gene-expression team at Merck, says. "We like to think of a small tumor as an innocent. The reality is that in that innocent lump are a lot of behaviors that spell a potential poor or good prognosis."
The good news here is that it might eventually be possible to screen breast cancers on a genetic level, using other kinds of tests--even blood tests--to look for the biological traces of those genes. This might also help with the chronic problem of overtreatment in breast cancer. If we can single out that small percentage of women whose tumors will metastasize, we can spare the rest the usual regimen of surgery, radiation, and chemotherapy. Gene-signature research is one of a number of reasons that many scientists are optimistic about the fight against breast cancer. But it is an advance that has nothing to do with taking more pictures, or taking better pictures. It has to do with going beyond the picture.
Under the circumstances, it is not hard to understand why mammography draws so much controversy. The picture promises certainty, and it cannot deliver on that promise. Even after forty years of research, there remains widespread disagreement over how much benefit women in the critical fifty-to-sixty-nine age bracket receive from breast X-rays, and further disagreement about whether there is enough evidence to justify regular mammography in women under fifty and over seventy. Is there any way to resolve the disagreement? Donald Berry says that there probably isn't--that a clinical trial that could definitively answer the question of mammography's precise benefits would have to be so large (involving more than five hundred thousand women) and so expensive (costing billions of dollars) as to be impractical. The resulting confusion has turned radiologists who do mammograms into one of the chief targets of malpractice litigation. "The problem is that mammographers--radiology groups--do hundreds of thousands of these mammograms, giving women the illusion that these things work and they are good, and if a lump is found and in most cases if it is found early, they tell women they have the probability of a higher survival rate," says E. Clay Parker, a Florida plaintiff's attorney, who recently won a $5.1 million judgment against an Orlando radiologist. "But then, when it comes to defending themselves, they tell you that the reality is that it doesn't make a difference when you find it. So you scratch your head and say, 'Well, why do you do mammography, then?'"
The answer is that mammograms do not have to be infallible to save lives. A modest estimate of mammography's benefit is that it reduces the risk of dying from breast cancer by about ten per cent--which works out, for the average woman in her fifties, to be about three extra days of life, or, to put it another way, a health benefit on a par with wearing a helmet on a ten-hour bicycle trip. That is not a trivial benefit. Multiplied across the millions of adult women in the United States, it amounts to thousands of lives saved every year, and, in combination with a medical regimen that includes radiation, surgery, and new and promising drugs, it has helped brighten the prognosis for women with breast cancer. Mammography isn't as a good as we'd like it to be. But we are still better off than we would be without it.
"There is increasingly an understanding among those of us who do this a lot that our efforts to sell mammography may have been over-vigorous," Dershaw said, "and that although we didn't intend to, the perception may have been that mammography accomplishes even more than it does." He was looking, as he spoke, at the mammogram of the woman whose tumor would have been invisible had it been a few centimetres to the right. Did looking at an X-ray like that make him nervous? Dershaw shook his head. "You have to respect the limitations of the technology," he said. "My job with the mammogram isn't to find what I can't find with a mammogram. It's to find what I can find with a mammogram. If I'm not going to accept that, then I shouldn't be reading mammograms."
7.
In February of last year, just before the start of the Iraq war, Secretary of State Colin Powell went before the United Nations to declare that Iraq was in defiance of international law. He presented transcripts of telephone conversations between senior Iraqi military officials, purportedly discussing attempts to conceal weapons of mass destruction. He told of eyewitness accounts of mobile biological-weapons facilities. And, most persuasively, he presented a series of images--carefully annotated, high-resolution satellite photographs of what he said was the Taji Iraqi chemical-munitions facility.
"Let me say a word about satellite images before I show a couple," Powell began. "The photos that I am about to show you are sometimes hard for the average person to interpret, hard for me. The painstaking work of photo analysis takes experts with years and years of experience, poring for hours and hours over light tables. But as I show you these images, I will try to capture and explain what they mean, what they indicate, to our imagery specialists." The first photograph was dated November 10, 2002, just three months earlier, and years after the Iraqis were supposed to have rid themselves of all weapons of mass destruction. "Let me give you a closer look," Powell said as he flipped to a closeup of the first photograph. It showed a rectangular building, with a vehicle parked next to it. "Look at the image on the left. On the left is a closeup of one of the four chemical bunkers. The two arrows indicate the presence of sure signs that the bunkers are storing chemical munitions. The arrow at the top that says 'Security' points to a facility that is a signature item for this kind of bunker. Inside that facility are special guards and special equipment to monitor any leakage that might come out of the bunker." Then he moved to the vehicle next to the building. It was, he said, another signature item. "It's a decontamination vehicle in case something goes wrong. . . . It is moving around those four and it moves as needed to move as people are working in the different bunkers."
Powell's analysis assumed, of course, that you could tell from the picture what kind of truck it was. But pictures of trucks, taken from above, are not always as clear as we would like; sometimes trucks hauling oil tanks look just like trucks hauling Scud launchers, and, while a picture is a good start, if you really want to know what you're looking at you probably need more than a picture. I looked at the photographs with Patrick Eddington, who for many years was an imagery analyst with the C.I.A. Eddington examined them closely. "They're trying to say that those are decontamination vehicles," he told me. He had a photo up on his laptop, and he peered closer to get a better look. "But the resolution is sufficient for me to say that I don't think it is--and I don't see any other decontamination vehicles down there that I would recognize." The standard decontamination vehicle was a Soviet-made box-body van, Eddington said. This truck was too long. For a second opinion, Eddington recommended Ray McGovern, a twenty-seven-year C.I.A. analyst, who had been one of George H. W. Bush's personal intelligence briefers when he was Vice-President. "If you're an expert, you can tell one hell of a lot from pictures like this," McGovern said. He'd heard another interpretation. "I think," he said, "that it's a fire truck."
THE ARCHIVE
complete list
Articles from the New Yorker
The Vanishing
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
January 15, 2005
BOOKS
In "Collapse," Jared Diamond
shows how societies destroy themselves.
1.
A thousand years ago, a group of Vikings led by Erik the Red set sail from Norway for the vast Arctic landmass west of Scandinavia which came to be known as Greenland. It was largely uninhabitable—a forbidding expanse of snow and ice. But along the southwestern coast there were two deep fjords protected from the harsh winds and saltwater spray of the North Atlantic Ocean, and as the Norse sailed upriver they saw grassy slopes flowering with buttercups, dandelions, and bluebells, and thick forests of willow and birch and alder. Two colonies were formed, three hundred miles apart, known as the Eastern and Western Settlements. The Norse raised sheep, goats, and cattle. They turned the grassy slopes into pastureland. They hunted seal and caribou. They built a string of parish churches and a magnificent cathedral, the remains of which are still standing. They traded actively with mainland Europe, and tithed regularly to the Roman Catholic Church. The Norse colonies in Greenland were law-abiding, economically viable, fully integrated communities, numbering at their peak five thousand people. They lasted for four hundred and fifty years—and then they vanished.
The story of the Eastern and Western Settlements of Greenland is told in Jared Diamond's "Collapse: How Societies Choose to Fail or Succeed" (Viking; $29.95). Diamond teaches geography at U.C.L.A. and is well known for his best-seller "Guns, Germs, and Steel," which won a Pulitzer Prize. In "Guns, Germs, and Steel," Diamond looked at environmental and structural factors to explain why Western societies came to dominate the world. In "Collapse," he continues that approach, only this time he looks at history's losers—like the Easter Islanders, the Anasazi of the American Southwest, the Mayans, and the modern-day Rwandans. We live in an era preoccupied with the way that ideology and culture and politics and economics help shape the course of history. But Diamond isn't particularly interested in any of those things—or, at least, he's interested in them only insofar as they bear on what to him is the far more important question, which is a society's relationship to its climate and geography and resources and neighbors. "Collapse" is a book about the most prosaic elements of the earth's ecosystem—soil, trees, and water—because societies fail, in Diamond's view, when they mismanage those environmental factors.
There was nothing wrong with the social organization of the Greenland settlements. The Norse built a functioning reproduction of the predominant northern-European civic model of the time—devout, structured, and reasonably orderly. In 1408, right before the end, records from the Eastern Settlement dutifully report that Thorstein Olafsson married Sigrid Bjornsdotter in Hvalsey Church on September 14th of that year, with Brand Halldorstson, Thord Jorundarson, Thorbjorn Bardarson, and Jon Jonsson as witnesses, following the proclamation of the wedding banns on three consecutive Sundays.
The problem with the settlements, Diamond argues, was that the Norse thought that Greenland really was green; they treated it as if it were the verdant farmland of southern Norway. They cleared the land to create meadows for their cows, and to grow hay to feed their livestock through the long winter. They chopped down the forests for fuel, and for the construction of wooden objects. To make houses warm enough for the winter, they built their homes out of six-foot-thick slabs of turf, which meant that a typical home consumed about ten acres of grassland.
But Greenland's ecosystem was too fragile to withstand that kind of pressure. The short, cool growing season meant that plants developed slowly, which in turn meant that topsoil layers were shallow and lacking in soil constituents, like organic humus and clay, that hold moisture and keep soil resilient in the face of strong winds. "The sequence of soil erosion in Greenland begins with cutting or burning the cover of trees and shrubs, which are more effective at holding soil than is grass," he writes. "With the trees and shrubs gone, livestock, especially sheep and goats, graze down the grass, which regenerates only slowly in Greenland's climate. Once the grass cover is broken and the soil is exposed, soil is carried away especially by the strong winds, and also by pounding from occasionally heavy rains, to the point where the topsoil can be removed for a distance of miles from an entire valley." Without adequate pastureland, the summer hay yields shrank; without adequate supplies of hay, keeping livestock through the long winter got harder. And, without adequate supplies of wood, getting fuel for the winter became increasingly difficult.
The Norse needed to reduce their reliance on livestock—particularly cows, which consumed an enormous amount of agricultural resources. But cows were a sign of high status; to northern Europeans, beef was a prized food. They needed to copy the Inuit practice of burning seal blubber for heat and light in the winter, and to learn from the Inuit the difficult art of hunting ringed seals, which were the most reliably plentiful source of food available in the winter. But the Norse had contempt for the Inuit—they called them skraelings, "wretches"—and preferred to practice their own brand of European agriculture. In the summer, when the Norse should have been sending ships on lumber-gathering missions to Labrador, in order to relieve the pressure on their own forestlands, they instead sent boats and men to the coast to hunt for walrus. Walrus tusks, after all, had great trade value. In return for those tusks, the Norse were able to acquire, among other things, church bells, stained-glass windows, bronze candlesticks, Communion wine, linen, silk, silver, churchmen's robes, and jewelry to adorn their massive cathedral at Gardar, with its three-ton sandstone building blocks and eighty-foot bell tower. In the end, the Norse starved to death.
2.
Diamond's argument stands in sharp contrast to the conventional explanations for a society's collapse. Usually, we look for some kind of cataclysmic event. The aboriginal civilization of the Americas was decimated by the sudden arrival of smallpox. European Jewry was destroyed by Nazism. Similarly, the disappearance of the Norse settlements is usually blamed on the Little Ice Age, which descended on Greenland in the early fourteen-hundreds, ending several centuries of relative warmth. (One archeologist refers to this as the "It got too cold, and they died" argument.) What all these explanations have in common is the idea that civilizations are destroyed by forces outside their control, by acts of God.
But look, Diamond says, at Easter Island. Once, it was home to a thriving culture that produced the enormous stone statues that continue to inspire awe. It was home to dozens of species of trees, which created and protected an ecosystem fertile enough to support as many as thirty thousand people. Today, it's a barren and largely empty outcropping of volcanic rock. What happened? Did a rare plant virus wipe out the island's forest cover? Not at all. The Easter Islanders chopped their trees down, one by one, until they were all gone. "I have often asked myself, 'What did the Easter Islander who cut down the last palm tree say while he was doing it?'" Diamond writes, and that, of course, is what is so troubling about the conclusions of "Collapse." Those trees were felled by rational actors—who must have suspected that the destruction of this resource would result in the destruction of their civilization. The lesson of "Collapse" is that societies, as often as not, aren't murdered. They commit suicide: they slit their wrists and then, in the course of many decades, stand by passively and watch themselves bleed to death.
This doesn't mean that acts of God don't play a role. It did get colder in Greenland in the early fourteen-hundreds. But it didn't get so cold that the island became uninhabitable. The Inuit survived long after the Norse died out, and the Norse had all kinds of advantages, including a more diverse food supply, iron tools, and ready access to Europe. The problem was that the Norse simply couldn't adapt to the country's changing environmental conditions. Diamond writes, for instance, of the fact that nobody can find fish remains in Norse archeological sites. One scientist sifted through tons of debris from the Vatnahverfi farm and found only three fish bones; another researcher analyzed thirty-five thousand bones from the garbage of another Norse farm and found two fish bones. How can this be? Greenland is a fisherman's dream: Diamond describes running into a Danish tourist in Greenland who had just caught two Arctic char in a shallow pool with her bare hands. "Every archaeologist who comes to excavate in Greenland . . . starts out with his or her own idea about where all those missing fish bones might be hiding," he writes. "Could the Norse have strictly confined their munching on fish to within a few feet of the shoreline, at sites now underwater because of land subsidence? Could they have faithfully saved all their fish bones for fertilizer, fuel, or feeding to cows?" It seems unlikely. There are no fish bones in Norse archeological remains, Diamond concludes, for the simple reason that the Norse didn't eat fish. For one reason or another, they had a cultural taboo against it.
Given the difficulty that the Norse had in putting food on the table, this was insane. Eating fish would have substantially reduced the ecological demands of the Norse settlements. The Norse would have needed fewer livestock and less pastureland. Fishing is not nearly as labor-intensive as raising cattle or hunting caribou, so eating fish would have freed time and energy for other activities. It would have diversified their diet.
Why did the Norse choose not to eat fish? Because they weren't thinking about their biological survival. They were thinking about their cultural survival. Food taboos are one of the idiosyncrasies that define a community. Not eating fish served the same function as building lavish churches, and doggedly replicating the untenable agricultural practices of their land of origin. It was part of what it meant to be Norse, and if you are going to establish a community in a harsh and forbidding environment all those little idiosyncrasies which define and cement a culture are of paramount importance. "The Norse were undone by the same social glue that had enabled them to master Greenland's difficulties," Diamond writes. "The values to which people cling most stubbornly under inappropriate conditions are those values that were previously the source of their greatest triumphs over adversity." He goes on:
To us in our secular modern society, the predicament in which the Greenlanders found themselves is difficult to fathom. To them, however, concerned with their social survival as much as their biological survival, it was out of the question to invest less in churches, to imitate or intermarry with the Inuit, and thereby to face an eternity in Hell just in order to survive another winter on Earth.
Diamond's distinction between social and biological survival is a critical one, because too often we blur the two, or assume that biological survival is contingent on the strength of our civilizational values. That was the lesson taken from the two world wars and the nuclear age that followed: we would survive as a species only if we learned to get along and resolve our disputes peacefully. The fact is, though, that we can be law-abiding and peace-loving and tolerant and inventive and committed to freedom and true to our own values and still behave in ways that are biologically suicidal. The two kinds of survival are separate.
Diamond points out that the Easter Islanders did not practice, so far as we know, a uniquely pathological version of South Pacific culture. Other societies, on other islands in the Hawaiian archipelago, chopped down trees and farmed and raised livestock just as the Easter Islanders did. What doomed the Easter Islanders was the interaction between what they did and where they were. Diamond and a colleague, Barry Rollet, identified nine physical factors that contributed to the likelihood of deforestation—including latitude, average rainfall, aerial-ash fallout, proximity to Central Asia's dust plume, size, and so on—and Easter Island ranked at the high-risk end of nearly every variable. "The reason for Easter's unusually severe degree of deforestation isn't that those seemingly nice people really were unusually bad or improvident," he concludes. "Instead, they had the misfortune to be living in one of the most fragile environments, at the highest risk for deforestation, of any Pacific people." The problem wasn't the Easter Islanders. It was Easter Island.
In the second half of "Collapse," Diamond turns his attention to modern examples, and one of his case studies is the recent genocide in Rwanda. What happened in Rwanda is commonly described as an ethnic struggle between the majority Hutu and the historically dominant, wealthier Tutsi, and it is understood in those terms because that is how we have come to explain much of modern conflict: Serb and Croat, Jew and Arab, Muslim and Christian. The world is a cauldron of cultural antagonism. It's an explanation that clearly exasperates Diamond. The Hutu didn't just kill the Tutsi, he points out. The Hutu also killed other Hutu. Why? Look at the land: steep hills farmed right up to the crests, without any protective terracing; rivers thick with mud from erosion; extreme deforestation leading to irregular rainfall and famine; staggeringly high population densities; the exhaustion of the topsoil; falling per-capita food production. This was a society on the brink of ecological disaster, and if there is anything that is clear from the study of such societies it is that they inevitably descend into genocidal chaos. In "Collapse," Diamond quite convincingly defends himself against the charge of environmental determinism. His discussions are always nuanced, and he gives political and ideological factors their due. The real issue is how, in coming to terms with the uncertainties and hostilities of the world, the rest of us have turned ourselves into cultural determinists.
3.
For the past thirty years, Oregon has had one of the strictest sets of land-use regulations in the nation, requiring new development to be clustered in and around existing urban development. The laws meant that Oregon has done perhaps the best job in the nation in limiting suburban sprawl, and protecting coastal lands and estuaries. But this November Oregon's voters passed a ballot referendum, known as Measure 37, that rolled back many of those protections. Specifically, Measure 37 said that anyone who could show that the value of his land was affected by regulations implemented since its purchase was entitled to compensation from the state. If the state declined to pay, the property owner would be exempted from the regulations.
To call Measure 37—and similar referendums that have been passed recently in other states—intellectually incoherent is to put it mildly. It might be that the reason your hundred-acre farm on a pristine hillside is worth millions to a developer is that it's on a pristine hillside: if everyone on that hillside could subdivide, and sell out to Target and Wal-Mart, then nobody's plot would be worth millions anymore. Will the voters of Oregon then pass Measure 38, allowing them to sue the state for compensation over damage to property values caused by Measure 37?
It is hard to read "Collapse," though, and not have an additional reaction to Measure 37. Supporters of the law spoke entirely in the language of political ideology. To them, the measure was a defense of property rights, preventing the state from unconstitutional "takings." If you replaced the term "property rights" with "First Amendment rights," this would have been indistinguishable from an argument over, say, whether charitable groups ought to be able to canvass in malls, or whether cities can control the advertising they sell on the sides of public buses. As a society, we do a very good job with these kinds of debates: we give everyone a hearing, and pass laws, and make compromises, and square our conclusions with our constitutional heritage—and in the Oregon debate the quality of the theoretical argument was impressively high.
The thing that got lost in the debate, however, was the land. In a rapidly growing state like Oregon, what, precisely, are the state's ecological strengths and vulnerabilities? What impact will changed land-use priorities have on water and soil and cropland and forest? One can imagine Diamond writing about the Measure 37 debate, and he wouldn't be very impressed by how seriously Oregonians wrestled with the problem of squaring their land-use rules with their values, because to him a society's environmental birthright is not best discussed in those terms. Rivers and streams and forests and soil are a biological resource. They are a tangible, finite thing, and societies collapse when they get so consumed with addressing the fine points of their history and culture and deeply held beliefs—with making sure that Thorstein Olafsson and Sigrid Bjornsdotter are married before the right number of witnesses following the announcement of wedding banns on the right number of Sundays—that they forget that the pastureland is shrinking and the forest cover is gone.
When archeologists looked through the ruins of the Western Settlement, they found plenty of the big wooden objects that were so valuable in Greenland—crucifixes, bowls, furniture, doors, roof timbers—which meant that the end came too quickly for anyone to do any scavenging. And, when the archeologists looked at the animal bones left in the debris, they found the bones of newborn calves, meaning that the Norse, in that final winter, had given up on the future. They found toe bones from cows, equal to the number of cow spaces in the barn, meaning that the Norse ate their cattle down to the hoofs, and they found the bones of dogs covered with knife marks, meaning that, in the end, they had to eat their pets. But not fish bones, of course. Right up until they starved to death, the Norse never lost sight of what they stood for.
THE ARCHIVE
complete list
Articles from the New Yorker
Brain Candy
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 16, 2005
The Critics: Books
Is pop culture dumbing us down or smartening us up?
1.
Twenty years ago, a political philosopher named James Flynn uncovered a curious fact. Americans—at least, as measured by I.Q. tests—were getting smarter. This fact had been obscured for years, because the people who give I.Q. tests continually recalibrate the scoring system to keep the average at 100. But if you took out the recalibration, Flynn found, I.Q. scores showed a steady upward trajectory, rising by about three points per decade, which means that a person whose I.Q. placed him in the top ten per cent of the American population in 1920 would today fall in the bottom third. Some of that effect, no doubt, is a simple by-product of economic progress: in the surge of prosperity during the middle part of the last century, people in the West became better fed, better educated, and more familiar with things like I.Q. tests. But, even as that wave of change has subsided, test scores have continued to rise—not just in America but all over the developed world. What's more, the increases have not been confined to children who go to enriched day-care centers and private schools. The middle part of the curve—the people who have supposedly been suffering from a deteriorating public-school system and a steady diet of lowest-common-denominator television and mindless pop music—has increased just as much. What on earth is happening? In the wonderfully entertaining "Everything Bad Is Good for You" (Riverhead; $23.95), Steven Johnson proposes that what is making us smarter is precisely what we thought was making us dumber: popular culture.
Johnson is the former editor of the online magazine Feed and the author of a number of books on science and technology. There is a pleasing eclecticism to his thinking. He is as happy analyzing "Finding Nemo" as he is dissecting the intricacies of a piece of software, and he's perfectly capable of using Nietzsche's notion of eternal recurrence to discuss the new creative rules of television shows. Johnson wants to understand popular culture—not in the postmodern, academic sense of wondering what "The Dukes of Hazzard" tells us about Southern male alienation but in the very practical sense of wondering what watching something like "The Dukes of Hazzard" does to the way our minds work.
As Johnson points out, television is very different now from what it was thirty years ago. It's harder. A typical episode of "Starsky and Hutch," in the nineteen-seventies, followed an essentially linear path: two characters, engaged in a single story line, moving toward a decisive conclusion. To watch an episode of "Dallas" today is to be stunned by its glacial pace—by the arduous attempts to establish social relationships, by the excruciating simplicity of the plotline, by how obvious it was. A single episode of "The Sopranos," by contrast, might follow five narrative threads, involving a dozen characters who weave in and out of the plot. Modern television also requires the viewer to do a lot of what Johnson calls "filling in," as in a "Seinfeld" episode that subtly parodies the Kennedy assassination conspiracists, or a typical "Simpsons" episode, which may contain numerous allusions to politics or cinema or pop culture. The extraordinary amount of money now being made in the television aftermarket—DVD sales and syndication—means that the creators of television shows now have an incentive to make programming that can sustain two or three or four viewings. Even reality shows like "Survivor," Johnson argues, engage the viewer in a way that television rarely has in the past:
When we watch these shows, the part of our brain that monitors the emotional lives of the people around us—the part that tracks subtle shifts in intonation and gesture and facial expression—scrutinizes the action on the screen, looking for clues. . . . The phrase "Monday-morning quarterbacking" was coined to describe the engaged feeling spectators have in relation to games as opposed to stories. We absorb stories, but we second-guess games. Reality programming has brought that second-guessing to prime time, only the game in question revolves around social dexterity rather than the physical kind.
How can the greater cognitive demands that television makes on us now, he wonders, not matter?
Johnson develops the same argument about video games. Most of the people who denounce video games, he says, haven't actually played them—at least, not recently. Twenty years ago, games like Tetris or Pac-Man were simple exercises in motor coördination and pattern recognition. Today's games belong to another realm. Johnson points out that one of the "walk-throughs" for "Grand Theft Auto III"—that is, the informal guides that break down the games and help players navigate their complexities—is fifty-three thousand words long, about the length of his book. The contemporary video game involves a fully realized imaginary world, dense with detail and levels of complexity.
Indeed, video games are not games in the sense of those pastimes—like Monopoly or gin rummy or chess—which most of us grew up with. They don't have a set of unambiguous rules that have to be learned and then followed during the course of play. This is why many of us find modern video games baffling: we're not used to being in a situation where we have to figure out what to do. We think we only have to learn how to press the buttons faster. But these games withhold critical information from the player. Players have to explore and sort through hypotheses in order to make sense of the game's environment, which is why a modern video game can take forty hours to complete. Far from being engines of instant gratification, as they are often described, video games are actually, Johnson writes, "all about delayed gratification—sometimes so long delayed that you wonder if the gratification is ever going to show."
At the same time, players are required to manage a dizzying array of information and options. The game presents the player with a series of puzzles, and you can't succeed at the game simply by solving the puzzles one at a time. You have to craft a longer-term strategy, in order to juggle and coördinate competing interests. In denigrating the video game, Johnson argues, we have confused it with other phenomena in teen-age life, like multitasking—simultaneously e-mailing and listening to music and talking on the telephone and surfing the Internet. Playing a video game is, in fact, an exercise in "constructing the proper hierarchy of tasks and moving through the tasks in the correct sequence," he writes. "It's about finding order and meaning in the world, and making decisions that help create that order."
2.
It doesn't seem right, of course, that watching "24" or playing a video game could be as important cognitively as reading a book. Isn't the extraordinary success of the "Harry Potter" novels better news for the culture than the equivalent success of "Grand Theft Auto III"? Johnson's response is to imagine what cultural critics might have said had video games been invented hundreds of years ago, and only recently had something called the book been marketed aggressively to children:
Reading books chronically understimulates the senses. Unlike the longstanding tradition of gameplaying—which engages the child in a vivid, three-dimensional world filled with moving images and musical sound-scapes, navigated and controlled with complex muscular movements—books are simply a barren string of words on the page. . . .
Books are also tragically isolating. While games have for many years engaged the young in complex social relationships with their peers, building and exploring worlds together, books force the child to sequester him or herself in a quiet space, shut off from interaction with other children. . . .
But perhaps the most dangerous property of these books is the fact that they follow a fixed linear path. You can't control their narratives in any fashion—you simply sit back and have the story dictated to you. . . . This risks instilling a general passivity in our children, making them feel as though they're powerless to change their circumstances. Reading is not an active, participatory process; it's a submissive one.
He's joking, of course, but only in part. The point is that books and video games represent two very different kinds of learning. When you read a biology textbook, the content of what you read is what matters. Reading is a form of explicit learning. When you play a video game, the value is in how it makes you think. Video games are an example of collateral learning, which is no less important.
Being "smart" involves facility in both kinds of thinking—the kind of fluid problem solving that matters in things like video games and I.Q. tests, but also the kind of crystallized knowledge that comes from explicit learning. If Johnson's book has a flaw, it is that he sometimes speaks of our culture being "smarter" when he's really referring just to that fluid problem-solving facility. When it comes to the other kind of intelligence, it is not clear at all what kind of progress we are making, as anyone who has read, say, the Gettysburg Address alongside any Presidential speech from the past twenty years can attest. The real question is what the right balance of these two forms of intelligence might look like. "Everything Bad Is Good for You" doesn't answer that question. But Johnson does something nearly as important, which is to remind us that we shouldn't fall into the trap of thinking that explicit learning is the only kind of learning that matters.
In recent years, for example, a number of elementary schools have phased out or reduced recess and replaced it with extra math or English instruction. This is the triumph of the explicit over the collateral. After all, recess is "play" for a ten-year-old in precisely the sense that Johnson describes video games as play for an adolescent: an unstructured environment that requires the child actively to intervene, to look for the hidden logic, to find order and meaning in chaos.
One of the ongoing debates in the educational community, similarly, is over the value of homework. Meta-analysis of hundreds of studies done on the effects of homework shows that the evidence supporting the practice is, at best, modest. Homework seems to be most useful in high school and for subjects like math. At the elementary-school level, homework seems to be of marginal or no academic value. Its effect on discipline and personal responsibility is unproved. And the causal relation between high-school homework and achievement is unclear: it hasn't been firmly established whether spending more time on homework in high school makes you a better student or whether better students, finding homework more pleasurable, spend more time doing it. So why, as a society, are we so enamored of homework? Perhaps because we have so little faith in the value of the things that children would otherwise be doing with their time. They could go out for a walk, and get some exercise; they could spend time with their peers, and reap the rewards of friendship. Or, Johnson suggests, they could be playing a video game, and giving their minds a rigorous workout.
THE ARCHIVE
complete list
Articles from the New Yorker
The Moral Hazard Myth
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
August 29, 2005
Dept. of Public Policy
The bad idea behind our failed health-care system.
1.
Tooth decay begins, typically, when debris becomes trapped between the teeth and along the ridges and in the grooves of the molars. The food rots. It becomes colonized with bacteria. The bacteria feeds off sugars in the mouth and forms an acid that begins to eat away at the enamel of the teeth. Slowly, the bacteria works its way through to the dentin, the inner structure, and from there the cavity begins to blossom three-dimensionally, spreading inward and sideways. When the decay reaches the pulp tissue, the blood vessels, and the nerves that serve the tooth, the pain starts—an insistent throbbing. The tooth turns brown. It begins to lose its hard structure, to the point where a dentist can reach into a cavity with a hand instrument and scoop out the decay. At the base of the tooth, the bacteria mineralizes into tartar, which begins to irritate the gums. They become puffy and bright red and start to recede, leaving more and more of the tooth's root exposed. When the infection works its way down to the bone, the structure holding the tooth in begins to collapse altogether.
Several years ago, two Harvard researchers, Susan Starr Sered and Rushika Fernandopulle, set out to interview people without health-care coverage for a book they were writing, "Uninsured in America." They talked to as many kinds of people as they could find, collecting stories of untreated depression and struggling single mothers and chronically injured laborers—and the most common complaint they heard was about teeth. Gina, a hairdresser in Idaho, whose husband worked as a freight manager at a chain store, had "a peculiar mannerism of keeping her mouth closed even when speaking." It turned out that she hadn't been able to afford dental care for three years, and one of her front teeth was rotting. Daniel, a construction worker, pulled out his bad teeth with pliers. Then, there was Loretta, who worked nights at a university research center in Mississippi, and was missing most of her teeth. "They'll break off after a while, and then you just grab a hold of them, and they work their way out," she explained to Sered and Fernandopulle. "It hurts so bad, because the tooth aches. Then it's a relief just to get it out of there. The hole closes up itself anyway. So it's so much better."
People without health insurance have bad teeth because, if you're paying for everything out of your own pocket, going to the dentist for a checkup seems like a luxury. It isn't, of course. The loss of teeth makes eating fresh fruits and vegetables difficult, and a diet heavy in soft, processed foods exacerbates more serious health problems, like diabetes. The pain of tooth decay leads many people to use alcohol as a salve. And those struggling to get ahead in the job market quickly find that the unsightliness of bad teeth, and the self-consciousness that results, can become a major barrier. If your teeth are bad, you're not going to get a job as a receptionist, say, or a cashier. You're going to be put in the back somewhere, far from the public eye. What Loretta, Gina, and Daniel understand, the two authors tell us, is that bad teeth have come to be seen as a marker of "poor parenting, low educational achievement and slow or faulty intellectual development." They are an outward marker of caste. "Almost every time we asked interviewees what their first priority would be if the president established universal health coverage tomorrow," Sered and Fernandopulle write, "the immediate answer was 'my teeth.' "
The U. S. health-care system, according to "Uninsured in America," has created a group of people who increasingly look different from others and suffer in ways that others do not. The leading cause of personal bankruptcy in the United States is unpaid medical bills. Half of the uninsured owe money to hospitals, and a third are being pursued by collection agencies. Children without health insurance are less likely to receive medical attention for serious injuries, for recurrent ear infections, or for asthma. Lung-cancer patients without insurance are less likely to receive surgery, chemotherapy, or radiation treatment. Heart-attack victims without health insurance are less likely to receive angioplasty. People with pneumonia who don't have health insurance are less likely to receive X rays or consultations. The death rate in any given year for someone without health insurance is twenty-five per cent higher than for someone with insur-ance. Because the uninsured are sicker than the rest of us, they can't get better jobs, and because they can't get better jobs they can't afford health insurance, and because they can't afford health insurance they get even sicker. John, the manager of a bar in Idaho, tells Sered and Fernandopulle that as a result of various workplace injuries over the years he takes eight ibuprofen, waits two hours, then takes eight more—and tries to cadge as much prescription pain medication as he can from friends. "There are times when I should've gone to the doctor, but I couldn't afford to go because I don't have insurance," he says. "Like when my back messed up, I should've gone. If I had insurance, I would've went, because I know I could get treatment, but when you can't afford it you don't go. Because the harder the hole you get into in terms of bills, then you'll never get out. So you just say, 'I can deal with the pain.' "
2.
One of the great mysteries of political life in the United States is why Americans are so devoted to their health-care system. Six times in the past century—during the First World War, during the Depression, during the Truman and Johnson Administrations, in the Senate in the nineteen-seventies, and during the Clinton years—efforts have been made to introduce some kind of universal health insurance, and each time the efforts have been rejected. Instead, the United States has opted for a makeshift system of increasing complexity and dysfunction. Americans spend $5,267 per capita on health care every year, almost two and half times the industrialized world's median of $2,193; the extra spending comes to hundreds of billions of dollars a year. What does that extra spending buy us? Americans have fewer doctors per capita than most Western countries. We go to the doctor less than people in other Western countries. We get admitted to the hospital less frequently than people in other Western countries. We are less satisfied with our health care than our counterparts in other countries. American life expectancy is lower than the Western average. Childhood-immunization rates in the United States are lower than average. Infant-mortality rates are in the nineteenth percentile of industrialized nations. Doctors here perform more high-end medical procedures, such as coronary angioplasties, than in other countries, but most of the wealthier Western countries have more CT scanners than the United States does, and Switzerland, Japan, Austria, and Finland all have more MRI machines per capita. Nor is our system more efficient. The United States spends more than a thousand dollars per capita per year—or close to four hundred billion dollars—on health-care-related paperwork and administration, whereas Canada, for example, spends only about three hundred dollars per capita. And, of course, every other country in the industrialized world insures all its citizens; despite those extra hundreds of billions of dollars we spend each year, we leave forty-five million people without any insurance. A country that displays an almost ruthless commitment to efficiency and performance in every aspect of its economy—a country that switched to Japanese cars the moment they were more reliable, and to Chinese T-shirts the moment they were five cents cheaper—has loyally stuck with a health-care system that leaves its citizenry pulling out their teeth with pliers.
America's health-care mess is, in part, simply an accident of history. The fact that there have been six attempts at universal health coverage in the last century suggests that there has long been support for the idea. But politics has always got in the way. In both Europe and the United States, for example, the push for health insurance was led, in large part, by organized labor. But in Europe the unions worked through the political system, fighting for coverage for all citizens. From the start, health insurance in Europe was public and universal, and that created powerful political support for any attempt to expand benefits. In the United States, by contrast, the unions worked through the collective-bargaining system and, as a result, could win health benefits only for their own members. Health insurance here has always been private and selective, and every attempt to expand benefits has resulted in a paralyzing political battle over who would be added to insurance rolls and who ought to pay for those additions.
Policy is driven by more than politics, however. It is equally driven by ideas, and in the past few decades a particular idea has taken hold among prominent American economists which has also been a powerful impediment to the expansion of health insurance. The idea is known as "moral hazard." Health economists in other Western nations do not share this obsession. Nor do most Americans. But moral hazard has profoundly shaped the way think tanks formulate policy and the way experts argue and the way health insurers structure their plans and the way legislation and regulations have been written. The health-care mess isn't merely the unintentional result of political dysfunction, in other words. It is also the deliberate consequence of the way in which American policymakers have come to think about insurance.
"Moral hazard" is the term economists use to describe the fact that insurance can change the behavior of the person being insured. If your office gives you and your co-workers all the free Pepsi you want—if your employer, in effect, offers universal Pepsi insurance—you'll drink more Pepsi than you would have otherwise. If you have a no-deductible fire-insurance policy, you may be a little less diligent in clearing the brush away from your house. The savings-and-loan crisis of the nineteen-eighties was created, in large part, by the fact that the federal government insured savings deposits of up to a hundred thousand dollars, and so the newly deregulated S. & L.s made far riskier investments than they would have otherwise. Insurance can have the paradoxical effect of producing risky and wasteful behavior. Economists spend a great deal of time thinking about such moral hazard for good reason. Insurance is an attempt to make human life safer and more secure. But, if those efforts can backfire and produce riskier behavior, providing insurance becomes a much more complicated and problematic endeavor.
In 1968, the economist Mark Pauly argued that moral hazard played an enormous role in medicine, and, as John Nyman writes in his book "The Theory of the Demand for Health Insurance," Pauly's paper has become the "single most influential article in the health economics literature." Nyman, an economist at the University of Minnesota, says that the fear of moral hazard lies behind the thicket of co-payments and deductibles and utilization reviews which characterizes the American health-insurance system. Fear of moral hazard, Nyman writes, also explains "the general lack of enthusiasm by U.S. health economists for the expansion of health insurance coverage (for example, national health insurance or expanded Medicare benefits) in the U.S."
What Nyman is saying is that when your insurance company requires that you make a twenty-dollar co-payment for a visit to the doctor, or when your plan includes an annual five-hundred-dollar or thousand-dollar deductible, it's not simply an attempt to get you to pick up a larger share of your health costs. It is an attempt to make your use of the health-care system more efficient. Making you responsible for a share of the costs, the argument runs, will reduce moral hazard: you'll no longer grab one of those free Pepsis when you aren't really thirsty. That's also why Nyman says that the notion of moral hazard is behind the "lack of enthusiasm" for expansion of health insurance. If you think of insurance as producing wasteful consumption of medical services, then the fact that there are forty-five million Americans without health insurance is no longer an immediate cause for alarm. After all, it's not as if the uninsured never go to the doctor. They spend, on average, $934 a year on medical care. A moral-hazard theorist would say that they go to the doctor when they really have to. Those of us with private insurance, by contrast, consume $2,347 worth of health care a year. If a lot of that extra $1,413 is waste, then maybe the uninsured person is the truly efficient consumer of health care.
The moral-hazard argument makes sense, however, only if we consume health care in the same way that we consume other consumer goods, and to economists like Nyman this assumption is plainly absurd. We go to the doctor grudgingly, only because we're sick. "Moral hazard is overblown," the Princeton economist Uwe Reinhardt says. "You always hear that the demand for health care is unlimited. This is just not true. People who are very well insured, who are very rich, do you see them check into the hospital because it's free? Do people really like to go to the doctor? Do they check into the hospital instead of playing golf?"
For that matter, when you have to pay for your own health care, does your consumption really become more efficient? In the late nineteen-seventies, the rand Corporation did an extensive study on the question, randomly assigning families to health plans with co-payment levels at zero per cent, twenty-five per cent, fifty per cent, or ninety-five per cent, up to six thousand dollars. As you might expect, the more that people were asked to chip in for their health care the less care they used. The problem was that they cut back equally on both frivolous care and useful care. Poor people in the high-deductible group with hypertension, for instance, didn't do nearly as good a job of controlling their blood pressure as those in other groups, resulting in a ten-per-cent increase in the likelihood of death. As a recent Commonwealth Fund study concluded, cost sharing is "a blunt instrument." Of course it is: how should the average consumer be expected to know beforehand what care is frivolous and what care is useful? I just went to the dermatologist to get moles checked for skin cancer. If I had had to pay a hundred per cent, or even fifty per cent, of the cost of the visit, I might not have gone. Would that have been a wise decision? I have no idea. But if one of those moles really is cancerous, that simple, inexpensive visit could save the health-care system tens of thousands of dollars (not to mention saving me a great deal of heartbreak). The focus on moral hazard suggests that the changes we make in our behavior when we have insurance are nearly always wasteful. Yet, when it comes to health care, many of the things we do only because we have insurance—like getting our moles checked, or getting our teeth cleaned regularly, or getting a mammogram or engaging in other routine preventive care—are anything but wasteful and inefficient. In fact, they are behaviors that could end up saving the health-care system a good deal of money.
Sered and Fernandopulle tell the story of Steve, a factory worker from northern Idaho, with a "grotesquelooking left hand—what looks like a bone sticks out the side." When he was younger, he broke his hand. "The doctor wanted to operate on it," he recalls. "And because I didn't have insurance, well, I was like 'I ain't gonna have it operated on.' The doctor said, 'Well, I can wrap it for you with an Ace bandage.' I said, 'Ahh, let's do that, then.' " Steve uses less health care than he would if he had insurance, but that's not because he has defeated the scourge of moral hazard. It's because instead of getting a broken bone fixed he put a bandage on it.
3.
At the center of the Bush Administration's plan to address the health-insurance mess are Health Savings Accounts, and Health Savings Accounts are exactly what you would come up with if you were concerned, above all else, with minimizing moral hazard. The logic behind them was laid out in the 2004 Economic Report of the President. Americans, the report argues, have too much health insurance: typical plans cover things that they shouldn't, creating the problem of overconsumption. Several paragraphs are then devoted to explaining the theory of moral hazard. The report turns to the subject of the uninsured, concluding that they fall into several groups. Some are foreigners who may be covered by their countries of origin. Some are people who could be covered by Medicaid but aren't or aren't admitting that they are. Finally, a large number "remain uninsured as a matter of choice." The report continues, "Researchers believe that as many as one-quarter of those without health insurance had coverage available through an employer but declined the coverage.... Still others may remain uninsured because they are young and healthy and do not see the need for insurance." In other words, those with health insurance are overinsured and their behavior is distorted by moral hazard. Those without health insurance use their own money to make decisions about insurance based on an assessment of their needs. The insured are wasteful. The uninsured are prudent. So what's the solution? Make the insured a little bit more like the uninsured.
Under the Health Savings Accounts system, consumers are asked to pay for routine health care with their own money—several thousand dollars of which can be put into a tax-free account. To handle their catastrophic expenses, they then purchase a basic health-insurance package with, say, a thousand-dollar annual deductible. As President Bush explained recently, "Health Savings Accounts all aim at empowering people to make decisions for themselves, owning their own health-care plan, and at the same time bringing some demand control into the cost of health care."
The country described in the President's report is a very different place from the country described in "Uninsured in America." Sered and Fernandopulle look at the billions we spend on medical care and wonder why Americans have so little insurance. The President's report considers the same situation and worries that we have too much. Sered and Fernandopulle see the lack of insurance as a problem of poverty; a third of the uninsured, after all, have incomes below the federal poverty line. In the section on the uninsured in the President's report, the word "poverty" is never used. In the Administration's view, people are offered insurance but "decline the coverage" as "a matter of choice." The uninsured in Sered and Fernandopulle's book decline coverage, but only because they can't afford it. Gina, for instance, works for a beauty salon that offers her a bare-bones health-insurance plan with a thousand-dollar deductible for two hundred dollars a month. What's her total income? Nine hundred dollars a month. She could "choose" to accept health insurance, but only if she chose to stop buying food or paying the rent.
The biggest difference between the two accounts, though, has to do with how each views the function of insurance. Gina, Steve, and Loretta are ill, and need insurance to cover the costs of getting better. In their eyes, insurance is meant to help equalize financial risk between the healthy and the sick. In the insurance business, this model of coverage is known as "social insurance," and historically it was the way health coverage was conceived. If you were sixty and had heart disease and diabetes, you didn't pay substantially more for coverage than a perfectly healthy twenty-five-year-old. Under social insurance, the twenty-five-year-old agrees to pay thousands of dollars in premiums even though he didn't go to the doctor at all in the previous year, because he wants to make sure that someone else will subsidize his health care if he ever comes down with heart disease or diabetes. Canada and Germany and Japan and all the other industrialized nations with universal health care follow the social-insurance model. Medicare, too, is based on the social-insurance model, and, when Americans with Medicare report themselves to be happier with virtually every aspect of their insurance coverage than people with private insurance (as they do, repeatedly and overwhelmingly), they are referring to the social aspect of their insurance. They aren't getting better care. But they are getting something just as valuable: the security of being insulated against the financial shock of serious illness.
There is another way to organize insurance, however, and that is to make it actuarial. Car insurance, for instance, is actuarial. How much you pay is in large part a function of your individual situation and history: someone who drives a sports car and has received twenty speeding tickets in the past two years pays a much higher annual premium than a soccer mom with a minivan. In recent years, the private insurance industry in the United States has been moving toward the actuarial model, with profound consequences. The triumph of the actuarial model over the social-insurance model is the reason that companies unlucky enough to employ older, high-cost employees—like United Airlines—have run into such financial difficulty. It's the reason that automakers are increasingly moving their operations to Canada. It's the reason that small businesses that have one or two employees with serious illnesses suddenly face unmanageably high health-insurance premiums, and it's the reason that, in many states, people suffering from a potentially high-cost medical condition can't get anyone to insure them at all.
Health Savings Accounts represent the final, irrevocable step in the actuarial direction. If you are preoccupied with moral hazard, then you want people to pay for care with their own money, and, when you do that, the sick inevitably end up paying more than the healthy. And when you make people choose an insurance plan that fits their individual needs, those with significant medical problems will choose expensive health plans that cover lots of things, while those with few health problems will choose cheaper, bare-bones plans. The more expensive the comprehensive plans become, and the less expensive the bare-bones plans become, the more the very sick will cluster together at one end of the insurance spectrum, and the more the well will cluster together at the low-cost end. The days when the healthy twenty-five-year-old subsidizes the sixty-year-old with heart disease or diabetes are coming to an end. "The main effect of putting more of it on the consumer is to reduce the social redistributive element of insurance," the Stanford economist Victor Fuchs says. Health Savings Accounts are not a variant of universal health care. In their governing assumptions, they are the antithesis of universal health care.
The issue about what to do with the health-care system is sometimes presented as a technical argument about the merits of one kind of coverage over another or as an ideological argument about socialized versus private medicine. It is, instead, about a few very simple questions. Do you think that this kind of redistribution of risk is a good idea? Do you think that people whose genes predispose them to depression or cancer, or whose poverty complicates asthma or diabetes, or who get hit by a drunk driver, or who have to keep their mouths closed because their teeth are rotting ought to bear a greater share of the costs of their health care than those of us who are lucky enough to escape such misfortunes? In the rest of the industrialized world, it is assumed that the more equally and widely the burdens of illness are shared, the better off the population as a whole is likely to be. The reason the United States has forty-five million people without coverage is that its health-care policy is in the hands of people who disagree, and who regard health insurance not as the solution but as the problem.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
The Bakeoff
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
September 5, 2005
Annals of Technology
Project Delta aims to create the perfect cookie.
1.
Steve Gundrum launched Project Delta at a small dinner last fall at Il Fornaio, in Burlingame, just down the road from the San Francisco Airport. It wasn't the first time he'd been to Il Fornaio, and he made his selection quickly, with just a glance at the menu; he is the sort of person who might have thought about his choice in advance — maybe even that morning, while shaving. He would have posed it to himself as a question — Ravioli alla Lucana?—and turned it over in his mind, assembling and disassembling the dish, ingredient by ingredient, as if it were a model airplane. Did the Pecorino pepato really belong? What if you dropped the basil? What would the ravioli taste like if you froze it, along with the ricotta and the Parmesan, and tried to sell it in the supermarket? And then what would you do about the fennel?
Gundrum is short and round. He has dark hair and a mustache and speaks with the flattened vowels of the upper Midwest. He is voluble and excitable and doggedly unpretentious, to the point that your best chance of seeing him in a suit is probably Halloween. He runs Mattson, one of the country's foremost food research-and-development firms, which is situated in a low-slung concrete-and-glass building in a nondescript office park in Silicon Valley. Gundrum's office is a spare, windowless room near the rear, and all day long white-coated technicians come to him with prototypes in little bowls, or on skewers, or in Tupperware containers. His job is to taste and advise, and the most common words out of his mouth are "I have an idea." Just that afternoon, Gundrum had ruled on the reformulation of a popular spinach dip (which had an unfortunate tendency to smell like lawn clippings) and examined the latest iteration of a low-carb kettle corn for evidence of rhythmic munching (the metronomic hand-to-mouth cycle that lies at the heart of any successful snack experience). Mattson created the shelf-stable Mrs. Fields Chocolate Chip Cookie, the new Boca Burger products for Kraft Foods, Orville Redenbacher's Butter Toffee Popcorn Clusters, and so many other products that it is impossible to walk down the aisle of a supermarket and not be surrounded by evidence of the company's handiwork.
That evening, Gundrum had invited two of his senior colleagues at Mattson — Samson Hsia and Carol Borba — to dinner, along with Steven Addis, who runs a prominent branding firm in the Bay Area. They sat around an oblong table off to one side of the dining room, with the sun streaming in the window, and Gundrum informed them that he intended to reinvent the cookie, to make something both nutritious and as "indulgent" as the premium cookies on the supermarket shelf. "We want to delight people," he said. "We don't want some ultra-high-nutrition power bar, where you have to rationalize your consumption." He said it again: "We want to delight people."
As everyone at the table knew, a healthful, good-tasting cookie is something of a contradiction. A cookie represents the combination of three unhealthful ingredients—sugar, white flour, and shortening. The sugar adds sweetness, bulk, and texture: along with baking powder, it produces the tiny cell structures that make baked goods light and fluffy. The fat helps carry the flavor. If you want a big hit of vanilla, or that chocolate taste that really blooms in the nasal cavities, you need fat. It also keeps the strands of gluten in the flour from getting too tightly bound together, so that the cookie stays chewable. The ¦our, of course, gives the batter its structure, and, with the sugar, provides the base for the browning reaction that occurs during baking. You could replace the standard white flour with wheat flour, which is higher in fibre, but fibre adds grittiness. Over the years, there have been many attempts to resolve these contradictions — from Snackwells and diet Oreos to the dry, grainy hockey pucks that pass for cookies in health-food stores — but in every case ¦flavor or fluffiness or tenderness has been compromised. Steve Gundrum was undeterred. He told his colleagues that he wanted Project Delta to create the world's great-est cookie. He wanted to do it in six months. He wanted to enlist the biggest players in the American food industry. And how would he come up with this wonder cookie? The old-fashioned way. He wanted to hold a bakeoff.
2.
The standard protocol for inventing something in the food industry is called the matrix model. There is a department for product development, which comes up with a new idea, and a department for process development, which figures out how to realize it, and then, down the line, departments for packing, quality assurance, regulatory affairs, chemistry, microbiology, and so on. In a conventional bakeoff, Gundrum would have pitted three identical matrixes against one another and compared the results. But he wasn't satisfied with the unexamined assumption behind the conventional bakeoff — that there was just one way of inventing something new.
Gundrum had a particular interest, as it happened, in software. He had read widely about it, and once, when he ran into Steve Jobs at an Apple store in the Valley, chatted with him for forty-five minutes on technical matters relating to the Apple operating system. He saw little difference between what he did for a living and what the soft-ware engineers in the surrounding hills of Silicon Valley did. "Lines of code are no different from a recipe," he explains. "It's the same thing. You add a little salt, and it tastes better. You write a little piece of code, and it makes the software work faster." But in the software world, Gundrum knew, there were ongoing debates about the best way to come up with new code.
On the one hand, there was the "open source" movement. Its patron saint was Linus Torvald, the Norwegian hacker who decided to build a free version of Unix, the hugely complicated operating system that runs many of the world's large computers. Torvald created the basic implementation of his version, which he called Linux, posted it online, and invited people to contribute to its development. Over the years, thousands of programmers had helped, and Linux was now considered as good as proprietary versions of Unix. "Given enough eyeballs all bugs are shallow" was the Linux mantra: a thousand people working for an hour each can do a better job writing and fixing code than a single person working for a thou-sand hours, because the chances are that among those thousand people you can find precisely the right expert for every problem that comes up.
On the other hand, there was the "extreme programming" movement, known as XP, which was led by a legendary programmer named Kent Beck. He called for breaking a problem into the smallest possible increments, and proceeding as simply and modestly as possible. He thought that programmers should work in pairs, two to a computer, passing the keyboard back and forth. Between Beck and Torvald were countless other people, arguing for slightly different variations. But everyone in the software world agreed that trying to get people to be as creative as possible was, as often as not, a social problem: it depended not just on who was on the team but on how the team was organized.
"I remember once I was working with a printing company in Chicago," Beck says. "The people there were having a terrible problem with their technology. I got there, and I saw that the senior people had these corner offices, and they were working separately and doing things separately that they had trouble integrating later on. So I said, 'Find a space where you can work together.' So they found a corner of the machine room. It was a raised floor, ice cold. They just loved it. They would go there five hours a day, making lots of progress. I flew home. They hired me for my technical expertise. And I told them to rearrange the office furniture, and that was the most valuable thing I could offer them."
It seemed to Gundrum that people in the food world had a great deal to learn from all this. They had become adept at solving what he called "science projects" — problems that required straightforward, linear applications of expensive German machinery and armies of white-coated people with advanced degrees in engineering. Cool Whip was a good example: a product processed so exquisitely — with air bubbles of such fantastic uniformity and stability — that it remains structurally sound for months, at high elevation and at low elevation, frozen and thawed and then refrozen. But coming up with a healthy cookie, which required finessing the inherent contradictions posed by sugar, flour, and shortening, was the kind of problem that the food industry had more trouble with. Gundrum recalled one brainstorming session that a client of his, a major food company, had convened. "This is no joke," he said. "They played a tape where it sounded like the wind was blowing and the birds were chirping. And they posed us out on a dance floor, and we had to hold our arms out like we were trees and close our eyes, and the ideas were supposed to grow like fruits off the limbs of the trees. Next to me was the head of R. & D., and he looked at me and said: 'What the hell are we doing here?'"
For Project Delta, Gundrum decreed that there would be three teams, each representing a different methodology of invention. He had read Kent Beck's writings, and decided that the first would be the XP team. He enlisted two of Mattson's brightest young associates — Peter Dea and Dan Howell. Dea is a food scientist, who worked as a confectionist before coming to Mattson. He is tall and spare, with short dark hair. "Peter is really good at hitting the high note," Gundrum said. "If a product needs to have a particular flavor profile, he's really good at getting that one dimension and getting it right." Howell is a culinarian-goateed and talkative, a man of enthusiasms who uses high-end Mattson equipment to make an exceptional cup of espresso every afternoon. He started his career as a barista at Starbucks, and then realized that his vocation lay elsewhere. "A customer said to me, 'What do you want to be doing? Because you clearly don't want to be here,'" Howell said. "I told him, 'I want to be sitting in a room working on a better non-fat pudding.' "
The second team was headed by Barb Stuckey, an executive vice-president of marketing at Mattson and one of the firm's stars. She is slender and sleek, with short blond hair. She tends to think out loud, and, because she thinks quickly, she ends up talking quickly, too-in nervous brilliant bursts. Stuckey, Gundrum decided, would represent "managed" research and development—a traditional hierarchical team, as opposed to a partnership like Dea and Howell's. She would work with Doug Berg, who runs one of Mattson's product-development teams. Stuckey would draw the big picture. Berg would serve as sounding board and project director. His team would execute their conceptions.
Then Gundrum was at a technology conference in California and heard the software pioneer Mitch Kapor talking about the open-source revolution. Afterward, Gundrum approached Kapor. "I said to Mitch, 'What do you think? Can I apply this—some of the same principles—outside of software and bring it to the food industry?'" Gundrum recounted. "He stopped and said, 'Why the hell not!'" So Gundrum invited an Ă©lite group of food-industry bakers and scientists to collaborate online. They would be the third team. He signed up a senior person from Mars, Inc., someone from R. & D. at Kraft, the marketing manager for NestlĂ© Toll House refrigerated/frozen cookie dough, a senior director of R. & D. at Birds Eye Foods, the head of the innovation program for Kellogg's Morning Foods, the director of seasoning at McCormick, a cookie maven formerly at Keebler, and six more high-level specialists. Mattson's innovation manager, Carol Borba, who began her career as a line cook at Bouley, in Manhattan, was given the role of project manager. Two Mattson staffers were assigned to carry out the group's recommendations. This was the Dream Team. It is quite possible that this was the most talented group of people ever to work together in the history of the food industry.
Soon after the launch of Project Delta, Steve Gundrum and his colleague Samson Hsia were standing around, talking about the current products in the supermarket which they particularly admire. "I like the Uncrustable line from Smuckers," Hsia said. "It's a frozen sandwich without any crust. It eats very well. You can put it in a lunchbox frozen, and it will be unfrozen by lunchtime." Hsia is a trim, silver-haired man who is said to know as much about emulsions as anyone in the business. "There's something else," he said, suddenly. "We just saw it last week. It's made by Jennie-O. It's turkey in a bag." This was a turkey that was seasoned, plumped with brine, and sold in a heat-resistant plastic bag: the customer simply has to place it in the oven. Hsia began to stride toward the Mattson kitchens, because he realized they actually had a Jennie-O turkey in the back. Gundrum followed, the two men weaving their way through the maze of corridors that make up the Mattson offices. They came to a large freezer. Gundrum pulled out a bright-colored bag. Inside was a second, clear bag, and inside that bag was a twelve-pound turkey. "This is one of my favorite innovations of the last year," Gundrum said, as Hsia nodded happily. "There is material science involved. There is food science involved. There is positioning involved. You can take this thing, throw it in your oven, and people will be blown away. It's that good. If I was Butterball, I'd be terrified."
Jennie-O had taken something old and made it new. But where had that idea come from? Was it a team? A committee? A lone turkey genius? Those of us whose only interaction with such innovations is at the point of sale have a naĂŻve faith in human creativity; we suppose that a world capable of coming up with turkey in a bag is capable of coming up with the next big thing as well—a healthy cookie, a faster computer chip, an automobile engine that gets a hundred miles to the gallon. But if you're the one responsible for those bright new ideas there is no such certainty. You come up with one great idea, and the process is so miraculous that all you do is puzzle over how on earth you ever did it, and worry whether you'll ever be able to do it again.
3.
The Mattson kitchens are a series of large, connecting rooms, running along the back of the building. There is a pilot plant in one corner — containing a mini version of the equipment that, say, Heinz would use to make canned soup, a soft-serve ice-cream machine, an industrial-strength pasta-maker, a colloid mill for making oil-and-water emulsions, a flash pasteurizer, and an eighty-five-thousand-dollar Japanese-made coextruder for, among other things, pastry-and-filling combinations. At any given time, the firm may have as many as fifty or sixty projects under way, so the kitchens are a hive of activity, with pressure cookers filled with baked beans bubbling in one corner, and someone rushing from one room to another carrying a tray of pizza slices with experimental toppings.
Dea and Howell, the XP team, took over part of one of the kitchens, setting up at a long stainless-steel lab bench. The countertop was crowded with tins of flour, a big white plastic container of wheat dextrin, a dozen bottles of liquid sweeteners, two plastic bottles of Kirkland olive oil, and, somewhat puzzlingly, three varieties of single-malt Scotch. The Project Delta brief was simple. All cookies had to have fewer than a hundred and thirty calories per serving. Carbohydrates had to be under 17.5 grams, saturated fat under two grams, fibre more than one gram, protein more than two grams, and so on; in other words, the cookie was to be at least fifteen per cent superior to the supermarket average in the major nutritional categories. To Dea and Howell, that suggested oatmeal, and crispy, as opposed to soft. "I've tried lots of cookies that are sold as soft and I never like them, because they're trying to be something that they're not," Dea explained. "A soft cookie is a fresh cookie, and what you are trying to do with soft is be a fresh cookie that's a month old. And that means you need to fake the freshness, to engineer the cookie."
The two decided to focus on a kind of oatmeal-chocolate-chip hybrid, with liberal applications of roasted soy nuts, toffee, and caramel. A straight oatmeal-raisin cookie or a straight low-cal chocolate-chip cookie was out of the question. This was a reflection of what might be called the Hidden Valley Ranch principle, in honor of a story that Samson Hsia often told about his years working on salad dressing when he was at Clorox. The couple who owned Hidden Valley Ranch, near Santa Barbara, had come up with a seasoning blend of salt, pepper, onion, garlic, and parsley flakes that was mixed with equal parts mayonnaise and buttermilk to make what was, by all accounts, an extraordinary dressing. Clorox tried to bottle it, but found that the buttermilk could not coexist, over any period of time, with the mayonnaise. The way to fix the problem, and preserve the texture, was to make the combination more acidic. But when you increased the acidity you ruined the flavor. Clorox's food engineers worked on Hidden Valley Ranch dressing for close to a decade. They tried different kinds of processing and stability control and endless cycles of consumer testing before they gave up and simply came out with a high-acid Hidden Valley Ranch dressing — which promptly became a runaway best-seller. Why? Because consumers had never tasted real Hidden Valley Ranch dressing, and as a result had no way of knowing that what they were eating was inferior to the original. For those in the food business, the lesson was unforgettable: if something was new, it didn't have to be perfect. And, since healthful, indulgent cookies couldn't be perfect, they had to be new: hence oatmeal, chocolate chips, toffee, and caramel.
Cookie development, at the Mattson level, is a matter of endless iteration, and Dea and Howell began by baking version after version in quick succession — establishing the cookie size, the optimal baking time, the desired variety of chocolate chips, the cut of oats (bulk oats? rolled oats? groats?), the varieties of flour, and the toffee dosage, while testing a variety of high-tech supplements, notably inulin, a fibre source derived from chicory root. As they worked, they made notes on tablet P.C.s, which gave them a running electronic record of each version. "With food, there's a large circle of pretty good, and we're solidly in pretty good," Dea announced, after several intensive days of baking. A tray of cookies was cooling in front of him on the counter. "Typically, that's when you take it to the customers."
In this case, the customer was Gundrum, and the next week Howell marched over to Gundrum's office with two Ziploc bags of cookies in his hand. There was a package of Chips Ahoy! on the table, and Howell took one out. "We've been eating these versus Chips Ahoy!," he said.
The two cookies looked remarkably alike. Gundrum tried one of each. "The Chips Ahoy!, it's tasty," he said. "When you eat it, the starch hydrates in your mouth. The XP doesn't have that same granulated-sugar kind of mouth feel."
"It's got more fat than us, though, and subsequently it's shorter in texture," Howell said. "And so, when you break it, it breaks more nicely. Ours is a little harder to break."
By "shorter in texture," he meant that the cookie "popped" when you bit into it. Saturated fats are solid fats, and give a cookie crispness. Parmesan cheese is short-textured. Brie is long. A shortbread like a Lorna Doone is a classic short-textured cookie. But the XP cookie had, for health reasons, substituted unsaturated fats for saturated fats, and unsaturated fats are liquid. They make the dough stickier, and inevitably compromise a little of that satisfying pop.
"The whole-wheat flour makes us a little grittier, too," Howell went on. "It has larger particulates." He broke open one of the Chips Ahoy!. "See how fine the grain is? Now look at one of our cookies. The particulates are larger. It is part of what we lose by going with a healthy profile. If it was just sugar and ¦our, for instance, the carbohydrate chains are going to be shorter, and so they will dissolve more quickly in your mouth. Whereas with more fibre you get longer carbohydrate chains and they don't dissolve as quickly, and you get that slightly tooth-packing feel."
"It looks very wholesome, like something you would want to feed your kids," Gundrum said, finally. They were still only in the realm of pretty good.
4.
Team Stuckey, meanwhile, was having problems of its own. Barb Stuckey's first thought had been a tea cookie, or, more specifically, a chai cookie — something with cardamom and cinnamon and vanilla and cloves and a soft dairy note. Doug Berg was dispatched to run the experiment. He and his team did three or four rounds of prototypes. The result was a cookie that tasted, astonishingly, like a cup of chai, which was, of course, its problem. Who wanted a cookie that tasted like a cup of chai? Stuckey called a meeting in the Mattson trophy room, where samples of every Mattson product that has made it to market are displayed. After everyone was done tasting the cookies, a bag of them sat in the middle of the table for forty-five minutes—and no one reached to take a second bite. It was a bad sign.
"You know, before the election Good Housekeeping had this cookie bakeoff," Stuckey said, as the meeting ended. "Laura Bush's entry was full of chocolate chips and had familiar ingredients. And Teresa Heinz went with pumpkin-spice cookies. I remember thinking, That's just like the Democrats! So not mainstream! I wanted her to win. But she's chosen this cookie that's funky and weird and out of the box. And I kind of feel the same way about the tea cookie. It's too far out, and will lose to something that's more comfortable for consumers."
Stuckey's next thought involved strawberries and a shortbread base. But shortbread was virtually impossible under the nutritional guidelines: there was no way to get that smooth butter-flour-sugar combination. So Team Stuckey switched to something closer to a strawberry-cobbler cookie, which had the Hidden Valley Ranch advantage that no one knew what a strawberry-cobbler cookie was supposed to taste like. Getting the carbohydrates down to the required 17.5 grams, though, was a struggle, because of how much flour and fruit cobbler requires. The obvious choice to replace the flour was almonds. But nuts have high levels of both saturated and unsaturated fat. "It became a balancing act," Anne Cristofano, who was doing the bench work for Team Stuckey, said. She baked batch after batch, playing the carbohydrates (first the flour, and then granulated sugar, and finally various kinds of what are called sugar alcohols, low-calorie sweeteners derived from hydrogenizing starch) against the almonds. Cristofano took a version to Stuckey. It didn't go well.
"We're not getting enough strawberry impact from the fruit alone," Stuckey said. "We have to find some way to boost the strawberry." She nibbled some more. "And, because of the low fat and all that stuff, I don't feel like we're getting that pop."
The Dream Team, by any measure, was the overwhelming Project Delta favorite. This was, after all, the Dream Team, and if any idea is ingrained in our thinking it is that the best way to solve a difficult problem is to bring the maximum amount of expertise to bear on it. Sure enough, in the early going the Dream Team was on fire. The members of the Dream Team did not doggedly fix on a single idea, like Dea and Howell, or move in fits and starts from chai sugar cookies to strawberry shortbread to strawberry cobbler, like Team Stuckey. It came up with thirty-four ideas, representing an astonishing range of cookie philosophies: a chocolate cookie with gourmet cocoa, high-end chocolate chips, pecans, raisins, Irish steel-cut oats, and the new Ultragrain White Whole Wheat flour; a bite-size oatmeal cookie with a Ceylon cinnamon filling, or chili and tamarind, or pieces of dried peaches with a cinnamon-and-ginger dusting; the classic seven-layer bar with oatmeal instead of graham crackers, coated in chocolate with a choice of coffee flavors; a "wellness" cookie, with an oatmeal base, soy and whey proteins, inulin and oat beta glucan and a combination of erythritol and sugar and sterol esters—and so on.
In the course of spewing out all those new ideas, however, the Dream Team took a difficult turn. A man named J. Hugh McEvoy (a.k.a. Chef J.), out of Chicago, tried to take control of the discussion. He wanted something exotic — not a health-food version of something already out there. But in the e-mail discussions with others on the team his sense of what constituted exotic began to get really exotic — "Chinese star anise plus fennel plus Pastis plus dark chocolate." Others, emboldened by his example, began talking about a possible role for zucchini or wasabi peas. Meanwhile, a more conservative faction, mindful of the Project Delta mandate to appeal to the whole family, started talking up peanut butter. Within a few days, the tensions were obvious:
From: Chef J.
Subject: <no subject>
Please keep in mind that less than 10 years ago, espresso, latte and dulce de leche were EXOTIC flavors / products that were considered unsuitable for the mainstream. And let's not even mention CHIPOTLE.
From: Andy Smith
Subject: Bought any Ben and Jerry's recently?
While we may not want to invent another Oreo or Chips Ahoy!, last I looked, World's Best Vanilla was B&J's # 2 selling flavor and Haagen Dazs' Vanilla (their top seller) outsold Dulce 3 to 1.
From: Chef J.
Subject: <no subject>Yes. Gourmet Vanilla does outsell any new flavor. But we must remember that DIET vanilla does not and never has. It is the high end, gourmet segment of ice cream that is growing. Diet Oreos were vastly outsold by new entries like Snackwells. Diet Snickers were vastly outsold by new entries like balance bars. New Coke failed miserably, while Red Bull is still growing.
What flavor IS Red Bull, anyway?
Eventually, Carol Borba, the Dream Team project leader, asked Gundrum whether she should try to calm things down. He told her no; the group had to find its "own kind of natural rhythm." He wanted to know what fifteen high-powered bakers thrown together on a project felt like, and the answer was that they felt like chaos. They took twice as long as the XP team. They created ten times the headache.
Worse, no one in the open-source group seemed to be having any fun. "Quite honestly, I was expecting a bit more involvement in this," Howard Plein, of Edlong Dairy Flavors, confessed afterward. "They said, expect to spend half an hour a day. But without doing actual bench work — all we were asked to do was to come up with ideas." He wanted to bake: he didn't enjoy being one of fifteen cogs in a machine. To Dan Fletcher, of Kellogg's, "the whole thing spun in place for a long time. I got frustrated with that. The number of people involved seemed unwieldy. You want some diversity of youth and experience, but you want to keep it close-knit as well. You get some depth in the process versus breadth. We were a mile wide and an inch deep." Chef J., meanwhile, felt thwarted by Carol Borba; he felt that she was pushing her favorite, a caramel turtle, to the detriment of better ideas. "We had the best people in the country involved," he says. "We were irrelevant. That's the weakness of it. Fifteen is too many. How much true input can any one person have when you are lost in the crowd?" In the end, the Dream Team whittled down its thirty-four possibilities to one: a chewy oatmeal cookie, with a pecan "thumbprint" in the middle, and ribbons of caramel-and-chocolate glaze. When Gundrum tasted it, he had nothing but praise for its "cookie hedonics." But a number of the team members were plainly unhappy with the choice. "It is not bad," Chef J. said. "But not bad doesn't win in the food business. There was nothing there that you couldn't walk into a supermarket and see on the shelf. Any Pepperidge Farm product is better than that. Any one."
It may have been a fine cookie. But, since no single person played a central role in its creation, it didn't seem to anyone to be a fine cookie.
The strength of the Dream Team — the fact that it had so many smart people on it — was also its weakness: it had too many smart people on it. Size provides expertise. But it also creates friction, and one of the truths Project Delta exposed is that we tend to overestimate the importance of expertise and underestimate the problem of friction. Gary Klein, a decision-making consultant, once examined this issue in depth at a nuclear power plant in North Carolina. In the nineteen-nineties, the power supply used to keep the reactor cool malfunctioned. The plant had to shut down in a hurry, and the shutdown went badly. So the managers brought in Klein's consulting group to observe as they ran through one of the crisis rehearsals mandated by federal regulators. "The drill lasted four hours," David Klinger, the lead consultant on the project, recalled. "It was in this big operations room, and there were between eighty and eighty-five people involved. We roamed around, and we set up a video camera, because we wanted to make sense of what was happening."
When the consultants asked people what was going on, though, they couldn't get any satisfactory answers. "Each person only knew a little piece of the puzzle, like the radiation person knew where the radiation was, or the maintenance person would say, 'I'm trying to get this valve closed,' " Klinger said. "No one had the big picture. We started to ask questions. We said, 'What is your mission?' And if the person didn't have one, we said, 'Get out.' There were just too many people. We ended up getting that team down from eighty-five to thirty-five people, and the first thing that happened was that the noise in the room was dramatically reduced." The room was quiet and calm enough so that people could easily find those they needed to talk to. "At the very end, they had a big drill that the N.R.C. was going to regulate. The regulators said it was one of their hardest drills. And you know what? They aced it." Was the plant's management team smarter with thirty-five people on it than it was with eighty-five? Of course not, but the expertise of those additional fifty people was more than cancelled out by the extra confusion and noise they created.
The open-source movement has had the same problem. The number of people involved can result in enormous friction. The software theorist Joel Spolsky points out that open-source software tends to have user interfaces that are difficult for ordinary people to use: "With Microsoft Windows, you right-click on a folder, and you're given the option to share that folder over the Web. To do the same thing with Apache, the open-source Web server, you've got to track down a file that has a different name and is stored in a different place on every system. Then you have to edit it, and it has its own syntax and its own little programming language, and there are lots of different comments, and you edit it the first time and it doesn't work and then you edit it the second time and it doesn't work."
Because there are so many individual voices involved in an open-source project, no one can agree on the right way to do things. And, because no one can agree, every possible option is built into the software, thereby frustrating the central goal of good design, which is, after all, to understand what to leave out. Spolsky notes that almost all the successful open-source products have been attempts to clone some preexisting software program, like Microsoft's Internet Explorer, or Unix. "One of the reasons open source works well for Linux is that there isn't any real design work to be undertaken," he says. "They were doing what we would call chasing tail-lights." Open source was great for a science project, in which the goals were clearly defined and the technical hurdles easily identifiable. Had Project Delta been a Cool Whip bakeoff, an exercise in chasing tail-lights, the Dream Team would easily win. But if you want to design a truly innovative software program — or a truly innovative cookie — the costs of bigness can become overwhelming.
In the frantic final weeks before the bakeoff, while the Dream Team was trying to fix a problem with crumbling, and hit on the idea of glazing the pecan on the face of the cookie, Dea and Howell continued to make steady, incremental improvements.
"These cookies were baked five days ago," Howell told Gundrum, as he handed him a Ziploc bag. Dea was off somewhere in the Midwest, meeting with clients, and Howell looked apprehensive, stroking his goatee nervously as he stood by Gundrum's desk. "We used wheat dextrin, which I think gives us some crispiness advantages and some shelf-stability advantages. We have a little more vanilla in this round, which gives you that brown, rounding background note."
Gundrum nodded. "The vanilla is almost like a surrogate for sugar," he said. "It potentiates the sweetness."
"Last time, the leavening system was baking soda and baking powder," Howell went on. "I switched that to baking soda and monocalcium phosphate. That helps them rise a little bit better. And we baked them at a slightly higher temperature for slightly longer, so that we drove off a little bit more moisture."
"How close are you?" Gundrum asked.
"Very close," Howell replied.
Gundrum was lost in thought for a moment. "It looks very wholesome. It looks like something you'd want to feed your kids. It has very good aroma. I really like the texture. My guess is that it eats very well with milk." He turned back to Howell, suddenly solicitous. "Do you want some milk?"
Meanwhile, Barb Stuckey had a revelation. She was working on a tortilla-chip project, and had bags of tortilla chips all over her desk. "You have no idea how much engineering goes into those things," she said, holding up a tortilla chip. "It's greater than what it takes to build a bridge. It's crazy." And one of the clever things about cheese tortilla chips—particularly the low-fat versions—is how they go about distracting the palate. "You know how you put a chip in your mouth and the minute it hits your tongue it explodes with flavor?" Stuckey said. "It's because it's got this topical seasoning. It's got dried cheese powders and sugar and probably M.S.G. and all that other stuff on the outside of the chip."
Her idea was to apply that technique to strawberry cobbler—to take large crystals of sugar, plate them with citric acid, and dust the cookies with them. "The minute they reach your tongue, you get this sweet-and-sour hit, and then you crunch into the cookie and get the rest—the strawberry and the oats," she said. The crystals threw off your taste buds. You weren't focussed on the fact that there was half as much fat in the cookie as there should be. Plus, the citric acid brought a tangy flavor to the dried strawberries: suddenly they felt fresh.
Batches of the new strawberry-cobbler prototype were ordered up, with different formulations of the citric acid and the crystals. A meeting was called in the trophy room. Anne Cristofano brought two plastic bags filled with cookies. Stuckey was there, as was a senior Mattson food technologist named Karen Smithson, an outsider brought to the meeting in an advisory role. Smithson, a former pastry chef, was a little older than Stuckey and Cristofano, with an air of self-possession. She broke the seal on the first bag, and took a bite with her eyes half closed. The other two watched intently.
"Umm," Smithson said, after the briefest of pauses. "That is pretty darn good. And this is one of the healthy cookies? I would not say, 'This is healthy.' I can't taste the trade-off." She looked up at Stuckey. "How old are they?"
"Today," Stuckey replied.
"O.K. . . ." This was a complicating fact. Any cookie tastes good on the day it's baked. The question was how it tasted after baking and packaging and shipping and sitting in a warehouse and on a supermarket shelf and finally in someone's cupboard.
"What we're trying to do here is a shelf-stable cookie that will last six months," Stuckey said. "I think we're better off if we can make it crispy."
Smithson thought for a moment. "You can have either a crispy, low-moisture cookie or a soft and chewy cookie," she said. "But you can't get the outside crisp and the inside chewy. We know that. The moisture will migrate. It will equilibrate over time, so you end up with a cookie that's consistent all the way through. Remember we did all that work on Mrs. Fields? That's what we learned."
They talked for a bit, in technical terms, about various kinds of sugars and starches. Smithson didn't think that the stability issue was going to be a problem.
"Isn't it compelling, visually?" Stuckey blurted out, after a lull in the conversation. And it was: the dried-strawberry chunks broke though the surface of the cookie, and the tiny citric-sugar crystals glinted in the light. "I just think you get so much more bang for the buck when you put the seasoning on the outside."
"Yet it's not weird," Smithson said, nodding. She picked up another cookie. "The mouth feel is a combination of chewy and crunchy. With the flavors, you have the caramelized sugar, the brown-sugar notes. You have a little bit of a chew from the oats. You have a flavor from the strawberry, and it helps to have a combination of the sugar alcohol and the brown sugar. You know, sugars have different deliveries, and sometimes you get some of the sweetness right off and some of it continues on. You notice that a lot with the artificial sweeteners. You get the sweetness that doesn't go away, long after the other flavors are gone. With this one, the sweetness is nice. The flavors come together at the same time and fade at the same time, and then you have the little bright after-hits from the fruit and the citric crunchies, which are" — she paused, looking for the right word — "brilliant."
5.
The bakeoff took place in April. Mattson selected a representative sample of nearly three hundred households from around the country. Each was mailed bubble-wrapped packages containing all three entrants. The vote was close but unequivocal. Fourteen per cent of the households voted for the XP oatmeal-chocolate-chip cookie. Forty-one per cent voted for the Dream Team's oatmeal-caramel cookie. Forty-four per cent voted for Team Stuckey's strawberry cobbler.
The Project Delta postmortem was held at Chaya Brasserie, a French-Asian fusion restaurant on the Embarcadero, in San Francisco. It was just Gundrum and Steven Addis, from the first Project Delta dinner, and their wives. Dan Howell was immersed in a confidential project for a big food conglomerate back East. Peter Dea was working with Cargill on a wellness product. Carol Borba was in Chicago, at a meeting of the Food Marketing Institute. Barb Stuckey was helping Ringling Brothers rethink the food at its concessions. "We've learned a lot about the circus," Gundrum said. Meanwhile, Addis's firm had created a logo and a brand name for Project Delta. Mattson has offered to license the winning cookie at no cost, as long as a percentage of its sales goes to a charitable foundation that Mattson has set up to feed the hungry. Someday soon, you should be able to go into a supermarket and buy Team Stuckey's strawberry-cobbler cookie.
"Which one would you have voted for?" Addis asked Gundrum.
"I have to say, they were all good in their own way," Gundrum replied. It was like asking a mother which of her children she liked best. "I thought Barb's cookie was a little too sweet, and I wish the open-source cookie was a little tighter, less crumbly. With XP, I think we would have done better, but we had a wardrobe malfunction. They used too much batter, overbaked it, and the cookie came out too hard and thick.
"In the end, it was not so much which cookie won that interested him. It was who won—and why. Three people from his own shop had beaten a Dream Team, and the decisive edge had come not from the collective wisdom of a large group but from one person's ability to make a lateral connection between two previously unconnected objects — a tortilla chip and a cookie. Was that just Barb being Barb? In large part, yes. But it was hard to believe that one of the Dream Team members would not have made the same kind of leap had they been in an environment quiet enough to allow them to think.
"Do you know what else we learned?" Gundrum said. He was talking about a questionnaire given to the voters. "We were looking at the open-ended questions — where all the families who voted could tell us what they were thinking. They all said the same thing — all of them." His eyes grew wide. "They wanted better granola bars and breakfast bars. I would not have expected that." He fell silent for a moment, turning a granola bar over and around in his mind, assembling and disassembling it piece by piece, as if it were a model airplane. "I thought that they were pretty good," he said. "I mean, there are so many of them out there. But apparently people want them better."
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
The Cellular Church
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
September 12, 2005
Letter From Saddleback
How Rick Warren built his ministry.
1.
On the occasion of the twenty-fifth anniversary of Saddleback Church, Rick Warren hired the Anaheim Angels' baseball stadium. He wanted to address his entire congregation at once, and there was no way to fit everyone in at Saddleback, where the crowds are spread across services held over the course of an entire weekend. So Warren booked the stadium and printed large, silver-black-and-white tickets, and, on a sunny Sunday morning last April, the tens of thousands of congregants of one of America's largest churches began to file into the stands. They were wearing shorts and T-shirts and buying Cokes and hamburgers from the concession stands, if they had not already tailgated in the parking lot. On the field, a rock band played loudly and enthusiastically. Just after one o'clock, a voice came over the public-address system—"RIIIICK WARRRREN"—and Warren bounded onto the stage, wearing black slacks, a red linen guayabera shirt, and wraparound NASCAR sunglasses. The congregants leaped to their feet."You know," Warren said, grabbing the microphone, "there are two things I've always wanted to do in a stadium." He turned his body sideways, playing an imaginary guitar, and belted out the first few lines of Jimi Hendrix's "Purple Haze." His image was up on the Jumbotrons in right and left fields, just below the Verizon and Pepsi and Budweiser logos. He stopped and grinned. "The other thing is, I want to do a wave!" He pointed to the bleachers, and then to the right-field seats, and around and around the stadium the congregation rose and fell, in four full circuits. "You are the most amazing church in America!" Warren shouted out, when they had finally finished. "AND I LOVE YOU!"
2.
Rick Warren is a large man, with a generous stomach. He has short, spiky hair and a goatee. He looks like an ex-athlete, or someone who might have many tattoos. He is a hugger, enfolding those he meets in his long arms and saying things like "Hey, man." According to Warren, from sixth grade through college there wasn't a day in his life that he wasn't president of something, and that makes sense, because he's always the one at the center of the room talking or laughing, with his head tilted way back, or crying, which he does freely. In the evangelical tradition, preachers are hard or soft. Billy Graham, with his piercing eyes and protruding chin and Bible clenched close to his chest, is hard. So was Martin Luther King, Jr., who overwhelmed his audience with his sonorous, forcefully enunciated cadences. Warren is soft. His sermons are conversational, delivered in a folksy, raspy voice. He talks about how he loves Krispy Kreme doughnuts, drives a four-year-old Ford, and favors loud Hawaiian shirts, even at the pulpit, because, he says, "they do not itch."
In December of 1979, when Warren was twenty-five years old, he and his wife, Kay, took their four-month-old baby and drove in a U-Haul from Texas to Saddleback Valley, in Orange County, because Warren had read that it was one of the fastest-growing counties in the country. He walked into the first real-estate office he found and introduced himself to the first agent he saw, a man named Don Dale. He was looking for somewhere to live, he said.
"Do you have any money to rent a house?" Dale asked.
"Not much, but we can borrow some," Warren replied.
"Do you have a job?"
"No. I don't have a job."
"What do you do for a living?"
"I'm a minister."
"So you have a church?"
"Not yet."
Dale found him an apartment that very day, of course: Warren is one of those people whose lives have an irresistible forward momentum. In the car on the way over, he recruited Dale as the first member of his still nonexistent church, of course. And when he held his first public service, three months later, he stood up in front of two hundred and five people he barely knew in a high-school gymnasium—this shiny-faced preacher fresh out of seminary—and told them that one day soon their new church would number twenty thousand people and occupy a campus of fifty acres. Today, Saddleback Church has twenty thousand members and occupies a campus of a hundred and twenty acres. Once, Warren wanted to increase the number of small groups at Saddleback—the groups of six or seven that meet for prayer and fellowship during the week—by three hundred. He went home and prayed and, as he tells it, God said to him that what he really needed to do was increase the number of small groups by three thousand, which is just what he did. Then, a few years ago, he wrote a book called "The Purpose-Driven Life," a genre of book that is known in the religious-publishing business as "Christian Living," and that typically sells thirty or forty thousand copies a year. Warren's publishers came to see him at Saddleback, and sat on the long leather couch in his office, and talked about their ideas for the book. "You guys don't understand," Warren told them. "This is a hundred-million-copy book." Warren remembers stunned silence: "Their jaws dropped." But now, nearly three years after its publication, "The Purpose-Driven Life" has sold twenty-three million copies. It is among the best-selling nonfiction hardcover books in American history. Neither the New York Times, the Los Angeles Times, nor the Washington Post has reviewed it. Warren's own publisher didn't see it coming. Only Warren had faith. "The best of the evangelical tradition is that you don't plan your way forward—you prophesy your way forward," the theologian Leonard Sweet says. "Rick's prophesying his way forward."
Not long after the Anaheim service, Warren went back to his office on the Saddleback campus. He put his feet up on the coffee table. On the wall in front of him were framed originals of the sermons of the nineteenth-century preacher Charles Spurgeon, and on the bookshelf next to him was his collection of hot sauces. "I had dinner with Jack Welch last Sunday night," he said. "He came to church, and we had dinner. I've been kind of mentoring him on his spiritual journey. And he said to me, 'Rick, you are the biggest thinker I have ever met in my life. The only other person I know who thinks globally like you is Rupert Murdoch.' And I said, 'That's interesting. I'm Rupert's pastor! Rupert published my book!'" Then he tilted back his head and gave one of those big Rick Warren laughs.
3.
Churches, like any large voluntary organization, have at their core a contradiction. In order to attract newcomers, they must have low barriers to entry. They must be unintimidating, friendly, and compatible with the culture they are a part of. In order to retain their membership, however, they need to have an identity distinct from that culture. They need to give their followers a sense of community—and community, exclusivity, a distinct identity are all, inevitably, casualties of growth. As an economist would say, the bigger an organization becomes, the greater a free-rider problem it has. If I go to a church with five hundred members, in a magnificent cathedral, with spectacular services and music, why should I volunteer or donate any substantial share of my money? What kind of peer pressure is there in a congregation that large? If the barriers to entry become too low—and the ties among members become increasingly tenuous—then a church as it grows bigger becomes weaker.
One solution to the problem is simply not to grow, and, historically, churches have sacrificed size for community. But there is another approach: to create a church out of a network of lots of little church cells—exclusive, tightly knit groups of six or seven who meet in one another's homes during the week to worship and pray. The small group as an instrument of community is initially how Communism spread, and in the postwar years Alcoholics Anonymous and its twelve-step progeny perfected the small-group technique. The small group did not have a designated leader who stood at the front of the room. Members sat in a circle. The focus was on discussion and interaction—not one person teaching and the others listening—and the remarkable thing about these groups was their power. An alcoholic could lose his job and his family, he could be hospitalized, he could be warned by half a dozen doctors—and go on drinking. But put him in a room of his peers once a week—make him share the burdens of others and have his burdens shared by others—and he could do something that once seemed impossible.
When churches—in particular, the megachurches that became the engine of the evangelical movement, in the nineteen-seventies and eighties—began to adopt the cellular model, they found out the same thing. The small group was an extraordinary vehicle of commitment. It was personal and flexible. It cost nothing. It was convenient, and every worshipper was able to find a small group that precisely matched his or her interests. Today, at least forty million Americans are in a religiously based small group, and the growing ranks of small-group membership have caused a profound shift in the nature of the American religious experience."
As I see it, one of the most unfortunate misunderstandings of our time has been to think of small intentional communities as groups 'within' the church," the philosopher Dick Westley writes in one of the many books celebrating the rise of small-group power. "When are we going to have the courage to publicly proclaim what everyone with any experience with small groups has known all along: they are not organizations 'within' the church; they are church."
Ram Cnaan, a professor of social work at the University of Pennsylvania, recently estimated the replacement value of the charitable work done by the average American church—that is, the amount of money it would take to equal the time, money, and resources donated to the community by a typical congregation—and found that it came to about a hundred and forty thousand dollars a year. In the city of Philadelphia, for example, that works out to an annual total of two hundred and fifty million dollars' worth of community "good"; on a national scale, the contribution of religious groups to the public welfare is, as Cnaan puts it, "staggering." In the past twenty years, as the enthusiasm for publicly supported welfare has waned, churches have quietly and steadily stepped in to fill the gaps. And who are the churchgoers donating all that time and money? People in small groups. Membership in a small group is a better predictor of whether people volunteer or give money than how often they attend church, whether they pray, whether they've had a deep religious experience, or whether they were raised in a Christian home. Social action is not a consequence of belief, in other words. I don't give because I believe in religious charity. I give because I belong to a social structure that enforces an ethic of giving. "Small groups are networks," the Princeton sociologist Robert Wuthnow, who has studied the phenomenon closely, says. "They create bonds among people. Expose people to needs, provide opportunities for volunteering, and put people in harm's way of being asked to volunteer. That's not to say that being there for worship is not important. But, even in earlier research, I was finding that if people say all the right things about being a believer but aren't involved in some kind of physical social setting that generates interaction, they are just not as likely to volunteer."
Rick Warren came to the Saddle-back Valley just as the small-group movement was taking off. He was the son of a preacher—a man who started seven churches in and around Northern California and was enough of a carpenter to have built a few dozen more with his own hands—and he wanted to do what his father had done: start a church from scratch.
For the first three months, he went from door to door in the neighborhood around his house, asking people why they didn't attend church. Churches were boring and irrelevant to everyday life, he was told. They were unfriendly to visitors. They were too interested in money. They had inadequate children's programs. So Warren decided that in his new church people would play and sing contemporary music, not hymns. (He could find no one, Warren likes to say, who listened to organ music in the car.) He would wear the casual clothes of his community. The sermons would be practical and funny and plainspoken, and he would use video and drama to illustrate his message. And when an actual church was finally built—Saddleback used seventy-nine different locations in its first thirteen years, from high-school auditoriums to movie theatres and then tents before building a permanent home—the church would not look churchy: no pews, or stained glass, or lofty spires. Saddleback looks like a college campus, and the main sanctuary looks like the school gymnasium. Parking is plentiful. The chairs are comfortable. There are loudspeakers and television screens everywhere broadcasting the worship service, and all the doors are open, so anyone can slip in or out, at any time, in the anonymity of the enormous crowds. Saddle-back is a church with very low barriers to entry.
But beneath the surface is a network of thousands of committed small groups. "Orange County is virtually a desert in social-capital terms," the Harvard political scientist Robert Putnam, who has taken a close look at the Saddleback success story, says. "The rate of mobility is really high. It has long and anonymous commutes. It's a very friendless place, and this church offers serious heavy friendship. It's a very interesting experience to talk to some of those groups. There were these eight people and they were all mountain bikers—mountain bikers for God. They go biking together, and they are one another's best friends. If one person's wife gets breast cancer, he can go to the others for support. If someone loses a job, the others are there for him. They are deeply best friends, in a larger social context where it is hard to find a best friend."
Putnam goes on, "Warren didn't invent the cellular church. But he's brought it to an amazing level of effectiveness. The real job of running Saddleback is the recruitment and training and retention of the thousands of volunteer leaders for all the small groups it has. That's the surprising thing to me—that they are able to manage that. Those small groups are incredibly vulnerable, and complicated to manage. How to keep all those little dinghies moving in the same direction is, organizationally, a major accomplishment."
At Saddleback, members are expected to tithe, and to volunteer. Sunday-school teachers receive special training and a police background check. Recently, Warren decided that Saddleback would feed every homeless person in Orange County three meals a day for forty days. Ninety-two hundred people volunteered. Two million pounds of food were collected, sorted, and distributed.
It may be easy to start going to Saddleback. But it is not easy to stay at Saddleback. "Last Sunday, we took a special offering called Extend the Vision, for people to give over and above their normal offering," Warren said. "We decided we would not use any financial consultants, no high-powered gimmicks, no thermometer on the wall. It was just 'Folks, you know you need to give.' Sunday's offering was seven million dollars in cash and fifty-three million dollars in commitments. That's one Sunday. The average commitment was fifteen thousand dollars a family. That's in addition to their tithe. When people say megachurches are shallow, I say you have no idea. These people are committed."
Warren's great talent is organizational. He's not a theological innovator. When he went from door to door, twenty-five years ago, he wasn't testing variants on the Christian message. As far as he was concerned, the content of his message was non-negotiable. Theologically, Warren is a straight-down-the-middle evangelical. What he wanted to learn was how to construct an effective religious institution. His interest was sociological. Putnam compares Warren to entrepreneurs like Ray Kroc and Sam Walton, pioneers not in what they sold but in how they sold. The contemporary thinker Warren cites most often in conversation is the management guru Peter Drucker, who has been a close friend of his for years. Before Warren wrote "The Purpose-Driven Life," he wrote a book called "The Purpose-Driven Church," which was essentially a how-to guide for church builders. He's run hundreds of training seminars around the world for ministers of small-to-medium-sized churches. At the beginning of the Internet boom, he created a Web site called pastors.com, on which he posted his sermons for sale for four dollars each. There were many pastors in the world, he reasoned, who were part time. They had a second, nine-to-five job and families of their own, and what little free time they had was spent ministering to their congregation. Why not help them out with Sunday morning? The Web site now gets nearly four hundred thousand hits a day.
"I went to South Africa two years ago," Warren said. "We did the purpose-driven-church training, and we simulcast it to ninety thousand pastors across Africa. After it was over, I said, 'Take me out to a village and show me some churches.'"
In the first village they went to, the local pastor came out, saw Warren, and said, "I know who you are. You're Pastor Rick."
"And I said, 'How do you know who I am?' " Warren recalled. "He said, 'I get your sermons every week.' And I said, 'How? You don't even have electricity here.' And he said, 'We're putting the Internet in every post office in South Africa. Once a week, I walk an hour and a half down to the post office. I download it. Then I teach it. You are the only training I have ever received.'"
A typical evangelist, of course, would tell stories about reaching ordinary people, the unsaved laity. But a typical evangelist is someone who goes from town to town, giving sermons to large crowds, or preaching to a broad audience on television. Warren has never pastored any congregation but Saddleback, and he refuses to preach on television, because that would put him in direct competition with the local pastors he has spent the past twenty years cultivating. In the argot of the New Economy, most evangelists follow a business-to-consumer model: b-to-c. Warren follows a business-to-business model: b-to-b. He reaches the people who reach people. He's a builder of religious networks. "I once heard Drucker say this," Warren said. "'Warren is not building a tent revival ministry, like the old-style evangelists. He's building an army, like the Jesuits.'"
4.
To write "The Purpose-Driven Life," Warren holed up in an office in a corner of the Saddleback campus, twelve hours a day for seven months. "I would get up at four-thirty, arrive at my special office at five, and I would write from five to five," he said. "I'm a people person, and it about killed me to be alone by my-self. By eleven-thirty, my A.D.D. would kick in. I would do anything not to be there. It was like birthing a baby." The book didn't tell any stories. It wasn't based on any groundbreaking new research or theory or theological insight. "I'm just not that good a writer," Warren said. "I'm a pastor. There's nothing new in this book. But sometimes as I was writing it I would break down in tears. I would be weeping, and I would feel like God was using me."
The book begins with an inscription: "This book is dedicated to you. Before you were born, God planned this moment in your life. It is no accident that you are holding this book. God longs for you to discover the life he created you to live—here on earth, and forever in eternity." Five sections follow, each detailing one of God's purposes in our lives—"You Were Planned for God's Pleasure"; "You Were Formed for God's Family"; "You Were Created to Become Like Christ"; "You Were Shaped for Serving God"; "You Were Made for a Mission"—and each of the sections, in turn, is divided into short chapters ("Understanding Your Shape" or "Using What God Gave You" or "How Real Servants Act"). The writing is simple and unadorned. The scriptural interpretation is literal: "Noah had never seen rain, because prior to the Flood, God irrigated the earth from the ground up." The religious vision is uncomplicated and accepting: "God wants to be your best friend." Warren's Christianity, like his church, has low barriers to entry: "Wherever you are reading this, I invite you to bow your head and quietly whisper the prayer that will change your eternity. Jesus, I believe in you and I receive you. Go ahead. If you sincerely meant that prayer, congratulations! Welcome to the family of God! You are now ready to discover and start living God's purpose for your life."
It is tempting to interpret the book's message as a kind of New Age self-help theology. Warren's God is not awesome or angry and does not stand in judgment of human sin. He's genial and mellow. "Warren's God 'wants to be your best friend,' and this means, in turn, that God's most daunting property, the exercise of eternal judgment, is strategically downsized," the critic Chris Lehmann writes, echoing a common complaint:
"When Warren turns his utility-minded feel-speak upon the symbolic iconography of the faith, the results are offensively bathetic: "When Jesus stretched his arms wide on the cross, he was saying, 'I love you this much.' " But God needs to be at a greater remove than a group hug."
The self-help genre, however, is fundamentally inward-focussed. M. Scott Peck's "The Road Less Traveled"—the only spiritual work that, in terms of sales, can even come close to "The Purpose-Driven Life"—begins with the sentence "Life is difficult." That's a self-help book: it focusses the reader on his own experience. Warren's first sentence, by contrast, is "It's not about you," which puts it in the spirit of traditional Christian devotional literature, which focusses the reader outward, toward God. In look and feel, in fact, "The Purpose-Driven Life" is less twenty-first-century Orange County than it is the nineteenth century of Warren's hero, the English evangelist Charles Spurgeon. Spurgeon was the Warren of his day: the pastor of a large church in London, and the author of best-selling devotional books. On Sunday, good Christians could go and hear Spurgeon preach at the Metropolitan Tabernacle. But during the week they needed something to replace the preacher, and so Spurgeon, in one of his best-known books, "Morning and Evening," wrote seven hundred and thirty-two short homilies, to be read in the morning and the evening of each day of the year. The homilies are not complex investigations of theology. They are opportunities for spiritual reflection. (Sample Spurgeonism: "Every child of God is where God has placed him for some purpose, and the practical use of this first point is to lead you to inquire for what practical purpose has God placed each one of you where you now are." Sound familiar?) The Oxford Times described one of Spurgeon's books as "a rich store of topics treated daintily, with broad humour, with quaint good sense, yet always with a subdued tone and high moral aim," and that describes "The Purpose-Driven Life" as well. It's a spiritual companion. And, like "Morning and Evening," it is less a book than a program. It's divided into forty chapters, to be read during "Forty Days of Purpose." The first page of the book is called "My Covenant." It reads, "With God's help, I commit the next 40 days of my life to discovering God's purpose for my life."
Warren departs from Spurgeon, though, in his emphasis on the purpose-driven life as a collective experience. Below the boxed covenant is a space for not one signature but three: "Your name," "Partner's name," and then Rick Warren's signature, already printed, followed by a quotation from Ecclesiastes 4:9:
"Two are better off than one, because together they can work more effectively. If one of them falls down, the other can help him up. . . . Two people can resist an attack that would defeat one person alone. A rope made of three cords is hard to break."
"The Purpose-Driven Life" is meant to be read in groups. If the vision of faith sometimes seems skimpy, that's because the book is supposed to be supplemented by a layer of discussion and reflection and debate. It is a testament to Warren's intuitive understanding of how small groups work that this is precisely how "The Purpose-Driven Life" has been used. It spread along the network that he has spent his career putting together, not from person to person but from group to group. It presold five hundred thousand copies. It averaged more than half a million copies in sales a month in its first two years, which is possible only when a book is being bought in lots of fifty or a hundred or two hundred. Of those who bought the book as individuals, nearly half have bought more than one copy, sixteen per cent have bought four to six copies, and seven per cent have bought ten or more. Twenty-five thousand churches have now participated in the congregation-wide "40 Days of Purpose" campaign, as have hundreds of small groups within companies and organizations, from the N.B.A. to the United States Postal Service.
"I remember the first time I met Rick," says Scott Bolinder, the head of Zondervan, the Christian publishing division of HarperCollins and the publisher of "The Purpose-Driven Life." "He was telling me about pastors.com. This is during the height of the dot-com boom. I was thinking, What's your angle? He had no angle. He said, 'I love pastors. I know what they go through.' I said, 'What do you put on there?' He said, 'I put my sermons with a little disclaimer on there: "You are welcome to preach it any way you can. I only ask one thing—I ask that you do it better than I did."' So then fast-forward seven years: he's got hundreds of thousands of pastors who come to this Web site. And he goes, 'By the way, my church and I are getting ready to do forty days of purpose. If you want to join us, I'm going to preach through this and put my sermons up. And I've arranged with my publisher that if you do join us with this campaign they will sell the book to you for a low price.' That became the tipping point—being able to launch that book with eleven hundred churches, right from the get-go. They became the evangelists for the book."
The book's high-water mark came earlier this year, when a fugitive named Brian Nichols, who had shot and killed four people in an Atlanta courthouse, accosted a young single mother, Ashley Smith, outside her apartment, and held her captive in her home for seven hours.
"I asked him if I could read," Smith said at the press conference after her ordeal was over, and so she went and got her copy of "The Purpose-Driven Life" and turned to the chapter she was reading that day. It was Chapter 33, "How Real Servants Act." It begins:
"We serve God by serving others.
The world defines greatness in terms of power, possessions, prestige, and position. If you can demand service from others, you've arrived. In our self-serving culture with its me-first mentality, acting like a servant is not a popular concept.
Jesus, however, measured greatness in terms of service, not status. God determines your greatness by how many people you serve, not how many people serve you."
Nichols listened and said, "Stop. Will you read it again?"
Smith read it to him again. They talked throughout the night. She made him pancakes. "I said, 'Do you believe in miracles? Because if you don't believe in miracles — you are here for a reason. You're here in my apartment for some reason.' " She might as well have been quoting from "The Purpose-Driven Life." She went on, "You don't think you're supposed to be sitting here right in front of me listening to me tell you, you know, your reason for being here?" When morning came, Nichols let her go.
Hollywood could not have scripted a better testimonial for "The Purpose-Driven Life." Warren's sales soared further. But the real lesson of that improbable story is that it wasn't improbable at all. What are the odds that a young Christian—a woman who, it turns out, sends her daughter to Hebron Church, in Dacula, Georgia—isn't reading "The Purpose-Driven Life"? And is it surprising that Ashley Smith would feel compelled to read aloud from the book to her captor, and that, in the discussion that followed, Nichols would come to some larger perspective on his situation? She and Nichols were in a small group, and reading aloud from "The Purpose-Driven Life" is what small groups do.
5.
Not long ago, the sociologist Christian Smith decided to find out what American evangelicals mean when they say that they believe in a "Christian America." The phrase seems to suggest that evangelicals intend to erode the separation of church and state. But when Smith asked a representative sample of evangelicals to explain the meaning of the phrase, the most frequent explanation was that America was founded by people who sought religious liberty and worked to establish religious freedom. The second most frequent explanation offered was that a majority of Americans of earlier generations were sincere Christians, which, as Smith points out, is empirically true. Others said what they meant by a Christian nation was that the basic laws of American government reflected Christian principles—which sounds potentially theocratic, except that when Smith asked his respondents to specify what they meant by basic laws they came up with representative government and the balance of powers.
"In other words," Smith writes, "the belief that America was once a Christian nation does not necessarily mean a commitment to making it a 'Christian' nation today, whatever that might mean. Some evangelicals do make this connection explicitly. But many discuss America's Christian heritage as a simple fact of history that they are not particularly interested in or optimistic about reclaiming. Further, some evangelicals think America never was a Christian nation; some think it still is; and others think it should not be a Christian nation, whether or not it was so in the past or is now."
As Smith explored one issue after another with the evangelicals—gender equality, education, pluralism, and politics—he found the same scattershot pattern. The Republican Party may have been adept at winning the support of evangelical voters, but that affinity appears to be as much cultural as anything; the Party has learned to speak the evangelical language. Scratch the surface, and the appearance of homogeneity and ideological consistency disappears. Evangelicals want children to have the right to pray in school, for example, and they vote for conservative Republicans who support that right. But what do they mean by prayer? The New Testament's most left-liberal text, the Lord's Prayer—which, it should be pointed out, begins with a call for utopian social restructuring ("Thy will be done, On earth as it is in Heaven"), then welfare relief ("Give us this day our daily bread"), and then income redistribution ("Forgive us our debts as we also have forgiven our debtors"). The evangelical movement isn't a movement, if you take movements to be characterized by a coherent philosophy, and that's hardly surprising when you think of the role that small groups have come to play in the evangelical religious experience. The answers that Smith got to his questions are the kind of answers you would expect from people who think most deeply about their faith and its implications on Tuesday night, or Wednesday, with five or six of their closest friends, and not Sunday morning, in the controlling hands of a pastor.
"Small groups cultivate spirituality, but it is a particular kind of spirituality," Robert Wuthnow writes. "They cannot be expected to nurture faith in the same way that years of theological study, meditation and reflection might." He says, "They provide ways of putting faith in practice. For the most part, their focus is on practical applications, not on abstract knowledge, or even on ideas for the sake of ideas themselves."
We are so accustomed to judging a social movement by its ideological coherence that the vagueness at the heart of evangelicalism sounds like a shortcoming. Peter Drucker calls Warren's network an army, like the Jesuits. But the Jesuits marched in lockstep and held to an all-encompassing and centrally controlled creed. The members of Warren's network don't all dress the same, and they march to the tune only of their own small group, and they agree, fundamentally, only on who the enemy is. It's not an army. It's an insurgency.
In the wake of the extraordinary success of "The Purpose-Driven Life," Warren says, he underwent a period of soul-searching. He had suddenly been given enormous wealth and influence and he did not know what he was supposed to do with it. "God led me to Psalm 72, which is Solomon's prayer for more influence," Warren says. "It sounds pretty selfish. Solomon is already the wisest and wealthiest man in the world. He's the King of Israel at the apex of its glory. And in that psalm he says, 'God, I want you to make me more powerful and influential.' It looks selfish until he says, 'So that the King may support the widow and orphan, care for the poor, defend the defenseless, speak up for the immigrant, the foreigner, be a friend to those in prison.' Out of that psalm, God said to me that the purpose of influence is to speak up for those who have no influence. That changed my life. I had to repent. I said, I'm sorry, widows and orphans have not been on my radar. I live in Orange County. I live in the Saddleback Valley, which is all gated communities. There aren't any homeless people around. They are thirteen miles away, in Santa Ana, not here." He gestured toward the rolling green hills outside. "I started reading through Scripture. I said, How did I miss the two thousand verses on the poor in the Bible? So I said, I will use whatever affluence and influence that you give me to help those who are marginalized."
He and his wife, Kay, decided to reverse tithe, giving away ninety per cent of the tens of millions of dollars they earned from "The Purpose-Driven Life." They sat down with gay community leaders to talk about fighting AIDS. Warren has made repeated trips to Africa. He has sent out volunteers to forty-seven countries around the world, test-piloting experiments in microfinance and H.I.V. prevention and medical education. He decided to take the same networks he had built to train pastors and spread the purpose-driven life and put them to work on social problems.
"There is only one thing big enough to handle the world's problems, and that is the millions and millions of churches spread out around the world," he says. "I can take you to thousands of villages where they don't have a school. They don't have a grocery store, don't have a fire department. But they have a church. They have a pastor. They have volunteers. The problem today is distribution. In the tsunami, millions of dollars of foodstuffs piled up on the shores and people couldn't get it into the places that needed it, because they didn't have a network. Well, the biggest distribution network in the world is local churches. There are millions of them, far more than all the franchises in the world. Put together, they could be a force for good."
That is, in one sense, a typical Warren pronouncement—bold to the point of audacity, like telling his publisher that his book will sell a hundred million copies. In another sense, it is profoundly modest. When Warren's nineteenth-century evangelical predecessors took on the fight against slavery, they brought to bear every legal, political, and economic lever they could get their hands on. But that was a different time, and that was a different church. Today's evangelicalism is a network, and networks, for better or worse, are informal and personal.
At the Anaheim stadium service, Warren laid out his plan for attacking poverty and disease. He didn't talk about governments, though, or the United Nations, or structures, or laws. He talked about the pastors he had met in his travels around the world. He brought out the President of Rwanda, who stood up at the microphone—a short, slender man in an immaculate black suit—and spoke in halting English about how Warren was helping him rebuild his country. When he was finished, the crowd erupted in applause, and Rick Warren walked across the stage and enfolded him in his long arms.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
Getting In
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
October 10, 2005
A Critic At Large
The social logic of Ivy League admissions.
1.
I applied to college one evening, after dinner, in the fall of my senior year in high school. College applicants in Ontario, in those days, were given a single sheet of paper which listed all the universities in the province. It was my job to rank them in order of preference. Then I had to mail the sheet of paper to a central college-admissions office. The whole process probably took ten minutes. My school sent in my grades separately. I vaguely remember filling out a supplementary two-page form listing my interests and activities. There were no S.A.T. scores to worry about, because in Canada we didn't have to take the S.A.T.s. I don't know whether anyone wrote me a recommendation. I certainly never asked anyone to. Why would I? It wasn't as if I were applying to a private club.
I put the University of Toronto first on my list, the University of Western Ontario second, and Queen's University third. I was working off a set of brochures that I'd sent away for. My parents' contribution consisted of my father's agreeing to drive me one afternoon to the University of Toronto campus, where we visited the residential college I was most interested in. I walked around. My father poked his head into the admissions office, chatted with the admissions director, and—I imagine—either said a few short words about the talents of his son or (knowing my father) remarked on the loveliness of the delphiniums in the college flower beds. Then we had ice cream. I got in.
Am I a better or more successful person for having been accepted at the University of Toronto, as opposed to my second or third choice? It strikes me as a curious question. In Ontario, there wasn't a strict hierarchy of colleges. There were several good ones and several better ones and a number of programs—like computer science at the University of Waterloo—that were world-class. But since all colleges were part of the same public system and tuition everywhere was the same (about a thousand dollars a year, in those days), and a B average in high school pretty much guaranteed you a spot in college, there wasn't a sense that anything great was at stake in the choice of which college we attended. The issue was whether we attended college, and—most important—how seriously we took the experience once we got there. I thought everyone felt this way. You can imagine my confusion, then, when I first met someone who had gone to Harvard.
There was, first of all, that strange initial reluctance to talk about the matter of college at all—a glance downward, a shuffling of the feet, a mumbled mention of Cambridge. "Did you go to Harvard?" I would ask. I had just moved to the United States. I didn't know the rules. An uncomfortable nod would follow. Don't define me by my school, they seemed to be saying, which implied that their school actually could define them. And, of course, it did. Wherever there was one Harvard graduate, another lurked not far behind, ready to swap tales of late nights at the Hasty Pudding, or recount the intricacies of the college—application essay, or wonder out loud about the whereabouts of Prince So-and-So, who lived down the hall and whose family had a place in the South of France that you would not believe. In the novels they were writing, the precocious and sensitive protagonist always went to Harvard; if he was troubled, he dropped out of Harvard; in the end, he returned to Harvard to complete his senior thesis. Once, I attended a wedding of a Harvard alum in his fifties, at which the best man spoke of his college days with the groom as if neither could have accomplished anything of greater importance in the intervening thirty years. By the end, I half expected him to take off his shirt and proudly display the large crimson "H" tattooed on his chest. What is this "Harvard" of which you Americans speak so reverently?
2.
In 1905, Harvard College adopted the College Entrance Examination Board tests as the principal basis for admission, which meant that virtually any academically gifted high—school senior who could afford a private college had a straightforward shot at attending. By 1908, the freshman class was seven per cent Jewish, nine per cent Catholic, and forty-five per cent from public schools, an astonishing transformation for a school that historically had been the preserve of the New England boarding-school complex known in the admissions world as St. Grottlesex.
As the sociologist Jerome Karabel writes in "The Chosen" (Houghton Mifflin; $28), his remarkable history of the admissions process at Harvard, Yale, and Princeton, that meritocratic spirit soon led to a crisis. The enrollment of Jews began to rise dramatically. By 1922, they made up more than a fifth of Harvard's freshman class. The administration and alumni were up in arms. Jews were thought to be sickly and grasping, grade-grubbing and insular. They displaced the sons of wealthy Wasp alumni, which did not bode well for fund-raising. A. Lawrence Lowell, Harvard's president in the nineteen-twenties, stated flatly that too many Jews would destroy the school: "The summer hotel that is ruined by admitting Jews meets its fate . . . because they drive away the Gentiles, and then after the Gentiles have left, they leave also."
The difficult part, however, was coming up with a way of keeping Jews out, because as a group they were academically superior to everyone else. Lowell's first idea—a quota limiting Jews to fifteen per cent of the student body—was roundly criticized. Lowell tried restricting the number of scholarships given to Jewish students, and made an effort to bring in students from public schools in the West, where there were fewer Jews. Neither strategy worked. Finally, Lowell—and his counterparts at Yale and Princeton—realized that if a definition of merit based on academic prowess was leading to the wrong kind of student, the solution was to change the definition of merit. Karabel argues that it was at this moment that the history and nature of the Ivy League took a significant turn.
The admissions office at Harvard became much more interested in the details of an applicant's personal life. Lowell told his admissions officers to elicit information about the "character" of candidates from "persons who know the applicants well," and so the letter of reference became mandatory. Harvard started asking applicants to provide a photograph. Candidates had to write personal essays, demonstrating their aptitude for leadership, and list their extracurricular activities. "Starting in the fall of 1922," Karabel writes, "applicants were required to answer questions on "Race and Color,' "Religious Preference,' "Maiden Name of Mother,' "Birthplace of Father,' and "What change, if any, has been made since birth in your own name or that of your father? (Explain fully).' "
At Princeton, emissaries were sent to the major boarding schools, with instructions to rate potential candidates on a scale of 1 to 4, where 1 was "very desirable and apparently exceptional material from every point of view" and 4 was "undesirable from the point of view of character, and, therefore, to be excluded no matter what the results of the entrance examinations might be." The personal interview became a key component of admissions in order, Karabel writes, "to ensure that "undesirables' were identified and to assess important but subtle indicators of background and breeding such as speech, dress, deportment and physical appearance." By 1933, the end of Lowell's term, the percentage of Jews at Harvard was back down to fifteen per cent.
If this new admissions system seems familiar, that's because it is essentially the same system that the Ivy League uses to this day. According to Karabel, Harvard, Yale, and Princeton didn't abandon the elevation of character once the Jewish crisis passed. They institutionalized it.
Starting in 1953, Arthur Howe, Jr., spent a decade as the chair of admissions at Yale, and Karabel describes what happened under his guidance:
The admissions committee viewed evidence of "manliness" with particular enthusiasm. One boy gained admission despite an academic prediction of 70 because "there was apparently something manly and distinctive about him that had won over both his alumni and staff interviewers." Another candidate, admitted despite his schoolwork being "mediocre in comparison with many others," was accepted over an applicant with a much better record and higher exam scores because, as Howe put it, "we just thought he was more of a guy." So preoccupied was Yale with the appearance of its students that the form used by alumni interviewers actually had a physical characteristics checklist through 1965. Each year, Yale carefully measured the height of entering freshmen, noting with pride the proportion of the class at six feet or more.
At Harvard, the key figure in that same period was Wilbur Bender, who, as the dean of admissions, had a preference for "the boy with some athletic interests and abilities, the boy with physical vigor and coordination and grace." Bender, Karabel tells us, believed that if Harvard continued to suffer on the football field it would contribute to the school's reputation as a place with "no college spirit, few good fellows, and no vigorous, healthy social life," not to mention a "surfeit of "pansies,' "decadent esthetes' and "precious sophisticates.' " Bender concentrated on improving Harvard's techniques for evaluating "intangibles" and, in particular, its "ability to detect homosexual tendencies and serious psychiatric problems."
By the nineteen-sixties, Harvard's admissions system had evolved into a series of complex algorithms. The school began by lumping all applicants into one of twenty-two dockets, according to their geographical origin. (There was one docket for Exeter and Andover, another for the eight Rocky Mountain states.) Information from interviews, references, and student essays was then used to grade each applicant on a scale of 1 to 6, along four dimensions: personal, academic, extracurricular, and athletic. Competition, critically, was within each docket, not between dockets, so there was no way for, say, the graduates of Bronx Science and Stuyvesant to shut out the graduates of Andover and Exeter. More important, academic achievement was just one of four dimensions, further diluting the value of pure intellectual accomplishment. Athletic ability, rather than falling under "extracurriculars," got a category all to itself, which explains why, even now, recruited athletes have an acceptance rate to the Ivies at well over twice the rate of other students, despite S.A.T. scores that are on average more than a hundred points lower. And the most important category? That mysterious index of "personal" qualities. According to Harvard's own analysis, the personal rating was a better predictor of admission than the academic rating. Those with a rank of 4 or worse on the personal scale had, in the nineteen-sixties, a rejection rate of ninety-eight per cent. Those with a personal rating of 1 had a rejection rate of 2.5 per cent. When the Office of Civil Rights at the federal education department investigated Harvard in the nineteen-eighties, they found handwritten notes scribbled in the margins of various candidates' files. "This young woman could be one of the brightest applicants in the pool but there are several references to shyness," read one. Another comment reads, "Seems a tad frothy." One application—and at this point you can almost hear it going to the bottom of the pile—was notated, "Short with big ears."
3.
Social scientists distinguish between what are known as treatment effects and selection effects. The Marine Corps, for instance, is largely a treatment-effect institution. It doesn't have an enormous admissions office grading applicants along four separate dimensions of toughness and intelligence. It's confident that the experience of undergoing Marine Corps basic training will turn you into a formidable soldier. A modelling agency, by contrast, is a selection-effect institution. You don't become beautiful by signing up with an agency. You get signed up by an agency because you're beautiful.
At the heart of the American obsession with the Ivy League is the belief that schools like Harvard provide the social and intellectual equivalent of Marine Corps basic training—that being taught by all those brilliant professors and meeting all those other motivated students and getting a degree with that powerful name on it will confer advantages that no local state university can provide. Fuelling the treatment-effect idea are studies showing that if you take two students with the same S.A.T. scores and grades, one of whom goes to a school like Harvard and one of whom goes to a less selective college, the Ivy Leaguer will make far more money ten or twenty years down the road.
The extraordinary emphasis the Ivy League places on admissions policies, though, makes it seem more like a modeling agency than like the Marine Corps, and, sure enough, the studies based on those two apparently equivalent students turn out to be flawed. How do we know that two students who have the same S.A.T. scores and grades really are equivalent? It's quite possible that the student who goes to Harvard is more ambitious and energetic and personable than the student who wasn't let in, and that those same intangibles are what account for his better career success. To assess the effect of the Ivies, it makes more sense to compare the student who got into a top school with the student who got into that same school but chose to go to a less selective one. Three years ago, the economists Alan Krueger and Stacy Dale published just such a study. And they found that when you compare apples and apples the income bonus from selective schools disappears.
"As a hypothetical example, take the University of Pennsylvania and Penn State, which are two schools a lot of students choose between," Krueger said. "One is Ivy, one is a state school. Penn is much more highly selective. If you compare the students who go to those two schools, the ones who go to Penn have higher incomes. But let's look at those who got into both types of schools, some of whom chose Penn and some of whom chose Penn State. Within that set it doesn't seem to matter whether you go to the more selective school. Now, you would think that the more ambitious student is the one who would choose to go to Penn, and the ones choosing to go to Penn State might be a little less confident in their abilities or have a little lower family income, and both of those factors would point to people doing worse later on. But they don't."
Krueger says that there is one exception to this. Students from the very lowest economic strata do seem to benefit from going to an Ivy. For most students, though, the general rule seems to be that if you are a hardworking and intelligent person you'll end up doing well regardless of where you went to school. You'll make good contacts at Penn. But Penn State is big enough and diverse enough that you can make good contacts there, too. Having Penn on your résumé opens doors. But if you were good enough to get into Penn you're good enough that those doors will open for you anyway. "I can see why families are really concerned about this," Krueger went on. "The average graduate from a top school is making nearly a hundred and twenty thousand dollars a year, the average graduate from a moderately selective school is making ninety thousand dollars. That's an enormous difference, and I can see why parents would fight to get their kids into the better school. But I think they are just assigning to the school a lot of what the student is bringing with him to the school."
Bender was succeeded as the dean of admissions at Harvard by Fred Glimp, who, Karabel tells us, had a particular concern with academic underperformers. "Any class, no matter how able, will always have a bottom quarter," Glimp once wrote. "What are the effects of the psychology of feeling average, even in a very able group? Are there identifiable types with the psychological or what—not tolerance to be "happy' or to make the most of education while in the bottom quarter?" Glimp thought it was critical that the students who populated the lower rungs of every Harvard class weren't so driven and ambitious that they would be disturbed by their status. "Thus the renowned (some would say notorious) Harvard admission practice known as the "happy-bottom-quarter' policy was born," Karabel writes.
It's unclear whether or not Glimp found any students who fit that particular description. (He wondered, in a marvellously honest moment, whether the answer was "Harvard sons.") But Glimp had the realism of the modelling scout. Glimp believed implicitly what Krueger and Dale later confirmed: that the character and performance of an academic class is determined, to a significant extent, at the point of admission; that if you want to graduate winners you have to admit winners; that if you want the bottom quarter of your class to succeed you have to find people capable of succeeding in the bottom quarter. Karabel is quite right, then, to see the events of the nineteen-twenties as the defining moment of the modern Ivy League. You are whom you admit in the Ă©lite-education business, and when Harvard changed whom it admitted, it changed Harvard. Was that change for the better or for the worse?
4.
In the wake of the Jewish crisis, Harvard, Yale, and Princeton chose to adopt what might be called the "best graduates" approach to admissions. France's École Normale SupĂ©rieure, Japan's University of Tokyo, and most of the world's other Ă©lite schools define their task as looking for the best students—that is, the applicants who will have the greatest academic success during their time in college. The Ivy League schools justified their emphasis on character and personality, however, by arguing that they were searching for the students who would have the greatest success after college. They were looking for leaders, and leadership, the officials of the Ivy League believed, was not a simple matter of academic brilliance. "Should our goal be to select a student body with the highest possible proportions of high-ranking students, or should it be to select, within a reasonably high range of academic ability, a student body with a certain variety of talents, qualities, attitudes, and backgrounds?" Wilbur Bender asked. To him, the answer was obvious. If you let in only the brilliant, then you produced bookworms and bench scientists: you ended up as socially irrelevant as the University of Chicago (an institution Harvard officials looked upon and shuddered). "Above a reasonably good level of mental ability, above that indicated by a 550-600 level of S.A.T. score," Bender went on, "the only thing that matters in terms of future impact on, or contribution to, society is the degree of personal inner force an individual has."
It's easy to find fault with the best-graduates approach. We tend to think that intellectual achievement is the fairest and highest standard of merit. The Ivy League process, quite apart from its dubious origins, seems subjective and opaque. Why should personality and athletic ability matter so much? The notion that "the ability to throw, kick, or hit a ball is a legitimate criterion in determining who should be admitted to our greatest research universities," Karabel writes, is "a proposition that would be considered laughable in most of the world's countries." At the same time that Harvard was constructing its byzantine admissions system, Hunter College Elementary School, in New York, required simply that applicants take an exam, and if they scored in the top fifty they got in. It's hard to imagine a more objective and transparent procedure.
But what did Hunter achieve with that best-students model? In the nineteen-eighties, a handful of educational researchers surveyed the students who attended the elementary school between 1948 and 1960. This was a group with an average I.Q. of 157—three and a half standard deviations above the mean—who had been given what, by any measure, was one of the finest classroom experiences in the world. As graduates, though, they weren't nearly as distinguished as they were expected to be. "Although most of our study participants are successful and fairly content with their lives and accomplishments," the authors conclude, "there are no superstars . . . and only one or two familiar names." The researchers spend a great deal of time trying to figure out why Hunter graduates are so disappointing, and end up sounding very much like Wilbur Bender. Being a smart child isn't a terribly good predictor of success in later life, they conclude. "Non-intellective" factors—like motivation and social skills—probably matter more. Perhaps, the study suggests, "after noting the sacrifices involved in trying for national or world-class leadership in a field, H.C.E.S. graduates decided that the intelligent thing to do was to choose relatively happy and successful lives." It is a wonderful thing, of course, for a school to turn out lots of relatively happy and successful graduates. But Harvard didn't want lots of relatively happy and successful graduates. It wanted superstars, and Bender and his colleagues recognized that if this is your goal a best-students model isn't enough.
Most Ă©lite law schools, to cite another example, follow a best-students model. That's why they rely so heavily on the L.S.A.T. Yet there's no reason to believe that a person's L.S.A.T. scores have much relation to how good a lawyer he will be. In a recent research project funded by the Law School Admission Council, the Berkeley researchers Sheldon Zedeck and Marjorie Shultz identified twenty-six "competencies" that they think effective lawyering demands—among them practical judgment, passion and engagement, legal-research skills, questioning and interviewing skills, negotiation skills, stress management, and so on—and the L.S.A.T. picks up only a handful of them. A law school that wants to select the best possible lawyers has to use a very different admissions process from a law school that wants to select the best possible law students. And wouldn't we prefer that at least some law schools try to select good lawyers instead of good law students?
This search for good lawyers, furthermore, is necessarily going to be subjective, because things like passion and engagement can't be measured as precisely as academic proficiency. Subjectivity in the admissions process is not just an occasion for discrimination; it is also, in better times, the only means available for giving us the social outcome we want. The first black captain of the Yale football team was a man named Levi Jackson, who graduated in 1950. Jackson was a hugely popular figure on campus. He went on to be a top executive at Ford, and is credited with persuading the company to hire thousands of African-Americans after the 1967 riots. When Jackson was tapped for the exclusive secret society Skull and Bones, he joked, "If my name had been reversed, I never would have made it." He had a point. The strategy of discretion that Yale had once used to exclude Jews was soon being used to include people like Levi Jackson.
In the 2001 book "The Game of Life," James L. Shulman and William Bowen (a former president of Princeton) conducted an enormous statistical analysis on an issue that has become one of the most contentious in admissions: the special preferences given to recruited athletes at selective universities. Athletes, Shulman and Bowen demonstrate, have a large and growing advantage in admission over everyone else. At the same time, they have markedly lower G.P.A.s and S.A.T. scores than their peers. Over the past twenty years, their class rankings have steadily dropped, and they tend to segregate themselves in an "athletic culture" different from the culture of the rest of the college. Shulman and Bowen think the preference given to athletes by the Ivy League is shameful.
Halfway through the book, however, Shulman and Bowen present "" finding. Male athletes, despite their lower S.A.T. scores and grades, and despite the fact that many of them are members of minorities and come from lower socioeconomic backgrounds than other students, turn out to earn a lot more than their peers. Apparently, athletes are far more likely to go into the high-paying financial-services sector, where they succeed because of their personality and psychological makeup. In what can only be described as a textbook example of burying the lead, Bowen and Shulman write:
One of these characteristics can be thought of as drive—a strong desire to succeed and unswerving determination to reach a goal, whether it be winning the next game or closing a sale. Similarly, athletes tend to be more energetic than the average person, which translates into an ability to work hard over long periods of time—to meet, for example, the workload demands placed on young people by an investment bank in the throes of analyzing a transaction. In addition, athletes are more likely than others to be highly competitive, gregarious and confident of their ability to work well in groups (on teams).
Shulman and Bowen would like to argue that the attitudes of selective colleges toward athletes are a perversion of the ideals of American Ă©lite education, but that's because they misrepresent the actual ideals of American Ă©lite education. The Ivy League is perfectly happy to accept, among others, the kind of student who makes a lot of money after graduation. As the old saying goes, the definition of a well-rounded Yale graduate is someone who can roll all the way from New Haven to Wall Street.
5.
I once had a conversation with someone who worked for an advertising agency that represented one of the big luxury automobile brands. He said that he was worried that his client's new lower-priced line was being bought disproportionately by black women. He insisted that he did not mean this in a racist way. It was just a fact, he said. Black women would destroy the brand's cachet. It was his job to protect his client from the attentions of the socially undesirable.
This is, in no small part, what Ivy League admissions directors do. They are in the luxury-brand-management business, and "The Chosen," in the end, is a testament to just how well the brand managers in Cambridge, New Haven, and Princeton have done their job in the past seventy-five years. In the nineteen twenties, when Harvard tried to figure out how many Jews they had on campus, the admissions office scoured student records and assigned each suspected Jew the designation j1 (for someone who was "conclusively Jewish"), j2 (where the "preponderance of evidence" pointed to Jewishness), or j3 (where Jewishness was a "possibility"). In the branding world, this is called customer segmentation. In the Second World War, as Yale faced plummeting enrollment and revenues, it continued to turn down qualified Jewish applicants. As Karabel writes, "In the language of sociology, Yale judged its symbolic capital to be even more precious than its economic capital." No good brand manager would sacrifice reputation for short-term gain. The admissions directors at Harvard have always, similarly, been diligent about rewarding the children of graduates, or, as they are quaintly called, "legacies." In the 1985-92 period, for instance, Harvard admitted children of alumni at a rate more than twice that of non-athlete, non-legacy applicants, despite the fact that, on virtually every one of the school's magical ratings scales, legacies significantly lagged behind their peers. Karabel calls the practice "unmeritocratic at best and profoundly corrupt at worst," but rewarding customer loyalty is what luxury brands do. Harvard wants good graduates, and part of their definition of a good graduate is someone who is a generous and loyal alumnus. And if you want generous and loyal alumni you have to reward them. Aren't the tremendous resources provided to Harvard by its alumni part of the reason so many people want to go to Harvard in the first place? The endless battle over admissions in the United States proceeds on the assumption that some great moral principle is at stake in the matter of whom schools like Harvard choose to let in—that those who are denied admission by the whims of the admissions office have somehow been harmed. If you are sick and a hospital shuts its doors to you, you are harmed. But a selective school is not a hospital, and those it turns away are not sick. Élite schools, like any luxury brand, are an aesthetic experience—an exquisitely constructed fantasy of what it means to belong to an Ă©lite —and they have always been mindful of what must be done to maintain that experience.
In the nineteen-eighties, when Harvard was accused of enforcing a secret quota on Asian admissions, its defense was that once you adjusted for the preferences given to the children of alumni and for the preferences given to athletes, Asians really weren't being discriminated against. But you could sense Harvard's exasperation that the issue was being raised at all. If Harvard had too many Asians, it wouldn't be Harvard, just as Harvard wouldn't be Harvard with too many Jews or pansies or parlor pinks or shy types or short people with big ears.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
Million-Dollar Murray
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
February 13, 2006
Dept. of Social Services
Why problems like homelessness may be easier to solve than to manage.
1.
Murray Barr was a bear of a man, an ex-marine, six feet tall and heavyset, and when he fell down—which he did nearly every day—it could take two or three grown men to pick him up. He had straight black hair and olive skin. On the street, they called him Smokey. He was missing most of his teeth. He had a wonderful smile. People loved Murray.
His chosen drink was vodka. Beer he called "horse piss." On the streets of downtown Reno, where he lived, he could buy a two-hundred-and-fifty-millilitre bottle of cheap vodka for a dollar-fifty. If he was flush, he could go for the seven-hundred-and-fifty-millilitre bottle, and if he was broke he could always do what many of the other homeless people of Reno did, which is to walk through the casinos and finish off the half-empty glasses of liquor left at the gaming tables.
"If he was on a runner, we could pick him up several times a day," Patrick O'Bryan, who is a bicycle cop in downtown Reno, said. "And he's gone on some amazing runners. He would get picked up, get detoxed, then get back out a couple of hours later and start up again. A lot of the guys on the streets who've been drinking, they get so angry. They are so incredibly abrasive, so violent, so abusive. Murray was such a character and had such a great sense of humor that we somehow got past that. Even when he was abusive, we'd say, 'Murray, you know you love us,' and he'd say, 'I know—and go back to swearing at us."
"I've been a police officer for fifteen years," O'Bryan's partner, Steve Johns, said. "I picked up Murray my whole career. Literally."
Johns and O'Bryan pleaded with Murray to quit drinking. A few years ago, he was assigned to a treatment program in which he was under the equivalent of house arrest, and he thrived. He got a job and worked hard. But then the program ended. "Once he graduated out, he had no one to report to, and he needed that," O'Bryan said. "I don't know whether it was his military background. I suspect that it was. He was a good cook. One time, he accumulated savings of over six thousand dollars. Showed up for work religiously. Did everything he was supposed to do. They said, 'Congratulations,' and put him back on the street. He spent that six thousand in a week or so."
Often, he was too intoxicated for the drunk tank at the jail, and he'd get sent to the emergency room at either Saint Mary's or Washoe Medical Center. Marla Johns, who was a social worker in the emergency room at Saint Mary's, saw him several times a week. "The ambulance would bring him in. We would sober him up, so he would be sober enough to go to jail. And we would call the police to pick him up. In fact, that's how I met my husband." Marla Johns is married to Steve Johns.
"He was like the one constant in an environment that was ever changing," she went on. "In he would come. He would grin that half-toothless grin. He called me 'my angel.' I would walk in the room, and he would smile and say, 'Oh, my angel, I'm so happy to see you.' We would joke back and forth, and I would beg him to quit drinking and he would laugh it off. And when time went by and he didn't come in I would get worried and call the coroner's office. When he was sober, we would find out, oh, he's working someplace, and my husband and I would go and have dinner where he was working. When my husband and I were dating, and we were going to get married, he said, 'Can I come to the wedding?' And I almost felt like he should. My joke was 'If you are sober you can come, because I can't afford your bar bill.' When we started a family, he would lay a hand on my pregnant belly and bless the child. He really was this kind of light."
In the fall of 2003, the Reno Police Department started an initiative designed to limit panhandling in the downtown core. There were articles in the newspapers, and the police department came under harsh criticism on local talk radio. The crackdown on panhandling amounted to harassment, the critics said. The homeless weren't an imposition on the city; they were just trying to get by. "One morning, I'm listening to one of the talk shows, and they're just trashing the police department and going on about how unfair it is," O'Bryan said. "And I thought, Wow, I've never seen any of these critics in one of the alleyways in the middle of the winter looking for bodies." O'Bryan was angry. In downtown Reno, food for the homeless was plentiful: there was a Gospel kitchen and Catholic Services, and even the local McDonald's fed the hungry. The panhandling was for liquor, and the liquor was anything but harmless. He and Johns spent at least half their time dealing with people like Murray; they were as much caseworkers as police officers. And they knew they weren't the only ones involved. When someone passed out on the street, there was a "One down" call to the paramedics. There were four people in an ambulance, and the patient sometimes stayed at the hospital for days, because living on the streets in a state of almost constant intoxication was a reliable way of getting sick. None of that, surely, could be cheap.
O'Bryan and Johns called someone they knew at an ambulance service and then contacted the local hospitals. "We came up with three names that were some of our chronic inebriates in the downtown area, that got arrested the most often," O'Bryan said. "We tracked those three individuals through just one of our two hospitals. One of the guys had been in jail previously, so he'd only been on the streets for six months. In those six months, he had accumulated a bill of a hundred thousand dollars—and that's at the smaller of the two hospitals near downtown Reno. It's pretty reasonable to assume that the other hospital had an even larger bill. Another individual came from Portland and had been in Reno for three months. In those three months, he had accumulated a bill for sixty-five thousand dollars. The third individual actually had some periods of being sober, and had accumulated a bill of fifty thousand."
The first of those people was Murray Barr, and Johns and O'Bryan realized that if you totted up all his hospital bills for the ten years that he had been on the streets—as well as substance-abuse-treatment costs, doctors' fees, and other expenses—Murray Barr probably ran up a medical bill as large as anyone in the state of Nevada.
"It cost us one million dollars not to do something about Murray," O'Bryan said.
2.
Fifteen years ago, after the Rodney King beating, the Los Angeles Police Department was in crisis. It was accused of racial insensitivity and ill discipline and violence, and the assumption was that those problems had spread broadly throughout the rank and file. In the language of statisticians, it was thought that L.A.P.D.'s troubles had a "normal" distribution—that if you graphed them the result would look like a bell curve, with a small number of officers at one end of the curve, a small number at the other end, and the bulk of the problem situated in the middle. The bell-curve assumption has become so much a part of our mental architecture that we tend to use it to organize experience automatically.
But when the L.A.P.D. was investigated by a special commission headed by Warren Christopher, a very different picture emerged. Between 1986 and 1990, allegations of excessive force or improper tactics were made against eighteen hundred of the eighty-five hundred officers in the L.A.P.D. The broad middle had scarcely been accused of anything. Furthermore, more than fourteen hundred officers had only one or two allegations made against them—and bear in mind that these were not proven charges, that they happened in a four-year period, and that allegations of excessive force are an inevitable feature of urban police work. (The N.Y.P.D. receives about three thousand such complaints a year.) A hundred and eighty-three officers, however, had four or more complaints against them, forty-four officers had six or more complaints, sixteen had eight or more, and one had sixteen complaints. If you were to graph the troubles of the L.A.P.D., it wouldn't look like a bell curve. It would look more like a hockey stick. It would follow what statisticians call a "power law" distribution—where all the activity is not in the middle but at one extreme.
The Christopher Commission's report repeatedly comes back to what it describes as the extreme concentration of problematic officers. One officer had been the subject of thirteen allegations of excessive use of force, five other complaints, twenty-eight "use of force reports" (that is, documented, internal accounts of inappropriate behavior), and one shooting. Another had six excessive-force complaints, nineteen other complaints, ten use-of-force reports, and three shootings. A third had twenty-seven use-of-force reports, and a fourth had thirty-five. Another had a file full of complaints for doing things like "striking an arrestee on the back of the neck with the butt of a shotgun for no apparent reason while the arrestee was kneeling and handcuffed," beating up a thirteen-year-old juvenile, and throwing an arrestee from his chair and kicking him in the back and side of the head while he was handcuffed and lying on his stomach.
The report gives the strong impression that if you fired those forty-four cops the L.A.P.D. would suddenly become a pretty well-functioning police department. But the report also suggests that the problem is tougher than it seems, because those forty-four bad cops were so bad that the institutional mechanisms in place to get rid of bad apples clearly weren't working. If you made the mistake of assuming that the department's troubles fell into a normal distribution, you'd propose solutions that would raise the performance of the middle—like better training or better hiring—when the middle didn't need help. For those hard-core few who did need help, meanwhile, the medicine that helped the middle wouldn't be nearly strong enough.
In the nineteen-eighties, when homelessness first surfaced as a national issue, the assumption was that the problem fit a normal distribution: that the vast majority of the homeless were in the same state of semi-permanent distress. It was an assumption that bred despair: if there were so many homeless, with so many problems, what could be done to help them? Then, fifteen years ago, a young Boston College graduate student named Dennis Culhane lived in a shelter in Philadelphia for seven weeks as part of the research for his dissertation. A few months later he went back, and was surprised to discover that he couldn't find any of the people he had recently spent so much time with. "It made me realize that most of these people were getting on with their own lives," he said.
Culhane then put together a database—the first of its kind—to track who was coming in and out of the shelter system. What he discovered profoundly changed the way homelessness is understood. Homelessness doesn't have a normal distribution, it turned out. It has a power-law distribution. "We found that eighty per cent of the homeless were in and out really quickly," he said. "In Philadelphia, the most common length of time that someone is homeless is one day. And the second most common length is two days. And they never come back. Anyone who ever has to stay in a shelter involuntarily knows that all you think about is how to make sure you never come back."
The next ten per cent were what Culhane calls episodic users. They would come for three weeks at a time, and return periodically, particularly in the winter. They were quite young, and they were often heavy drug users. It was the last ten per cent—the group at the farthest edge of the curve—that interested Culhane the most. They were the chronically homeless, who lived in the shelters, sometimes for years at a time. They were older. Many were mentally ill or physically disabled, and when we think about homelessness as a social problem—the people sleeping on the sidewalk, aggressively panhandling, lying drunk in doorways, huddled on subway grates and under bridges—it's this group that we have in mind. In the early nineteen-nineties, Culhane's database suggested that New York City had a quarter of a million people who were homeless at some point in the previous half decade —which was a surprisingly high number. But only about twenty-five hundred were chronically homeless.
It turns out, furthermore, that this group costs the health-care and social-services systems far more than anyone had ever anticipated. Culhane estimates that in New York at least sixty-two million dollars was being spent annually to shelter just those twenty-five hundred hard-core homeless. "It costs twenty-four thousand dollars a year for one of these shelter beds," Culhane said. "We're talking about a cot eighteen inches away from the next cot." Boston Health Care for the Homeless Program, a leading service group for the homeless in Boston, recently tracked the medical expenses of a hundred and nineteen chronically homeless people. In the course of five years, thirty-three people died and seven more were sent to nursing homes, and the group still accounted for 18,834 emergency-room visits—at a minimum cost of a thousand dollars a visit. The University of California, San Diego Medical Center followed fifteen chronically homeless inebriates and found that over eighteen months those fifteen people were treated at the hospital's emergency room four hundred and seventeen times, and ran up bills that averaged a hundred thousand dollars each. One person—San Diego's counterpart to Murray Barr—came to the emergency room eighty-seven times.
"If it's a medical admission, it's likely to be the guys with the really complex pneumonia," James Dunford, the city of San Diego's emergency medical director and the author of the observational study, said. "They are drunk and they aspirate and get vomit in their lungs and develop a lung abscess, and they get hypothermia on top of that, because they're out in the rain. They end up in the intensive-care unit with these very complicated medical infections. These are the guys who typically get hit by cars and buses and trucks. They often have a neurosurgical catastrophe as well. So they are very prone to just falling down and cracking their head and getting a subdural hematoma, which, if not drained, could kill them, and it's the guy who falls down and hits his head who ends up costing you at least fifty thousand dollars. Meanwhile, they are going through alcoholic withdrawal and have devastating liver disease that only adds to their inability to fight infections. There is no end to the issues. We do this huge drill. We run up big lab fees, and the nurses want to quit, because they see the same guys come in over and over, and all we're doing is making them capable of walking down the block."
The homelessness problem is like the L.A.P.D.'s bad-cop problem. It's a matter of a few hard cases, and that's good news, because when a problem is that concentrated you can wrap your arms around it and think about solving it. The bad news is that those few hard cases are hard. They are falling-down drunks with liver disease and complex infections and mental illness. They need time and attention and lots of money. But enormous sums of money are already being spent on the chronically homeless, and Culhane saw that the kind of money it would take to solve the homeless problem could well be less than the kind of money it took to ignore it. Murray Barr used more health-care dollars, after all, than almost anyone in the state of Nevada. It would probably have been cheaper to give him a full-time nurse and his own apartment.
The leading exponent for the power-law theory of homelessness is Philip Mangano, who, since he was appointed by President Bush in 2002, has been the executive director of the U.S. Interagency Council on Homelessness, a group that oversees the programs of twenty federal agencies. Mangano is a slender man, with a mane of white hair and a magnetic presence, who got his start as an advocate for the homeless in Massachusetts. In the past two years, he has crisscrossed the United States, educating local mayors and city councils about the real shape of the homelessness curve. Simply running soup kitchens and shelters, he argues, allows the chronically homeless to remain chronically homeless. You build a shelter and a soup kitchen if you think that homelessness is a problem with a broad and unmanageable middle. But if it's a problem at the fringe it can be solved. So far, Mangano has convinced more than two hundred cities to radically reëvaluate their policy for dealing with the homeless.
"I was in St. Louis recently," Mangano said, back in June, when he dropped by New York on his way to Boise, Idaho. "I spoke with people doing services there. They had a very difficult group of people they couldn't reach no matter what they offered. So I said, Take some of your money and rent some apartments and go out to those people, and literally go out there with the key and say to them, 'This is the key to an apartment. If you come with me right now I am going to give it to you, and you are going to have that apartment.' And so they did. And one by one those people were coming in. Our intent is to take homeless policy from the old idea of funding programs that serve homeless people endlessly and invest in results that actually end homelessness."
Mangano is a history buff, a man who sometimes falls asleep listening to old Malcolm X speeches, and who peppers his remarks with references to the civil-rights movement and the Berlin Wall and, most of all, the fight against slavery. "I am an abolitionist," he says. "My office in Boston was opposite the monument to the 54th Regiment on the Boston Common, up the street from the Park Street Church, where William Lloyd Garrison called for immediate abolition, and around the corner from where Frederick Douglass gave that famous speech at the Tremont Temple. It is very much ingrained in me that you do not manage a social wrong. You should be ending it."
3.
The old Y.M.C.A. in downtown Denver is on Sixteenth Street, just east of the central business district. The main building is a handsome six-story stone structure that was erected in 1906, and next door is an annex that was added in the nineteen-fifties. On the ground floor there is a gym and exercise rooms. On the upper floors there are several hundred apartments—brightly painted one-bedrooms, efficiencies, and S.R.O.-style rooms with microwaves and refrigerators and central airconditioning—and for the past several years those apartments have been owned and managed by the Colorado Coalition for the Homeless.
Even by big-city standards, Denver has a serious homelessness problem. The winters are relatively mild, and the summers aren't nearly as hot as those of neighboring New Mexico or Utah, which has made the city a magnet for the indigent. By the city's estimates, it has roughly a thousand chronically homeless people, of whom three hundred spend their time downtown, along the central Sixteenth Street shopping corridor or in nearby Civic Center Park. Many of the merchants downtown worry that the presence of the homeless is scaring away customers. A few blocks north, near the hospital, a modest, low-slung detox center handles twenty-eight thousand admissions a year, many of them homeless people who have passed out on the streets, either from liquor or—as is increasingly the case—from mouthwash. "Dr. ——Dr. Tich, they call it—is the brand of mouthwash they use," says Roxane White, the manager of the city's social services. "You can imagine what that does to your gut."
Eighteen months ago, the city signed up with Mangano. With a mixture of federal and local funds, the C.C.H. inaugurated a new program that has so far enrolled a hundred and six people. It is aimed at the Murray Barrs of Denver, the people costing the system the most. C.C.H. went after the people who had been on the streets the longest, who had a criminal record, who had a problem with substance abuse or mental illness. "We have one individual in her early sixties, but looking at her you'd think she's eighty," Rachel Post, the director of substance treatment at the C.C.H., said. (Post changed some details about her clients in order to protect their identity.) "She's a chronic alcoholic. A typical day for her is she gets up and tries to find whatever 's going to drink that day. She falls down a lot. There's another person who came in during the first week. He was on methadone maintenance. He'd had psychiatric treatment. He was incarcerated for eleven years, and lived on the streets for three years after that, and, if that's not enough, he had a hole in his heart."
The recruitment strategy was as simple as the one that Mangano had laid out in St. Louis: Would you like a free apartment? The enrollees got either an efficiency at the Y.M.C.A. or an apartment rented for them in a building somewhere else in the city, provided they agreed to work within the rules of the program. In the basement of the Y, where the racquetball courts used to be, the coalition built a command center, staffed with ten caseworkers. Five days a week, between eight-thirty and ten in the morning, the caseworkers meet and painstakingly review the status of everyone in the program. On the wall around the conference table are several large white boards, with lists of doctor's appointments and court dates and medication schedules. "We need a staffing ratio of one to ten to make it work," Post said. "You go out there and you find people and assess how 're doing in their residence. Sometimes we're in contact with someone every day. Ideally, we want to be in contact every couple of days. We've got about fifteen people we're really worried about now."
The cost of services comes to about ten thousand dollars per homeless client per year. An efficiency apartment in Denver averages $376 a month, or just over forty-five hundred a year, which means that you can house and care for a chronically homeless person for at most fifteen thousand dollars, or about a third of what he or she would cost on the street. The idea is that once the people in the program get stabilized they will find jobs, and start to pick up more and more of their own rent, which would bring someone's annual cost to the program closer to six thousand dollars. As of today, seventy-five supportive housing slots have already been added, and the city's homeless plan calls for eight hundred more over the next ten years.
The reality, of course, is hardly that neat and tidy. The idea that the very sickest and most troubled of the homeless can be stabilized and eventually employed is only a hope. Some of them plainly won't be able to get there: these are, after all, hard cases. "We've got one man, he's in his twenties," Post said. "Already, he has cirrhosis of the liver. One time he blew a blood alcohol of .49, which is enough to kill most people. The first place we had he brought over all his friends, and they partied and trashed the place and broke a window. Then we gave him another apartment, and he did the same thing."
Post said that the man had been sober for several months. But he could relapse at some point and perhaps trash another apartment, and they'd have to figure out what to do with him next. Post had just been on a conference call with some people in New York City who run a similar program, and they talked about whether giving clients so many chances simply encourages them to behave irresponsibly. For some people, it probably does. But what was the alternative? If this young man was put back on the streets, he would cost the system even more money. The current philosophy of welfare holds that government assistance should be temporary and conditional, to avoid creating dependency. But someone who blows .49 on a Breathalyzer and has cirrhosis of the liver at the age of twenty-seven doesn't respond to incentives and sanctions in the usual way. "The most complicated people to work with are those who have been homeless for so long that going back to the streets just isn't scary to them," Post said. "The summer comes along and they say, 'I don't need to follow your rules.' " Power-law homelessness policy has to do the opposite of normal-distribution social policy. It should create dependency: you want people who have been outside the system to come inside and rebuild their lives under the supervision of those ten caseworkers in the basement of the Y.M.C.A.
That is what is so perplexing about power-law homeless policy. From an economic perspective the approach makes perfect sense. But from a moral perspective it doesn't seem fair. Thousands of people in the Denver area no doubt live day to day, work two or three jobs, and are eminently deserving of a helping hand—and no one offers them the key to a new apartment. Yet that's just what the guy screaming obscenities and swigging Dr. Tich gets. When the welfare mom's time on public assistance runs out, we cut her off. Yet when the homeless man trashes his apartment we give him another. Social benefits are supposed to have some kind of moral justification. We give them to widows and disabled veterans and poor mothers with small children. Giving the homeless guy passed out on the sidewalk an apartment has a different rationale. It's simply about efficiency.
We also believe that the distribution of social benefits should not be arbitrary. We don't give only to some poor mothers, or to a random handful of disabled veterans. We give to everyone who meets a formal criterion, and the moral credibility of government assistance derives, in part, from this universality. But the Denver homelessness program doesn't help every chronically homeless person in Denver. There is a waiting list of six hundred for the supportive-housing program; it will be years before all those people get apartments, and some may never get one. There isn't enough money to go around, and to try to help everyone a little bit—to observe the principle of universality—isn't as cost-effective as helping a few people a lot. Being fair, in this case, means providing shelters and soup kitchens, and shelters and soup kitchens don't solve the problem of homelessness. Our usual moral intuitions are little use, then, when it comes to a few hard cases. Power-law problems leave us with an unpleasant choice. We can be true to our principles or we can fix the problem. We cannot do both.
4.
A few miles northwest of the old Y.M.C.A. in downtown Denver, on the Speer Boulevard off-ramp from I-25, there is a big electronic sign by the side of the road, connected to a device that remotely measures the emissions of the vehicles driving past. When a car with properly functioning pollution-control equipment passes, the sign flashes "Good." When a car passes that is well over the acceptable limits, the sign flashes "Poor." If you stand at the Speer Boulevard exit and watch the sign for any length of time, you'll find that virtually every car scores "Good." An Audi A4 —"Good." A Buick Century—"Good." A Toyota Corolla—"Good." A Ford Taurus—"Good." A Saab 9-5—"Good," and on and on, until after twenty minutes or so, some beat-up old Ford Escort or tricked-out Porsche drives by and the sign flashes "Poor." The picture of the smog problem you get from watching the Speer Boulevard sign and the picture of the homelessness problem you get from listening in on the morning staff meetings at the Y.M.C.A. are pretty much the same. Auto emissions follow a power-law distribution, and the air-pollution example offers another look at why we struggle so much with problems centered on a few hard cases.
Most cars, especially new ones, are extraordinarily clean. A 2004 Subaru in good working order has an exhaust stream that's just .06 per cent carbon monoxide, which is negligible. But on almost any highway, for whatever reason—age, ill repair, deliberate tampering by the owner—a small number of cars can have carbon-monoxide levels in excess of ten per cent, which is almost two hundred times higher. In Denver, five per cent of the vehicles on the road produce fifty-five per cent of the automobile pollution.
"Let's say a car is fifteen years old," Donald Stedman says. Stedman is a chemist and automobile-emissions specialist at the University of Denver. His laboratory put up the sign on Speer Avenue. "Obviously, the older a car is the more likely it is to become broken. It's the same as human beings. And by broken we mean any number of mechanical malfunctions—the computer's not working anymore, fuel injection is stuck open, the catalyst 's not unusual that these failure modes result in high emissions. We have at least one car in our database which was emitting seventy grams of hydrocarbon per mile, which means that you could almost drive a Honda Civic on the exhaust fumes from that car. It's not just old cars. It's new cars with high mileage, like taxis. One of the most successful and least publicized control measures was done by a district attorney in L.A. back in the nineties. He went to LAX and discovered that all of the Bell Cabs were gross emitters. One of those cabs emitted more than its own weight of pollution every year."
In Stedman's view, the current system of smog checks makes little sense. A million motorists in Denver have to go to an emissions center every year—take time from work, wait in line, pay fifteen or twenty-five dollars—for a test that more than ninety per cent of them don't need. "Not everybody gets tested for breast cancer," Stedman says. "Not everybody takes an AIDS test." On-site smog checks, furthermore, do a pretty bad job of finding and fixing the few outliers. Car enthusiasts—with high-powered, high-polluting sports cars—have been known to drop a clean engine into their car on the day they get it tested. Others register their car in a faraway town without emissions testing or arrive at the test site "hot"—having just come off hard driving on the freeway—which is a good way to make a dirty engine appear to be clean. Still others randomly pass the test when they shouldn't, because dirty engines are highly variable and sometimes burn cleanly for short durations. There is little evidence, Stedman says, that the city's regime of inspections makes any difference in air quality.
He proposes mobile testing instead. Twenty years ago, he invented a device the size of a suitcase that uses infrared light to instantly measure and then analyze the emissions of cars as they drive by on the highway. The Speer Avenue sign is attached to one of Stedman's devices. He says that cities should put half a dozen or so of his devices in vans, park them on freeway off-ramps around the city, and have a police car poised to pull over anyone who fails the test. A half-dozen vans could test thirty thousand cars a day. For the same twenty-five million dollars that Denver's motorists now spend on on-site testing, Stedman estimates, the city could identify and fix twenty-five thousand truly dirty vehicles every year, and within a few years cut automobile emissions in the Denver metropolitan area by somewhere between thirty-five and forty per cent. The city could stop managing its smog problem and start ending it.
Why don't we all adopt the Stedman method? There's no moral impediment here. We're used to the police pulling people over for having a blown headlight or a broken side mirror, and it wouldn't be difficult to have them add pollution-control devices to their list. Yet it does run counter to an instinctive social preference for thinking of pollution as a problem to which we all contribute equally. We have developed institutions that move reassuringly quickly and forcefully on collective problems. Congress passes a law. The Environmental Protection Agency promulgates a regulation. The auto industry makes its cars a little cleaner, and—presto—the air gets better. But Stedman doesn't much care about what happens in Washington and Detroit. The challenge of controlling air pollution isn't so much about the laws as it is about compliance with them. It's a policing problem, rather than a policy problem, and there is something ultimately unsatisfying about his proposed solution. He wants to end air pollution in Denver with a half-dozen vans outfitted with a contraption about the size of a suitcase. Can such a big problem have such a small-bore solution?
That's what made the findings of the Christopher Commission so unsatisfying. We put together blue-ribbon panels when we're faced with problems that seem too large for the normal mechanisms of bureaucratic repair. We want sweeping reforms. But what was the commission's most memorable observation? It was the story of an officer with a known history of doing things like beating up handcuffed suspects who nonetheless received a performance review from his superior stating that he "usually conducts himself in a manner that inspires respect for the law and instills public confidence." This is what you say about an officer when you haven't actually read his file, and the implication of the Christopher Commission's report was that the L.A.P.D. might help solve its problem simply by getting its police captains to read the files of their officers. The L.A.P.D.'s problem was a matter not of policy but of compliance. The department needed to adhere to the rules it already had in place, and that's not what a public hungry for institutional transformation wants to hear. Solving problems that have power-law distributions doesn't just violate our moral intuitions; it violates our political intuitions as well. It's hard not to conclude, in the end, that the reason we treated the homeless as one hopeless undifferentiated group for so long is not simply that we didn't know better. It's that we didn't want to know better. It was easier the old way.
Power-law solutions have little appeal to the right, because they involve special treatment for people who do not deserve special treatment; and they have little appeal to the left, because their emphasis on efficiency over fairness suggests the cold number-crunching of Chicago-school cost-benefit analysis. Even the promise of millions of dollars in savings or cleaner air or better police departments cannot entirely compensate for such discomfort. In Denver, John Hickenlooper, the city's enormously popular mayor, has worked on the homelessness issue tirelessly during the past couple of years. He spent more time on the subject in his annual State of the City address this past summer than on any other topic. He gave the speech, with deliberate symbolism, in the city's downtown Civic Center Park, where homeless people gather every day with their shopping carts and garbage bags. He has gone on local talk radio on many occasions to discuss what the city is doing about the issue. He has commissioned studies to show what a drain on the city's resources the homeless population has become. But, he says, "there are still people who stop me going into the supermarket and say, 'I can't believe you're going to help those homeless people, those bums.'"
5.
Early one morning a year ago, Marla Johns got a call from her husband, Steve. He was at work. "He called and woke me up," Johns remembers. "He was choked up and crying on the phone. And I thought that something had happened with another police officer. I said, 'Oh, my gosh, what happened?' He said, 'Murray died last night.' " He died of intestinal bleeding. At the police department that morning, some of the officers gave Murray a moment of silence.
"There are not many days that go by that I don't have a thought of him," she went on. "Christmas comes— and I used to buy him a Christmas present. Make sure he had warm gloves and a blanket and a coat. There was this mutual respect. There was a time when another intoxicated patient jumped off the gurney and was coming at me, and Murray jumped off his gurney and shook his fist and said, 'Don't you touch my angel.' You know, when he was monitored by the system he did fabulously. He would be on house arrest and he would get a job and he would save money and go to work every day, and he wouldn't drink. He would do all the things he was supposed to do. There are some people who can be very successful members of society if someone monitors them. Murray needed someone to be in charge of him."
But, of course, Reno didn't have a place where Murray could be given the structure he needed. Someone must have decided that it cost too much.
"I told my husband that I would claim his body if no one else did," she said. "I would not have him in an unmarked grave."
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
Here's Why
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
April 10, 2006
The Critics: Books
A sociologist offers an anatomy of explanations.
1.
Little Timothy is playing with his older brother Geoffrey, when he comes running to his mother.
"Mommy, Mommy," he starts in. "I was playing with my truck, and then Geoffrey came and he said it was his turn to play with the truck even though it's my truck and then he pushed me."
"Timothy!" his mother says, silencing him. "Don't be a tattletale."
Timothy has heard that phrase—"Don't be a tattletale"—countless times, and it always stops him short. He has offered his mother an eyewitness account of a crime. His mother, furthermore, in no way disputes the truth of his story. Yet what does she do? She rejects it in favor of a simplistic social formula: Don't be a tattletale. It makes no sense. Timothy's mother would never use such a formula to trump a story if she were talking to his father. On the contrary, his mother and father tattle to each other about Geoffrey all the time. And, if Timothy were to tattle on Geoffrey to his best friend, Bruce, Bruce wouldn't reject the story in favor of a formula, either. Narratives are the basis of Timothy's friendship with Bruce. They explain not just effects but causes. They matter—except in this instance, of a story told by Timothy to Mommy about Geoffrey, in which Mommy is suddenly indifferent to stories altogether. What is this don't-be-a-tattletale business about?
In "Why?" (Princeton; $24.95), the Columbia University scholar Charles Tilly sets out to make sense of our reasons for giving reasons. In the tradition of the legendary sociologist Erving Goffman, Tilly seeks to decode the structure of everyday social interaction, and the result is a book that forces readers to reëxamine everything from the way they talk to their children to the way they argue about politics.
In Tilly's view, we rely on four general categories of reasons. The first is what he calls conventions—conventionally accepted explanations. Tilly would call "Don't be a " a convention. The second is stories, and what distinguishes a story ("I was playing with my truck, and then Geoffrey came in . . .") is a very specific account of cause and effect. Tilly cites the sociologist Francesca Polletta's interviews with people who were active in the civil-rights sit-ins of the nineteen-sixties. Polletta repeatedly heard stories that stressed the spontaneity of the protests, leaving out the role of civil-rights organizations, teachers, and churches. That's what stories do. As Tilly writes, they circumscribe time and space, limit the number of actors and actions, situate all causes "in the consciousness of the actors," and elevate the personal over the institutional.
Then there are codes, which are high-level conventions, formulas that invoke sometimes recondite procedural rules and categories. If a loan officer turns you down for a mortgage, the reason he gives has to do with your inability to conform to a prescribed standard of creditworthiness. Finally, there are technical accounts: stories informed by specialized knowledge and authority. An academic history of civil-rights sit-ins wouldn't leave out the role of institutions, and it probably wouldn't focus on a few actors and actions; it would aim at giving patient and expert attention to every sort of nuance and detail.
Tilly argues that we make two common errors when it comes to understanding reasons. The first is to assume that some kinds of reasons are always better than others—that there is a hierarchy of reasons, with conventions (the least sophisticated) at the bottom and technical accounts at the top. That's wrong, Tilly says: each type of reason has its own role.
Tilly's second point flows from the first, and it's that the reasons people give aren't a function of their character—that is, there aren't people who always favor technical accounts and people who always favor stories. Rather, reasons arise out of situations and roles. Imagine, he says, the following possible responses to one person's knocking some books off the desk of another:
1. Sorry, buddy. I'm just plain awkward.
2. I'm sorry. I didn't see your book.
3. Nuts! I did it again.
4. Why did you put that book there?
5. I told you to stack up your books neatly.
The lesson is not that the kind of person who uses reason No. 1 or No. 2 is polite and the kind of person who uses reason No. 4 or No. 5 is a jerk. The point is that any of us might use any of those five reasons depending on our relation to the person whose books we knocked over. Reason-giving, Tilly says, reflects, establishes, repairs, and negotiates relationships. The husband who uses a story to explain his unhappiness to his wife—"Ever since I got my new job, I feel like I've just been so busy that I haven't had time for us"—is attempting to salvage the relationship. But when he wants out of the marriage, he'll say, "It's not you—it's me." He switches to a convention. As his wife realizes, it's not the content of what he has said that matters. It's his shift from the kind of reason-giving that signals commitment to the kind that signals disengagement. Marriages thrive on stories. They die on conventions.
Consider the orgy of reason-giving that followed Vice-President Dick Cheney's quail-hunting accident involving his friend Harry Whittington. Allies of the Vice-President insisted that the media were making way too much of it. "Accidents happen," they said, relying on a convention. Cheney, in a subsequent interview, looked penitently into the camera and said, "The image of him falling is something I'll never be able to get out of my mind. I fired, and there's Harry falling. And it was, I'd have to say, one of the worst days of my life." Cheney told a story. Some of Cheney's critics, meanwhile, focussed on whether he conformed to legal and ethical standards. Did he have a valid license? Was he too slow to notify the White House? They were interested in codes. Then came the response of hunting experts. They retold the narrative of Cheney's accident, using their specialized knowledge of hunting procedure. The Cheney party had three guns, and on a quail shoot, some of them said, you should never have more than two. Why did Whittington retrieve the downed bird? A dog should have done that. Had Cheney's shotgun been aimed more than thirty degrees from the ground, as it should have been? And what were they doing in the bush at five-thirty in the afternoon, when the light isn't nearly good enough for safe hunting? The experts gave a technical account.
Here are four kinds of reasons, all relational in nature. If you like Cheney and are eager to relieve him of responsibility, you want the disengagement offered by a convention. For a beleaguered P.R. agent, the first line of defense in any burgeoning scandal is, inevitably, There is no story here. When, in Cheney's case, this failed, the Vice-President had to convey his concern and regret while not admitting that he had done anything procedurally wrong. Only a story can accomplish that. Anything else—to shrug and say that accidents happen, for instance—would have been perceived as unpardonably callous. Cheney's critics, for their part, wanted the finality and precision of a code: he acted improperly. And hunting experts wanted to display their authority and educate the public about how to hunt safely, so they retold the story of Cheney's accident with the benefit of their specialized knowledge.
Effective reason-giving, then, involves matching the kind of reason we give to the particular role that we happen to be playing at the time a reason is necessary. The fact that Timothy's mother accepts tattling from his father but rejects it from Timothy is not evidence of capriciousness; it just means that a husband's relationship to his wife gives him access to a reasongiving category that a son's role does not. The lesson "Don't be a tattletale"—which may well be one of the hardest childhood lessons to learn—is that in the adult world it is sometimes more important to be appropriate than it is to be truthful.
2.
Two years ago, a young man named Anthony mugged a woman named Anne on a London street. Anthony was caught and convicted, and a few days before he was sentenced he sat down with Anne for a face-to-face meeting, as an exercise in what is known as "restorative justice." The meeting was videotaped by a criminal-justice research group, and to watch the video is to get an even deeper sense of the usefulness of Tilly's thinking.
"We're going to talk about what's happened," the policeman moderating the meeting begins. "Who's been affected, and how they've been affected, and see what we can do to make things better."
Anthony starts. He has a shaved head, a tattoo on his neck, and multiple piercings in his eyebrows and ears. Beside him is his partner, Christy, holding their baby boy. "What happened is I had a bad week. Been out of work for a couple of weeks. Had my kneecap broken. . . . I only had my dad in this country, who I don't get on with. We had no gas in our flat. Me and Christy were arguing all that morning. The baby had been screaming. We were hungry." His story comes out painfully and haltingly. "It was a bit too much. All my friends I was asking to loan me a couple of pounds. They just couldn't afford to give it to me. . . . I don't know what got into me. I just reached over and took your bag. And I'm really sorry for it. And if there is anything I can do to make up for it, I'm willing to do it. I know you probably don't want me anywhere near you."
Anne has been listening closely, her husband, Terry, next to her. Now she tells her side of the story. She heard a sound like male laughter. She turned, and felt her purse being pulled away. She saw a man pulling up his hood. She ran after him, feeling like a "complete idiot." In the struggle over her bag, her arm was injured. She is a journalist and has since had difficulty typing. "The mugging was very small," she says. "But the effect is not going away as fast as I expected. . . . It makes life one notch less bearable."
It was Christy's turn. She got the call at home. She didn't know exactly what had happened. She took the baby and walked to the police station, angry and frightened. "We got ourselves in a situation where we were relying on the state, and we just can't live off the money," Christy says. "And that's not your problem." She starts to cry. "He's not a drug addict," she continues, looking at her husband. Anthony takes the baby from her and holds him. "If we go to court on Monday, and he does get three years for what he's done, or six years, that's his problem. He done it. And he's got to pay for what he's done. I wake up and hear him cry"—she looks at the baby—"and it kills me. I'm in a situation where I can't do anything to make this better. . . . I just want you to know. The first thing he said to me when he walked in was 'I apologized.' And I said, 'That makes what difference?' "
Watching the conference is a strange experience, because it is utterly foreign to the criminal process of which it is ostensibly a part. There is none of the oppressive legalese of the courtroom. Nothing is "alleged"; there are no "perpetrators." The formal back-and-forth between questioner and answerer, the emotionally protective structure of courtroom procedure, is absent. Anne and Terry sit on comfortable chairs facing Christy and Anthony. They have a conversation, not a confrontation. They are telling stories, in Tilly's sense of that word: repairing their relationship by crafting a cause-and-effect account of what happened on the street.
3.
Why is such storytelling, in the wake of a crime, so important? Because, Tilly would argue, some social situations don't lend themselves to the easy reconciliation of reason and role. In Jonathan Franzen's novel "The Corrections," for example, one of the characters, Gary, is in the midst of a frosty conversation with his wife, Caroline. Gary had the sense, Franzen writes, "that Caroline was on the verge of accusing him of being 'depressed,' and he was afraid that if the idea that he was depressed gained currency, he would forfeit his right to his opinions. . . . Every word he spoke would become a symptom of disease; he would never again win an argument." Gary was afraid, in other words, that a technical account of his behavior—the explanation that he was clinically depressed—would trump his efforts to use the stories and conventions that permitted him to be human. But what was his wife to do? She wanted him to change.
When we say that two parties in a conflict are "talking past each other," this is what we mean: that both sides have a legitimate attachment to mutually exclusive reasons. Proponents of abortion often rely on a convention (choice) and a technical account (concerning the viability of a fetus in the first trimester). Opponents of abortion turn the fate of each individual fetus into a story: a life created and then abruptly terminated. Is it any surprise that the issue has proved to be so intractable? If you believe that stories are the most appropriate form of reason-giving, then those who use conventions and technical accounts will seem morally indifferent—regardless of whether you agree with them. And, if you believe that a problem is best adjudicated through conventions or technical accounts, it is hard not to look upon storytellers as sensationalistic and intellectually unserious. By Tilly's logic, abortion proponents who want to engage their critics will have to become better storytellers—and that, according to the relational principles of such reason-giving, may require them to acknowledge an emotional connection between a mother and a fetus. (Ironically, many of the same members of the religious right who have so emphatically demonstrated the emotional superiority of stories when it comes to abortion insist, when it comes to Genesis, on a reading of the Bible as a technical account. Thus do creationists, in the service of reasongiving exigency, force the Holy Scripture to do double duty as a high-school biology textbook.)
Tilly argues that these conflicts are endemic to the legal system. Laws are established in opposition to stories. In a criminal trial, we take a complicated narrative of cause and effect and match it to a simple, impersonal code: first-degree murder, or second-degree murder, or manslaughter. The impersonality of codes is what makes the law fair. But it is also what can make the legal system so painful for victims, who find no room for their voices and their anger and their experiences. Codes punish, but they cannot heal.
So what do you do? You put Anne and her husband in a room with Anthony and Christy and their baby boy and you let them talk. In a series of such experiments, conducted in Britain and Australia by the criminologists Lawrence Sherman and Heather Strang, restorative-justice programs have shown encouraging results in reducing recidivism rates among offenders and psychological trauma among victims. If you view the tape of the Anthony-Anne exchange, it's not hard to see why. Sherman said that when the Lord Chief Justice of England and Wales watched it at home one night he wept.
"If there is anything I can do, please say it," Anthony says.
"I think most of what you can do is between the two of you, actually," Anne says to Anthony and Christy. "I think if you can put your lives back together again, then that's what needs to be done."
The moderator tells them all to take a break and help themselves to "Metropolitan Police tea and coffee and chocolate biscuits."
Anne asks Christy how old the baby is, and where they are living. It turns out that their apartment has been condemned.Terry stands up and offers the baby a chocolate biscuit, and the adults experience the kind of moment that adults have in the company of babies, where nothing matters except the child in front of them.
"He's a good baby," Christy says. A convention. One kind of reason is never really enough.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
Game Theory
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 29, 2006
Books
When it comes to athletic prowess, don't believe your eyes.
1.
The first player picked in the 1996 National Basketball Association draft was a slender, six-foot guard from Georgetown University named Allen Iverson. Iverson was thrilling. He was lightning quick, and could stop and start on a dime. He would charge toward the basket, twist and turn and writhe through the arms and legs of much taller and heavier men, and somehow find a way to score. In his first season with the Philadelphia 76ers, Iverson was voted the N.B.A.'s Rookie of the Year. In every year since 2000, he has been named to the N.B.A.'s All-Star team. In the 2000-01 season, he finished first in the league in scoring and steals, led his team to the second-best record in the league, and was named, by the country's sportswriters and broadcasters, basketball's Most Valuable Player. He is currently in the midst of a four-year, seventy-seven-million-dollar contract. Almost everyone who knows basketball and who watches Iverson play thinks that he's one of the best players in the game.
But how do we know that we're watching a great player? That's an easier question to answer when it comes to, say, golf or tennis, where players compete against one another, under similar circumstances, week after week. Nobody would dispute that Roger Federer is the world's best tennis player. Baseball is a little more complicated, since it's a team sport. Still, because the game consists of a sequence of discrete, ritualized encounters between pitcher and hitter, it lends itself to statistical rankings and analysis. Most tasks that professionals perform, though, are surprisingly hard to evaluate. Suppose that we wanted to measure something in the real world, like the relative skill of New York City's heart surgeons. One obvious way would be to compare the mortality rates of the patients on whom they operate—except that substandard care isn't necessarily fatal, so a more accurate measure might be how quickly patients get better or how few complications they have after surgery. But recovery time is a function as well of how a patient is treated in the intensive-care unit, which reflects the capabilities not just of the doctor but of the nurses in the I.C.U. So now we have to adjust for nurse quality in our assessment of surgeon quality. We'd also better adjust for how sick the patients were in the first place, and since well-regarded surgeons often treat the most difficult cases, the best surgeons might well have the poorest patient recovery rates. In order to measure something you thought was fairly straightforward, you really have to take into account a series of things that aren't so straightforward.
Basketball presents many of the same kinds of problems. The fact that Allen Iverson has been one of the league's most prolific scorers over the past decade, for instance, could mean that he is a brilliant player. It could mean that he's selfish and takes shots rather than passing the ball to his teammates. It could mean that he plays for a team that races up and down the court and plays so quickly that he has the opportunity to take many more shots than he would on a team that plays more deliberately. Or he might be the equivalent of an average surgeon with a first-rate I.C.U.: maybe his success reflects the fact that everyone else on his team excels at getting rebounds and forcing the other team to turn over the ball. Nor does the number of points that Iverson scores tell us anything about his tendency to do other things that contribute to winning and losing games; it doesn't tell us how often he makes a mistake and loses the ball to the other team, or commits a foul, or blocks a shot, or rebounds the ball. Figuring whether one basketball player is better than another is a challenge similar to figuring out whether one heart surgeon is better than another: you have to find a way to interpret someone's individual statistics in the context of the team that they're on and the task that they are performing.
In "The Wages of Wins" (Stanford; $29.95), the economists David J. Berri, Martin B. Schmidt, and Stacey L. Brook set out to solve the Iverson problem. Weighing the relative value of fouls, rebounds, shots taken, turnovers, and the like, they've created an algorithm that, they argue, comes closer than any previous statistical measure to capturing the true value of a basketball player. The algorithm yields what they call a Win Score, because it expresses a player's worth as the number of wins that his contributions bring to his team. According to their analysis, Iverson's finest season was in 2004-05, when he was worth ten wins, which made him the thirty-sixth-best player in the league. In the season in which he won the Most Valuable Player award, he was the ninety-first-best player in the league. In his worst season (2003-04), he was the two-hundred-and-twenty-seventh-best player in the league. On average, for his career, he has ranked a hundred and sixteenth. In some years, Iverson has not even been the best player on his own team. Looking at the findings that Berri, Schmidt, and Brook present is enough to make one wonder what exactly basketball experts—coaches, managers, sportswriters—know about basketball.
2.
Basketball experts clearly appreciate basketball. They understand the gestalt of the game, in the way that someone who has spent a lifetime thinking about and watching, say, modern dance develops an understanding of that art form. They're able to teach and coach and motivate; to make judgments and predictions about a player's character and resolve and stage of development. But the argument of "The Wages of Wins" is that this kind of expertise has real limitations when it comes to making precise evaluations of individual performance, whether you're interested in the consistency of football quarterbacks or in testing claims that N.B.A. stars "turn it on" during playoffs. The baseball legend Ty Cobb, the authors point out, had a lifetime batting average of .366, almost thirty points higher than the former San Diego Padres outfielder Tony Gwynn, who had a lifetime batting average of .338:
So Cobb hit safely 37 percent of the time while Gwynn hit safely on 34 percent of his at bats. If all you did was watch these players, could you say who was a better hitter? Can one really tell the difference between 37 percent and 34 percent just staring at the players play? To see the problem with the non-numbers approach to player evaluation, consider that out of every 100 at bats, Cobb got three more hits than Gwynn. That's it, three hits.
Michael Lewis made a similar argument in his 2003 best-seller, "Moneyball," about how the so-called sabermetricians have changed the evaluation of talent in baseball. Baseball is sufficiently transparent, though, that the size of the discrepancies between intuitive and statistically aided judgment tends to be relatively modest. If you mistakenly thought that Gwynn was better than Cobb, you were still backing a terrific hitter. But "The Wages of Wins" suggests that when you move into more complex situations, like basketball, the limitations of "seeing" become enormous. Jermaine O'Neal, a center for the Indiana Pacers, finished third in the Most Valuable Player voting in 2004. His Win Score that year put him forty-fourth in the league. In 2004-05, the forward Antoine Walker made as much money as the point guard Jason Kidd, even though Walker produced 0.6 wins for Atlanta and Boston and Kidd produced nearly twenty wins for New Jersey. The Win Score algorithm suggests that Ray Allen has had nearly as good a career as Kobe Bryant, whom many consider the top player in the game, and that the journeyman forward Jerome Williams was actually among the strongest players of his generation.
Most egregious is the story of a young guard for the Chicago Bulls named Ben Gordon. Last season, Gordon finished second in the Rookie of the Year voting and was named the league's top "sixth man"—that is, the best non-starter—because he averaged an impressive 15.1 points per game in limited playing time. But Gordon rebounds less than he should, turns over the ball frequently, and makes such a low percentage of his shots that, of the ''s top thirty-three scorers—that is, players who score at least one point for every two minutes on the floor—Gordon's Win Score ranked him dead last.
The problem for basketball experts is that, in a situation with many variables, it's difficult to know how much weight to assign to each variable. Buying a house is agonizing because we look at the size, the location, the back yard, the proximity to local schools, the price, and so on, and we're unsure which of those things matters most. Assessing heart-attack risk is a notoriously difficult task for similar reasons. A doctor can analyze a dozen different factors. But how much weight should be given to a patient's cholesterol level relative to his blood pressure? In the face of such complexity, people construct their own arbitrary algorithms—they assume that every factor is of equal importance, or randomly elevate one or two factors for the sake of simplifying matters—and we make mistakes because those arbitrary algorithms are, well, arbitrary.
Berri, Schmidt, and Brook argue that the arbitrary algorithms of basketball experts elevate the number of points a player scores above all other considerations. In one clever piece of research, they analyze the relationship between the statistics of rookies and the number of votes they receive in the All-Rookie Team balloting. If a rookie increases his scoring by ten per cent—regardless of how efficiently he scores those points—the number of votes he'll get will increase by twenty-three per cent. If he increases his rebounds by ten per cent, the number of votes he'll get will increase by six per cent. Every other factor, like turnovers, steals, assists, blocked shots, and personal fouls—factors that can have a significant influence on the outcome of a game—seemed to bear no statistical relationship to judgments of merit at all. It's not even the case that high scorers help their team by drawing more fans. As the authors point out, that's only true on the road. At home, attendance is primarily a function of games won. Basketball's decision-makers, it seems, are simply irrational.
It's hard not to wonder, after reading "The Wages of Wins," about the other instances in which we defer to the evaluations of experts. Boards of directors vote to pay C.E.O.s tens of millions of dollars, ostensibly because they believe—on the basis of what they have learned over the years by watching other C.E.O.s—that they are worth it. But so what? We see Allen Iverson, over and over again, charge toward the basket, twisting and turning and writhing through a thicket of arms and legs of much taller and heavier men—and all we learn is to appreciate twisting and turning and writhing. We become dance critics, blind to Iverson's dismal shooting percentage and his excessive turnovers, blind to the reality that the Philadelphia 76ers would be better off without him. "One can play basketball," the authors conclude. "One can watch basketball. One can both play and watch basketball for a thousand years. If you do not systematically track what the players do, and then uncover the statistical relationship between these actions and wins, you will never know why teams win and why they lose."
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
No Mercy
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
September 4, 2006
Comment
In 1925, a young American physicist was doing graduate work at Cambridge University, in England. He was depressed. He was fighting with his mother and had just broken up with his girlfriend. His strength was in theoretical physics, but he was being forced to sit in a laboratory making thin films of beryllium. In the fall of that year, he dosed an apple with noxious chemicals from the lab and put it on the desk of his tutor, Patrick Blackett. Blackett, luckily, didn't eat the apple. But school officials found out what happened, and arrived at a punishment: the student was to be put on probation and ordered to go to London for regular sessions with a psychiatrist.
Probation? These days, we routinely suspend or expel high-school students for doing infinitely less harmful things, like fighting or drinking or taking drugs—that is, for doing the kinds of things that teen-agers do. This past summer, Rhett Bomar, the starting quarterback for the University of Oklahoma Sooners, was cut from the team when he was found to have been "overpaid" (receiving wages for more hours than he worked, with the apparent complicity of his boss) at his job at a car dealership. Even in Oklahoma, people seemed to think that kicking someone off a football team for having cut a few corners on his job made perfect sense. This is the age of zero tolerance. Rules are rules. Students have to be held accountable for their actions. Institutions must signal their expectations firmly and unambiguously: every school principal and every college president, these days, reads from exactly the same script. What, then, of a student who gives his teacher a poisoned apple? Surely he ought to be expelled from school and sent before a judge.
Suppose you cared about the student, though, and had some idea of his situation and his potential. Would you feel the same way? You might. Trying to poison your tutor is no small infraction. Then again, you might decide, as the dons at Cambridge clearly did, that what had happened called for a measure of leniency. They knew that the student had never done anything like this before, and that he wasn't well. And they knew that to file charges would almost certainly ruin his career. Cambridge wasn't sure that the benefits of enforcing the law, in this case, were greater than the benefits of allowing the offender an unimpeded future.
Schools, historically, have been home to this kind of discretionary justice. You let the principal or the teacher decide what to do about cheating because you know that every case of cheating is different—and, more to the point, that every cheater is different. Jimmy is incorrigible, and needs the shock of expulsion. But Bobby just needs a talking to, because he's a decent kid, and Mary and Jane cheated because the teacher foolishly stepped out of the classroom in the middle of the test, and the temptation was simply too much. A Tennessee study found that after zero-tolerance programs were adopted by the state's public schools the frequency of targeted offenses soared: the firm and unambiguous punishments weren't deterring bad behavior at all. Is that really a surprise? If you're a teen-ager, the announcement that an act will be sternly punished doesn't always sink in, and it isn't always obvious when you're doing the thing you aren't supposed to be doing. Why? Because you're a teen-ager.
Somewhere along the way—perhaps in response to Columbine—we forgot the value of discretion in disciplining the young. "Ultimately, they have to make right decisions," the Oklahoma football coach, Bob Stoops, said of his players, after jettisoning his quarterback. "When they do not, the consequences are serious." Open and shut: he sounded as if he were talking about a senior executive of Enron, rather than a college sophomore whose primary obligation at Oklahoma was to throw a football in the direction of young men in helmets. You might think that if the University of Oklahoma was so touchy about its quarterback being "overpaid" it ought to have kept closer track of his work habits with an on-campus job. But making a fetish of personal accountability conveniently removes the need for institutional accountability. (We court-martial the grunts who abuse prisoners, not the commanding officers who let the abuse happen.) To acknowledge that the causes of our actions are complex and muddy seems permissive, and permissiveness is the hallmark of an ideology now firmly in disgrace. That conservative patron saint Whittaker Chambers once defined liberalism as Christ without the Crucifixion. But punishment without the possibility of redemption is worse: it is the Crucifixion without Christ.
As for the student whose career Cambridge saved? He left at the end of the academic year and went to study at the University of Göttingen, where he made important contributions to quantum theory. Later, after a brilliant academic career, he was entrusted with leading one of the most critical and morally charged projects in the history of science. His name was Robert Oppenheimer.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
The Formula
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
October 10, 2006
Annals of Entertainment
What if you built a machine to predict hit movies?
1.
One sunny afternoon not long ago, Dick Copaken sat in a booth at Daniel, one of those hushed, exclusive restaurants on Manhattan's Upper East Side where the waiters glide spectrally from table to table. He was wearing a starched button-down shirt and a blue blazer. Every strand of his thinning hair was in place, and he spoke calmly and slowly, his large pink Charlie Brown head bobbing along evenly as he did. Copaken spent many years as a partner at the white-shoe Washington, D.C., firm Covington & Burling, and he has a lawyer's gravitas. One of his best friends calls him, admiringly, "relentless." He likes to tell stories. Yet he is not, strictly, a storyteller, because storytellers are people who know when to leave things out, and Copaken never leaves anything out: each detail is adduced, considered, and laid on the table—and then adjusted and readjusted so that the corners of the new fact are flush with the corners of the fact that preceded it. This is especially true when Copaken is talking about things that he really cares about, such as questions of international law or his grandchildren or, most of all, the movies.
Dick Copaken loves the movies. His friend Richard Light, a statistician at Harvard, remembers summer vacations on Cape Cod with the Copakens, when Copaken would take his children and the Light children to the movies every day. "Fourteen nights out of fourteen," Light said. "Dick would say at seven o'clock, 'Hey, who's up for the movies?' And, all by himself, he would take the six kids to the movies. The kids had the time of their lives. And Dick would come back and give, with a completely straight face, a rigorous analysis of how each movie was put together, and the direction and the special effects and the animation." This is a man who has seen two or three movies a week for the past fifty years, who has filed hundreds of plots and characters and scenes away in his mind, and at Daniel he was talking about a movie that touched him as much as any he'd ever seen.
"Nobody's heard of it," he said, and he clearly regarded this fact as a minor tragedy. "It's called 'Dear Frankie.' I watched it on a Virgin Atlantic flight because it was the only movie they had that I hadn't already seen. I had very low expectations. But I was blown away." He began, in his lawyer-like manner, to lay out the plot. It takes place in Scotland. A woman has fled an abusive relationship with her infant son and is living in a port town. The boy, now nine, is deaf, and misses the father he has never known. His mother has told him that his father is a sailor on a ship that rarely comes to shore, and has suggested that he write his father letters. These she intercepts, and replies to, writing as if she were the father. One day, the boy finds out that what he thinks is his father's ship is coming to shore. The mother has to find a man to stand in for the father. She does. The two fall in love. Unexpectedly, the real father reëmerges. He's dying, and demands to see his son. The mother panics. Then the little boy reveals his secret: he knew about his mother's ruse all along.
"I was in tears over this movie," Copaken said. "You know, sometimes when you see a movie in the air you're in such an out-of-body mood that things get exaggerated. So when I got home I sat down and saw it another time. I was bawling again, even though I knew what was coming." Copaken shook his head, and then looked away. His cheeks were flushed. His voice was suddenly thick. There he was, a buttoned-down corporate lawyer, in a hushed restaurant where there is practically a sign on the wall forbidding displays of human emotion—and he was crying, a third time. "That absolutely hits me," he said, his face still turned away. "He knew all along what the mother was doing." He stopped to collect himself. "I can't even retell the damn story without getting emotional."
He tried to explain why he was crying. There was the little boy, first of all. He was just about the same age as Copaken's grandson Jacob. So maybe that was part of it. Perhaps, as well, he was reacting to the idea of an absent parent. His own parents, Albert and Silvia, ran a modest community-law practice in Kansas City, and would shut down their office whenever Copaken or his brother had any kind of school activity or performance. In the Copaken world, it was an iron law that parents had to be present. He told a story about representing the Marshall Islands in negotiations with the U.S. government during the Cold War. A missile-testing range on the island was considered to be strategically critical. The case was enormously complex—involving something like fifty federal agencies and five countries—and, just as the negotiations were scheduled to begin, Copaken learned of a conflict: his eldest daughter was performing the lead role in a sixth-grade production of "The Wiz." "I made an instant decision," Copaken said. He told the President of the Marshall Islands that his daughter had to come first. Half an hour passed. "I get a frantic call from the State Department, very high levels: 'Dick, I got a call from the President of the Marshall Islands. What's going on?' I told him. He said, 'Dick, are you putting in jeopardy the national security of the United States for a sixth-grade production?' " In the end, the negotiations were suspended while Copaken flew home from Hawaii. "The point is," Copaken said, "that absence at crucial moments has been a worry to me, and maybe this movie just grabbed at that issue."
He stopped, seemingly dissatisfied. Was that really why he'd cried? Hollywood is awash in stories of bad fathers and abandoned children, and Copaken doesn't cry in fancy restaurants every time he thinks of one of them. When he tried to remember the last time he cried at the movies, he was stumped. So he must have been responding to something else, too—some detail, some unconscious emotional trigger in the combination of the mother and the boy and the Scottish seaside town and the ship and the hired surrogate and the dying father. To say that he cried at "Dear Frankie" because of that lonely fatherless boy was as inadequate as saying that people cried at the death of Princess Diana because she was a beautiful princess. Surely it mattered as well that she was killed in the company of her lover, a man distrusted by the Royal Family. ''t this "Romeo and Juliet"? And surely it mattered that she died in a tunnel, and that the tunnel was in Paris, and that she was chased by motorbikes, and that she was blond and her lover was dark—because each one of those additional narrative details has complicated emotional associations, and it is the subtle combination of all these associations that makes us laugh or choke up when we remember a certain movie, every single time, even when we're sitting in a fancy restaurant.
Of course, the optimal combination of all those elements is a mystery. That's why it's so hard to make a really memorable movie, and why we reward so richly the few people who can. But suppose you really, really loved the movies, and suppose you were a relentless type, and suppose you used all of the skills you'd learned during the course of your career at the highest rungs of the law to put together an international team of story experts. Do you think you could figure it out?
2.
The most famous dictum about Hollywood belongs to the screenwriter William Goldman. "Nobody knows anything," Goldman wrote in "Adventures in the Screen Trade" a couple of decades ago. "Not one person in the entire motion picture field knows for a certainty what's going to work. Every time out it's a guess." One of the highest-grossing movies in history, ""Raiders of the Lost Ark," was offered to every studio in Hollywood, Goldman writes, and every one of them turned it down except Paramount: "Why did Paramount say yes? Because nobody knows anything. And why did all the other studios say no? Because nobody knows anything. And why did Universal, the mightiest studio of all, pass on Star Wars? . . . Because nobody, nobody—not now, not ever—knows the least goddamn thing about what is or isn't going to work at the box office."
What Goldman was saying was a version of something that has long been argued about art: that there is no way of getting beyond one's own impressions to arrive at some larger, objective truth. There are no rules to art, only the infinite variety of subjective experience. "Beauty is no quality in things themselves," the eighteenth-century Scottish philosopher David Hume wrote. "It exists merely in the mind which contemplates them; and each mind perceives a different beauty." Hume might as well have said that nobody knows anything.
But Hume had a Scottish counterpart, Lord Kames, and Lord Kames was equally convinced that traits like beauty, sublimity, and grandeur were indeed reducible to a rational system of rules and precepts. He devised principles of congruity, propriety, and perspicuity: an elevated subject, for instance, must be expressed in elevated language; sound and signification should be in concordance; a woman was most attractive when in distress; depicted misfortunes must never occur by chance. He genuinely thought that the superiority of Virgil's hexameters to Horace's could be demonstrated with Euclidean precision, and for every Hume, it seems, there has always been a Kames—someone arguing that if nobody knows anything it is only because nobody's looking hard enough.
In a small New York loft, just below Union Square, for example, there is a tech startup called Platinum Blue that consults for companies in the music business. Record executives have tended to be Humean: though they can tell you how they feel when they listen to a song, they don't believe anyone can know with confidence whether a song is going to be a hit, and, historically, fewer than twenty per cent of the songs picked as hits by music executives have fulfilled those expectations. Platinum Blue thinks it can do better. It has a proprietary computer program that uses "spectral deconvolution software" to measure the mathematical relationships among all of a song's structural components: melody, harmony, beat, tempo, rhythm, octave, pitch, chord progression, cadence, sonic brilliance, frequency, and so on. On the basis of that analysis, the firm believes it can predict whether a song is likely to become a hit with eighty-per-cent accuracy. Platinum Blue is staunchly Kamesian, and, if you have a field dominated by those who say there are no rules, it is almost inevitable that someone will come along and say that there are. The head of Platinum Blue is a man named Mike McCready, and the service he is providing for the music business is an exact model of what Dick Copaken would like to do for the movie business.
McCready is in his thirties, baldish and laconic, with rectangular hipster glasses. His offices are in a large, open room, with a row of windows looking east, across the rooftops of downtown Manhattan. In the middle of the room is a conference table, and one morning recently McCready sat down and opened his laptop to demonstrate the Platinum Blue technology. On his screen was a cluster of thousands of white dots, resembling a cloud. This was a "map" of the songs his group had run through its software: each dot represented a single song, and each song was positioned in the cloud according to its particular mathematical signature. "You could have one piano sonata by Beethoven at this end and another one here," McCready said, pointing at the opposite end, "as long as they have completely different chord progressions and completely different melodic structures."
McCready then hit a button on his computer, which had the effect of eliminating all the songs that had not made the Billboard Top 30 in the past five years. The screen went from an undifferentiated cloud to sixty discrete clusters. This is what the universe of hit songs from the past five years looks like structurally; hits come out of a small, predictable, and highly conserved set of mathematical patterns. "We take a new CD far in advance of its release date," McCready said. "We analyze all twelve tracks. Then we overlay them on top of the already existing hit clusters, and what we can tell a record company is which of those songs conform to the mathematical pattern of past hits. Now, that doesn't mean that they will be hits. But what we are saying is that, almost certainly, songs that fall outside these clusters will not be —regardless of how much they sound and feel like hit songs, and regardless of how positive your call-out research or focus-group research is." Four years ago, when McCready was working with a similar version of the program at a firm in Barcelona, he ran thirty just-released albums, chosen at random, through his system. One stood out. The computer said that nine of the fourteen songs on the album had clear hit potential—which was unheard of. Nobody in his group knew much about the artist or had even listened to the record before, but the numbers said the album was going to be big, and McCready and his crew were of the belief that numbers do not lie. "Right around that time, a local newspaper came by and asked us what we were doing," McCready said. "We explained the hit-prediction thing, and that we were really turned on to a record by this artist called Norah Jones." The record was "Come Away with Me." It went on to sell twenty million copies and win eight Grammy awards.
3.
The strength of McCready's analysis is its precision. This past spring, for instance, he analyzed "Crazy," by Gnarls Barkley. The computer calculated, first of all, the song's Hit Grade—that is, how close it was to the center of any of those sixty hit clusters. Its Hit Grade was 755, on a scale where anything above 700 is exceptional. The computer also found that "Crazy" belonged to the same hit cluster as Dido's "Thank You," James Blunt's "You're Beautiful," and Ashanti's "Baby," as well as older hits like "Let Me Be There," by Olivia Newton-John, and "One Sweet Day," by Mariah Carey, so that listeners who liked any of those songs would probably like "Crazy," too. Finally, the computer gave "Crazy" a Periodicity Grade—which refers to the fact that, at any given time, only twelve to fifteen hit clusters are "active," because from month to month the particular mathematical patterns that excite music listeners will shift around. "Crazy" 's periodicity score was 658—which suggested a very good fit with current tastes. The data said, in other words, that "Crazy" was almost certainly going to be huge—and, sure enough, it was.
If "Crazy" hadn't scored so high, though, the Platinum Blue people would have given the song's producers broad suggestions for fixing it. McCready said, "We can tell a producer, 'These are the elements that seem to be pushing your song into the hit cluster. These are the variables that are pulling your song away from the hit cluster. The problem seems to be in your bass line.' And the producer will make a bunch of mixes, where they do something different with the bass lines—increase the decibel level, or muddy it up. Then they come back to us. And we say, 'Whatever you were doing with mix No. 3, do a little bit more of that and you'll be back inside the hit cluster.'"
McCready stressed that his system didn't take the art out of hit-making. Someone still had to figure out what to do with mix No. 3, and it was entirely possible that whatever needed to be done to put the song in the hit cluster wouldn't work, because it would make the song sound wrong—and in order to be a hit a song had to sound right. Still, for the first time you wouldn't be guessing about what needed to be done. You would know. And what you needed to know in order to fix the song was much simpler than anyone would have thought. McCready didn't care about who the artist was, or the cleverness of the lyrics. He didn't even have a way of feeding lyrics into his computer. He cared only about a song's underlying mathematical structure. "If you go back to the popular melodies written by Beethoven and Mozart three hundred years ago," he went on, "they conform to the same mathematical patterns that we are looking at today. What sounded like a beautiful melody to them sounds like a beautiful melody to us. What has changed is simply that we have come up with new styles and new instruments. Our brains are wired in a way—we assume—that keeps us coming back, again and again, to the same answers, the same pleasure centers." He had sales data and Top 30 lists and deconvolution software, and it seemed to him that if you put them together you had an objective way of measuring something like beauty. "We think we've figured out how the brain works regarding musical taste," McCready said.
It requires a very particular kind of person, of course, to see the world as a code waiting to be broken. Hume once called Kames "the most arrogant man in the world," and to take this side of the argument you have to be. Kames was also a brilliant lawyer, and no doubt that matters as well, because to be a good lawyer is to be invested with a reverence for rules. (Hume defied his family's efforts to make him a lawyer.) And to think like Kames you probably have to be an outsider. Kames was born Henry Home, to a farming family, and grew up in the sparsely populated cropping-and-fishing county of Berwickshire; he became Lord Kames late in life, after he was elevated to the bench. (Hume was born and reared in Edinburgh.) His early published work was about law and its history, but he soon wandered into morality, religion, anthropology, soil chemistry, plant nutrition, and the physical sciences, and once asked his friend Benjamin Franklin to explain the movement of smoke in chimneys. Those who believe in the power of broad patterns and rules, rather than the authority of individuals or institutions, are not intimidated by the boundaries and hierarchies of knowledge. They don't defer to the superior expertise of insiders; they set up shop in a small loft somewhere downtown and take on the whole music industry at once. The difference between Hume and Kames is, finally, a difference in kind, not degree. You're either a Kamesian or you're not. And if you were to create an archetypal Kamesian—to combine lawyerliness, outsiderness, and supreme self-confidence in one dapper, Charlie Brown-headed combination? You'd end up with Dick Copaken.
"I remember when I was a sophomore in high school and I went into the bathroom once to wash my hands," Copaken said. "I noticed the bubbles on the sink, and it fascinated me the way these bubbles would form and move around and float and reform, and I sat there totally transfixed. My father called me, and I didn't hear him. Finally, he comes in. 'Son. What the . . . are you all right?' I said, 'Bubbles, Dad, look what they do.' He said, 'Son, if you're going to waste your time, waste it on something that may have some future consequence.' Well, I kind of rose to the challenge. That summer, I bicycled a couple of miles to a library in Kansas City and I spent every day reading every book and article I could find on bubbles."
Bubbles looked completely random, but young Copaken wasn't convinced. He built a bubble-making device involving an aerator from a fish tank, and at school he pleaded with the math department to teach him the quadratic equations he needed to show why the bubbles formed the way they did. Then he devised an experiment, and ended up with a bronze medal at the International Science Fair. His interest in bubbles was genuine, but the truth is that almost anything could have caught Copaken's eye: pop songs, movies, the movement of chimney smoke. What drew him was not so much solving this particular problem as the general principle that problems were solvable—that he, little Dick Copaken from Kansas City, could climb on his bicycle and ride to the library and figure out something that his father thought wasn't worth figuring out.
Copaken has written a memoir of his experience defending the tiny Puerto Rican islands of Culebra and Vieques against the U.S. Navy, which had been using their beaches for target practice. It is a riveting story. Copaken takes on the vast Navy bureaucracy, armed only with arcane provisions of environmental law. He investigates the nesting grounds of the endangered hawksbill turtle, and the mating habits of a tiny yet extremely loud tree frog known as the coqui, and at one point he transports four frozen whale heads from the Bahamas to Harvard Medical School. Copaken wins. The Navy loses.
The memoir reads like a David-and-Goliath story. It isn't. David changed the rules on Goliath. He brought a slingshot to a sword fight. People like Copaken, though, don't change the rules; they believe in rules. Copaken would have agreed to sword-on-sword combat. But then he would have asked the referee for a stay, deposed Goliath and his team at great length, and papered him with brief after brief until he conceded that his weapon did not qualify as a sword under §48(B)(6)(e) of the Samaria Convention of 321 B.C. (The Philistines would have settled.) And whereas David knew that he couldn't win a conventional fight with Goliath, the conviction that sustained Copaken's long battle with the Navy was, to the contrary, that so long as the battle remained conventional—so long as it followed the familiar pathways of the law and of due process—he really could win. Dick Copaken didn't think he was an underdog at all. If you believe in rules, Goliath is just another Philistine, and the Navy is just another plaintiff. As for the ineffable mystery of the Hollywood blockbuster? Well, Mr. Goldman, you may not know anything. But I do.
4.
Dick Copaken has a friend named Nick Meaney. They met on a case years ago. Meaney has thick dark hair. He is younger and much taller than Copaken, and seems to regard his friend with affectionate amusement. Meaney's background is in risk management, and for years he'd been wanting to bring the principles of that world to the movie business. In 2003, Meaney and Copaken were driving through the English countryside to Durham when Meaney told Copaken about a friend of his from college. The friend and his business partner were students of popular narrative: the sort who write essays for obscure journals serving the small band of people who think deeply about, say, the evolution of the pilot episode in transnational TV crime dramas. And, for some time, they had been developing a system for evaluating the commercial potential of stories. The two men, Meaney told Copaken, had broken down the elements of screenplay narrative into multiple categories, and then drawn on their encyclopedic knowledge of television and film to assign scripts a score in each of those categories—creating a giant screenplay report card. The system was extraordinarily elaborate. It was under constant refinement. It was also top secret. Henceforth, Copaken and Meaney would refer to the two men publicly only as "Mr. Pink" and "Mr. Brown," an homage to "Reservoir Dogs."
"The guy had a big wall, and he started putting up little Post-its covering everything you can think of," Copaken said. It was unclear whether he was talking about Mr. Pink or Mr. Brown or possibly some Obi-Wan Kenobi figure from whom Mr. Pink and Mr. Brown first learned their trade. "You know, the star wears a blue shirt. The star doesn't zip up his pants. Whatever. So he put all these factors up and began moving them around as the scripts were either successful or unsuccessful, and he began grouping them and eventually this evolved to a kind of ad-hoc analytical system. He had no theory as to what would work, he just wanted to know what did work."
Copaken and Meaney also shared a fascination with a powerful kind of computerized learning system called an artificial neural network. Neural networks are used for data mining—to look for patterns in very large amounts of data. In recent years, they have become a critical tool in many industries, and what Copaken and Meaney realized, when they thought about Mr. Pink and Mr. Brown, was that it might now be possible to bring neural networks to Hollywood. They could treat screenplays as mathematical propositions, using Mr. Pink and Mr. Brown's categories and scores as the motion-picture equivalents of melody, harmony, beat, tempo, rhythm, octave, pitch, chord progression, cadence, sonic brilliance, and frequency.
Copaken and Meaney brought in a former colleague of Meaney's named Sean Verity, and the three of them signed up Mr. Pink and Mr. Brown. They called their company Epagogix—a reference to Aristotle's discussion of epagogic, or inductive, learning—and they started with a "training set" of screenplays that Mr. Pink and Mr. Brown had graded. Copaken and Meaney won't disclose how many scripts were in the training set. But let's say it was two hundred. Those scores—along with the U.S. box-office receipts for each of the films made from those screenplays—were fed into a neural network built by a computer scientist of Meaney's acquaintance. "I can't tell you his name," Meaney said, "but he's English to his bootstraps." Mr. Bootstraps then went to work, trying to use Mr. Pink and Mr. Brown's scoring data to predict the box-office receipts of every movie in the training set. He started with the first film and had the neural network make a guess: maybe it said that the hero's moral crisis in act one, which rated a 7 on the 10-point moral-crisis scale, was worth $7 million, and having a gorgeous red-headed eighteen-year-old female lead whose characterization came in at 6.5 was worth $3 million and a 9-point bonding moment between the male lead and a four-year-old boy in act three was worth $2 million, and so on, putting a dollar figure on every grade on Mr. Pink and Mr. Brown's report card until the system came up with a prediction. Then it compared its guess with how that movie actually did. Was it close? Of course not. The neural network then went back and tried again. If it had guessed $20 million and the movie actually made $110 million, it would reweight the movie's Pink/Brown scores and run the numbers a second time. And then it would take the formula that worked best on Movie One and apply it to Movie Two, and tweak that until it had a formula that worked on Movies One and Two, and take that formula to Movie Three, and then to four and five, and on through all two hundred movies, whereupon it would go back through all the movies again, through hundreds of thousands of iterations, until it had worked out a formula that did the best possible job of predicting the financial success of every one of the movies in its database.
That formula, the theory goes, can then be applied to new scripts. If you were developing a $75-million buddy picture for Bruce Willis and Colin Farrell, Epagogix says, it can tell you, based on past experience, what that script's particular combination of narrative elements can be expected to make at the box office. If the formula says it's a $50-million script, you pull the plug. "We shoot turkeys," Meaney said. He had seen Mr. Bootstraps and the neural network in action: "It can sometimes go on for hours. If you look at the computer, you see lots of flashing numbers in a gigantic grid. It's like 'The Matrix.' There are a lot of computations. The guy is there, the whole time, looking at it. It eventually stops flashing, and it tells us what it thinks the American box-office will be. A number comes out."
The way the neural network thinks is not that different from the way a Hollywood executive thinks: if you pitch a movie to a studio, the executive uses an ad-hoc algorithm—perfected through years of trial and error—to put a value on all the components in the story. Neural networks, though, can handle problems that have a great many variables, and they never play favorites—which means (at least in theory) that as long as you can give the neural network the same range of information that a human decision-maker has, it ought to come out ahead. That's what the University of Arizona computer scientist Hsinchun Chen demonstrated ten years ago, when he built a neural network to predict winners at the dog track. Chen used the ten variables that greyhound experts told him they used in making their bets—like fastest time and winning percentage and results for the past seven races—and trained his system with the results of two hundred races. Then he went to the greyhound track in Tucson and challenged three dog-racing handicappers to a contest. Everyone picked winners in a hundred races, at a modest two dollars a bet. The experts lost $71.40, $61.20, and $70.20, respectively. Chen won $124.80. It wasn't close, and one of the main reasons was the special interest the neural network showed in something called "race grade": greyhounds are moved up and down through a number of divisions, according to their ability, and dogs have a big edge when they've just been bumped down a level and a big handicap when they've just been bumped up. "The experts know race grade exists, but they don't weight it sufficiently," Chen said. "They are all looking at win percentage, place percentage, or thinking about the dogs' times."
Copaken and Meaney figured that Hollywood's experts also had biases and skipped over things that really mattered. If a neural network won at the track, why not Hollywood? "One of the most powerful aspects of what we do is the ruthless objectivity of our system," Copaken said. "It doesn't care about maintaining relationships with stars or agents or getting invited to someone's party. It doesn't care about climbing the corporate ladder. It has one master and one master only: how do you get to bigger box-office? Nobody else in Hollywood is like that."
In the summer of 2003, Copaken approached Josh Berger, a senior executive at Warner Bros. in Europe. Meaney was opposed to the idea: in his mind, it was too early. "I just screamed at Dick," he said. But Copaken was adamant. He had Mr. Bootstraps, Mr. Pink, and Mr. Brown run sixteen television pilots through the neural network, and try to predict the size of each show's eventual audience. "I told Josh, 'Stick this in a drawer, and I'll come back at the end of the season and we can check to see how we did,' " Copaken said. In January of 2004, Copaken tabulated the results. In six cases, Epagogix guessed the number of American homes that would tune in to a show to within .06 per cent. In thirteen of the sixteen cases, its predictions were within two per cent. Berger was floored. "It was incredible," he recalls. "It was like someone saying to you, 'We're going to show you how to count cards in Vegas.' It had that sort of quality."
Copaken then approached another Hollywood studio. He was given nine unreleased movies to analyze. Mr. Pink, Mr. Brown, and Mr. Bootstraps worked only from the script—without reference to the stars or the director or the marketing budget or the producer. On three of the films—two of which were low-budget—the Epagogix estimates were way off. On the remaining six—including two of the studio's biggest-budget productions—they correctly identified whether the film would make or lose money. On one film, the studio thought it had a picture that would make a good deal more than $100 million. Epagogix said $49 million. The movie made less than $40 million. On another, a big-budget picture, the team's estimate came within $1.2 million of the final gross. On a number of films, they were surprisingly close. "They were basically within a few million," a senior executive at the studio said. "It was shocking. It was kind of weird." Had the studio used Epagogix on those nine scripts before filming started, it could have saved tens of millions of dollars. "I was impressed by a couple of things," another executive at the same studio said. "I was impressed by the things they thought mattered to a movie. They weren't the things that we typically give credit to. They cared about the venue, and whether it was a love story, and very specific things about the plot that they were convinced determined the outcome more than anything else. It felt very objective. And they could care less about whether the lead was Tom Cruise or Tom Jones."
The Epagogix team knocked on other doors that weren't quite so welcoming. This was the problem with being a Kamesian. Your belief in a rule-bound universe was what gave you, an outsider, a claim to real expertise. But you were still an outsider. You were still Dick Copaken, the blue-blazered corporate lawyer who majored in bubbles as a little boy in Kansas City, and a couple of guys from the risk-management business, and three men called Pink, Brown, and Bootstraps—and none of you had ever made a movie in your life. And what were you saying? That stars didn't matter, that the director didn't matter, and that all that mattered was story—and, by the way, that you understood story the way the people on the inside, people who had spent a lifetime in the motion-picture business, didn't. "They called, and they said they had a way of predicting box-office success or failure, which is everyone's fantasy," one former studio chief recalled. "I said to them, 'I hope you're right.' " The executive seemed to think of the Epagogix team as a small band of Martians who had somehow slipped their U.F.O. past security. "In reality, there are so many circumstances that can affect a movie's success," the executive went on. "Maybe the actor or actress has an external problem. Or this great actor, for whatever reason, just fails. You have to fire a director. Or September 11th or some other thing happens. There are many people who have come forward saying they have a way of predicting box-office success, but so far nobody has been able to do it. I think we know something. We just don't know enough. I still believe in something called that magical thing—talent, the unexpected. The movie god has to shine on you." You were either a Kamesian or you weren't, and this person wasn't: "My first reaction to those guys? Bullshit."
5.
A few months ago, Dick Copaken agreed to lift the cloud of unknowing surrounding Epagogix, at least in part. He laid down three conditions: the meeting was to be in London, Mr. Pink and Mr. Brown would continue to be known only as Mr. Pink and Mr. Brown, and no mention was to be made of the team's current projects. After much discussion, an agreement was reached. Epagogix would analyze the 2005 movie "The Interpreter," which was directed by Sydney Pollack and starred Sean Penn and Nicole Kidman. "The Interpreter" had a complicated history, having gone through countless revisions, and there was a feeling that it could have done much better at the box office. If ever there was an ideal case study for the alleged wizardry of Epagogix, this was it.
The first draft of the movie was written by Charles Randolph, a philosophy professor turned screenwriter. It opened in the fictional African country of Matobo. Two men in a Land Rover pull up to a soccer stadium. A group of children lead them to a room inside the building. On the ground is a row of corpses.
Cut to the United Nations, where we meet Silvia Broome, a young woman who works as an interpreter. She goes to the U.N. Security Service and relates a terrifying story. The previous night, while working late in the interpreter's booth, she overheard two people plotting the assassination of Matobo's murderous dictator, Edmund Zuwanie, who is coming to New York to address the General Assembly. She says that the plotters saw her, and that her life may be in danger. The officer assigned to her case, Tobin Keller, is skeptical, particularly when he learns that she, too, is from Matobo, and that her parents were killed in the country's civil war. But after Broome suffers a series of threatening incidents Keller starts to believe her. His job is to protect Zuwanie, but he now feels moved to act as Broome's bodyguard as well. A quiet, slightly ambiguous romantic attraction begins to develop between them. Zuwanie's visit draws closer. Broome's job is to be his interpreter. On the day of the speech, Broome ends up in the greenroom with Zuwanie. Keller suddenly realizes the truth: that she has made up the whole story as a way of bringing Zuwanie to justice. He rushes to the greenroom. Broome, it seems, has poisoned Zuwanie and is withholding the antidote unless he goes onstage and confesses to the murder of his countrymen. He does. Broome escapes. A doctor takes a look at the poison. It's harmless. The doctor turns to the dictator, who has just been tricked into writing his own prison sentence: "You were never in danger, Mr. Zuwanie."
Randolph says that the film he was thinking of while he was writing "The Interpreter" was Francis Ford Coppola's classic "The Conversation." He wanted to make a spare, stark movie about an isolated figure. "She's a terrorist," Randolph said of Silvia Broome. "She comes to this country to do a very specific task, and when that task is done she's gone again. I wanted to write about this idea of a noble terrorist, who tried to achieve her ends with a character assassination, not a real assassination." Randolph realized that most moviegoers—and most Hollywood executives—prefer characters who have psychological motivations. But he wasn't trying to make "Die Hard." "Look, I'm the son of a preacher," he said. "I believe that ideology motivates people."
In 2004, Sydney Pollack signed on to direct the project. He loved the idea of an interpreter at the United Nations and the conceit of an overheard conversation. But he wanted to make a commercial movie, and parts of the script didn't feel right to him. He didn't like the twist at the end, for instance. "I felt like I had been tricked, because in fact there was no threat," Pollack said. "As much as I liked the original script, I felt like an audience would somehow, at the end, feel cheated." Pollack also felt that audiences would want much more from Silvia Broome's relationship with Tobin Keller. "I've never been able to do a movie without a love story in it," he said. "For me, the heart of it is always the man and the woman and who they are and what they are going through." Pollack brought Randolph back for rewrites. He then hired Scott Frank and Steven Zaillian, two of the most highly sought-after screenwriters in Hollywood—and after several months the story was turned inside out. Now Broome didn't tell the story of overhearing that conversation. It actually happened. She wasn't a terrorist anymore. She was a victim. She ''t an isolated figure. She was given a social life. She wasn't manipulating Keller. Their relationship was more prominent. A series of new characters—political allies and opponents of Zuwanie's—were added, as was a scene in Brooklyn where a bus explodes, almost killing Broome. "I remember when I came on 'Minority Report,' and started over," said Frank, who wrote many of the new scenes for "The Interpreter." "There weren't many characters. When I finished, there were two mysteries and a hundred characters. I have diarrhea of the plot. This movie cried out for that. There are never enough suspects and red herrings."
The lingering problem, though, was the ending. If Broome wasn't after Zuwanie, who was? "We struggled," Pollack said. "It was a long process, to the point where we almost gave up." In the end, Zuwanie was made the engineer of the plot: he fakes the attempt on his life in order to justify his attacks on his enemies back home. Zuwanie hires a man to shoot him, and then another of Zuwanie's men shoots the assassin before he can do the job—and in the chaos Broome ends up with a gun in her hand, training it on Zuwanie. "The end was the hardest part," Frank said. "All these balls were in the air. But I couldn't find a satisfying way to resolve it. We had to put a gun in the hand of a pacifist. I couldn't quite sew it up in the right way. Sydney kept saying, 'You're so close.' But I kept saying, 'Yeah, but I don't believe what I'm writing.' I wonder if I did a disservice to 'The Interpreter.' I don't know that I made it better. I may have just made it different."
This, then, was the question for Epagogix: If Pollack's goal was to make "The Interpreter" a more commercial movie, how well did he succeed? And could he have done better?
6.
The debriefing took place in central London, behind the glass walls of the private dining room of a Mayfair restaurant. The waiters came in waves, murmuring their announcements of the latest arrival from the kitchen. The table was round. Copaken, dapper as always in his navy blazer, sat next to Sean Verity, followed by Meaney, Mr. Brown, and Mr. Pink. Mr. Brown was very tall, and seemed to have a northern English accent. Mr. Pink was slender and graying, and had an air of authority about him. His academic training was in biochemistry. He said he thought that, in the highly emotional business of Hollywood, having a scientific background was quite useful. There was no sign of Mr. Bootstraps.
Mr. Pink began by explaining the origins of their system. "There were certain historical events that allowed us to go back and test how appealing one film was against another," he said. "The very simple one is that in the English market, in the sixties on Sunday night, religious programming aired on the major networks. Nobody watched it. And, as soon as that finished, movies came on. There were no lead-ins, and only two competing channels. Plus, across the country you had a situation where the commercial sector was playing a whole variety of movies against the standard, the BBC. It might be a John Wayne movie in Yorkshire, and a musical in Somerset, and the BBC would be the same movie everywhere. So you had a control. It was very pure and very simple. That was a unique opportunity to try and make some guesstimates as to why movies were doing what they were doing."
Brown nodded. "We built a body of evidence until we had something systematic," he said.
Pink estimated that they had analyzed thousands of movies. "The thing is that not everything comes to you as a script. For a long period, we worked for a broadcaster who used to send us a couple of paragraphs. We made our predictions based on that much. Having the script is actually too much information sometimes. You're trying to replicate what the audience is doing. They're trying to make a choice between three movies, and all they have at that point is whatever they've seen in TV Guide or on any trailer they've seen. We have to take a piece here and a piece here. Take a couple of reference points. When I look at a story, there are certain things I'm looking for—certain themes, and characters you immediately focus on." He thought for a moment. "That's not to deny that it matters whether the lead character wears a hat," he added, in a way that suggested he and Mr. Brown had actually thought long and hard about leads and hats.
"There's always a pattern," he went on. "There are certain stories that come back, time and time again, and that always work. You know, whenever we go into a market—and we work in fifty markets—the initial thing people say is 'What do you know about our market?' The assumption is that, say, Japan is different from us—that there has to be something else going on there. But, basically, they're just like us. It's the consistency of these reappearing things that I find amazing."
"Biblical stories are a classic case," Mr. Brown put in. "There is something about what they're telling and the message that's coming out that seems to be so universal. With Mel Gibson's 'The Passion,' people always say, 'Who could have predicted that?' And the answer is, we could have."
They had looked at "The Interpreter" scripts a few weeks earlier. The process typically takes them a day. They read, they graded, and then they compared notes, because Mr. Pink was the sort who went for "Yojimbo" and Mr. Brown's favorite movie was "Alien" (the first one), so they didn't always agree. Mr. Brown couldn't remember a single script he'd read where he thought there wasn't room for improvement, and Mr. Pink, when asked the same question, could come up with just one: "Lethal Weapon." "A friend of mine gave me the shooting script before it came out, and I remember reading it and thinking, It's all there. It was all on the page." Once Mr. Pink and Mr. Brown had scored "The Interpreter," they gave their analyses to Mr. Bootstraps, who did fifteen runs through the neural network: the original Randolph script, the shooting script, and certain variants of the plot that Epagogix devised. Mr. Bootstraps then passed his results to Copaken, who wrote them up. The Epagogix reports are always written by Copaken, and they are models of lawyerly thoroughness. This one ran to thirty-eight pages. He had finished the final draft the night before, very late. He looked fresh as a daisy.
Mr. Pink started with the original script. "My pure reaction? I found it very difficult to read. I got confused. I had to reread bits. We do this a lot. If a project takes more than an hour to read, then there's something going on that I'm not terribly keen on."
"It didn't feel to me like a mass-appeal movie," Mr. Brown added. "It seemed more niche."
When Mr. Bootstraps ran Randolph's original draft through the neural network, the computer called it a $33-million movie—an "intelligent" thriller, in the same commercial range as "The Constant Gardener" or "Out of Sight." According to the formula, the final shooting script was a $69-million picture (an estimate that came within $4 million of the actual box-office). Mr. Brown wasn't surprised. The shooting script, he said, "felt more like an American movie, where the first one seemed European in style."
Everyone agreed, though, that Pollack could have done much better. There was, first of all, the matter of the United Nations. "They had a unique opportunity to get inside the building," Mr. Pink said. "But I came away thinking that it could have been set in any boxy office tower in Manhattan. An opportunity was missed. That's when we get irritated—when there are opportunities that could very easily be turned into something that would actually have had an impact."
"Locale is an extra character," Mr. Brown said. "But in this case it's a very bland character that didn't really help."
In the Epagogix secret formula, it seemed, locale matters a great deal. "You know, there's a big difference between city and countryside," Mr. Pink said. "It can have a huge effect on a movie's ability to draw in viewers. And writers just do not take advantage of it. We have a certain set of values that we attach to certain places."
Mr. Pink and Mr. Brown ticked off the movies and television shows that they thought understood the importance of locale: "Crimson Tide," "Lawrence of Arabia," "Lost," "Survivor," "Castaway," "Deliverance." Mr. Pink said, "The desert island is something that we have always recognized as a pungent backdrop, but it's not used that often. In the same way, prisons can be a powerful environment, because they are so well defined." The U.N. could have been like that, but it wasn't. Then there was the problem of starting, as both scripts did, in Africa—and not just Africa but a fictional country in Africa. The whole team found that crazy. "Audiences are pretty parochial, by and large," Mr. Pink said. "If you start off by telling them, 'We're going to begin this movie in Africa,' you're going to lose them. They've bought their tickets. But when they come out they're going to say, 'It was all right. But it was Africa.' " The whole thing seemed to leave Mr. Pink quite distressed. He looked at Mr. Brown beseechingly.
Mr. Brown changed the subject. "It's amazing how often quite little things, quite small aspects, can spoil everything," he said. "I remember seeing the trailer for 'V for Vendetta' and deciding against it right there, for one very simple reason: there was a ridiculous mask on the main character. If you can't see the face of the character, you can't tell what that person is thinking. You can't tell who they are. With 'Spider-Man' and 'Superman,' though, you do see the face, so you respond to them."
The team once gave a studio a script analysis in which almost everything they suggested was, in Hollywood terms, small. They wanted the lead to jump off the page a little more. They wanted the lead to have a young sidekick—a relatively minor character—to connect with a younger demographic, and they wanted the city where the film was set to be much more of a presence. The neural network put the potential value of better characterization at an extra $2.46 million in U.S. box-office revenue; the value of locale adjustment at $4.92 million; the value of a sidekick at $12.3 million—and the value of all three together (given the resulting synergies) at $24.6 million. That's another $25 million for a few weeks of rewrites and maybe a day or two of extra filming. Mr. Bootstraps, incidentally, ran the numbers and concluded that the script would make $47 million if the suggested changes were not made. The changes were not made. The movie made $50 million.
Mr. Pink and Mr. Brown went on to discuss the second "Interpreter" screenplay, the shooting script. They thought the ending was implausible. Charles Randolph had originally suggested that the Tobin Keller character be black, not white, in order to create the frisson of bringing together a white African and a black American. Mr. Pink and Mr. Brown independently came to the same conclusion. Apparently, the neural network ran the numbers on movies that paired black and white leads—"Lethal Weapon," "The Crying Game," "Independence Day," "Men in Black," "Die Another Day," "The Pelican Brief"—and found that the black-white combination could increase box-office revenue. The computer did the same kind of analysis on Scott Frank's "diarrhea of the plot," and found that there were too many villains. And if Silvia Broome was going to be in danger, Mr. Bootstraps made clear, she really had to be in danger.
"Our feeling—and Dick, you may have to jump in here—is that the notion of a woman in peril is a very powerful narrative element," Mr. Pink said. He glanced apprehensively at Copaken, evidently concerned that what he was about to say might fall in the sensitive category of the proprietary. "How powerful?" He chose his words carefully. "Well above average. And the problem is that we lack a sense of how much danger she is in, so an opportunity is missed. There were times when you were thinking, Is this something she has created herself? Is someone actually after her? You are confused. There is an element of doubt, and that ambiguity makes it possible to doubt the danger of the situation." Of course, all that ambiguity was there because in the Randolph script she was making it all up, and we were supposed to doubt the danger of the situation. But Mr. Pink and Mr. Brown believed that, once you decided you weren't going to make a European-style niche movie, you had to abandon ambiguity altogether.
"You've got to make the peril real," Mr. Pink said.
The Epagogix revise of "The Interpreter" starts with an upbeat Silvia Broome walking into the United Nations, flirting with the security guard. The two men plotting the assassination later see her and chase her through the labyrinthine cor-ridors of what could only be the U.N. building. The ambiguous threats to Broome's life are now explicit. At one point in the Epagogix version, a villain pushes Broome's Vespa off one of Manhattan's iconic East River bridges. She hangs on to her motorbike for dear life, as it swings precariously over the edge of the parapet. Tobin Keller, in a police helicopter, swoops into view: "As she clings to Tobin's muscular body while the two of them are hoisted up into the hovering helicopter, we sense that she is feeling more than relief." In the Epagogix ending, Broome stabs one of Zuwanie's security men with a knife. Zuwanie storms off the stage, holds a press conference, and is shot dead by a friend of Broome's brother. Broome cradles the dying man in her arms. He " dies peacefully," with " a smile on his blood-spattered face." Then she gets appointed Matobo's U.N. ambassador. She turns to Keller. "'This time,' she notes with a wry smile . . . 'you will have to protect me.' " Bootstraps's verdict was that this version would result in a U.S. box-office of $111 million.
"It's funny," Mr. Pink said. "This past weekend, 'The Bodyguard' was on TV. Remember that piece of"—he winced—"entertainment? Which is about a bodyguard and a woman. The final scene is that they are right back together. It is very clearly and deliberately sown. That is the commercial way, if you want more bodies in the seats."
"You have to either consummate it or allow for the possibility of that," Copaken agreed.
They were thinking now of what would happen if they abandoned all fealty to the original, and simply pushed the movie's premise as far as they could possibly go.
Mr. Pink went on, "If Dick had said, 'You can take this project wherever you want,' we probably would have ended up with something a lot closer to 'The Bodyguard'—where you have a much more romantic film, a much more powerful focus to the two characters—without all the political stuff going on in the background. You go for the emotions on a very basic level. What would be the upper limit on that? You know, the upper limit of anything these days is probably still 'Titanic.' I'm not saying we could do six hundred million dollars. But it could be two hundred million."
7.
It was clear that the whole conversation was beginning to make Mr. Pink uncomfortable. He didn't like "The Bodyguard." Even the title made him wince. He was the sort who liked "Yojimbo," after all. The question went around the room: What would you do with "The Interpreter"? Sean Verity wanted to juice up the action-adventure elements and push it to the $150- to $160-million range. Meaney wanted to do without expensive stars: he didn't think they were worth the money. Copaken wanted more violence, and he also favored making Keller black. But he didn't want to go all the way to "The Bodyguard," either. This was a man who loved "Dear Frankie" as much as any film he'd seen in recent memory, and "Dear Frankie" had a domestic box-office gross of $1.3 million. If you followed the rules of Epagogix, there wouldn't be any movies like "Dear Frankie." The neural network had one master, the market, and answered one question: how do you get to bigger box-office? But once a movie had made you vulnerable—once you couldn't even retell the damn story without getting emotional—you couldn't be content with just one master anymore.
That was the thing about the formula: it didn't make the task of filmmaking easier. It made it harder. So long as nobody knows anything, you've got license to do whatever you want. You can start a movie in Africa. You can have male and female leads not go off together—all in the name of making something new. Once you came to think that you knew something, though, you had to decide just how much money you were willing to risk for your vision. Did the Epagogix team know what the answer to that question was? Of course not. That question required imagination, and they weren't in the imagination business. They were technicians with tools: computer programs and analytical systems and proprietary software that calculated mathematical relationships among a laundry list of structural variables. At Platinum Blue, Mike McCready could tell you that the bass line was pushing your song out of the center of hit cluster 31. But he couldn't tell you exactly how to fix the bass line, and he couldn't guarantee that the redone version would still sound like a hit, and you didn't see him releasing his own album of computer-validated pop music. A Kamesian had only to read Lord Kames to appreciate the distinction. The most arrogant man in the world was a terrible writer: clunky, dense, prolix. He knew the rules of art. But that didn't make him an artist.
Mr. Brown spoke last. "I don't think it needs to be a big-budget picture," he said. "I think we do what we can with the original script to make it a strong story, with an ending that is memorable, and then do a slow release. A low-budget picture. One that builds through word of mouth—something like that." He was confident that he had the means to turn a $69-million script into a $111-million movie, and then again into a $150- to $200-million blockbuster. But it had been a long afternoon, and part of him had a stubborn attachment to "The Interpreter" in something like its original form. Mr. Bootstraps might have disagreed. But Mr. Bootstraps was nowhere to be seen.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
Dangerous Minds
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
November 12, 2007
Dept. of Criminology
Criminal profiling made easy.
1.
On November 16, 1940, workers at the Consolidated Edison building on West Sixty-fourth Street in Manhattan found a homemade pipe bomb on a windowsill. Attached was a note: "Con Edison crooks, this is for you." In September of 1941, a second bomb was found, on Nineteenth Street, just a few blocks from Con Edison's headquarters, near Union Square. It had been left in the street, wrapped in a sock. A few months later, the New York police received a letter promising to "bring the Con Edison to justice—they will pay for their dastardly deeds." Sixteen other letters followed, between 1941 and 1946, all written in block letters, many repeating the phrase "dastardly deeds" and all signed with the initials "F.P." In March of 1950, a third bomb—larger and more powerful than the others—was found on the lower level of Grand Central Terminal. The next was left in a phone booth at the New York Public Library. It exploded, as did one placed in a phone booth in Grand Central. In 1954, the Mad Bomber—as he came to be known—struck four times, once in Radio City Music Hall, sending shrapnel throughout the audience. In 1955, he struck six times. The city was in an uproar. The police were getting nowhere. Late in 1956, in desperation, Inspector Howard Finney, of the New York City Police Department's crime laboratory, and two plainclothesmen paid a visit to a psychiatrist by the name of James Brussel.
Brussel was a Freudian. He lived on Twelfth Street, in the West Village, and smoked a pipe. In Mexico, early in his career, he had done counter-espionage work for the F.B.I. He wrote many books, including "Instant Shrink: How to Become an Expert Psychiatrist in Ten Easy Lessons." Finney put a stack of documents on Brussel's desk: photographs of unexploded bombs, pictures of devastation, photostats of F.P.'s neatly lettered missives. "I didn't miss the look in the two plainclothesmen's eyes," Brussel writes in his memoir, "Casebook of a Crime Psychiatrist." "I'd seen that look before, most often in the Army, on the faces of hard, old-line, field-grade officers who were sure this newfangled psychiatry business was all nonsense."
He began to leaf through the case materials. For sixteen years, F.P. had been fixated on the notion that Con Ed had done him some terrible injustice. Clearly, he was clinically paranoid. But paranoia takes some time to develop. F.P. had been bombing since 1940, which suggested that he was now middle-aged. Brussel looked closely at the precise lettering of F.P.'s notes to the police. This was an orderly man. He would be cautious. His work record would be exemplary. Further, the language suggested some degree of education. But there was a stilted quality to the word choice and the phrasing. Con Edison was often referred to as "the Con Edison." And who still used the expression "dastardly deeds"? F.P. seemed to be foreign-born. Brussel looked closer at the letters, and noticed that all the letters were perfect block capitals, except the "W"s. They were misshapen, like two "U"s. To Brussel's eye, those "W"s looked like a pair of breasts. He flipped to the crime-scene descriptions. When F.P. planted his bombs in movie theatres, he would slit the underside of the seat with a knife and stuff his explosives into the upholstery. Didn't that seem like a symbolic act of penetrating a woman, or castrating a man—or perhaps both? F.P. had probably never progressed beyond the Oedipal stage. He was unmarried, a loner. Living with a mother figure. Brussel made another leap. F.P. was a Slav. Just as the use of a garrote would have suggested someone of Mediterranean extraction, the bomb-knife combination struck him as Eastern European. Some of the letters had been posted from Westchester County, but F.P. wouldn't have mailed the letters from his home town. Still, a number of cities in southeastern Connecticut had a large Slavic population. And didn't you have to pass through Westchester to get to the city from Connecticut?
Brussel waited a moment, and then, in a scene that has become legendary among criminal profilers, he made a prediction:
"One more thing." I closed my eyes because I didn't want to see their reaction. I saw the Bomber: impeccably neat, absolutely proper. A man who would avoid the newer styles of clothing until long custom had made them conservative. I saw him clearly—much more clearly than the facts really warranted. I knew I was letting my imagination get the better of me, but I couldn't help it.
"One more " I said, my eyes closed tight. "When you catch him—and I have no doubt you will—he'll be wearing a double-breasted suit."
"Jesus!" one of the detectives whispered.
"And it will be buttoned," I said. I opened my eyes. Finney and his men were looking at each other.
"A double-breasted suit," said the Inspector.
"Yes."
"Buttoned."
"Yes."
He nodded. Without another word, they left.
A month later, George Metesky was arrested by police in connection with the New York City bombings. His name had been changed from Milauskas. He lived in Waterbury, Connecticut, with his two older sisters. He was unmarried. He was unfailingly neat. He attended Mass regularly. He had been employed by Con Edison from 1929 to 1931, and claimed to have been injured on the job. When he opened the door to the police officers, he said, "I know why you fellows are here. You think I'm the Mad Bomber." It was midnight, and he was in his pajamas. The police asked that he get dressed. When he returned, his hair was combed into a pompadour and his shoes were newly shined. He was also wearing a double-breasted suit—buttoned.
2.
In a new book, "Inside the Mind of BTK," the eminent F.B.I. criminal profiler John Douglas tells the story of a serial killer who stalked the streets of Wichita, Kansas, in the nineteen-seventies and eighties. Douglas was the model for Agent Jack Crawford in "The Silence of the Lambs." He was the protĂ©gĂ© of the pioneering F.B.I. profiler Howard Teten, who helped establish the bureau's Behavioral Science Unit, at Quantico, in 1972, and who was a protĂ©gĂ© of Brussel—which, in the close-knit fraternity of profilers, is like being analyzed by the analyst who was analyzed by Freud. To Douglas, Brussel was the father of criminal profiling, and, in both style and logic, "Inside the Mind of BTK" pays homage to "Casebook of a Crime Psychiatrist" at every turn.
"BTK" stood for "Bind, Torture, Kill"—the three words that the killer used to identify himself in his taunting notes to the Wichita police. He had struck first in January, 1974, when he killed thirty-eight-year-old Joseph Otero in his home, along with his wife, Julie, their son, Joey, and their eleven-year-old daughter, who was found hanging from a water pipe in the basement with semen on her leg. The following April, he stabbed a twenty-four-year-old woman. In March, 1977, he bound and strangled another young woman, and over the next few years he committed at least four more murders. The city of Wichita was in an uproar. The police were getting nowhere. In 1984, in desperation, two police detectives from Wichita paid a visit to Quantico.
The meeting, Douglas writes, was held in a first-floor conference room of the F.B.I.'s forensic-science building. He was then nearly a decade into his career at the Behavioral Science Unit. His first two best-sellers, "Mindhunter: Inside the FBI's Elite Serial Crime Unit," and "Obsession: The FBI's Legendary Profiler Probes the Psyches of Killers, Rapists, and Stalkers and Their Victims and Tells How to Fight Back," were still in the future. Working a hundred and fifty cases a year, he was on the road constantly, but BTK was never far from his thoughts. "Some nights I'd lie awake asking myself, 'Who the hell is this BTK?' " he writes. "What makes a guy like this do what he does? What makes him tick?"
Roy Hazelwood sat next to Douglas. A lean chain-smoker, Hazelwood specialized in sex crimes, and went on to write the best-sellers "Dark Dreams" and "The Evil That Men Do." Beside Hazelwood was an ex-Air Force pilot named Ron Walker. Walker, Douglas writes, was "whip smart" and an "exceptionally quick study." The three bureau men and the two detectives sat around a massive oak table. "The objective of our session was to keep moving forward until we ran out of juice," Douglas writes. They would rely on the typology developed by their colleague Robert Ressler, himself the author of the true-crime best-sellers "Whoever Fights Monsters" and "I Have Lived in the Monster." The goal was to paint a picture of the killer—of what sort of man BTK was, and what he did, and where he worked, and what he was like—and with that scene "Inside the Mind of BTK" begins.
We are now so familiar with crime stories told through the eyes of the profiler that it is easy to lose sight of how audacious the genre is. The traditional detective story begins with the body and centers on the detective's search for the culprit. Leads are pursued. A net is cast, widening to encompass a bewilderingly diverse pool of suspects: the butler, the spurned lover, the embittered nephew, the shadowy European. That's a Whodunit. In the profiling genre, the net is narrowed. The crime scene doesn't initiate our search for the killer. It defines the killer for us. The profiler sifts through the case materials, looks off into the distance, and knows. "Generally, a psychiatrist can study a man and make a few reasonable predictions about what the man may do in the future—how he will react to such-and-such a stimulus, how he will behave in such-and-such a situation," Brussel writes. "What I have done is reverse the terms of the prophecy. By studying a 's deeds, I have deduced what kind of man he might be." Look for a middle-aged Slav in a double-breasted suit. Profiling stories aren't Whodunits; they're Hedunits.
In the Hedunit, the profiler does not catch the criminal. That's for local law enforcement. He takes the meeting. Often, he doesn't write down his predictions. It's up to the visiting police officers to take notes. He does not feel the need to involve himself in the subsequent investigation, or even, it turns out, to justify his predictions. Once, Douglas tells us, he drove down to the local police station and offered his services in the case of an elderly woman who had been savagely beaten and sexually assaulted. The detectives working the crime were regular cops, and Douglas was a bureau guy, so you can imagine him perched on the edge of a desk, the others pulling up chairs around him.
" 'Okay,' I said to the detectives. . . . 'Here's what I think,' " Douglas begins. "It's a sixteen- or seventeen-year-old high school kid. . . . He'll be disheveled-looking, he'll have scruffy hair, generally poorly groomed." He went on: a loner, kind of weird, no girlfriend, lots of bottled-up anger. He comes to the old lady's house. He knows she's alone. Maybe he's done odd jobs for her in the past. Douglas continues:
I pause in my narrative and tell them there's someone who meets this description out there. If they can find him, they've got their offender.
One detective looks at another. One of them starts to smile. "Are you a psychic, Douglas?"
"No," I say, "but my job would be a lot easier if I were."
"Because we had a psychic, Beverly Newton, in here a couple of weeks ago, and she said just about the same things."
You might think that Douglas would bridle at that comparison. He is, after all, an agent of the Federal Bureau of Investigation, who studied with Teten, who studied with Brussel. He is an ace profiler, part of a team that restored the F.B.I.'s reputation for crime-fighting, inspired countless movies, television shows, and best-selling thrillers, and brought the modern tools of psychology to bear on the savagery of the criminal mind—and some cop is calling him a psychic. But Douglas doesn't object. Instead, he begins to muse on the ineffable origins of his insights, at which point the question arises of what exactly this mysterious art called profiling is, and whether it can be trusted. Douglas writes,
What I try to do with a case is to take in all the evidence I have to work with . . . and then put myself mentally and emotionally in the head of the offender. I try to think as he does. Exactly how this happens, I'm not sure, any more than the novelists such as Tom Harris who've consulted me over the years can say exactly how their characters come to life. If there's a psychic component to this, I won't run from it.
3.
In the late nineteen-seventies, John Douglas and his F.B.I. colleague Robert Ressler set out to interview the most notorious serial killers in the country. They started in California, since, as Douglas says, "California has always had more than its share of weird and spectacular crimes." On weekends and days off, over the next months, they stopped by one federal prison after another, until they had interviewed thirty-six murderers.
Douglas and Ressler wanted to know whether there was a pattern that connected a killer's life and personality with the nature of his crimes. They were looking for what psychologists would call a homology, an agreement between character and action, and, after comparing what they learned from the killers with what they already knew about the characteristics of their murders, they became convinced that they'd found one.
Serial killers, they concluded, fall into one of two categories. Some crime scenes show evidence of logic and planning. The victim has been hunted and selected, in order to fulfill a specific fantasy. The recruitment of the victim might involve a ruse or a con. The perpetrator maintains control throughout the offense. He takes his time with the victim, carefully enacting his fantasies. He is adaptable and mobile. He almost never leaves a weapon behind. He meticulously conceals the body. Douglas and Ressler, in their respective books, call that kind of crime "organized."
In a "disorganized" crime, the victim isn't chosen logically. She's seemingly picked at random and "blitz-attacked," not stalked and coerced. The killer might grab a steak knife from the kitchen and leave the knife behind. The crime is so sloppily executed that the victim often has a chance to fight back. The crime might take place in a high-risk environment. "Moreover, the disorganized killer has no idea of, or interest in, the personalities of his victims," Ressler writes in "Whoever Fights Monsters." "He does not want to know who they are, and many times takes steps to obliterate their personalities by quickly knocking them unconscious or covering their faces or otherwise disfiguring them."
Each of these styles, the argument goes, corresponds to a personality type. The organized killer is intelligent and articulate. He feels superior to those around him. The disorganized killer is unattractive and has a poor self-image. He often has some kind of disability. He's too strange and withdrawn to be married or have a girlfriend. If he doesn't live alone, he lives with his parents. He has pornography stashed in his closet. If he drives at all, his car is a wreck.
"The crime scene is presumed to reflect the murderer's behavior and personality in much the same way as furnishings reveal the homeowner's character," we're told in a crime manual that Douglas and Ressler helped write. The more they learned, the more precise the associations became. If the victim was white, the killer would be white. If the victim was old, the killer would be sexually immature.
"In our research, we discovered that . . . frequently serial offenders had failed in their efforts to join police departments and had taken jobs in related fields, such as security guard or night watchman," Douglas writes. Given that organized rapists were preoccupied with control, it made sense that they would be fascinated by the social institution that symbolizes control. Out of that insight came another prediction: "One of the things we began saying in some of our profiles was that the UNSUB"—the unknown subject—"would drive a policelike vehicle, say a Ford Crown Victoria or Chevrolet Caprice."
4.
On the surface, the F.B.I.'s system seems extraordinarily useful. Consider a case study widely used in the profiling literature. The body of a twenty-six-year-old special-education teacher was found on the roof of her Bronx apartment building. She was apparently abducted just after she left her house for work, at six-thirty in the morning. She had been beaten beyond recognition, and tied up with her stockings and belt. The killer had mutilated her sexual organs, chopped off her nipples, covered her body with bites, written obscenities across her abdomen, masturbated, and then defecated next to the body.
Let's pretend that we're an F.B.I. profiler. First question: race. The victim is white, so let's call the offender white. Let's say he's in his mid-twenties to early thirties, which is when the thirty-six men in the F.B.I.'s sample started killing. Is the crime organized or disorganized? Disorganized, clearly. It's on a rooftop, in the Bronx, in broad daylight—high risk. So what is the killer doing in the building at six-thirty in the morning? He could be some kind of serviceman, or he could live in the neighborhood. Either way, he appears to be familiar with the building. He's disorganized, though, so he's not stable. If he is employed, it's blue-collar work, at best. He probably has a prior offense, having to do with violence or sex. His relationships with women will be either nonexistent or deeply troubled. And the mutilation and the defecation are so strange that he's probably mentally ill or has some kind of substance-abuse problem. How does that sound? As it turns out, it's spot-on. The killer was Carmine Calabro, age thirty, a single, unemployed, deeply troubled actor who, when he was not in a mental institution, lived with his widowed father on the fourth floor of the building where the murder took place.
But how useful is that profile, really? The police already had Calabro on their list of suspects: if you're looking for the person who killed and mutilated someone on the roof, you don't really need a profiler to tell you to check out the dishevelled, mentally ill guy living with his father on the fourth floor.
That's why the F.B.I.'s profilers have always tried to supplement the basic outlines of the organized/disorganized system with telling details—something that lets the police zero in on a suspect. In the early eighties, Douglas gave a presentation to a roomful of police officers and F.B.I. agents in Marin County about the Trailside Killer, who was murdering female hikers in the hills north of San Francisco. In Douglas's view, the killer was a classic "disorganized" offender—a blitz attacker, white, early to mid-thirties, blue collar, probably with "a history of bed-wetting, fire-starting, and cruelty to animals." Then he went back to how asocial the killer seemed. Why did all the killings take place in heavily wooded areas, miles from the road? Douglas reasoned that the killer required such seclusion because he had some condition that he was deeply self-conscious about. Was it something physical, like a missing limb? But then how could he hike miles into the woods and physically overpower his victims? Finally, it came to him: " 'Another thing,' I added after a pregnant pause, 'the killer will have a speech impediment.' "
And so he did. Now, that's a useful detail. Or is it? Douglas then tells us that he pegged the 's age as early thirties, and he turned out to be fifty. Detectives use profiles to narrow down the range of suspects. It doesn't do any good to get a specific detail right if you get general details wrong.
In the case of Derrick Todd Lee, the Baton Rouge serial killer, the F.B.I. profile described the offender as a white male blue-collar worker, between twenty-five and thirty-five years old, who "wants to be seen as someone who is attractive and appealing to women." The profile went on, "However, his level of sophistication in interacting with women, especially women who are above him in the social strata, is low. Any contact he has had with women he has found attractive would be described by these women as 'awkward.' " The F.B.I. was right about the killer being a blue-collar male between twenty-five and thirty-five. But Lee turned out to be charming and outgoing, the sort to put on a cowboy hat and snakeskin boots and head for the bars. He was an extrovert with a number of girlfriends and a reputation as a ladies' man. And he wasn't white. He was black.
A profile isn't a test, where you pass if you get most of the answers right. It's a portrait, and all the details have to cohere in some way if the image is to be helpful. In the mid-nineties, the British Home Office analyzed a hundred and eighty-four crimes, to see how many times profiles led to the arrest of a criminal. The profile worked in five of those cases. That's just 2.7 per cent, which makes sense if you consider the position of the detective on the receiving end of a profiler's list of conjectures. Do you believe the stuttering part? Or do you believe the thirty-year-old part? Or do you throw up your hands in frustration?
5.
There is a deeper problem with F.B.I. profiling. Douglas and Ressler didn't interview a representative sample of serial killers to come up with their typology. They talked to whoever happened to be in the neighborhood. Nor did they interview their subjects according to a standardized protocol. They just sat down and chatted, which isn't a particularly firm foundation for a psychological system. So you might wonder whether serial killers can really be categorized by their level of organization.
Not long ago, a group of psychologists at the University of Liverpool decided to test the F.B.I.'s assumptions. First, they made a list of crime-scene characteristics generally considered to show organization: perhaps the victim was alive during the sex acts, or the body was posed in a certain way, or the murder weapon was missing, or the body was concealed, or torture and restraints were involved. Then they made a list of characteristics showing disorganization: perhaps the victim was beaten, the body was left in an isolated spot, the victim's belongings were scattered, or the murder weapon was improvised.
If the F.B.I. was right, they reasoned, the crime-scene details on each of those two lists should "co-occur"—that is, if you see one or more organized traits in a crime, there should be a reasonably high probability of seeing other organized traits. When they looked at a sample of a hundred serial crimes, however, they couldn't find any support for the F.B.I.'s distinction. Crimes don't fall into one camp or the other. It turns out that they're almost always a mixture of a few key organized traits and a random array of disorganized traits. Laurence Alison, one of the leaders of the Liverpool group and the author of "The Forensic Psychologist's Casebook," told me, "The whole business is a lot more complicated than the F.B.I. imagines."
Alison and another of his colleagues also looked at homology. If Douglas was right, then a certain kind of crime should correspond to a certain kind of criminal. So the Liverpool group selected a hundred stranger rapes in the United Kingdom, classifying them according to twenty-eight variables, such as whether a disguise was worn, whether compliments were given, whether there was binding, gagging, or blindfolding, whether there was apologizing or the theft of personal property, and so on. They then looked at whether the patterns in the crimes corresponded to attributes of the criminals—like age, type of employment, ethnicity, level of education, marital status, number of prior convictions, type of prior convictions, and drug use. Were rapists who bind, gag, and blindfold more like one another than they were like rapists who, say, compliment and apologize? The answer is no—not even slightly.
"The fact is that different offenders can exhibit the same behaviors for completely different reasons," Brent Turvey, a forensic scientist who has been highly critical of the F.B.I.'s approach, says. "You've got a rapist who attacks a woman in the park and pulls her shirt up over her face. Why? What does that mean? There are ten different things it could mean. It could mean he ''t want to see her. It could mean he doesn't want her to see him. It could mean he wants to see her breasts, he wants to imagine someone else, he wants to incapacitate her arms—all of those are possibilities. You can't just look at one behavior in isolation."
A few years ago, Alison went back to the case of the teacher who was murdered on the roof of her building in the Bronx. He wanted to know why, if the F.B.I.'s approach to criminal profiling was based on such simplistic psychology, it continues to have such a sterling reputation. The answer, he suspected, lay in the way the profiles were written, and, sure enough, when he broke down the rooftop-killer analysis, sentence by sentence, he found that it was so full of unverifiable and contradictory and ambiguous language that it could support virtually any interpretation.
Astrologers and psychics have known these tricks for years. The magician Ian Rowland, in his classic "The Full Facts Book of Cold Reading," itemizes them one by one, in what could easily serve as a manual for the beginner profiler. First is the Rainbow Ruse—the "statement which credits the client with both a personality trait and its opposite." ("I would say that on the whole you can be rather a quiet, self effacing type, but when the circumstances are right, you can be quite the life and soul of the party if the mood strikes you.") The Jacques Statement, named for the character in "As You Like It" who gives the Seven Ages of Man speech, tailors the prediction to the age of the subject. To someone in his late thirties or early forties, for example, the psychic says, "If you are honest about it, you often get to wondering what happened to all those dreams you had when you were younger." There is the Barnum Statement, the assertion so general that anyone would agree, and the Fuzzy Fact, the seemingly factual statement couched in a way that "leaves plenty of scope to be developed into something more specific." ("I can see a connection with Europe, possibly Britain, or it could be the warmer, Mediterranean part?") And that's only the start: there is the Greener Grass technique, the Diverted Question, the Russian Doll, Sugar Lumps, not to mention Forking and the Good Chance Guess—all of which, when put together in skillful combination, can convince even the most skeptical observer that he or she is in the presence of real insight.
"Moving on to career matters, you don't work with children, do you?" Rowland will ask his subjects, in an example of what he dubs the "Vanishing Negative."
No, I don't.
"No, I thought not. That's not really your role."
Of course, if the subject answers differently, there's another way to play the question: "Moving on to career matters, you don't work with children, do you?"
I do, actually, part time.
"Yes, I thought so."
After Alison had analyzed the rooftop-killer profile, he decided to play a version of the cold-reading game. He gave the details of the crime, the profile prepared by the F.B.I., and a description of the offender to a group of senior police officers and forensic professionals in England. How did they find the profile? Highly accurate. Then Alison gave the same packet of case materials to another group of police officers, but this time he invented an imaginary offender, one who was altogether different from Calabro. The new killer was thirty-seven years old. He was an alcoholic. He had recently been laid off from his job with the water board, and had met the victim before on one of his rounds. What's more, Alison claimed, he had a history of violent relationships with women, and prior convictions for assault and burglary. How accurate did a group of experienced police officers find the F.B.I.'s profile when it was matched with the phony offender? Every bit as accurate as when it was matched to the real offender.
James Brussel didn't really see the Mad Bomber in that pile of pictures and photostats, then. That was an illusion. As the literary scholar Donald Foster pointed out in his 2000 book "Author Unknown," Brussel cleaned up his predictions for his memoirs. He actually told the police to look for the bomber in White Plains, sending the N.Y.P.D.'s bomb unit on a wild goose chase in Westchester County, sifting through local records. Brussel also told the police to look for a man with a facial scar, which Metesky didn't have. He told them to look for a man with a night job, and Metesky had been largely unemployed since leaving Con Edison in 1931. He told them to look for someone between forty and fifty, and Metesky was over fifty. He told them to look for someone who was an "expert in civil or military ordnance" and the closest Metesky came to that was a brief stint in a machine shop. And Brussel, despite what he wrote in his memoir, never said that the Bomber would be a Slav. He actually told the police to look for a man "born and educated in Germany," a prediction so far off the mark that the Mad Bomber himself was moved to object. At the height of the police investigation, when the New York Journal American offered to print any communications from the Mad Bomber, Metesky wrote in huffily to say that "the nearest to my being 'Teutonic' is that my father boarded a liner in Hamburg for passage to this country—about sixty-five years ago."
The true hero of the case wasn't Brussel; it was a woman named Alice Kelly, who had been assigned to go through Con Edison's personnel files. In January, 1957, she ran across an employee complaint from the early nineteen-thirties: a generator wiper at the Hell Gate plant had been knocked down by a backdraft of hot gases. The worker said that he was injured. The company said that he wasn't. And in the flood of angry letters from the ex-employee Kelly spotted a threat—to "take justice in my own hands"—that had appeared in one of the Mad Bomber's letters. The name on the file was George Metesky.
Brussel did not really understand the mind of the Mad Bomber. He seems to have understood only that, if you make a great number of predictions, the ones that were wrong will soon be forgotten, and the ones that turn out to be true will make you famous. The Hedunit is not a triumph of forensic analysis. It's a party trick.
6.
"Here's where I'm at with this guy," Douglas said, kicking off the profiling session with which "Inside the Mind of BTK" begins. It was 1984. The killer was still at large. Douglas, Hazelwood, and Walker and the two detectives from Wichita were all seated around the oak table. Douglas took off his suit jacket and draped it over his chair. "Back when he started in 1974, he was in his mid-to-late twenties," Douglas began. "It's now ten years later, so that would put him in his mid-to-late thirties."
It was Walker's turn: BTK had never engaged in any sexual penetration. That suggested to him someone with an "inadequate, immature sexual history." He would have a "lone-wolf type of personality. But he's not alone because he's shunned by others—it's because he chooses to be alone. . . . He can function in social settings, but only on the surface. He may have women friends he can talk to, but he'd feel very inadequate with a peer-group female." Hazelwood was next. BTK would be "heavily into masturbation." He went on, "Women who have had sex with this guy would describe him as aloof, uninvolved, the type who is more interested in her servicing him than the other way around."
Douglas followed his lead. "The women he's been with are either many years younger, very naĂŻve, or much older and depend on him as their meal ticket," he ventured. What's more, the profilers determined, BTK would drive a "decent" automobile, but it would be "nondescript."
At this point, the insights began piling on. Douglas said he'd been thinking that BTK was married. But now maybe he was thinking he was divorced. He speculated that BTK was lower middle class, probably living in a rental. Walker felt BTK was in a "lower-paying white collar job, as opposed to blue collar." Hazelwood saw him as "middle class" and "articulate." The consensus was that his I.Q. was somewhere between 105 and 145. Douglas wondered whether he was connected with the military. Hazelwood called him a "now" person, who needed "instant gratification."
Walker said that those who knew him "might say they remember him, but didn't really know much about him." Douglas then had a flash—"It was a sense, almost a knowing"—and said, "I wouldn't be surprised if, in the job he's in today, that he's wearing some sort of uniform. . . . This guy isn't mental. But he is crazy like a fox."
They had been at it for almost six hours. The best minds in the F.B.I. had given the Wichita detectives a blueprint for their investigation. Look for an American male with a possible connection to the military. His I.Q. will be above 105. He will like to masturbate, and will be aloof and selfish in bed. He will drive a decent car. He will be a "now" person. He won't be comfortable with women. But he may have women friends. He will be a lone wolf. But he will be able to function in social settings. He won't be unmemorable. But he will be unknowable. He will be either never married, divorced, or married, and if he was or is married his wife will be younger or older. He may or may not live in a rental, and might be lower class, upper lower class, lower middle class or middle class. And he will be crazy like a fox, as opposed to being mental. If you're keeping score, that's a Jacques Statement, two Barnum Statements, four Rainbow Ruses, a Good Chance Guess, two predictions that aren't really predictions because they could never be verified—and nothing even close to the salient fact that BTK was a pillar of his community, the president of his church and the married father of two.
"This thing is solvable," Douglas told the detectives, as he stood up and put on his jacket. "Feel free to pick up the phone and call us if we can be of any further assistance." You can imagine him taking the time for an encouraging smile and a slap on the back. "You're gonna nail this guy."
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
None of the Above
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
December 17, 2007
Books
What I.Q. doesn't tell you about race.
If what I.Q. tests measure is immutable and innate, what explains the Flynn effect—the steady rise in scores across generations?
1.
One Saturday in November of 1984, James Flynn, a social scientist at the University of Otago, in New Zealand, received a large package in the mail. It was from a colleague in Utrecht, and it contained the results of I.Q. tests given to two generations of Dutch eighteen-year-olds. When Flynn looked through the data, he found something puzzling. The Dutch eighteen-year-olds from the nineteen-eighties scored better than those who took the same tests in the nineteen-fifties—and not just slightly better, much better.
Curious, Flynn sent out some letters. He collected intelligence-test results from Europe, from North America, from Asia, and from the developing world, until he had data for almost thirty countries. In every case, the story was pretty much the same. I.Q.s around the world appeared to be rising by 0.3 points per year, or three points per decade, for as far back as the tests had been administered. For some reason, human beings seemed to be getting smarter.
Flynn has been writing about the implications of his findings—now known as the Flynn effect—for almost twenty-five years. His books consist of a series of plainly stated statistical observations, in support of deceptively modest conclusions, and the evidence in support of his original observation is now so overwhelming that the Flynn effect has moved from theory to fact. What remains uncertain is how to make sense of the Flynn effect. If an American born in the nineteen-thirties has an I.Q. of 100, the Flynn effect says that his children will have I.Q.s of 108, and his grandchildren I.Q.s of close to 120—more than a standard deviation higher. If we work in the opposite direction, the typical teen-ager of today, with an I.Q. of 100, would have had grandparents with average I.Q.s of 82—seemingly below the threshold necessary to graduate from high school. And, if we go back even farther, the Flynn effect puts the average I.Q.s of the schoolchildren of 1900 at around 70, which is to suggest, bizarrely, that a century ago the United States was populated largely by people who today would be considered mentally retarded.
2.
For almost as long as there have been I.Q. tests, there have been I.Q. fundamentalists. H. H. Goddard, in the early years of the past century, established the idea that intelligence could be measured along a single, linear scale. One of his particular contributions was to coin the word "moron." "The people who are doing the drudgery are, as a rule, in their proper places," he wrote. Goddard was followed by Lewis Terman, in the nineteen-twenties, who rounded up the California children with the highest I.Q.s, and confidently predicted that they would sit at the top of every profession. In 1969, the psychometrician Arthur Jensen argued that programs like Head Start, which tried to boost the academic performance of minority children, were doomed to failure, because I.Q. was so heavily genetic; and in 1994, Richard Hernsterin and Charles Murray published their bestselling hereditarian primer "The Bell Curve," which argued that blacks were innately inferiour in intelligence to whites. To the I.Q. fundamentalist, two things are beyond dispute: first, that I.Q. tests measure some hard and identifiable trait that predicts the quality of our thinking; and, second, that this trait is stable—that is, it is determined by our genes and largely impervious to environmental influences.
This is what James Watson, the co-discoverer of DNA, meant when he told an English newspaper recently that he was "inherently gloomy" about the prospects for Africa. From the perspective of an I.Q. fundamentalist, the fact that Africans score lower than Europeans on I.Q. tests suggests an ineradicable cognitive disability. In the controversy that followed, Watson was defended by the journalist William Saletan, in a three-part series for the online magazine Slate. Drawing heavily on the work of J. Philippe Rushton—a psychologist who specializes in comparing the circumference of what he calls the Negroid brain with the length of the Negroid penis—Saletan took the fundamentalist position to its logical conclusion. To erase the difference between blacks and whites, Saletan wrote, would probably require vigorous interbreeding between the races, or some kind of corrective genetic engineering aimed at upgrading African stock. "Economic and cultural theories have failed to explain most of the pattern," Saletan declared, claiming to have been "soaking [his] head in each 's computations and arguments." One argument that Saletan never soaked his head in, however, was Flynn's, because what Flynn discovered in his mailbox upsets the certainties upon which I.Q. fundamentalism rests. If whatever the thing is that I.Q. tests measure can jump so much in a generation, it can't be all that immutable and it doesn't look all that innate.
The very fact that average I.Q.s shift over time ought to create a "crisis of confidence," Flynn writes in "What Is Intelligence?" (Cambridge; $22), his latest attempt to puzzle through the implications of his discovery. "How could such huge gains be intelligence gains? Either the children of today were far brighter than their parents or, at least in some circumstances, I.Q. tests were not good measures of intelligence."
3.
The best way to understand why I.Q.s rise, Flynn argues, is to look at one of the most widely used I.Q. tests, the so-called WISC (for Wechsler Intelligence Scale for Children). The WISC is composed of ten subtests, each of which measures a different aspect of I.Q. Flynn points out that scores in some of the categories—those measuring general knowledge, say, or vocabulary or the ability to do basic arithmetic—have risen only modestly over time. The big gains on the WISC are largely in the category known as "similarities," where you get questions such as "In what way are 'dogs' and 'rabbits' alike?" Today, we tend to give what, for the purposes of I.Q. tests, is the right answer: dogs and rabbits are both mammals. A nineteenth-century American would have said that "you use dogs to hunt rabbits."
"If the everyday world is your cognitive home, it is not natural to detach abstractions and logic and the hypothetical from their concrete referents," Flynn writes. Our great-grandparents may have been perfectly intelligent. But they would have done poorly on I.Q. tests because they did not participate in the twentieth century's great cognitive revolution, in which we learned to sort experience according to a new set of abstract categories. In Flynn's phrase, we have now had to put on "scientific spectacles," which enable us to make sense of the WISC questions about similarities. To say that Dutch I.Q. scores rose substantially between 1952 and 1982 was another way of saying that the Netherlands in 1982 was, in at least certain respects, much more cognitively demanding than the Netherlands in 1952. An I.Q., in other words, measures not so much how smart we are as how modern we are.
This is a critical distinction. When the children of Southern Italian immigrants were given I.Q. tests in the early part of the past century, for example, they recorded median scores in the high seventies and low eighties, a full standard deviation below their American and Western European counterparts. Southern Italians did as poorly on I.Q. tests as Hispanics and blacks did. As you can imagine, there was much concerned talk at the time about the genetic inferiority of Italian stock, of the inadvisability of letting so many second-class immigrants into the United States, and of the squalor that seemed endemic to Italian urban neighborhoods. Sound familiar? These days, when talk turns to the supposed genetic differences in the intelligence of certain races, Southern Italians have disappeared from the discussion. "Did their genes begin to mutate somewhere in the 1930s?" the psychologists Seymour Sarason and John Doris ask, in their account of the Italian experience. "Or is it possible that somewhere in the 1920s, if not earlier, the sociocultural history of Italo-Americans took a turn from the blacks and the Spanish Americans which permitted their assimilation into the general undifferentiated mass of Americans?"
The psychologist Michael Cole and some colleagues once gave members of the Kpelle tribe, in Liberia, a version of the WISC similarities test: they took a basket of food, tools, containers, and clothing and asked the tribesmen to sort them into appropriate categories. To the frustration of the researchers, the Kpelle chose functional pairings. They put a potato and a knife together because a knife is used to cut a potato. "A wise man could only do such-and-such," they explained. Finally, the researchers asked, "How would a fool do it?" The tribesmen immediately re-sorted the items into the "right" categories. It can be argued that taxonomical categories are a developmental improvement—that is, that the Kpelle would be more likely to advance, technologically and scientifically, if they started to see the world that way. But to label them less intelligent than Westerners, on the basis of their performance on that test, is merely to state that they have different cognitive preferences and habits. And if I.Q. varies with habits of mind, which can be adopted or discarded in a generation, what, exactly, is all the fuss about?
When I was growing up, my family would sometimes play Twenty Questions on long car trips. My father was one of those people who insist that the standard categories of animal, vegetable, and mineral be supplemented with a fourth category: "abstract." Abstract could mean something like "whatever it was that was going through my mind when we drove past the water tower fifty miles back." That abstract category sounds absurdly difficult, but it wasn't: it merely required that we ask a slightly different set of questions and grasp a slightly different set of conventions, and, after two or three rounds of practice, guessing the contents of someone's mind fifty miles ago becomes as easy as guessing Winston Churchill. (There is one exception. That was the trip on which my old roommate Tom Connell chose, as an abstraction, "the Unknown Soldier"—which allowed him legitimately and gleefully to answer "I have no idea" to almost every question. There were four of us playing. We gave up after an hour.) Flynn would say that my father was teaching his three sons how to put on scientific spectacles, and that extra practice probably bumped up all of our I.Q.s a few notches. But let's be clear about what this means. There's a world of difference between an I.Q. advantage that's genetic and one that depends on extended car time with Graham Gladwell.
4.
Flynn is a cautious and careful writer. Unlike many others in the I.Q. debates, he resists grand philosophizing. He comes back again and again to the fact that I.Q. scores are generated by paper-and-pencil tests—and making sense of those scores, he tells us, is a messy and complicated business that requires something closer to the skills of an accountant than to those of a philosopher.
For instance, Flynn shows what happens when we recognize that I.Q. is not a freestanding number but a value attached to a specific time and a specific test. When an I.Q. test is created, he reminds us, it is calibrated or "normed" so that the test-takers in the fiftieth percentile—those exactly at the median—are assigned a score of 100. But since I.Q.s are always rising, the only way to keep that hundred-point benchmark is periodically to make the tests more difficult—to "renorm" them. The original WISC was normed in the late nineteen-forties. It was then renormed in the early nineteen-seventies, as the WISC-R; renormed a third time in the late eighties, as the WISC III; and renormed again a few years ago, as the WISC IV—with each version just a little harder than its predecessor. The notion that anyone "has" an I.Q. of a certain number, then, is meaningless unless you know which WISC he took, and when he took it, since there's a substantial difference between getting a 130 on the WISC IV and getting a 130 on the much easier WISC.
This is not a trivial issue. I.Q. tests are used to diagnose people as mentally retarded, with a score of 70 generally taken to be the cutoff. You can imagine how the Flynn effect plays havoc with that system. In the nineteen-seventies and eighties, most states used the WISC-R to make their mental-retardation diagnoses. But since kids—even kids with disabilities—score a little higher every year, the number of children whose scores fell below 70 declined steadily through the end of the eighties. Then, in 1991, the WISC III was introduced, and suddenly the percentage of kids labelled retarded went up. The psychologists Tomoe Kanaya, Matthew Scullin, and Stephen Ceci estimated that, if every state had switched to the WISC III right away, the number of Americans labelled mentally retarded should have doubled.
That is an extraordinary number. The diagnosis of mental disability is one of the most stigmatizing of all educational and occupational classifications—and yet, apparently, the chances of being burdened with that label are in no small degree a function of the point, in the life cycle of the WISC, at which a child happens to sit for his evaluation. "As far as I can determine, no clinical or school psychologists using the WISC over the relevant 25 years noticed that its criterion of mental retardation became more lenient over time," Flynn wrote, in a 2000 paper. "Yet no one drew the obvious moral about psychologists in the field: They simply were not making any systematic assessment of the I.Q. criterion for mental retardation."
Flynn brings a similar precision to the question of whether Asians have a genetic advantage in I.Q., a possibility that has led to great excitement among I.Q. fundamentalists in recent years. Data showing that the Japanese had higher I.Q.s than people of European descent, for example, prompted the British psychometrician and eugenicist Richard Lynn to concoct an elaborate evolutionary explanation involving the Himalayas, really cold weather, premodern hunting practices, brain size, and specialized vowel sounds. The fact that the I.Q.s of Chinese-Americans also seemed to be elevated has led I.Q. fundamentalists to posit the existence of an international I.Q. pyramid, with Asians at the top, European whites next, and Hispanics and blacks at the bottom.
Here was a question tailor-made for James Flynn's accounting skills. He looked first at Lynn's data, and realized that the comparison was skewed. Lynn was comparing American I.Q. estimates based on a representative sample of schoolchildren with Japanese estimates based on an upper-income, heavily urban sample. Recalculated, the Japanese average came in not at 106.6 but at 99.2. Then Flynn turned his attention to the Chinese-American estimates. They turned out to be based on a 1975 study in San Francisco's Chinatown using something called the Lorge-Thorndike Intelligence Test. But the Lorge-Thorndike test was normed in the nineteen-fifties. For children in the nineteen-seventies, it would have been a piece of cake. When the Chinese-American scores were reassessed using up-to-date intelligence metrics, Flynn found, they came in at 97 verbal and 100 nonverbal. Chinese-Americans had slightly lower I.Q.s than white Americans.
The Asian-American success story had suddenly been turned on its head. The numbers now suggested, Flynn said, that they had succeeded not because of their higher I.Q.s. but despite their lower I.Q.s. Asians were overachievers. In a nifty piece of statistical analysis, Flynn then worked out just how great that overachievement was. Among whites, virtually everyone who joins the ranks of the managerial, professional, and technical occupations has an I.Q. of 97 or above. Among Chinese-Americans, that threshold is 90. A Chinese-American with an I.Q. of 90, it would appear, does as much with it as a white American with an I.Q. of 97.
There should be no great mystery about Asian achievement. It has to do with hard work and dedication to higher education, and belonging to a culture that stresses professional success. But Flynn makes one more observation. The children of that first successful wave of Asian-Americans really did have I.Q.s that were higher than everyone else's—coming in somewhere around 103. Having worked their way into the upper reaches of the occupational scale, and taken note of how much the professions value abstract thinking, Asian-American parents have evidently made sure that their own children wore scientific spectacles. "Chinese Americans are an ethnic group for whom high achievement preceded high I.Q. rather than the reverse," Flynn concludes, reminding us that in our discussions of the relationship between I.Q. and success we often confuse causes and effects. "It is not easy to view the history of their achievements without emotion," he writes. That is exactly right. To ascribe Asian success to some abstract number is to trivialize it.
5.
Two weeks ago, Flynn came to Manhattan to debate Charles Murray at a forum sponsored by the Manhattan Institute. Their subject was the black-white I.Q. gap in America. During the twenty-five years after the Second World War, that gap closed considerably. The I.Q.s of white Americans rose, as part of the general worldwide Flynn effect, but the I.Q.s of black Americans rose faster. Then, for about a period of twenty-five years, that trend stalled—and the question was why.
Murray showed a series of PowerPoint slides, each representing different statistical formulations of the I.Q. gap. He appeared to be pessimistic that the racial difference would narrow in the future. "By the nineteen-seventies, you had gotten most of the juice out of the environment that you were going to get," he said. That gap, he seemed to think, reflected some inherent difference between the races. "Starting in the nineteen-seventies, to put it very crudely, you had a higher proportion of black kids being born to really dumb mothers," he said. When the debate's moderator, Jane Waldfogel, informed him that the most recent data showed that the race gap had begun to close again, Murray seemed unimpressed, as if the possibility that blacks could ever make further progress was inconceivable.
Flynn took a different approach. The black-white gap, he pointed out, differs dramatically by age. He noted that the tests we have for measuring the cognitive functioning of infants, though admittedly crude, show the races to be almost the same. By age four, the average black I.Q. is 95.4—only four and a half points behind the average white I.Q. Then the real gap emerges: from age four through twenty-four, blacks lose six-tenths of a point a year, until their scores settle at 83.4.
That steady decline, Flynn said, did not resemble the usual pattern of genetic influence. Instead, it was exactly what you would expect, given the disparate cognitive environments that whites and blacks encounter as they grow older. Black children are more likely to be raised in single-parent homes than are white children—and single-parent homes are less cognitively complex than two-parent homes. The average I.Q. of first-grade students in schools that blacks attend is 95, which means that "kids who want to be above average don't have to aim as high." There were possibly adverse differences between black teen-age culture and white teen-age culture, and an enormous number of young black men are in jail—which is hardly the kind of environment in which someone would learn to put on scientific spectacles.
Flynn then talked about what we've learned from studies of adoption and mixed-race children—and that evidence didn't fit a genetic model, either. If I.Q. is innate, it shouldn't make a difference whether it's a mixed-race child's mother or father who is black. But it does: children with a white mother and a black father have an eight-point I.Q. advantage over those with a black mother and a white father. And it shouldn't make much of a difference where a mixed-race child is born. But, again, it does: the children fathered by black American G.I.s in postwar Germany and brought up by their German mothers have the same I.Q.s as the children of white American G.I.s and German mothers. The difference, in that case, was not the fact of the children's blackness, as a fundamentalist would say. It was the fact of their Germanness—of their being brought up in a different culture, under different circumstances. "The mind is much more like a muscle than we've ever realized," Flynn said. "It needs to get cognitive exercise. It's not some piece of clay on which you put an indelible mark." The lesson to be drawn from black and white differences was the same as the lesson from the Netherlands years ago: I.Q. measures not just the quality of a person's mind but the quality of the world that person lives in.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
In the Air
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 12, 2008
Annals of Innovation
Who says big ideas are rare?
1.
Nathan Myhrvold met Jack Horner on the set of the "Jurassic Park" sequel in 1996. Horner is an eminent paleontologist, and was a consultant on the movie. Myhrvold was there because he really likes dinosaurs. Between takes, the two men got to talking, and Horner asked Myhrvold if he was interested in funding dinosaur expeditions.
Myhrvold is of Nordic extraction, and he looks every bit the bearded, fair-haired Viking—not so much the tall, ferocious kind who raped and pillaged as the impish, roly-poly kind who stayed home by the fjords trying to turn lead into gold. He is gregarious, enthusiastic, and nerdy on an epic scale. He graduated from high school at fourteen. He started Microsoft's research division, leaving, in 1999, with hundreds of millions. He is obsessed with aperiodic tile patterns. (Imagine a floor tiled in a pattern that never repeats.) When Myhrvold built his own house, on the shores of Lake Washington, outside Seattle—a vast, silvery hypermodernist structure described by his wife as the place in the sci-fi movie where the aliens live—he embedded some sixty aperiodic patterns in the walls, floors, and ceilings. His front garden is planted entirely with vegetation from the Mesozoic era. ("If the 'Jurassic Park' thing happens," he says, "this is where the dinosaurs will come to eat.") One of the scholarly achievements he is proudest of is a paper he co-wrote proving that it was theoretically possible for sauropods—his favorite kind of dinosaur—to have snapped their tails back and forth faster than the speed of sound. How could he say no to the great Jack Horner?
"What you do on a dinosaur expedition is you hike and look at the ground," Myhrvold explains. "You find bones sticking out of the dirt and, once you see something, you dig." In Montana, which is prime dinosaur country, people had been hiking around and looking for bones for at least a hundred years. But Horner wanted to keep trying. So he and Myhrvold put together a number of teams, totalling as many as fifty people. They crossed the Fort Peck reservoir in boats, and began to explore the Montana badlands in earnest. They went out for weeks at a time, several times a year. They flew equipment in on helicopters. They mapped the full dinosaur ecology—bringing in specialists from other disciplines. And they found dinosaur bones by the truckload.
Once, a team member came across a bone sticking out from the bottom of a recently eroded cliff. It took Horner's field crew three summers to dig it out, and when they broke the bone open a black, gooey substance trickled out—a discovery that led Myhrvold and his friend Lowell Wood on a twenty-minute digression at dinner one night about how, given enough goo and a sufficient number of chicken embryos, they could "make another one."
There was also Myhrvold's own find: a line of vertebrae, as big as apples, just lying on the ground in front of him. "It was seven years ago. It was a bunch of bones from a fairly rare dinosaur called a thescelosaurus. I said, 'Oh, my God!' I was walking with Jack and my son. Then Jack said, 'Look, there's a bone in the side of the hill.' And we look at it, and it's a piece of a jawbone with a tooth the size of a banana. It was a T. rex skull. There was nothing else it could possibly be."
People weren't finding dinosaur bones, and they assumed that it was because they were rare. But—and almost everything that Myhrvold has been up to during the past half decade follows from this fact—it was our fault. We didn't look hard enough.
Myhrvold gave the skeleton to the Smithsonian. It's called the N. rex. "Our expeditions have found more T. rex than anyone else in the world," Myhrvold said. "From 1909 to 1999, the world found eighteen T. rex specimens. From 1999 until now, we've found nine more." Myhrvold has the kind of laugh that scatters pigeons. "We have dominant T. rex market share."
2.
In 1874, Alexander Graham Bell spent the summer with his parents in Brantford, Ontario. He was twenty-seven years old, and employed as a speech therapist in Boston. But his real interest was solving the puzzle of what he then called the "harmonic telegraph." In Boston, he had tinkered obsessively with tuning forks and electromagnetic coils, often staying up all night when he was in the grip of an idea. When he went to Brantford, he brought with him an actual human ear, taken from a cadaver and preserved, to which he attached a pen, so that he could record the vibration of the ear's bones when he spoke into it.
One day, Bell went for a walk on a bluff overlooking the Grand River, near his parents' house. In a recent biography of Bell, "Reluctant Genius," Charlotte Gray writes:
A large tree had blown down here, creating a natural and completely private belvedere, which [he] had dubbed his "dreaming place." Slouched on a wicker chair, his hands in his pockets, he stared unseeing at the swiftly flowing river below him. Far from the bustle of Boston and the pressure of competition from other eager inventors, he mulled over everything he had discovered about sound.
In that moment, Bell knew the answer to the puzzle of the harmonic telegraph. Electric currents could convey sound along a wire if they undulated in accordance with the sound waves. Back in Boston, he hired a research assistant, Thomas Watson. He turned his attic into a laboratory, and redoubled his efforts. Then, on March 10, 1876, he set up one end of his crude prototype in his bedroom, and had Watson take the other end to the room next door. Bell, always prone to clumsiness, spilled acid on his clothes. "Mr. Watson, come here," he cried out. Watson came —but only because he had heard Bell on the receiver, plain as day. The telephone was born.
In 1999, when Nathan Myhrvold left Microsoft and struck out on his own, he set himself an unusual goal. He wanted to see whether the kind of insight that leads to invention could be engineered. He formed a company called Intellectual Ventures. He raised hundreds of millions of dollars. He hired the smartest people he knew. It was not a venture-capital firm. Venture capitalists fund insights—that is, they let the magical process that generates new ideas take its course, and then they jump in. Myhrvold wanted to make insights—to come up with ideas, patent them, and then license them to interested companies. He thought that if he brought lots of very clever people together he could reconstruct that moment by the Grand River.
One rainy day last November, Myhrvold held an "invention session," as he calls such meetings, on the technology of self-assembly. What if it was possible to break a complex piece of machinery into a thousand pieces and then, at some predetermined moment, have the machine put itself back together again? That had to be useful. But for what?
The meeting, like many of Myhrvold's sessions, was held in a conference room in the Intellectual Ventures laboratory, a big warehouse in an industrial park across Lake Washington from Seattle: plasma TV screens on the walls, a long table furnished with bottles of Diet Pepsi and big bowls of cashews.
Chairing the meeting was Casey Tegreene, an electrical engineer with a law degree, who is the chief patent counsel for I.V. He stood at one end of the table. Myhrvold was at the opposite end. Next to him was Edward Jung, whom Myhrvold met at Microsoft. Jung is lean and sleek, with closely cropped fine black hair. Once, he spent twenty-two days walking across Texas with nothing but a bedroll, a flashlight, and a rifle, from Big Bend, in the west, to Houston, where he was going to deliver a paper at a biology conference. On the other side of the table from Jung was Lowell Wood, an imposing man with graying red hair and an enormous head. Three or four pens were crammed into his shirt pocket. The screen saver on his laptop was a picture of Stonehenge.
"You know how musicians will say, 'My teacher was So-and-So, and his teacher was So-and-So,' right back to Beethoven?" Myhrvold says. "So Lowell was the great protégé of Edward Teller. He was at Lawrence Livermore. He was the technical director of Star Wars." Myhrvold and Wood have known each other since Myhrvold was a teen-ager and Wood interviewed him for a graduate fellowship called the Hertz. "If you want to know what Nathan was like at that age," Wood said, "look at that ball of fire now and scale that up by eight or ten decibels." Wood bent the rules for Myhrvold; the Hertz was supposed to be for research in real-world problems. Myhrvold's field at that point, quantum cosmology, involved the application of quantum mechanics to the period just after the big bang, which means, as Myhrvold likes to say, that he had no interest in the universe a microsecond after its creation.
The chairman of the chemistry department at Stanford, Richard Zare, had flown in for the day, as had Eric Leuthardt, a young neurosurgeon from Washington University, in St. Louis, who is a regular at I.V. sessions. At the back was a sombre, bearded man named Rod Hyde, who had been Wood's protégé at Lawrence Livermore.
Tegreene began. "There really aren't any rules," he told everyone. "We may start out talking about refined plastics and end up talking about shoes, and that's O.K."
He started in on the "prep." In the previous weeks, he and his staff had reviewed the relevant scientific literature and recent patent filings in order to come up with a short briefing on what was and wasn't known about self-assembly. A short BBC documentary was shown, on the early work of the scientist Lionel Penrose. Richard Zare passed around a set of what looked like ceramic dice. Leuthardt drew elaborate diagrams of the spine on the blackboard. Self-assembly was very useful in eye-of-the-needle problems—in cases where you had to get something very large through a very small hole—and Leuthardt wondered if it might be helpful in minimally invasive surgery.
The conversation went in fits and starts. "I'm asking a simple question and getting a long-winded answer," Jung said at one point, quietly. Wood played the role of devil's advocate. During a break, Myhrvold announced that he had just bought a CAT scanner, on an Internet auction site.
"I put in a minimum bid of twenty-nine hundred dollars," he said. There was much murmuring and nodding around the room. Myhrvold's friends, like Myhrvold, seemed to be of the opinion that there is no downside to having a CAT scanner, especially if you can get it for twenty-nine hundred dollars.
Before long, self-assembly was put aside and the talk swung to how to improve X-rays, and then to the puzzling phenomenon of soldiers in Iraq who survive a bomb blast only to die a few days later of a stroke. Wood thought it was a shock wave, penetrating the soldiers' helmets and surging through their brains, tearing blood vessels away from tissue. "Lowell is the living example of something better than the Internet," Jung said after the meeting was over. "On the Internet, you can search for whatever you want, but you have to know the right terms. With Lowell, you just give him a concept, and this stuff pops out."
Leuthardt, the neurosurgeon, thought that Wood's argument was unconvincing. The two went back and forth, arguing about how you could make a helmet that would better protect soldiers.
"We should be careful how much mental energy we spend on this," Leuthardt said, after a few minutes.
Wood started talking about the particular properties of bullets with tungsten cores.
"Shouldn't someone tell the Pentagon?" a voice said, only half jokingly, from the back of the room.
3.
How useful is it to have a group of really smart people brainstorm for a day? When Myhrvold started out, his expectations were modest. Although he wanted insights like Alexander Graham Bell's, Bell was clearly one in a million, a genius who went on to have ideas in an extraordinary number of areas—sound recording, flight, lasers, tetrahedral construction, and hydrofoil boats, to name a few. The telephone was his obsession. He approached it from a unique perspective, that of a speech therapist. He had put in years of preparation before that moment by the Grand River, and it was impossible to know what unconscious associations triggered his great insight. Invention has its own algorithm: genius, obsession, serendipity, and epiphany in some unknowable combination. How can you put that in a bottle?
But then, in August of 2003, I.V. held its first invention session, and it was a revelation. "Afterward, Nathan kept saying, 'There are so many inventions,' " Wood recalled. "He thought if we came up with a half-dozen good ideas it would be great, and we came up with somewhere between fifty and a hundred. I said to him, 'But you had eight people in that room who are seasoned inventors. Weren't you expecting a multiplier effect?' And he said, 'Yeah, but it was more than multiplicity.' Not even Nathan had any idea of what it was going to be like."
The original expectation was that I.V. would file a hundred patents a year. Currently, it's filing five hundred a year. It has a backlog of three thousand ideas. Wood said that he once attended a two-day invention session presided over by Jung, and after the first day the group went out to dinner. "So Edward took his people out, plus me," Wood said. "And the eight of us sat down at a table and the attorney said, 'Do you mind if I record the evening?' And we all said no, of course not. We sat there. It was a long dinner. I thought we were lightly chewing the rag. But the next day the attorney comes up with eight single-spaced pages flagging thirty-six different inventions from dinner. Dinner."
And the kinds of ideas the group came up with weren't trivial. Intellectual Ventures just had a patent issued on automatic, battery-powered glasses, with a tiny video camera that reads the image off the retina and adjusts the fluid-filled lenses accordingly, up to ten times a second. It just licensed off a cluster of its patents, for eighty million dollars. It has invented new kinds of techniques for making microchips and improving jet engines; it has proposed a way to custom-tailor the mesh "sleeve" that neurosurgeons can use to repair aneurysms.
Bill Gates, whose company, Microsoft, is one of the major investors in Intellectual "Ventures, says, I can give you fifty examples of ideas they've had where, if you take just one of them, you'd have a startup company right there." Gates has participated in a number of invention sessions, and, with other members of the Gates Foundation, meets every few months with Myhrvold to brainstorm about things like malaria or H.I.V. "Nathan sent over a hundred scientific papers beforehand," Gates said of the last such meeting. "The amount of reading was huge. But it was fantastic. There's this idea they have where you can track moving things by counting wing beats. So you could build a mosquito fence and clear an entire area. They had some ideas about super-thermoses, so you wouldn't need refrigerators for certain things. They also came up with this idea to stop hurricanes. Basically, the waves in the ocean have energy, and you use that to lower the temperature differential. I'm not saying it necessarily is going to work. But it's just an example of something where you go, Wow."
One of the sessions that Gates participated in was on the possibility of resuscitating nuclear energy. "Teller had this idea way back when that you could make a very safe, passive nuclear reactor," Myhrvold explained. "No moving parts. Proliferation-resistant. Dead simple. Every serious nuclear accident involves operator error, so you want to eliminate the operator altogether. Lowell and Rod and others wrote a paper on it once. So we did several sessions on it."
The plant, as they conceived it, would produce something like one to three gigawatts of power, which is enough to serve a medium-sized city. The reactor core would be no more than several metres wide and about ten metres long. It would be enclosed in a sealed, armored box. The box would work for thirty years, without need for refuelling. Wood's idea was that the box would run on thorium, which is a very common, mildly radioactive metal. (The world has roughly a hundred-thousand-year supply, he figures.) Myhrvold's idea was that it should run on spent fuel from existing power plants. "Waste has negative cost," Myhrvold said. "This is how we make this idea politically and regulatorily attractive. Lowell and I had a monthlong no-holds-barred nuclear-physics battle. He didn't believe waste would work. It turns out it does." Myhrvold grinned. "He concedes it now."
It was a long-shot idea, easily fifteen years from reality, if it became a reality at all. It was just a tantalizing idea at this point, but who wasn't interested in seeing where it would lead? "We have thirty guys working on it," he went on. "I have more people doing cutting-edge nuclear work than General Electric. We're looking for someone to partner with us, because this is a huge undertaking. We took out an ad in Nuclear News, which is the big trade journal. It looks like something from The Onion: 'Intellectual Ventures interested in nuclear-core designer and fission specialist.' And, no, the F.B.I. hasn't come knocking." He lowered his voice to a stage whisper. "Lowell is known to them."
It was the dinosaur-bone story all over again. You sent a proper search team into territory where people had been looking for a hundred years, and, lo and behold, there's a T. rex tooth the size of a banana. Ideas weren't precious. They were everywhere, which suggested that maybe the extraordinary process that we thought was necessary for invention—genius, obsession, serendipity, epiphany—wasn't necessary at all.
4.
In June of 1876, a few months after he shouted out, "Mr. Watson, come here," Alexander Graham Bell took his device to the World's Fair in Philadelphia. There, before an audience that included the emperor of Brazil, he gave his most famous public performance. The emperor accompanied Bell's assistant, Willie Hubbard, to an upper gallery, where the receiver had been placed, leaving Bell with his transmitter. Below them, and out of sight, Bell began to talk. "A storm of emotions crossed the Brazilian emperor's face—uncertainty, amazement, elation," Charlotte Gray writes. "Lifting his head from the receiver . . . he gave Willie a huge grin and said, 'This thing speaks!' " Gray continues:
Soon a steady stream of portly, middle-aged men were clambering into the gallery, stripping off their jackets, and bending their ears to the receiver. "For an hour or more," Willie remembered, "all took turns in talking and listening, testing the line in every possible way, evidently looking for some trickery, or thinking that the sound was carried through the air. . . . It seemed to be nearly all too wonderful for belief."
Bell was not the only one to give a presentation on the telephone at the Philadelphia Exhibition, however. Someone else spoke first. His name was Elisha Gray. Gray never had an epiphany overlooking the Grand River. Few have claimed that Gray was a genius. He does not seem to have been obsessive, or to have routinely stayed up all night while in the grip of an idea—although we don't really know, because, unlike Bell, he has never been the subject of a full-length biography. Gray was simply a very adept inventor. He was the author of a number of discoveries relating to the telegraph industry, including a self-adjusting relay that solved the problem of circuits sticking open or shut, and a telegraph printer—a precursor of what was later called the Teletype machine. He worked closely with Western Union. He had a very capable partner named Enos Barton, with whom he formed a company that later became the Western Electric Company and its offshoot Graybar (of Graybar Building fame). And Gray was working on the telephone at the same time that Bell was. In fact, the two filed notice with the Patent Office in Washington, D.C., on the same day—February 14, 1876. Bell went on to make telephones with the company that later became A. T. & T. Gray went on to make telephones in partnership with Western Union and Thomas Edison, and—until Gray's team was forced to settle a lawsuit with Bell's company—the general consensus was that Gray and Edison's telephone was better than Bell's telephone.
In order to get one of the greatest inventions of the modern age, in other words, we thought we needed the solitary genius. But if Alexander Graham Bell had fallen into the Grand River and drowned that day back in Brantford, the world would still have had the telephone, the only difference being that the telephone company would have been nicknamed Ma Gray, not Ma Bell.
5.
This phenomenon of simultaneous discovery—what science historians call "multiples"—turns out to be extremely common. One of the first comprehensive lists of multiples was put together by William Ogburn and Dorothy Thomas, in 1922, and they found a hundred and forty-eight major scientific discoveries that fit the multiple pattern. Newton and Leibniz both discovered calculus. Charles Darwin and Alfred Russel Wallace both discovered evolution. Three mathematicians "invented" decimal fractions. Oxygen was discovered by Joseph Priestley, in Wiltshire, in 1774, and by Carl Wilhelm Scheele, in Uppsala, a year earlier. Color photography was invented at the same time by Charles Cros and by Louis Ducos du Hauron, in France. Logarithms were invented by John Napier and Henry Briggs in Britain, and by Joost BĂĽrgi in Switzerland.
"There were four independent discoveries of sunspots, all in 1611; namely, by Galileo in Italy, Scheiner in Germany, Fabricius in Holland and Harriott in England," Ogburn and Thomas note, and they continue:
The law of the conservation of energy, so significant in science and philosophy, was formulated four times independently in 1847, by Joule, Thomson, Colding and Helmholz. They had been anticipated by Robert Mayer in 1842. There seem to have been at least six different inventors of the thermometer and no less than nine claimants of the invention of the telescope. Typewriting machines were invented simultaneously in England and in America by several individuals in these countries. The steamboat is claimed as the "exclusive" discovery of Fulton, Jouffroy, Rumsey, Stevens and Symmington.
For Ogburn and Thomas, the sheer number of multiples could mean only one thing: scientific discoveries must, in some sense, be inevitable. They must be in the air, products of the intellectual climate of a specific time and place. It should not surprise us, then, that calculus was invented by two people at the same moment in history. Pascal and Descartes had already laid the foundations. The Englishman John Wallis had pushed the state of knowledge still further. Newton's teacher was Isaac Barrow, who had studied in Italy, and knew the critical work of Torricelli and Cavalieri. Leibniz knew Pascal's and Descartes's work from his time in Paris. He was close to a German named Henry Oldenburg, who, now living in London, had taken it upon himself to catalogue the latest findings of the English mathematicians. Leibniz and Newton may never have actually sat down together and shared their work in detail. But they occupied a common intellectual milieu. "All the basic work was done—someone just needed to take the next step and put it together," Jason Bardi writes in "The Calculus Wars," a history of the idea's development. "If Newton and Leibniz had not discovered it, someone else would have." Calculus was in the air.
Of course, that is not the way Newton saw it. He had done his calculus work in the mid-sixteen-sixties, but never published it. And after Leibniz came out with his calculus, in the sixteen-eighties, people in Newton's circle accused Leibniz of stealing his work, setting off one of the great scientific scandals of the seventeenth century. That is the inevitable human response. We're reluctant to believe that great discoveries are in the air. We want to believe that great discoveries are in our heads—and to each party in the multiple the presence of the other party is invariably cause for suspicion.
Thus the biographer Robert Bruce, in "Bell: Alexander Graham Bell and the Conquest of Solitude," casts a skeptical eye on Elisha Gray. Was it entirely coincidence, he asks, that the two filed on exactly the same day? "If Gray had prevailed in the end," he goes on,
Bell and his partners, along with fanciers of the underdog, would have suspected chicanery. After all, Gray did not put his concept on paper nor even mention it to anyone until he had spent nearly a month in Washington making frequent visits to the Patent Office, and until Bell's notarized specifications had for several days been the admiration of at least some of "the people in the Patent Office." . . . It is easier to believe that a conception already forming in Gray's mind was precipitated by rumors of what Bell was about to patent, than to believe that chance alone brought Gray to inspiration and action at that precise moment.
In "The Telephone Gambit," Seth Shulman makes the opposite case. Just before Bell had his famous conversation with Watson, Shulman points out, he visited the Patent Office in Washington. And the transmitter design that Bell immediately sketched in his notebook upon his return to Boston was identical to the sketch of the transmitter that Gray had submitted to the Patent Office. This could not be coincidence, Shulman concludes, and thereupon constructs an ingenious (and, it should be said, highly entertaining) revisionist account of Bell's invention, complete with allegations of corruption and romantic turmoil. Bell's telephone, he writes, is "one of the most consequential thefts in history."
But surely Gray and Bell occupied their scientific moment in the same way that Leibniz and Newton did. They arrived at electric speech by more or less the same pathway. They were trying to find a way to send more than one message at a time along a telegraph wire—which was then one of the central technological problems of the day. They had read the same essential sources—particularly the work of Philipp Reis, the German physicist who had come startlingly close to building a working telephone back in the early eighteen-sixties. The arguments of Bruce and Shulman suppose that great ideas are precious. It is too much for them to imagine that a discovery as remarkable as the telephone could arise in two places at once. But five people came up with the steamboat, and nine people came up with the telescope, and, if Gray had fallen into the Grand River along with Bell, some Joe Smith somewhere would likely have come up with the telephone instead and Ma Smith would have run the show. Good ideas are out there for anyone with the wit and the will to find them, which is how a group of people can sit down to dinner, put their minds to it, and end up with eight single-spaced pages of ideas.
6.
Last March, Myhrvold decided to do an invention session with Eric Leuthardt and several other physicians in St. Louis. Rod Hyde came, along with a scientist from M.I.T. named Ed Boyden. Wood was there as well.
"Lowell came in looking like the Cheshire Cat," Myhrvold recalled. "He said, 'I have a question for everyone. You have a tumor, and the tumor becomes metastatic, and it sheds metastatic cancer cells. How long do those circulate in the bloodstream before they land?' And we all said, 'We don't know. Ten times?' 'No,' he said. 'As many as a million times.' Isn't that amazing? If you had no time, you'd be screwed. But it turns out that these cells are in your blood for as long as a year before they land somewhere. What that says is that you've got a chance to intercept them."
How did Wood come to this conclusion? He had run across a stray fact in a recent issue of The New England Journal of Medicine. "It was an article that talked about, at one point, the number of cancer cells per millilitre of blood," he said. "And I looked at that figure and said, 'Something's wrong here. That can't possibly be true.' The number was incredibly high. Too high. It has to be one cell in a hundred litres, not what they were saying—one cell in a millilitre. Yet they spoke of it so confidently. I clicked through to the references. It was a commonplace. There really were that many cancer cells."
Wood did some arithmetic. He knew that human beings have only about five litres of blood. He knew that the heart pumps close to a hundred millilitres of blood per beat, which means that all of our blood circulates through our bloodstream in a matter of minutes. The New England Journal article was about metastatic breast cancer, and it seemed to Wood that when women die of metastatic breast cancer they don't die with thousands of tumors. The vast majority of circulating cancer cells don't do anything.
"It turns out that some small per cent of tumor cells are actually the deadly " "; he went on. " Tumor stem cells are what really initiate metastases. And isn't it astonishing that they have to turn over at least ten thousand times before they can find a happy home? You naĂŻvely think it's once or twice or three times. Maybe five times at most. It isn't. In other words, metastatic cancer—the brand of cancer that kills us—is an amazingly hard thing to initiate. Which strongly suggests that if you tip things just a little bit you essentially turn off the process."
That was the idea that Wood presented to the room in St. Louis. From there, the discussion raced ahead. Myhrvold and his inventors had already done a lot of thinking about using tiny optical filters capable of identifying and zapping microscopic particles. They also knew that finding cancer cells in blood is not hard. They're often the wrong size or the wrong shape. So what if you slid a tiny filter into a blood vessel of a cancer patient? "You don't have to intercept very much of the blood for it to work," Wood went on. "Maybe one ten-thousandth of it. The filter could be put in a little tiny vein in the back of the hand, because that's all you need. Or maybe I intercept all of the blood, but then it doesn't have to be a particularly efficient filter."
Wood was a physicist, not a doctor, but that wasn't necessarily a liability, at this stage. ""People in biology and medicine don't do arithmetic," he said. He wasn't being critical of biologists and physicians: this was, after all, a man who read medical journals for fun. He meant that the traditions of medicine encouraged qualitative observation and interpretation. But what physicists do—out of sheer force of habit and training—is measure things and compare measurements, and do the math to put measurements in context. At that moment, while reading The New England Journal, Wood had the advantages of someone looking at a familiar fact with a fresh perspective.
That was also why Myhrvold had wanted to take his crew to St. Louis to meet with the surgeons. He likes to say that the only time a physicist and a brain surgeon meet is when the physicist is about to be cut open—and to his mind that made no sense. Surgeons had all kinds of problems that they didn't realize had solutions, and physicists had all kinds of solutions to things that they didn't realize were problems. At one point, Myhrvold asked the surgeons what, in a perfect world, would make their lives easier, and they said that they wanted an X-ray that went only skin deep. They wanted to know, before they made their first incision, what was just below the surface. When the Intellectual Ventures crew heard that, their response was amazement. "That's your dream? A subcutaneous X-ray? We can do that."
Insight could be orchestrated: that was the lesson. If someone who knew how to make a filter had a conversation with someone who knew a lot about cancer and with someone who read the medical literature like a physicist, then maybe you could come up with a cancer treatment. It helped as well that Casey Tegreene had a law degree, Lowell Wood had spent his career dreaming up weapons for the government, Nathan Myhrvold was a ball of fire, Edward Jung had walked across Texas. They had different backgrounds and temperaments and perspectives, and if you gave them something to think about that they did not ordinarily think about—like hurricanes, or jet engines, or metastatic cancer—you were guaranteed a fresh set of eyes.
There were drawbacks to this approach, of course. The outsider, not knowing what the insider knew, would make a lot of mistakes and chase down a lot of rabbit holes. Myhrvold admits that many of the ideas that come out of the invention sessions come to naught. After a session, the Ph.D.s on the I.V. staff examine each proposal closely and decide which ones are worth pursuing. They talk to outside experts; they reread the literature. Myhrvold isn't even willing to guess what his company's most promising inventions are. "That's a fool's game," he says. If ideas are cheap, there is no point in making predictions, or worrying about failures, or obsessing, like Newton and Leibniz, or Bell and Gray, over who was first. After I.V. came up with its cancer-filter idea, it discovered that there was a company, based in Rochester, that was already developing a cancer filter. Filters were a multiple. But so what? If I.V.'s design wasn't the best, Myhrvold had two thousand nine hundred and ninety-nine other ideas to pursue.
In his living room, Myhrvold has a life-size T. rex skeleton, surrounded by all manner of other dinosaur artifacts. One of those is a cast of a nest of oviraptor eggs, each the size of an eggplant. You'd think a bird that big would have one egg, or maybe two. That's the general rule: the larger the animal, the lower the fecundity. But it didn't. For Myhrvold, it was one of the many ways in which dinosaurs could teach us about ourselves. "You know how many eggs were in that nest?" Myhrvold asked. "Thirty-two."
7.
In the nineteen-sixties, the sociologist Robert K. Merton wrote a famous essay on scientific discovery in which he raised the question of what the existence of multiples tells us about genius. No one is a partner to more multiples, he pointed out, than a genius, and he came to the conclusion that our romantic notion of the genius must be wrong. A scientific genius is not a person who does what no one else can do; he or she is someone who does what it takes many others to do. The genius is not a unique source of insight; he is merely an efficient source of insight. "Consider the case of Kelvin, by way of illustration," Merton writes, summarizing work he had done with his Columbia colleague Elinor Barber:
After examining some 400 of his 661 scientific communications and addresses . . . Dr. Elinor Barber and I find him testifying to at least 32 multiple discoveries in which he eventually found that his independent discoveries had also been made by others. These 32 multiples involved an aggregate of 30 other scientists, some, like Stokes, Green, Helmholtz, Cavendish, Clausius, Poincaré, Rayleigh, themselves men of undeniable genius, others, like Hankel, Pfaff, Homer Lane, Varley and Lamé, being men of talent, no doubt, but still not of the highest order. . . . For the hypothesis that each of these discoveries was destined to find expression, even if the genius of Kelvin had not obtained, there is the best of traditional proof: each was in fact made by others. Yet Kelvin's stature as a genius remains undiminished. For it required a considerable number of others to duplicate these 32 discoveries which Kelvin himself made.
This is, surely, what an invention session is: it is Hankel, Pfaff, Homer Lane, Varley, and LamĂ© in a room together, and if you have them on your staff you can get a big chunk of 's discoveries, without ever needing to have Kelvin—which is fortunate, because, although there are plenty of Homer Lanes, Varleys, and Pfaffs in the world, there are very few Kelvins.
Merton's observation about scientific geniuses is clearly not true of artistic geniuses, however. You can't pool the talents of a dozen Salieris and get Mozart's Requiem. You can't put together a committee of really talented art students and get Matisse's "La Danse." A work of artistic genius is singular, and all the arguments over calculus, the accusations back and forth between the Bell and the Gray camps, and our persistent inability to come to terms with the existence of multiples are the result of our misplaced desire to impose the paradigm of artistic invention on a world where it doesn't belong. Shakespeare owned Hamlet because he created him, as none other before or since could. Alexander Graham Bell owned the telephone only because his patent application landed on the examiner's desk a few hours before Gray's. The first kind of creation was sui generis; the second could be re-created in a warehouse outside Seattle.
This is a confusing distinction, because we use the same words to describe both kinds of inventors, and the brilliant scientist is every bit as dazzling in person as the brilliant playwright. The unavoidable first response to Myhrvold and his crew is to think of them as a kind of dream team, but, of course, the fact that they invent as prodigiously and effortlessly as they do is evidence that they are not a dream team at all. You could put together an Intellectual Ventures in Los Angeles, if you wanted to, and Chicago, and New York and Baltimore, and anywhere you could find enough imagination, a fresh set of eyes, and a room full of Varleys and Pfaffs.
The statistician Stephen Stigler once wrote an elegant essay about the futility of the practice of eponymy in science—that is, the practice of naming a scientific discovery after its inventor. That's another idea inappropriately borrowed from the cultural realm. As Stigler pointed out, "It can be found that Laplace employed Fourier Transforms in print before Fourier published on the topic, that Lagrange presented Laplace Transforms before Laplace began his scientific career, that Poisson published the Cauchy distribution in 1824, twenty-nine years before Cauchy touched on it in an incidental manner, and that BienaymĂ© stated and proved the Chebychev Inequality a decade before and in greater generality than Chebychev's first work on the topic." For that matter, the Pythagorean theorem was known before Pythagoras; Gaussian distributions were not discovered by Gauss. The examples were so legion that Stigler declared the existence of Stigler's Law: "No scientific discovery is named after its original discoverer." There are just too many people with an equal shot at those ideas floating out there in the ether. We think we're pinning medals on heroes. In fact, we're pinning tails on donkeys.
Stigler's Law was true, Stigler gleefully pointed out, even of Stigler's Law itself. The idea that credit does not align with discovery, he reveals at the very end of his essay, was in fact first put forth by Merton. "We may expect," Stigler concluded, "that in years to come, Robert K. Merton, and his colleagues and students, will provide us with answers to these and other questions regarding eponymy, completing what, but for the Law, would be called the Merton Theory of the reward system of science."
8.
In April, Lowell Wood was on the East Coast for a meeting of the Hertz Foundation fellows in Woods Hole. Afterward, he came to New York to make a pilgrimage to the American Museum of Natural History. He had just half a day, so he began right away in the Dinosaur Halls. He spent what he later described as a "ridiculously prolonged" period of time at the first station in the Ornithischian Hall—the ankylosaurus shrine. He knew it by heart. His next stop was the dimetrodon, the progenitor of Mammalia. This was a family tradition. When Wood first took his daughter to the museum, she dubbed the fossil "Great Grand-Uncle Dimetrodon," and they always paid their respects to it. Next, he visited a glyptodont; this creature was the only truly armored mammal, a fact of great significance to a former weaponeer.
He then wandered into the Vertebrate Origins gallery and, for the hundredth time, wondered about the strange openings that Archosauria had in front of their eyes and behind their nostrils. They had to be for breathing, didn't they? He tried to come up with an alternate hypothesis, and couldn't—but then he couldn't come up with a way to confirm his own hunch, either. It was a puzzle. Perhaps someday he would figure it out. Perhaps someone else would. Or perhaps someone would find another skeleton that shed light on the mystery. Nathan Myhrvold and Jack Horner had branched out from Montana, and at the end of the summer were going to Mongolia, to hunt in the Gobi desert. There were a lot more bones where these came from.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
Late Bloomers
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
October 20, 2008
Annals of Culture
Why do we equate genius with precocity?
1.
Ben Fountain was an associate in the real-estate practice at the Dallas offices of Akin, Gump, Strauss, Hauer & Feld, just a few years out of law school, when he decided he wanted to write fiction. The only thing Fountain had ever published was a law-review article. His literary training consisted of a handful of creative-writing classes in college. He had tried to write when he came home at night from work, but usually he was too tired to do much. He decided to quit his job.
"I was tremendously apprehensive," Fountain recalls. "I felt like I'd stepped off a cliff and I didn't know if the parachute was going to open. Nobody wants to waste their life, and I was doing well at the practice of law. I could have had a good career. And my parents were very proud of me—my dad was so proud of me. . . . It was crazy."
He began his new life on a February morning—a Monday. He sat down at his kitchen table at 7:30 A.M. He made a plan. Every day, he would write until lunchtime. Then he would lie down on the floor for twenty minutes to rest his mind. Then he would return to work for a few more hours. He was a lawyer. He had discipline. "I figured out very early on that if I didn't get my writing done I felt terrible. So I always got my writing done. I treated it like a job. I did not procrastinate." His first story was about a stockbroker who uses inside information and crosses a moral line. It was sixty pages long and took him three months to write. When he finished that story, he went back to work and wrote another—and then another.
In his first year, Fountain sold two stories. He gained confidence. He wrote a novel. He decided it wasn't very good, and he ended up putting it in a drawer. Then came what he describes as his dark period, when he adjusted his expectations and started again. He got a short story published in Harper's. A New York literary agent saw it and signed him up. He put together a collection of short stories titled "Brief Encounters with Che Guevara," and Ecco, a HarperCollins imprint, published it. The reviews were sensational. The Times Book Review called it "heartbreaking." It won the Hemingway Foundation/PEN award. It was named a No. 1 Book Sense Pick. It made major regional best-seller lists, was named one of the best books of the year by the San Francisco Chronicle, the Chicago Tribune, and Kirkus Reviews, and drew comparisons to Graham Greene, Evelyn Waugh, Robert Stone, and John le Carré.
Ben Fountain's rise sounds like a familiar story: the young man from the provinces suddenly takes the literary world by storm. But Ben Fountain's success was far from sudden. He quit his job at Akin, Gump in 1988. For every story he published in those early years, he had at least thirty rejections. The novel that he put away in a drawer took him four years. The dark period lasted for the entire second half of the nineteen-nineties. His breakthrough with "Brief " came in 2006, eighteen years after he first sat down to write at his kitchen table. The "young" writer from the provinces took the literary world by storm at the age of forty-eight.
2.
Genius, in the popular conception, is inextricably tied up with precocity—doing something truly creative, we're inclined to think, requires the freshness and exuberance and energy of youth. Orson Welles made his masterpiece, "Citizen Kane," at twenty-five. Herman Melville wrote a book a year through his late twenties, culminating, at age thirty-two, with "Moby-Dick." Mozart wrote his breakthrough Piano Concerto No. 9 in E-Flat-Major at the age of twenty-one. In some creative forms, like lyric poetry, the importance of precocity has hardened into an iron law. How old was T. S. Eliot when he wrote "The Love Song of J. Alfred Prufrock" ("I grow old . . . I grow old")? Twenty-three. "Poets peak young," the creativity researcher James Kaufman maintains. Mihály CsĂkszentmihályi, the author of "Flow," agrees: "The most creative lyric verse is believed to be that written by the young." According to the Harvard psychologist Howard Gardner, a leading authority on creativity, "Lyric poetry is a domain where talent is discovered early, burns brightly, and then peters out at an early age."
A few years ago, an economist at the University of Chicago named David Galenson decided to find out whether this assumption about creativity was true. He looked through forty-seven major poetry anthologies published since 1980 and counted the poems that appear most frequently. Some people, of course, would quarrel with the notion that literary merit can be quantified. But Galenson simply wanted to poll a broad cross-section of literary scholars about which poems they felt were the most important in the American canon. The top eleven are, in order, T. S. Eliot's "Prufrock," Robert Lowell's "Skunk Hour," Robert Frost's "Stopping by Woods on a Snowy Evening," William Carlos Williams's "Red Wheelbarrow," Elizabeth Bishop's "The Fish," Ezra Pound's "The River Merchant's Wife," Sylvia Plath's "Daddy," Pound's "In a Station of the Metro," Frost's "Mending Wall," Wallace Stevens's "The Snow Man," and Williams's "The Dance." Those eleven were composed at the ages of twenty-three, forty-one, forty-eight, forty, twenty-nine, thirty, thirty, twenty-eight, thirty-eight, forty-two, and fifty-nine, respectively. There is no evidence, Galenson concluded, for the notion that lyric poetry is a young person's game. Some poets do their best work at the beginning of their careers. Others do their best work decades later. Forty-two per cent of Frost's anthologized poems were written after the age of fifty. For Williams, it's forty-four per cent. For Stevens, it's forty-nine per cent.
The same was true of film, Galenson points out in his study "Old Masters and Young Geniuses: The Two Life Cycles of Artistic Creativity." Yes, there was Orson Welles, peaking as a director at twenty-five. But then there was Alfred Hitchcock, who made "Dial M for Murder," "Rear Window," "To Catch a Thief," "The Trouble with Harry," "Vertigo," "North by Northwest," and "Psycho"—one of the greatest runs by a director in history—between his fifty-fourth and sixty-first birthdays. Mark Twain published "Adventures of Huckleberry Finn" at forty-nine. Daniel Defoe wrote "Robinson Crusoe" at fifty-eight.
The examples that Galenson could not get out of his head, however, were Picasso and CĂ©zanne. He was an art lover, and he knew their stories well. Picasso was the incandescent prodigy. His career as a serious artist began with a masterpiece, "Evocation: The Burial of Casagemas," produced at age twenty. In short order, he painted many of the greatest works of his career—including "Les Demoiselles d'Avignon," at the age of twenty-six. Picasso fit our usual ideas about genius perfectly.
CĂ©zanne didn't. If you go to the CĂ©zanne room at the MusĂ©e d'Orsay, in Paris—the finest collection of CĂ©zannes in the world—the array of masterpieces you'll find along the back wall were all painted at the end of his career. Galenson did a simple economic analysis, tabulating the prices paid at auction for paintings by Picasso and CĂ©zanne with the ages at which they created those works. A painting done by Picasso in his mid-twenties was worth, he found, an average of four times as much as a painting done in his sixties. For CĂ©zanne, the opposite was true. The paintings he created in his mid-sixties were valued fifteen times as highly as the paintings he created as a young man. The freshness, exuberance, and energy of youth did little for CĂ©zanne. He was a late bloomer—and for some reason in our accounting of genius and creativity we have forgotten to make sense of the CĂ©zannes of the world.
3.
The first day that Ben Fountain sat down to write at his kitchen table went well. He knew how the story about the stockbroker was supposed to start. But the second day, he says, he "completely freaked out." He didn't know how to describe things. He felt as if he were back in first grade. He didn't have a fully formed vision, waiting to be emptied onto the page. "I had to create a mental image of a building, a room, a façade, haircut, clothes—just really basic things," he says. "I realized I didn't have the facility to put those into words. I started going out and buying visual dictionaries, architectural dictionaries, and going to school on those."
He began to collect articles about things he was interested in, and before long he realized that he had developed a fascination with Haiti. "The Haiti file just kept getting bigger and bigger," Fountain says. "And I thought, O.K., here's my novel. For a month or two I said I really don't need to go there, I can imagine everything. But after a couple of months I thought, Yeah, you've got to go there, and so I went, in April or May of '91."
He spoke little French, let alone Haitian Creole. He had never been abroad. Nor did he know anyone in Haiti. "I got to the hotel, walked up the stairs, and there was this guy standing at the top of the stairs," Fountain recalls. "He said, 'My name is Pierre. You need a guide.' I said, 'You're sure as hell right, I do.' He was a very genuine person, and he realized pretty quickly I didn't want to go see the girls, I didn't want drugs, I didn't want any of that other stuff," Fountain went on. "And then it was, boom, 'I can take you there. I can take you to this person.' "
Fountain was riveted by Haiti. "It's like a laboratory, almost," he says. "Everything that's gone on in the last five hundred years—colonialism, race, power, politics, ecological disasters—it's all there in very concentrated form. And also I just felt, viscerally, pretty comfortable there." He made more trips to Haiti, sometimes for a week, sometimes for two weeks. He made friends. He invited them to visit him in Dallas. ("You haven't lived until you've had Haitians stay in your house," Fountain says.) "I mean, I was involved. I couldn't just walk away. There's this very nonrational, nonlinear part of the whole process. I had a pretty specific time era that I was writing about, and certain things that I needed to know. But there were other things I didn't really need to know. I met a fellow who was with Save the Children, and he was on the Central Plateau, which takes about twelve hours to get to on a bus, and I had no reason to go there. But I went up there. Suffered on that bus, and ate dust. It was a hard trip, but it was a glorious trip. It had nothing to do with the book, but it wasn't wasted knowledge."
In "Brief Encounters with Che Guevara," four of the stories are about Haiti, and they are the strongest in the collection. They feel like Haiti; they feel as if they've been written from the inside looking out, not the outside looking in. "After the novel was done, I don't know, I just felt like there was more for me, and I could keep going, keep going deeper there," Fountain recalls. "Always there's something—always something—here for me. How many times have I been? At least thirty times."
Prodigies like Picasso, Galenson argues, rarely engage in that kind of open-ended exploration. They tend to be "conceptual," Galenson says, in the sense that they start with a clear idea of where they want to go, and then they execute it. "I can hardly understand the importance given to the word 'research,' " Picasso once said in an interview with the artist Marius de Zayas. "In my opinion, to search means nothing in painting. To find is the thing." He continued, "The several manners I have used in my art must not be considered as an evolution or as steps toward an unknown ideal of painting. . . . I have never made trials or experiments."
But late bloomers, Galenson says, tend to work the other way around. Their approach is experimental. "Their goals are imprecise, so their procedure is tentative and incremental," Galenson writes in "Old Masters and Young Geniuses," and he goes on:
The imprecision of their goals means that these artists rarely feel they have succeeded, and their careers are consequently often dominated by the pursuit of a single objective. These artists repeat themselves, painting the same subject many times, and gradually changing its treatment in an experimental process of trial and error. Each work leads to the next, and none is generally privileged over others, so experimental painters rarely make specific preparatory sketches or plans for a painting. They consider the production of a painting as a process of searching, in which they aim to discover the image in the course of making it; they typically believe that learning is a more important goal than making finished paintings. Experimental artists build their skills gradually over the course of their careers, improving their work slowly over long periods. These artists are perfectionists and are typically plagued by frustration at their inability to achieve their goal.
Where Picasso wanted to find, not search, CĂ©zanne said the opposite: "I seek in painting."
An experimental innovator would go back to Haiti thirty times. That's how that kind of mind figures out what it wants to do. When CĂ©zanne was painting a portrait of the critic Gustave Geffroy, he made him endure eighty sittings, over three months, before announcing the project a failure. (The result is one of that string of masterpieces in the MusĂ©e ''Orsay.) When CĂ©zanne painted his dealer, Ambrose Vollard, he made Vollard arrive at eight in the morning and sit on a rickety platform until eleven-thirty, without a break, on a hundred and fifty occasions—before abandoning the portrait. He would paint a scene, then repaint it, then paint it again. He was notorious for slashing his canvases to pieces in fits of frustration.
Mark Twain was the same way. Galenson quotes the literary critic Franklin Rogers on Twain's trial-and-error method: "His routine procedure seems to have been to start a novel with some structural plan which ordinarily soon proved defective, whereupon he would cast about for a new plot which would overcome the difficulty, rewrite what he had already written, and then push on until some new defect forced him to repeat the process once again." Twain fiddled and despaired and revised and gave up on "Huckleberry Finn" so many times that the book took him nearly a decade to complete. The CĂ©zannes of the world bloom late not as a result of some defect in character, or distraction, or lack of ambition, but because the kind of creativity that proceeds through trial and error necessarily takes a long time to come to fruition.
One of the best stories in "Brief Encounters" is called "Near-Extinct Birds of the Central Cordillera." It's about an ornithologist taken hostage by the FARC guerrillas of Colombia. Like so much of Fountain's work, it reads with an easy grace. But there was nothing easy or graceful about its creation. "I struggled with that story," Fountain says. "I always try to do too much. I mean, I probably wrote five hundred pages of it in various incarnations." Fountain is at work right now on a novel. It was supposed to come out this year. It's late.
4.
Galenson's idea that creativity can be divided into these types—conceptual and experimental—has a number of important implications. For example, we sometimes think of late bloomers as late starters. They don't realize they're good at something until they're fifty, so of course they achieve late in life. But that's not quite right. CĂ©zanne was painting almost as early as Picasso was. We also sometimes think of them as artists who are discovered late; the world is just slow to appreciate their gifts. In both cases, the assumption is that the prodigy and the late bloomer are fundamentally the same, and that late blooming is simply genius under conditions of market failure. What Galenson's argument suggests is something else—that late bloomers bloom late because they simply aren't much good until late in their careers.
"All these qualities of his inner vision were continually hampered and obstructed by CĂ©zanne's incapacity to give sufficient verisimilitude to the personae of his drama," the great English art critic Roger Fry wrote of the early CĂ©zanne. "With all his rare endowments, he happened to lack the comparatively common gift of illustration, the gift that any draughtsman for the illustrated papers learns in a school of commercial art; whereas, to realize such visions as CĂ©zanne's required this gift in high degree." In other words, the young CĂ©zanne couldn't draw. Of "The Banquet," which CĂ©zanne painted at thirty-one, Fry writes, "It is no use to deny that CĂ©zanne has made a very poor job of it." Fry goes on, "More happily endowed and more integral personalities have been able to express themselves harmoniously from the very first. But such rich, complex, and conflicting natures as CĂ©zanne's require a long period of fermentation." CĂ©zanne was trying something so elusive that he couldn't master it until he'd spent decades practicing.
This is the vexing lesson of Fountain's long attempt to get noticed by the literary world. On the road to great achievement, the late bloomer will resemble a failure: while the late bloomer is revising and despairing and changing course and slashing canvases to ribbons after months or years, what he or she produces will look like the kind of thing produced by the artist who will never bloom at all. Prodigies are easy. They advertise their genius from the get-go. Late bloomers are hard. They require forbearance and blind faith. (Let's just be thankful that CĂ©zanne didn't have a guidance counsellor in high school who looked at his primitive sketches and told him to try accounting.) Whenever we find a late bloomer, we can't but wonder how many others like him or her we have thwarted because we prematurely judged their talents. But we also have to acccept that there's nothing we can do about it. How can we ever know which of the failures will end up blooming?
Not long after meeting Ben Fountain, I went to see the novelist Jonathan Safran Foer, the author of the 2002 best-seller "Everything Is Illuminated." Fountain is a graying man, slight and modest, who looks, in the words of a friend of his, like a "golf pro from Augusta, Georgia." Foer is in his early thirties and looks barely old enough to drink. Fountain has a softness to him, as if years of struggle have worn away whatever sharp edges he once had. Foer gives the impression that if you touched him while he was in full conversational flight you would get an electric shock.
"I came to writing really by the back door," Foer said. "My wife is a writer, and she grew up keeping journals—you know, parents said, 'Lights out, time for bed,' and she had a little flashlight under the covers, reading books. I don't think I read a book until much later than other people. I just wasn't interested in it."
Foer went to Princeton and took a creative-writing class in his freshman year with Joyce Carol Oates. It was, he explains, "sort of on a whim, maybe out of a sense that I should have a diverse course load." He'd never written a story before. "I didn't really think anything of it, to be honest, but halfway through the semester I arrived to class early one day, and she said, 'Oh, I'm glad I have this chance to talk to you. I'm a fan of your writing.' And it was a real revelation for me."
Oates told him that he had the most important of writerly qualities, which was energy. He had been writing fifteen pages a week for that class, an entire story for each seminar. "Why does a dam with a crack in it leak so much?" he said, with a laugh. "There was just something in me, there was like a pressure."
As a sophomore, he took another creative-writing class. During the following summer, he went to Europe. He wanted to find the village in Ukraine where his grandfather had come from. After the trip, he went to Prague. There he read Kafka, as any literary undergraduate would, and sat down at his computer.
"I was just writing," he said. "I didn't know that I was writing until it was happening. I didn't go with the intention of writing a book. I wrote three hundred pages in ten weeks. I really wrote. I'd never done it like that."
It was a novel about a boy named Jonathan Safran Foer who visits the village in Ukraine where his grandfather had come from. Those three hundred pages were the first draft of "Everything Is Illuminated"—the exquisite and extraordinary novel that established Foer as one of the most distinctive literary voices of his generation. He was nineteen years old.
Foer began to talk about the other way of writing books, where you painstakingly honed your craft, over years and years. "I couldn't do that," he said. He seemed puzzled by it. It was clear that he had no understanding of how being an experimental innovator would work. "I mean, imagine if the craft you're trying to learn is to be an original. How could you learn the craft of being an original?"
He began to describe his visit to Ukraine. "I went to the shtetl where my family came from. It's called Trachimbrod, the name I use in the book. It's a real place. But you know what's funny? It's the single piece of research that made its way into the book." He wrote the first sentence, and he was proud of it, and then he went back and forth in his mind about where to go next. "I spent the first week just having this debate with myself about what to do with this first sentence. And once I made the decision, I felt liberated to just create—and it was very explosive after that."
If you read "Everything Is Illuminated," you end up with the same feeling you get when you read "Brief Encounters with Che Guevara"—the sense of transport you experience when a work of literature draws you into its own world. Both are works of art. It's just that, as artists, Fountain and Foer could not be less alike. Fountain went to Haiti thirty times. Foer went to Trachimbrod just once. "I mean, it was nothing," Foer said. "I had absolutely no experience there at all. It was just a springboard for my book. It was like an empty swimming pool that had to be filled up." Total time spent getting inspiration for his novel: three days.
5.
Ben Fountain did not make the decision to quit the law and become a writer all by himself. He is married and has a family. He met his wife, Sharon, when they were both in law school at Duke. When he was doing real-estate work at Akin, Gump, she was on the partner track in the tax practice at Thompson & Knight. The two actually worked in the same building in downtown Dallas. They got married in 1985, and had a son in April of 1987. Sharie, as Fountain calls her, took four months of maternity leave before returning to work. She made partner by the end of that year.
"We had our son in a day care downtown," she recalls. "We would drive in together, one of us would take him to day care, the other one would go to work. One of us would pick him up, and then, somewhere around eight o'clock at night, we would have him bathed, in bed, and then we hadn't even eaten yet, and we'd be looking at each other, going, 'This is just the beginning.' " She made a face. "That went on for maybe a month or two, and Ben's like, 'I don't know how people do this.' We both agreed that continuing at that pace was probably going to make us all miserable. Ben said to me, 'Do you want to stay home?' Well, I was pretty happy in my job, and he wasn't, so as far as I was concerned it didn't make any sense for me to stay home. And I didn't have anything besides practicing law that I really wanted to do, and he did. So I said, 'Look, can we do this in a way that we can still have some day care and so you can write?' And so we did that."
Ben could start writing at seven-thirty in the morning because Sharie took their son to day care. He stopped working in the afternoon because that was when he had to pick him up, and then he did the shopping and the household chores. In 1989, they had a second child, a daughter. Fountain was a full-fledged North Dallas stay-at-home dad.
"When Ben first did this, we talked about the fact that it might not work, and we talked about, generally, 'When will we know that it really isn't working?' and I'd say, 'Well, give it ten years,' " Sharie recalled. To her, ten years didn't seem unreasonable. "It takes a while to decide whether you like something or not," she says. And when ten years became twelve and then fourteen and then sixteen, and the kids were off in high school, she stood by him, because, even during that long stretch when Ben had nothing published at all, she was confident that he was getting better. She was fine with the trips to Haiti, too. "I can't imagine writing a novel about a place you haven't at least tried to visit," she says. She even went with him once, and on the way into town from the airport there were people burning tires in the middle of the road.
"I was making pretty decent money, and we didn't need two incomes," Sharie went on. She has a calm, unflappable quality about her. "I mean, it would have been nice, but we could live on one."
Sharie was Ben's wife. But she was also—to borrow a term from long ago—his patron. That word has a condescending edge to it today, because we think it far more appropriate for artists (and everyone else for that matter) to be supported by the marketplace. But the marketplace works only for people like Jonathan Safran Foer, whose art emerges, fully realized, at the beginning of their career, or Picasso, whose talent was so blindingly obvious that an art dealer offered him a hundred-and-fifty-franc-a-month stipend the minute he got to Paris, at age twenty. If you are the type of creative mind that starts without a plan, and has to experiment and learn by doing, you need someone to see you through the long and difficult time it takes for your art to reach its true level.
This is what is so instructive about any biography of Cézanne. Accounts of his life start out being about Cézanne, and then quickly turn into the story of Cézanne's circle. First and foremost is always his best friend from childhood, the writer Émile Zola, who convinces the awkward misfit from the provinces to come to Paris, and who serves as his guardian and protector and coach through the long, lean years.
Here is Zola, already in Paris, in a letter to the young CĂ©zanne back in Provence. Note the tone, more paternal than fraternal:
You ask me an odd question. Of course one can work here, as anywhere else, if one has the will. Paris offers, further, an advantage you can't find elsewhere: the museums in which you can study the old masters from 11 to 4. This is how you must divide your time. From 6 to 11 you go to a studio to paint from a live model; you have lunch, then from 12 to 4 you copy, in the Louvre or the Luxembourg, whatever masterpiece you like. That will make up nine hours of work. I think that ought to be enough.
Zola goes on, detailing exactly how CĂ©zanne could manage financially on a monthly stipend of a hundred and twenty-five francs:
I'll reckon out for you what you should spend. A room at 20 francs a month; lunch at 18 sous and dinner at 22, which makes two francs a day, or 60 francs a month. . . . Then you have the studio to pay for: the Atelier Suisse, one of the least expensive, charges, I think, 10 francs. Add 10 francs for canvas, brushes, colors; that makes 100. So you'll have 25 francs left for laundry, light, the thousand little needs that turn up.
Camille Pissarro was the next critical figure in CĂ©zanne's life. It was Pissarro who took CĂ©zanne under his wing and taught him how to be a painter. For years, there would be periods in which they went off into the country and worked side by side.
Then there was Ambrose Vollard, the sponsor of CĂ©zanne's first one-man show, at the age of fifty-six. At the urging of Pissarro, Renoir, Degas, and Monet, Vollard hunted down CĂ©zanne in Aix. He spotted a still-life in a tree, where it had been flung by CĂ©zanne in disgust. He poked around the town, putting the word out that he was in the market for CĂ©zanne's canvases. In "Lost Earth: A Life of CĂ©zanne," the biographer Philip Callow writes about what happened next:
Before long someone appeared at his hotel with an object wrapped in a cloth. He sold the picture for 150 francs, which inspired him to trot back to his house with the dealer to inspect several more magnificent CĂ©zannes. Vollard paid a thousand francs for the job lot, then on the way out was nearly hit on the head by a canvas that had been overlooked, dropped out the window by the man's wife. All the pictures had been gathering dust, half buried in a pile of junk in the attic.
All this came before Vollard agreed to sit a hundred and fifty times, from eight in the morning to eleven-thirty, without a break, for a picture that CĂ©zanne disgustedly abandoned. Once, Vollard recounted in his memoir, he fell asleep, and toppled off the makeshift platform. CĂ©zanne berated him, incensed: "Does an apple move?" This is called friendship.
Finally, there was CĂ©zanne's father, the banker Louis-Auguste. From the time CĂ©zanne first left Aix, at the age of twenty-two, Louis-Auguste paid his bills, even when CĂ©zanne gave every indication of being nothing more than a failed dilettante. But for Zola, CĂ©zanne would have remained an unhappy banker's son in Provence; but for Pissarro, he would never have learned how to paint; but for Vollard (at the urging of Pissarro, Renoir, Degas, and Monet), his canvases would have rotted away in some attic; and, but for his father, CĂ©zanne's long apprenticeship would have been a financial impossibility. That is an extraordinary list of patrons. The first three—Zola, Pissarro, and Vollard—would have been famous even if CĂ©zanne never existed, and the fourth was an unusually gifted entrepreneur who left CĂ©zanne four hundred thousand francs when he died. CĂ©zanne didn't just have help. He had a dream team in his corner.
This is the final lesson of the late bloomer: his or her success is highly contingent on the efforts of others. In biographies of CĂ©zanne, Louis-Auguste invariably comes across as a kind of grumpy philistine, who didn't appreciate his son's genius. But Louis-Auguste didn't have to support CĂ©zanne all those years. He would have been within his rights to make his son get a real job, just as Sharie might well have said no to her husband's repeated trips to the chaos of Haiti. She could have argued that she had some right to the life style of her profession and status—that she deserved to drive a BMW, which is what power couples in North Dallas drive, instead of a Honda Accord, which is what she settled for.
But she believed in her husband's art, or perhaps, more simply, she believed in her husband, the same way Zola and Pissarro and Vollard and—in his own, querulous way—Louis-Auguste must have believed in CĂ©zanne. Late bloomers' stories are invariably love stories, and this may be why we have such difficulty with them. We'd like to think that mundane matters like loyalty, steadfastness, and the willingness to keep writing checks to support what looks like failure have nothing to do with something as rarefied as genius. But sometimes genius is anything but rarefied; sometimes it's just the thing that emerges after twenty years of working at your kitchen table.
"Sharie never once brought up money, not once—never," Fountain said. She was sitting next to him, and he looked at her in a way that made it plain that he understood how much of the credit for "Brief Encounters" belonged to his wife. His eyes welled up with tears. "I never felt any pressure from her," he said. "Not even covert, not even implied."
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
The Uses of Adversity
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
November 10, 2008
Annals of Business
Can underprivileged outsiders have an advantage?
1.
Sidney Weinberg was born in 1891, one of eleven children of Pincus Weinberg, a struggling Polish-born liquor wholesaler and bootlegger in Brooklyn. Sidney was short, a "Kewpie doll," as the New Yorker writer E. J. Kahn, Jr., described him, "in constant danger of being swallowed whole by executive-size chairs." He pronounced his name "Wine-boig." He left school at fifteen. He had scars on his back from knife fights in his preteen days, when he sold evening newspapers at the Hamilton Avenue terminus of the Manhattan-Brooklyn ferry.
At sixteen, he made a visit to Wall Street, keeping an eye out for a "nice-looking, tall building," as he later recalled. He picked 43 Exchange Place, where he started at the top floor and worked his way down, asking at every office, "Want a boy?" By the end of the day, he had reached the third-floor offices of a small brokerage house. There were no openings. He returned to the brokerage house the next morning. He lied that he was told to come back, and bluffed himself into a job assisting the janitor, for three dollars a week. The small brokerage house was Goldman Sachs.
From that point, Charles Ellis tells us in a new book, "The Partnership: The Making of Goldman Sachs," Weinberg's rise was inexorable. Early on, he was asked to carry a flagpole on the trolley uptown to the Sachs family's town house. The door was opened by Paul Sachs, the grandson of the firm's founder, and Sachs took a shine to him. Weinberg was soon promoted to the mailroom, which he promptly reorganized. Sachs sent him to Browne's Business College, in Brooklyn, to learn penmanship. By 1925, the firm had bought him a seat on the New York Stock Exchange. By 1927, he had made partner. By 1930, he was a senior partner, and for the next thirty-nine years—until his death, in 1969—Weinberg was Goldman Sachs, turning it from a floundering, mid-tier partnership into the premier investment bank in the world.
2.
The rags-to-riches story—that staple of American biography—has over the years been given two very different interpretations. The nineteenth-century version stressed the value of compensating for disadvantage. If you wanted to end up on top, the thinking went, it was better to start at the bottom, because it was there that you learned the discipline and motivation essential for success. "New York merchants preferred to hire country boys, on the theory that they worked harder, and were more resolute, obedient, and cheerful than native New Yorkers," Irvin G. Wyllie wrote in his 1954 study "The Self-Made Man in America." Andrew Carnegie, whose personal history was the defining self-made-man narrative of the nineteenth century, insisted that there was an advantage to being "cradled, nursed and reared in the stimulating school of poverty." According to Carnegie, "It is not from the sons of the millionaire or the noble that the world receives its teachers, its martyrs, its inventors, its statesmen, its poets, or even its men of affairs. It is from the cottage of the poor that all these spring."
Today, that interpretation has been reversed. Success is seen as a matter of capitalizing on socioeconomic advantage, not compensating for disadvantage. The mechanisms of social mobility—scholarships, affirmative action, housing vouchers, Head Start—all involve attempts to convert the poor from chronic outsiders to insiders, to rescue them from what is assumed to be a hopeless state. Nowadays, we don't learn from poverty, we escape from poverty, and a book like Ellis's history of Goldman Sachs is an almost perfect case study of how we have come to believe social mobility operates. Six hundred pages of Ellis's book are devoted to the modern-day Goldman, the firm that symbolized the golden era of Wall Street. From the boom years of the nineteen-eighties through the great banking bubble of the past decade, Goldman brought impeccably credentialled members of the cognitive and socioeconomic Ă©lite to Wall Street, where they conjured up fantastically complex deals and made enormous fortunes. The opening seventy-two pages of the book, however, the chapters covering the Sidney Weinberg years, seem as though they belong to a different era. The man who created what we know as Goldman Sachs was a poor, uneducated member of a despised minority—and his story is so remarkable that perhaps only Andrew Carnegie could make sense of it.
3.
Weinberg was not a financial wizard. His gifts were social. In his heyday, Weinberg served as a director on thirty-one corporate boards. He averaged two hundred and fifty committee or board meetings a year, and when he was not in meetings he would often take a steam at the Hotel Biltmore's Turkish baths with the likes of Robert Woodruff, of Coca-Cola, and Bernard Gimbel, of Gimbels. During the Depression, Weinberg served on Franklin Roosevelt's Business Advisory and Planning Council, and F.D.R. dubbed him the Politician, for his skill at mediating among contentious parties. He spent the war years as the vice-chairman of the War Production Board, where he was known as the Body Snatcher, because of the way he persuaded promising young business executives to join the war effort. (Weinberg seems to have been the first to realize that signing up promising young executives for public service during the war was the surest way to sign them up as clients after the war.)
When Ford Motor Company decided to go public, in the mid-nineteen-fifties, in what remains one of the world's biggest initial public offerings, both major parties in the hugely complicated transaction—the Ford family and the Ford Foundation—wanted Weinberg to represent them. He was Mr. Wall Street. "In his role as the power behind the throne," E. J. Kahn wrote in a New Yorker Profile of Weinberg, fifty years ago, "he probably comes as close as Bernard Baruch to embodying the popular conception of Bernard Baruch." Kahn went on:
There is hardly a prominent corporation executive of whom he cannot—and, indeed, does not—say, "He's an intimate close personal friend of mine." . . . Industrialists who want information about other industrialists automatically turn to Weinberg, much as merchants consult credit-rating agencies. His end of many telephone conversations consists of fragments like "Who? . . . Of course I know him. Intimately. . . . Used to be Under-Secretary of the Treasury. . . . O.K., I'll have him call you."
This gregariousness is what we expect of the head of an investment bank. Wall Street—particularly the clubby Wall Street of the early and middle part of the twentieth century—was a relationship business: you got to do the stock offering for Continental Can because you knew the head of Continental Can. We further assume that businesses based on social ties reward cultural insiders. That's one of the reasons we no longer think of poverty as being useful in the nineteenth-century sense; no matter how hard you work, or how disciplined you are, it is difficult to overcome the socially marginalizing effects of an impoverished background. In order to do the stock offering for Continental Can, you need to know the head of Continental Can, and in order to know the head of Continental Can it really helps to have been his classmate at Yale.
But Weinberg wasn't Yale. He was P.S. 13. Nor did he try to pretend that he was an insider. He did the opposite. "You'll have to make that plainer," he would say. "I'm just a dumb, uneducated kid from Brooklyn." He bought a modest house in Scarsdale in the nineteen-twenties, and lived there the rest of his life. He took the subway. He may have worked closely with the White House, but this was the Roosevelt White House, in the nineteen-thirties, at a time when none of the Old Guard on Wall Street were New Dealers. Weinberg would talk about his public school as if it were Princeton, and as a joke he would buy up Phi Beta Kappa keys from pawnshops and hand them out to visitors like party favors. His savvy was such that Roosevelt wanted to make him Ambassador to the Soviet Union, and his grasp of the intricacies of Wall Street was so shrewd that his phone never stopped ringing. But as often as he could he reminded his peers that he was from the other side of the tracks.
At one board meeting, Ellis writes, "a long presentation was being made that was overloaded with dull, detailed statistics. Number after number was read off. When the droning presenter finally paused for breath, Weinberg jumped up, waving his papers in mock triumph, to call out 'Bingo!' " The immigrant's best strategy, in the famous adage, is to think Yiddish and dress British. Weinberg thought British and dressed Yiddish.
Why did that strategy work? This is the great mystery of Weinberg's career, and it's hard to escape the conclusion that Carnegie was on to something: there are times when being an outsider is precisely what makes you a good insider. It's not difficult to imagine, for example, that the head of Continental Can liked the fact that Weinberg was from nothing, in the same way that New York City employers preferred country boys to city boys. That C.E.O. dwelled in a world with lots of people who went to Yale and then to Wall Street; he knew that some of them were good at what they did and some of them were just well connected, and separating the able from the incompetent wasn't always easy. Weinberg made it out of Brooklyn; how could he not be good?
Weinberg's outsiderness also allowed him to play the classic "middleman minority" role. One of the reasons that the Parsi in India, the East Asians in Africa, the Chinese in Southeast Asia, and the Lebanese in the Caribbean, among others, have been so successful, sociologists argue, is that they are decoupled from the communities in which they operate. If you are a Malaysian in Malaysia, or a Kenyan in Kenya, or an African-American in Watts, and you want to run a grocery store, you start with a handicap: you have friends and relatives who want jobs, or discounts. You can't deny credit or collect a debt from your neighbor, because he's your neighbor, and your social and business lives are tied up together. As the anthropologist Brian Foster writes of commerce in Thailand:
A trader who was subject to the traditional social obligations and constraints would find it very difficult to run a viable business. If, for example, he were fully part of the village society and subject to the constraints of the society, he would be expected to be generous in the traditional way to those in need. It would be difficult for him to refuse credit, and it would not be possible to collect debts. . . . The inherent conflict of interest in a face-to-face market transaction would make proper etiquette impossible or would at least strain it severely, which is an important factor in Thai social relations.
The minority has none of those constraints. He's free to keep social and financial considerations separate. He can call a bad debt a bad debt, or a bad customer a bad customer, without worrying about the social implications of his honesty.
Weinberg was decoupled from the business establishment in the same way, and that seems to have been a big part of what drew executives to him. The chairman of General Foods avowed, "Sidney is the only man I know who could ever say to me in the middle of a board meeting, as he did once, 'I don't think you're very bright,' and somehow give me the feeling that I'd been paid a compliment." That Weinberg could make a rebuke seem like a compliment is testament to his charm. That he felt free to deliver the rebuke in the first place is testament to his sociological position. You can't tell the chairman of General Foods that he's an idiot if you were his classmate at Yale. But you can if you're Pincus Weinberg's son from Brooklyn. Truthtelling is easier from a position of cultural distance.
Here is Ellis on Weinberg, again:
Shortly after he was elected a director of General Electric, he was called upon by Philip D. Reed, GE's chairman of the board, to address a group of company officials at a banquet at the Waldorf Astoria. In presenting Weinberg, Reed said . . . that he hoped Mr. Weinberg felt, as he felt, that GE was the greatest outfit in the greatest industry in the greatest country in the world. Weinberg got to his feet. "I'll string along with your chairman about this being the greatest country," he began. "And I guess I'll even buy that part about the electrical industry. But as to GE's being the greatest business in the field, why, I'm damned if I'll commit myself until I've had a look-see." Then he sat down to vigorous applause.
At G.E., Weinberg's irreverence was cherished. During the Second World War, a top Vichy official, Admiral Jean-François Darlan, visited the White House. Darlan was classic French military, imperious and entitled, and was thought to have Nazi sympathies. Protocol dictated that the Allies treat Darlan with civility, and everyone did—save for Weinberg. The outsider felt perfectly free to say what everyone else wanted to but could not, and in so doing surely endeared himself to the whole room. "When it was time to leave," Ellis writes, "Weinberg reached into his pocket as he came to the front door, pulled out a quarter, and handed it to the resplendently uniformed admiral, saying, 'Here, boy, get me a cab.'"
The idea that outsiders can profit by virtue of their outsiderness runs contrary to our understanding of minorities. "Think Yiddish, dress British" presumes that the outsider is better off cloaking his differences. But there are clearly also times and places where minorities benefit by asserting and even exaggerating their otherness. The Berkeley historian Yuri Slezkine argues, in "The Jewish Century" (2004), that Yiddish did not evolve typically: if you study its form and structure, you discover its deliberate and fundamental artificiality—it is the language of people who are interested, in Slezkine's words, in "the maintenance of difference, the conscious preservation of the self and thus of strangeness."
Similarly, in field work in a Malaysian village, the anthropologist L. A. Peter Gosling observed a Chinese shopkeeper who
appeared to be considerably acculturated to Malay culture, and was scrupulously sensitive to Malays in every way, including the normal wearing of a sarong, quiet and polite Malay speech, and a humble and affable manner. However, at harvest time when he would go to the field to collect crops on which he had advanced credit, he would put on his Chinese costume of shorts and undershirt, and speak in a much more abrupt fashion, acting, as one Malay farmer put it, "just like a Chinese." This behavior was to insure that he would not be treated like a fellow Malay who might be expected to be more generous on price or credit terms.
Is this what Weinberg was up to with his constant references to P.S. 13? Ellis's book repeats stories about Weinberg from Lisa Endlich's 1999 history, "Goldman Sachs: The Culture of Success," which in turn repeats stories about Weinberg from Kahn's Profile, which in turn—one imagines—repeats stories honed by Weinberg and his friends over the years. And what is clear when you read those stories is how obviously they are stories: anecdotes clearly constructed for strategic effect. According to Ellis:
A friend told of Weinberg's being the guest of honor at J. P. Morgan's luncheon table, where the following exchange occurred: "Mr. Weinberg, I presume you served in the last war?"
"Yes, sir, I was in the war—in the navy."
"What were you in the navy?"
"Cook, Second Class."
Morgan was delighted.
Of course, J. P. Morgan wasn't actually delighted. He died in 1913, before the First World War started. So he wasn't the mogul at the table. But you can understand why Weinberg would want to pretend that he was. And although Weinberg did a stint as a cook (on account of poor eyesight), he quickly got himself transferred to the Office of Naval Intelligence, and then spent most of the war heading up the inspection of all vessels using the port of Norfolk. But you can understand why that little bit of additional history doesn't fit, either.
Here's another one:
The heir to a large retailing fortune once spent a night in Scarsdale with the Weinbergs and retired early. After Weinberg and his wife, whose only servant was a cook, had emptied the ashtrays and picked up the glasses, they noticed that their guest had put his suit and shoes outside his bedroom door. Amused, Weinberg took the suit and shoes down to the kitchen, cleaned the shoes, brushed the suit, and put them back. The following day, as the guest was leaving, he handed Weinberg a five dollar bill and asked him to pass it along to the butler who had taken such excellent care of things. Weinberg thanked him gravely and pocketed the money.
Let's see: we're supposed to believe that the retailing heir has dinner at the modest Weinberg residence in Scarsdale and never once sees a butler, and doesn't see a butler in the morning, either, and yet somehow remains convinced that there's a butler around. Did he imagine the butler was hiding in a closet? No matter. This is another of those stories which Weinberg needed to tell, and his audience needed to hear.
4.
It's one thing to argue that being an outsider can be strategically useful. But Andrew Carnegie went farther. He believed that poverty provided a better preparation for success than wealth did; that, at root, compensating for disadvantage was more useful, developmentally, than capitalizing on advantage.
This idea is both familiar and perplexing. Consider the curious fact that many successful entrepreneurs suffer from serious learning disabilities. Paul Orfalea, the founder of the Kinko's chain, was a D student who failed two grades, was expelled from four schools, and graduated at the bottom of his high-school class. "In third grade, the only word I could read was 'the,' " he says. "I used to keep track of where the group was reading by following from one 'the' to the next." Richard Branson, the British billionaire who started the Virgin empire, dropped out of school at fifteen after struggling with reading and writing. "I was always bottom of the class," he has said. John Chambers, who built the Silicon Valley firm Cisco into a hundred-billion-dollar corporation, has trouble reading e-mail. One of the pioneers of the cellular-phone industry, Craig McCaw, is dyslexic, as is Charles Schwab, the founder of the discount brokerage house that bears his name. When the business-school professor Julie Logan surveyed a group of American small-business owners recently, she found that thirty-five per cent of them self-identified as dyslexic.
That is a remarkable statistic. Dyslexia affects the very skills that lie at the center of an individual's ability to manage the modern world. Yet Schwab and Orfalea and Chambers and Branson seem to have made up for their disabilities, in the same way that the poor, in Carnegie's view, can make up for their poverty. Because of their difficulties with reading and writing, they were forced to develop superior oral-communication and problem-solving skills. Because they had to rely on others to help them navigate the written word, they became adept at delegating authority. In one study, conducted in Britain, eighty per cent of dyslexic entrepreneurs were found to have held the position of captain in a high-school sport, versus twenty-seven per cent of non-dyslexic entrepreneurs. They compensated for their academic shortcomings by developing superior social skills, and, when they reached the workplace, those compensatory skills gave them an enormous head start. "I didn't have a lot of self-confidence as a kid," Orfalea said once, in an interview. "And that is for the good. If you have a healthy dose of rejection in your life, you are going to have to figure out how to do it your way."
There's no question that we are less than comfortable with the claims that people like Schwab and Orfalea make on behalf of their disabilities. As impressive as their success has been, none of us would go so far as to wish dyslexia on our own children. If a disproportionately high number of entrepreneurs are dyslexic, so are a disproportionately high number of prisoners. Systems in which people compensate for disadvantage seem to us unacceptably Darwinian. The stronger get stronger, and the weaker get even weaker. The man who boasts of walking seven miles to school, barefoot, every morning, happily drives his own grandchildren ten blocks in an S.U.V. We have become convinced that the surest path to success for our children involves providing them with a carefully optimized educational experience: the "best" schools, the most highly educated teachers, the smallest classrooms, the shiniest facilities, the greatest variety of colors in the art-room paint box. But one need only look at countries where schoolchildren outperform their American counterparts—despite larger classes, shabbier schools, and smaller budgets—to wonder if our wholesale embrace of the advantages of advantages isn't as simplistic as Carnegie's wholesale embrace of the advantages of disadvantages.
In E. J. Kahn's Profile, he tells the story of a C.E.O. retreat that Weinberg attended, organized by Averell Harriman. It was at Sun Valley, Harriman's ski resort, where, Kahn writes, it emerged that Weinberg had never skied before:
Several corporation presidents pooled their cash resources to bet him twenty-five dollars that he could not ski down the steepest and longest slope in the area. Weinberg was approaching fifty but game. "I got hold of an instructor named Franz Something or Fritz Something and had a thirty minute lesson," he says. "Then I rode up to the top of the mountain. It took me half a day to come down, and I finished with only one ski, and for two weeks I was black and blue all over, but I won the bet."
Here you have the Waspy Ă©lite of corporate America, off in their mountain idyll, subjecting the little Jew from Brooklyn to a bit of boarding-school hazing. (In a reminder of the anti-Semitism permeating Weinberg's world, Ellis tells us that, in the Depression, Manufacturers Trust, a predominantly Jewish company, had to agree to install a Gentile as C.E.O. as a condition of being rescued by a coalition of banks.) It is also possible, though, to read that story as highlighting the determination of the Brooklyn kid who'll be damned if he's going to let himself lose a bet to those smirking C.E.O.s. One imagines that Weinberg told that tale the first way to his wife, and the second way to his buddies in the Biltmore steam room. And when he tried to get out of bed the next morning it probably occurred to him that sometimes being humiliated provides a pretty good opportunity to show a lodge full of potential clients that you would ski down a mountain for them.
Twenty years later, Weinberg had his greatest score, handling the initial public offering for Ford Motor Company, which was founded, of course, by that odious anti-Semite Henry Ford. Did taking the business prick Weinberg's conscience? Maybe so. But he probably realized that the unstated premise behind the idea that the Jews control all the banks is that Jews are really good bankers. The first was a stereotype that oppressed; the second was a stereotype that, if you were smart about it, you could use to win a few clients. If you're trying to build an empire, you work with what you have.
5.
In 1918, Henry Goldman, one of the senior partners of Goldman Sachs, quit the firm in a dispute over Liberty Bonds. Goldman was a Germanophile, who objected to aiding the Allied war effort. (This is the same Henry Goldman who later bought the twelve-year-old Yehudi Menuhin a Stradivarius and Albert Einstein a yacht.) The Sachs brothers—Walter and Arthur—were desperate for a replacement, and they settled, finally, on a young man named Waddill Catchings, a close friend of Arthur Sachs from Harvard. He had worked at Sullivan & Cromwell, Wall Street's great patrician law firm. He had industrial experience, having reorganized several companies, and "on top of all that," Ellis tells us, "Catchings was one of the most talented, charming, handsome, well-educated, and upwardly mobile people in Wall Street."
Catchings's bold idea was to create a huge investment trust, called the Goldman Sachs Trading Corporation. It was a precursor to today's hedge funds; it borrowed heavily to buy controlling stakes in groups of corporations. The fund was originally intended to be twenty-five million dollars, but then Catchings, swept up in the boom market of the nineteen-twenties, doubled it to fifty million, doubled it again to a hundred million, then merged the Goldman fund with another fund and added two subsidiary trusts, until G.S.T.C. controlled half a billion dollars in assets.
"Walter and Arthur Sachs were travelling in Europe during the summer of 1929," Ellis writes. "In Italy they learned of the deals Catchings was doing on his own, and Walter Sachs got worried. On his return to New York, he went straight to Catchings' apartment in the Plaza Hotel to urge greater caution. But Catchings, still caught up in the bull-market euphoria, was unmoved. "The trouble with you, Walter, is that you've no imagination," he said.
Then came the stock-market crash. G.S.T.C. stock, which had traded as high as three hundred and twenty-six dollars a share, fell to $1.75. Goldman's capital was wiped out. The firm was besieged with lawsuits, the last of which was not settled until 1968. Eddie Cantor—one of the most popular comedians of the day and a disgruntled G.S.T.C. investor—turned the respected Goldman name into a punch line: "They told me to buy the stock for my old age . . . and it worked perfectly. Within six months I felt like a very old man!" Catchings was ousted. "Very few men can stand success," Walter Sachs concluded. "He was not one of them." Privilege did not prepare Catchings for crisis. The Sachs brothers then replaced Catchings with a man who was not from privilege at all, and perhaps now we can appreciate the wisdom of that decision. Wall Street needs a few less Waddill Catchingses and a few more Sidney Weinbergs.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
Most Likely to Succeed
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
December 15, 2008
Annals of Education
How do we hire when we can't tell who's right for the job?
1.
On the day of the big football game between the University of Missouri Tigers and the Cowboys of Oklahoma State, a football scout named Dan Shonka sat in his hotel, in Columbia, Missouri, with a portable DVD player. Shonka has worked for three National Football League teams. Before that, he was a football coach, and before that he played linebacker—although, he says, "that was three knee operations and a hundred pounds ago." Every year, he evaluates somewhere between eight hundred and twelve hundred players around the country, helping professional teams decide whom to choose in the college draft, which means that over the last thirty years he has probably seen as many football games as anyone else in America. In his DVD player was his homework for the evening's big game—an edited video of the Tigers' previous contest, against the University of Nebraska Cornhuskers.
Shonka methodically made his way through the video, stopping and re-winding whenever he saw something that caught his eye. He liked Jeremy Maclin and Chase Coffman, two of the Mizzou receivers. He loved William Moore, the team's bruising strong safety. But, most of all, he was interested in the Tigers' quarterback and star, a stocky, strong-armed senior named Chase Daniel.
"I like to see that the quarterback can hit a receiver in stride, so he doesn't have to slow for the ball," Shonka began. He had a stack of evaluation forms next to him and, as he watched the game, he was charting and grading every throw that Daniel made. "Then judgment. Hey, if it's not there, throw it away and play another day. Will he stand in there and take a hit, with a guy breathing down his face? Will he be able to step right in there, throw, and still take that hit? Does the guy throw better when he's in the pocket, or does he throw equally well when he's on the move? You want a great competitor. Durability. Can they hold up, their strength, toughness? Can they make big plays? Can they lead a team down the field and score late in the game? Can they see the field? When your team's way ahead, that's fine. But when you're getting your ass kicked I want to see what you're going to do."
He pointed to his screen. Daniel had thrown a dart, and, just as he did, a defensive player had hit him squarely. "See how he popped up?" Shonka said. "He stood right there and threw the ball in the face of that rush. This kid has got a lot of courage." Daniel was six feet tall and weighed two hundred and twenty-five pounds: thick through the chest and trunk. He carried himself with a self-assurance that bordered on cockiness. He threw quickly and in rhythm. He nimbly evaded defenders. He made short throws with touch and longer throws with accuracy. By the game's end, he had completed an astonishing seventy-eight per cent of his passes, and handed Nebraska its worst home defeat in fifty-three years. "He can zip it," Shonka said. "He can really gun, when he has to." Shonka had seen all the promising college quarterbacks, charted and graded their throws, and to his mind Daniel was special: "He might be one of the best college quarterbacks in the country."
But then Shonka began to talk about when he was on the staff of the Philadelphia Eagles, in 1999. Five quarterbacks were taken in the first round of the college draft that year, and each looked as promising as Chase Daniel did now. But only one of them, Donovan McNabb, ended up fulfilling that promise. Of the rest, one descended into mediocrity after a decent start. Two were complete busts, and the last was so awful that after failing out of the N.F.L. he ended up failing out of the Canadian Football League as well.
The year before, the same thing happened with Ryan Leaf, who was the Chase Daniel of 1998. The San Diego Chargers made him the second player taken over all in the draft, and gave him an eleven-million-dollar signing bonus. Leaf turned out to be terrible. In 2002, it was Joey Harrington's turn. Harrington was a golden boy out of the University of Oregon, and the third player taken in the draft. Shonka still can't get over what happened to him.
"I tell you, I saw Joey live," he said. "This guy threw lasers, he could throw under tight spots, he had the arm strength, he had the size, he had the intelligence." Shonka got as misty as a two-hundred-and-eighty-pound ex-linebacker in a black tracksuit can get. "He's a concert pianist, you know? I really—I mean, I really—liked Joey." And yet Harrington's career consisted of a failed stint with the Detroit Lions and a slide into obscurity. Shonka looked back at the screen, where the young man he felt might be the best quarterback in the country was marching his team up and down the field. "How will that ability translate to the National Football League?" He shook his head slowly. "Shoot."
This is the quarterback problem. There are certain jobs where almost nothing you can learn about candidates before they start predicts how they'll do once they're hired. So how do we know whom to choose in cases like that? In recent years, a number of fields have begun to wrestle with this problem, but none with such profound social consequences as the profession of teaching.
2.
One of the most important tools in contemporary educational research is "value added" analysis. It uses standardized test scores to look at how much the academic performance of students in a given teacher's classroom changes between the beginning and the end of the school year. Suppose that Mrs. Brown and Mr. Smith both teach a classroom of third graders who score at the fiftieth percentile on math and reading tests on the first day of school, in September. When the students are retested, in June, Mrs. Brown's class scores at the seventieth percentile, while Mr. Smith's students have fallen to the fortieth percentile. That change in the students' rankings, value-added theory says, is a meaningful indicator of how much more effective Mrs. Brown is as a teacher than Mr. Smith.
It's only a crude measure, of course. A teacher is not solely responsible for how much is learned in a classroom, and not everything of value that a teacher imparts to his or her students can be captured on a standardized test. Nonetheless, if you follow Brown and Smith for three or four years, their effect on their students' test scores starts to become predictable: with enough data, it is possible to identify who the very good teachers are and who the very poor teachers are. What's more—and this is the finding that has galvanized the educational world—the difference between good teachers and poor teachers turns out to be vast.
Eric Hanushek, an economist at Stanford, estimates that the students of a very bad teacher will learn, on average, half a year's worth of material in one school year. The students in the class of a very good teacher will learn a year and a half's worth of material. That difference amounts to a year's worth of learning in a single year. Teacher effects dwarf school effects: your child is actually better off in a "bad" school with an excellent teacher than in an excellent school with a bad teacher. Teacher effects are also much stronger than class-size effects. You'd have to cut the average class almost in half to get the same boost that you'd get if you switched from an average teacher to a teacher in the eighty-fifth percentile. And remember that a good teacher costs as much as an average one, whereas halving class size would require that you build twice as many classrooms and hire twice as many teachers.
Hanushek recently did a back-of-the-envelope calculation about what even a rudimentary focus on teacher quality could mean for the United States. If you rank the countries of the world in terms of the academic performance of their schoolchildren, the U.S. is just below average, half a standard deviation below a clump of relatively high-performing countries like Canada and Belgium. According to Hanushek, the U.S. could close that gap simply by replacing the bottom six per cent to ten per cent of public-school teachers with teachers of average quality. After years of worrying about issues like school funding levels, class size, and curriculum design, many reformers have come to the conclusion that nothing matters more than finding people with the potential to be great teachers. But there's a hitch: no one knows what a person with the potential to be a great teacher looks like. The school system has a quarterback problem.
3.
Kickoff time for Missouri's game against Oklahoma State was seven o'clock. It was a perfect evening for football: cloudless skies and a light fall breeze. For hours, fans had been tailgating in the parking lots around the stadium. Cars lined the roads leading to the university, many with fuzzy yellow-and-black Tiger tails hanging from their trunks. It was one of Mizzou's biggest games in years. The Tigers were undefeated, and had a chance to become the No. 1 college football team in the country. Shonka made his way through the milling crowds and took a seat in the press box. Below him, the players on the field looked like pieces on a chessboard.
The Tigers held the ball first. Chase Daniel stood a good seven yards behind his offensive line. He had five receivers, two to his left and three to his right, spaced from one side of the field to the other. His linemen were widely spaced as well. In play after play, Daniel caught the snap from his center, planted his feet, and threw the ball in quick seven- and eight-yard diagonal passes to one of his five receivers.
The style of offense that the Tigers run is called the "spread," and most of the top quarterbacks in college football—the players who will be drafted into the pros—are spread quarterbacks. By spacing out the offensive linemen and wide receivers, the system makes it easy for the quarterback to figure out the intentions of the opposing defense before the ball is snapped: he can look up and down the line, "read" the defense, and decide where to throw the ball before anyone has moved a muscle. Daniel had been playing in the spread since high school; he was its master. "Look how quickly he gets the ball out," Shonka said. "You can hardly go a thousand and one, a thousand and two, and it's out of his hand. He knows right where he's going. When everyone is spread out like that, the defense can't disguise its coverage. Chase knows right away what they are going to do. The system simplifies the quarterback's decisions."
But for Shonka this didn't help matters. It had always been hard to predict how a college quarterback would fare in the pros. The professional game was, simply, faster and more complicated. With the advent of the spread, though, the correspondence between the two levels of play had broken down almost entirely. N.F.L. teams don't run the spread. They can't. The defenders in the pros are so much faster than their college counterparts that they would shoot through those big gaps in the offensive line and flatten the quarterback. In the N.F.L., the offensive line is bunched closely together. Daniel wouldn't have five receivers. Most of the time, he'd have just three or four. He wouldn't have the luxury of standing seven yards behind the center, planting his feet, and knowing instantly where to throw. He'd have to crouch right behind the center, take the snap directly, and run backward before planting his feet to throw. The onrushing defenders wouldn't be seven yards away. They would be all around him, from the start. The defense would no longer have to show its hand, because the field would not be so spread out. It could now disguise its intentions. Daniel wouldn't be able to read the defense before the snap was taken. He'd have to read it in the seconds after the play began.
"In the spread, you see a lot of guys wide open," Shonka said. "But when a guy like Chase goes to the N.F.L. he's never going to see his receivers that open—only in some rare case, like someone slips or there's a bust in the coverage. When that ball's leaving your hands in the pros, if you don't use your eyes to move the defender a little bit, they'll break on the ball and intercept it. The athletic ability that they're playing against in the league is unbelievable."
As Shonka talked, Daniel was moving his team down the field. But he was almost always throwing those quick, diagonal passes. In the N.F.L., he would have to do much more than that—he would have to throw long, vertical passes over the top of the defense. Could he make that kind of throw? Shonka didn't know. There was also the matter of his height. Six feet was fine in a spread system, where the big gaps in the offensive line gave Daniel plenty of opportunity to throw the ball and see downfield. But in the N.F.L. there wouldn't be gaps, and the linemen rushing at him would be six-five, not six-one.
"I wonder," Shonka went on. "Can he see? Can he be productive in a new kind of offense? How will he handle that? I'd like to see him set up quickly from center. I'd like to see his ability to read coverages that are not in the spread. I'd like to see him in the pocket. I'd like to see him move his feet. I'd like to see him do a deep dig, or deep comeback. You know, like a throw twenty to twenty-five yards down the field."
It was clear that Shonka didn't feel the same hesitancy in evaluating the other Mizzou stars—the safety Moore, the receivers Maclin and Coffman. The game that they would play in the pros would also be different from the game they were playing in college, but the difference was merely one of degree. They had succeeded at Missouri because they were strong and fast and skilled, and these traits translate in kind to professional football.
A college quarterback joining the N.F.L., by contrast, has to learn to play an entirely new game. Shonka began to talk about Tim Couch, the quarterback taken first in that legendary draft of 1999. Couch set every record imaginable in his years at the University of Kentucky. "They used to put five garbage cans on the field," Shonka recalled, shaking his head, "and Couch would stand there and throw and just drop the ball into every one." But Couch was a flop in the pros. It wasn't that professional quarterbacks didn't need to be accurate. It was that the kind of accuracy required to do the job well could be measured only in a real N.F.L. game.
Similarly, all quarterbacks drafted into the pros are required to take an I.Q. test—the Wonderlic Personnel Test. The theory behind the test is that the pro game is so much more cognitively demanding than the college game that high intelligence should be a good predictor of success. But when the economists David Berri and Rob Simmons analyzed the scores—which are routinely leaked to the press—they found that Wonderlic scores are all but useless as predictors. Of the five quarterbacks taken in round one of the 1999 draft, Donovan McNabb, the only one of the five with a shot at the Hall of Fame, had the lowest Wonderlic score. And who else had I.Q. scores in the same range as McNabb? Dan Marino and Terry Bradshaw, two of the greatest quarterbacks ever to play the game.
We're used to dealing with prediction problems by going back and looking for better predictors. We now realize that being a good doctor requires the ability to communicate, listen, and empathize—and so there is increasing pressure on medical schools to pay attention to interpersonal skills as well as to test scores. We can have better physicians if we're just smarter about how we choose medical-school students. But no one is saying that Dan Shonka is somehow missing some key ingredient in his analysis; that if he were only more perceptive he could predict Chase Daniel's career trajectory. The problem with picking quarterbacks is that Chase Daniel's performance can't be predicted. The job he's being groomed for is so particular and specialized that there is no way to know who will succeed at it and who won't. In fact, Berri and Simmons found no connection between where a quarterback was taken in the draft—that is, how highly he was rated on the basis of his college performance—and how well he played in the pros.
The entire time that Chase Daniel was on the field against Oklahoma State, his backup, Chase Patton, stood on the sidelines, watching. Patton didn't play a single down. In his four years at Missouri, up to that point, he had thrown a total of twenty-six passes. And yet there were people in Shonka's world who thought that Patton would end up as a better professional quarterback than Daniel. The week of the Oklahoma State game, the national sports magazine ESPN even put the two players on its cover, with the title "CHASE DANIEL MIGHT WIN THE HEISMAN"—referring to the trophy given to college football's best player. "HIS BACKUP COULD WIN THE SUPER BOWL." Why did everyone like Patton so much? It wasn't clear. Maybe he looked good in practice. Maybe it was because this season in the N.F.L. a quarterback who had also never started in a single college game is playing superbly for the New England Patriots. It sounds absurd to put an athlete on the cover of a magazine for no particular reason. But perhaps that's just the quarterback problem taken to an extreme. If college performance doesn't tell us anything, why shouldn't we value someone who hasn't had the chance to play as highly as someone who plays as well as anyone in the land?
4.
Picture a young preschool teacher, sitting on a classroom floor surrounded by seven children. She is holding an alphabet book, and working through the letters with the children, one by one: " 'A' is for apple. . . . 'C' is for cow." The session was taped, and the videotape is being watched by a group of experts, who are charting and grading each of the teacher's moves.
After thirty seconds, the leader of the group—Bob Pianta, the dean of the University of Virginia's Curry School of Education—stops the tape. He points to two little girls on the right side of the circle. They are unusually active, leaning into the circle and reaching out to touch the book.
"What I'm struck by is how lively the affect is in this room," Pianta said. "One of the things the teacher is doing is creating a holding space for that. And what distinguishes her from other teachers is that she flexibly allows the kids to move and point to the book. She's not rigidly forcing the kids to sit back."
Pianta's team has developed a system for evaluating various competencies relating to student-teacher interaction. Among them is "regard for student perspective"; that is, a teacher's knack for allowing students some flexibility in how they become engaged in the classroom. Pianta stopped and rewound the tape twice, until what the teacher had managed to achieve became plain: the children were active, but somehow the class hadn't become a free-for-all.
"A lesser teacher would have responded to the kids' leaning over as misbehavior," Pianta went on. " 'We can't do this right now. You need to be sitting still.' She would have turned this off."
Bridget Hamre, one of Pianta's colleagues, chimed in: "These are three- and four-year-olds. At this age, when kids show their engagement it's not like the way we show our engagement, where we look alert. They're leaning forward and wriggling. That's their way of doing it. And a good teacher doesn't interpret that as bad behavior. You can see how hard it is to teach new teachers this idea, because the minute you teach them to have regard for the student's perspective, they think you have to give up control of the classroom."
The lesson continued. Pianta pointed out how the teacher managed to personalize the material. " 'C' is for cow" turned into a short discussion of which of the kids had ever visited a farm. "Almost every time a child says something, she responds to it, which is what we describe as teacher sensitivity," Hamre said.
The teacher then asked the children if anyone's name "began with that letter. Calvin," a boy named Calvin says. The teacher nods, and says, "Calvin starts with 'C.' " A little girl in the middle says, "Me!" The teacher turns to her. "Your name's Venisha. Letter 'V.' Venisha."
It was a key moment. Of all the teacher elements analyzed by —the Virginia group, feedbacka direct, personal response by a teacher to a specific statement by a student—seems to be most closely linked to academic success. Not only did the teacher catch the "Me!" amid the wiggling and tumult; she addressed it directly.
"Mind you, that's not great feedback," Hamre said. "High-quality feedback is where there is a back-and-forth exchange to get a deeper understanding." The perfect way to handle that moment would have been for the teacher to pause and pull out Venisha's name card, point to the letter "V," show her how different it is from "C," and make the class sound out both letters. But the teacher didn't do that—either because it didn't occur to her or because she was distracted by the wiggling of the girls to her right.
"On the other hand, she could have completely ignored the girl, which happens a lot," Hamre went on. "The other thing that happens a lot is the teacher will just say, 'You're wrong.' Yes-no feedback is probably the predominant kind of feedback, which provides almost no information for the kid in terms of learning."
Pianta showed another tape, of a nearly identical situation: a circle of pre-schoolers around a teacher. The lesson was about how we can tell when someone is happy or sad. The teacher began by acting out a short conversation between two hand puppets, Henrietta and Twiggle: Twiggle is sad until Henrietta shares some watermelon with him.
"The idea that the teacher is trying to get across is that you can tell by looking at somebody's face how they're feeling, whether they're feeling sad or happy," Hamre said. "What kids of this age tend to say is you can tell how they're feeling because of something that happened to them. They lost their puppy and that's why they're sad. They don't really get this idea. So she's been challenged, and she's struggling."
The teacher begins, "Remember when we did something and we drew our face?" She touches her face, pointing out her eyes and mouth. "When somebody is happy, their face tells us that they're happy. And their eyes tell us." The children look on blankly. The teacher plunges on: "Watch, watch." She smiles broadly. "This is happy! How can you tell that I'm happy? Look at my face. Tell me what changes about my face when I'm happy. No, no, look at my face. . . . No. . . ."
A little girl next to her says, "Eyes," providing the teacher with an opportunity to use one of her students to draw the lesson out. But the teacher doesn't hear her. Again, she asks, "What's changed about my face?" She smiles and she frowns, as if she can reach the children by sheer force of repetition. Pianta stopped the tape. One problem, he pointed out, was that Henrietta made Twiggle happy by sharing watermelon with him, which doesn't illustrate what the lesson is about.
"You know, a better way to handle this would be to anchor something around the kids," Pianta said. "She should ask, 'What makes you feel happy?' The kids could answer. Then she could say, 'Show me your face when you have that feeling? O.K., what does So-and-So's face look like? Now tell me what makes you sad. Show me your face when you're sad. Oh, look, her face changed!' You've basically made the point. And then you could have the kids practice, or something. But this is going to go nowhere."
"What's changed about my face?" the teacher repeated, for what seemed like the hundredth time. One boy leaned forward into the circle, trying to engage himself in the lesson, in the way that little children do. His eyes were on the teacher. "Sit up!" she snapped at him.
As Pianta played one tape after another, the patterns started to become clear. Here was a teacher who read out sentences, in a spelling test, and every sentence came from her own life—"I went to a wedding last week"—which meant she was missing an opportunity to say something that engaged her students. Another teacher walked over to a computer to do a PowerPoint presentation, only to realize that she hadn't turned it on. As she waited for it to boot up, the classroom slid into chaos.
Then there was the superstar—a young high-school math teacher, in jeans and a green polo shirt. "So let's see," he began, standing up at the blackboard. "Special right triangles. We're going to do practice with this, just throwing out ideas." He drew two triangles. "Label the length of the side, if you can. If you can't, we'll all do it." He was talking and moving quickly, which Pianta said might be interpreted as a bad thing, because this was trigonometry. It wasn't easy material. But his energy seemed to infect the class. And all the time he offered the promise of help. If you can't, we'll all do it. In a corner of the room was a student named Ben, who'd evidently missed a few classes. "See what you can remember, Ben," the teacher said. Ben was lost. The teacher quickly went to his side: "I'm going to give you a way to get to it." He made a quick suggestion: "How about that?" Ben went back to work. The teacher slipped over to the student next to Ben, and glanced at her work. "That's all right!" He went to a third student, then a fourth. Two and a half minutes into the lesson—the length of time it took that subpar teacher to turn on the computer—he had already laid out the problem, checked in with nearly every student in the class, and was back at the blackboard, to take the lesson a step further.
"In a group like this, the standard m.o. would be: he's at the board, broadcasting to the kids, and has no idea who knows what he's doing and who doesn't know," Pianta said. "But he's giving individualized feedback. He's off the charts on feedback." Pianta and his team watched in awe.
5.
Educational-reform efforts typically start with a push for higher standards for teachers—that is, for the academic and cognitive requirements for entering the profession to be as stiff as possible. But after you've watched Pianta's tapes, and seen how complex the elements of effective teaching are, this emphasis on book smarts suddenly seems peculiar. The preschool teacher with the alphabet book was sensitive to her students' needs and knew how to let the two girls on the right wiggle and squirm without disrupting the rest of the students; the trigonometry teacher knew how to complete a circuit of his classroom in two and a half minutes and make everyone feel as if he or she were getting his personal attention. But these aren't cognitive skills.
A group of researchers—Thomas J. Kane, an economist at Harvard's school of education; Douglas Staiger, an economist at Dartmouth; and Robert Gordon, a policy analyst at the Center for American Progress—have investigated whether it helps to have a teacher who has earned a teaching certification or a master's degree. Both are expensive, time-consuming credentials that almost every district expects teachers to acquire; neither makes a difference in the classroom. Test scores, graduate degrees, and certifications—as much as they appear related to teaching prowess—turn out to be about as useful in predicting success as having a quarterback throw footballs into a bunch of garbage cans.
Another educational researcher, Jacob Kounin, once did an analysis of "desist" events, in which a teacher has to stop some kind of misbehavior. In one instance, "Mary leans toward the table to her right and whispers to Jane. Both she and Jane giggle. The teacher says, 'Mary and Jane, stop that!' " That's a desist event. But how a teacher desists—her tone of voice, her attitudes, her choice of words—appears to make no difference at all in maintaining an orderly classroom. How can that be? Kounin went back over the videotape and noticed that forty-five seconds before Mary whispered to Jane, Lucy and John had started whispering. Then Robert had noticed and joined in, making Jane giggle, whereupon Jane said something to John. Then Mary whispered to Jane. It was a contagious chain of misbehavior, and what really was significant was not how a teacher stopped the deviancy at the end of the chain but whether she was able to stop the chain before it started. Kounin called that ability "withitness," which he defined as "a teacher's communicating to the children by her actual behavior (rather than by verbally announcing: 'I know what's going on') that she knows what the children are doing, or has the proverbial 'eyes in the back of her head.' " It stands to reason that to be a great teacher you have to have withitness. But how do you know whether someone has withitness until she stands up in front of a classroom of twenty-five wiggly Janes, Lucys, Johns, and Roberts and tries to impose order?
6.
Perhaps no profession has taken the implications of the quarterback problem more seriously than the financial-advice field, and the experience of financial advisers is a useful guide to what could happen in teaching as well. There are no formal qualifications for entering the field except a college degree. Financial-services firms don't look for only the best students, or require graduate degrees or specify a list of prerequisites. No one knows beforehand what makes a high-performing financial adviser different from a low-performing one, so the field throws the door wide open.
"A question I ask is, 'Give me a typical day,' " Ed Deutschlander, the co-president of North Star Resource Group, in Minneapolis, says. "If that person says, 'I get up at five-thirty, hit the gym, go to the library, go to class, go to my job, do homework until eleven,' that person has a chance." Deutschlander, in other words, begins by looking for the same general traits that every corporate recruiter looks for.
Deutschlander says that last year his firm interviewed about a thousand people, and found forty-nine it liked, a ratio of twenty interviewees to one candidate. Those candidates were put through a four-month "training camp," in which they tried to act like real financial advisers. "They should be able to obtain in that four-month period a minimum of ten official clients," Deutschlander said. "If someone can obtain ten clients, and is able to maintain a minimum of ten meetings a week, that means that person has gathered over a hundred introductions in that four-month period. Then we know that person is at least fast enough to play this game."
Of the forty-nine people invited to the training camp, twenty-three made the cut and were hired as apprentice advisers. Then the real sorting began. "Even with the top performers, it really takes three to four years to see whether someone can make it," Deutschlander says. "You're just scratching the surface at the beginning. Four years from now, I expect to hang on to at least thirty to forty per cent of that twenty-three."
People like Deutschlander are referred to as gatekeepers, a title that suggests that those at the door of a profession are expected to discriminate—to select who gets through the gate and who doesn't. But Deutschlander sees his role as keeping the gate as wide open as possible: to find ten new financial advisers, he's willing to interview a thousand people. The equivalent of that approach, in the N.F.L., would be for a team to give up trying to figure out who the "best" college quarterback is, and, instead, try out three or four "good" candidates.
In teaching, the implications are even more profound. They suggest that we shouldn't be raising standards. We should be lowering them, because there is no point in raising standards if standards don't track with what we care about. Teaching should be open to anyone with a pulse and a college degree—and teachers should be judged after they have started their jobs, not before. That means that the profession needs to start the equivalent of Ed Deutschlander's training camp. It needs an apprenticeship system that allows candidates to be rigorously evaluated. Kane and Staiger have calculated that, given the enormous differences between the top and the bottom of the profession, you'd probably have to try out four candidates to find one good teacher. That means tenure can't be routinely awarded, the way it is now. Currently, the salary structure of the teaching profession is highly rigid, and that would also have to change in a world where we want to rate teachers on their actual performance. An apprentice should get apprentice wages. But if we find eighty-fifth-percentile teachers who can teach a year and a half's material in one year, we're going to have to pay them a lot—both because we want them to stay and because the only way to get people to try out for what will suddenly be a high-risk profession is to offer those who survive the winnowing a healthy reward.
Is this solution to teaching's quarterback problem politically possible? Taxpayers might well balk at the costs of trying out four teachers to find one good one. Teachers' unions have been resistant to even the slightest move away from the current tenure arrangement. But all the reformers want is for the teaching profession to copy what firms like North Star have been doing for years. Deutschlander interviews a thousand people to find ten advisers. He spends large amounts of money to figure out who has the particular mixture of abilities to do the job. "Between hard and soft costs," he says, "most firms sink between a hundred thousand dollars and two hundred and fifty thousand dollars on someone in their first three or four years," and in most cases, of course, that investment comes to naught. But, if you were willing to make that kind of investment and show that kind of patience, you wound up with a truly high-performing financial adviser. "We have a hundred and twenty-five full-time advisers," Deutschlander says. "Last year, we had seventy-one of them qualify for the Million Dollar Round Table"—the industry's association of its most successful practitioners. "We're seventy-one out of a hundred and twenty-five in that Ă©lite group." What does it say about a society that it devotes more care and patience to the selection of those who handle its money than of those who handle its children?
7.
Midway through the fourth quarter of the Oklahoma State–Missouri game, the Tigers were in trouble. For the first time all year, they were behind late in the game. They needed to score, or they'd lose any chance of a national championship. Daniel took the snap from his center, and planted his feet to pass. His receivers were covered. He began to run. The Oklahoma State defenders closed in on him. He was under pressure, something that rarely happened to him in the spread. Desperate, he heaved the ball downfield, right into the arms of a Cowboy defender.
Shonka jumped up. "That's not like him!" he cried out. "He doesn't throw stuff up like that."
Next to Shonka, a scout for the Kansas City Chiefs looked crestfallen. "Chase never throws something up for grabs!"
It was tempting to see Daniel's mistake as definitive. The spread had broken down. He was finally under pressure. This was what it would be like to be an N.F.L. quarterback, wasn't it? But there is nothing like being an N.F.L. quarterback except being an N.F.L. quarterback. A prediction, in a field where prediction is not possible, is no more than a prejudice. Maybe that interception means that Daniel won't be a good professional quarterback, or maybe he made a mistake that he'll learn from. "In a great big piece of pie," Shonka said, "that was just a little slice."
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
How David Beats Goliath
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 11, 2009
Annals of Innovation
When underdogs break the rules.
1.
When Vivek RanadivĂ© decided to coach his daughter Anjali's basketball team, he settled on two principles. The first was that he would never raise his voice. This was National Junior Basketball—the Little League of basketball. The team was made up mostly of twelve-year-olds, and twelve-year-olds, he knew from experience, did not respond well to shouting. He would conduct business on the basketball court, he decided, the same way he conducted business at his software firm. He would speak calmly and softly, and convince the girls of the wisdom of his approach with appeals to reason and common sense.
The second principle was more important. RanadivĂ© was puzzled by the way Americans played basketball. He is from Mumbai. He grew up with cricket and soccer. He would never forget the first time he saw a basketball game. He thought it was mindless. Team A would score and then immediately retreat to its own end of the court. Team B would inbound the ball and dribble it into Team A's end, where Team A was patiently waiting. Then the process would reverse itself. A basketball court was ninety-four feet long. But most of the time a team defended only about twenty-four feet of that, conceding the other seventy feet. Occasionally, teams would play a full-court press—that is, they would contest their opponent's attempt to advance the ball up the court. But they would do it for only a few minutes at a time. It was as if there were a kind of conspiracy in the basketball world about the way the game ought to be played, and RanadivĂ© thought that that conspiracy had the effect of widening the gap between good teams and weak teams. Good teams, after all, had players who were tall and could dribble and shoot well; they could crisply execute their carefully prepared plays in their opponent's end. Why, then, did weak teams play in a way that made it easy for good teams to do the very things that made them so good?
RanadivĂ© looked at his girls. Morgan and Julia were serious basketball players. But Nicky, Angela, Dani, Holly, Annika, and his own daughter, Anjali, had never played the game before. They weren't all that tall. They couldn't shoot. They weren't particularly adept at dribbling. They were not the sort who played pickup games at the playground every evening. Most of them were, as RanadivĂ© says, "little blond girls" from Menlo Park and Redwood City, the heart of Silicon Valley. These were the daughters of computer programmers and people with graduate degrees. They worked on science projects, and read books, and went on ski vacations with their parents, and dreamed about growing up to be marine biologists. RanadivĂ© knew that if they played the conventional way—if they let their opponents dribble the ball up the court without opposition—they would almost certainly lose to the girls for whom basketball was a passion. RanadivĂ© came to America as a seventeen-year-old, with fifty dollars in his pocket. He was not one to accept losing easily. His second principle, then, was that his team would play a real full-court press, every game, all the time. The team ended up at the national championships. "It was really random," Anjali RanadivĂ© said. "I mean, my father had never played basketball before."
2.
David's victory over Goliath, in the Biblical account, is held to be an anomaly. It was not. Davids win all the time. The political scientist Ivan ArreguĂn-Toft recently looked at every war fought in the past two hundred years between strong and weak combatants. The Goliaths, he found, won in 71.5 per cent of the cases. That is a remarkable fact. ArreguĂn-Toft was analyzing conflicts in which one side was at least ten times as powerful—in terms of armed might and population—as its opponent, and even in those lopsided contests the underdog won almost a third of the time.
In the Biblical story of David and Goliath, David initially put on a coat of mail and a brass helmet and girded himself with a sword: he prepared to wage a conventional battle of swords against Goliath. But then he stopped. "I cannot walk in these, for I am unused to it," he said (in Robert Alter's translation), and picked up those five smooth stones. What happened, ArreguĂn-Toft wondered, when the underdogs likewise acknowledged their weakness and chose an unconventional strategy? He went back and re-analyzed his data. In those cases, David's winning percentage went from 28.5 to 63.6. When underdogs choose not to play by Goliath's rules, they win, ArreguĂn-Toft concluded, "even when everything we think we know about power says they shouldn't."
Consider the way T. E. Lawrence (or, as he is better known, Lawrence of Arabia) led the revolt against the Ottoman Army occupying Arabia near the end of the First World War. The British were helping the Arabs in their uprising, and the initial focus was Medina, the city at the end of a long railroad that the Turks had built, running south from Damascus and down through the Hejaz desert. The Turks had amassed a large force in Medina, and the British leadership wanted Lawrence to gather the Arabs and destroy the Turkish garrison there, before the Turks could threaten the entire region.
But when Lawrence looked at his ragtag band of Bedouin fighters he realized that a direct attack on Medina would never succeed. And why did taking the city matter, anyway? The Turks sat in Medina "on the defensive, immobile." There were so many of them, consuming so much food and fuel and water, that they could hardly make a major move across the desert. Instead of attacking the Turks at their point of strength, Lawrence reasoned, he ought to attack them where they were weak—along the vast, largely unguarded length of railway line that was their connection to Damascus. Instead of focussing his attention on Medina, he should wage war over the broadest territory possible.
The Bedouins under Lawrence's command were not, in conventional terms, skilled troops. They were nomads. Sir Reginald Wingate, one of the British commanders in the region, called them "an untrained rabble, most of whom have never fired a rifle." But they were tough and they were mobile. The typical Bedouin soldier carried no more than a rifle, a hundred rounds of ammunition, forty-five pounds of flour, and a pint of drinking water, which meant that he could travel as much as a hundred and ten miles a day across the desert, even in summer. "Our cards were speed and time, not hitting power," Lawrence wrote. "Our largest available resources were the tribesmen, men quite unused to formal warfare, whose assets were movement, endurance, individual intelligence, knowledge of the country, courage." The eighteenth-century general Maurice de Saxe famously said that the art of war was about legs, not arms, and Lawrence's troops were all legs. In one typical stretch, in the spring of 1917, his men dynamited sixty rails and cut a telegraph line at Buair on March 24th, sabotaged a train and twenty-five rails at Abu al-Naam on March 25th, dynamited fifteen rails and cut a telegraph line at Istabl Antar on March 27th, raided a Turkish garrison and derailed a train on March 29th, returned to Buair and sabotaged the railway line again on March 31st, dynamited eleven rails at Hediah on April 3rd, raided the train line in the area of Wadi Dhaiji on April 4th and 5th, and attacked twice on April 6th.
Lawrence's masterstroke was an assault on the port town of Aqaba. The Turks expected an attack from British ships patrolling the waters of the Gulf of Aqaba to the west. Lawrence decided to attack from the east instead, coming at the city from the unprotected desert, and to do that he led his men on an audacious, six-hundred-mile loop—up from the Hejaz, north into the Syrian desert, and then back down toward Aqaba. This was in summer, through some of the most inhospitable land in the Middle East, and Lawrence tacked on a side trip to the outskirts of Damascus, in order to mislead the Turks about his intentions. "This year the valley seemed creeping with horned vipers and puff-adders, cobras and black snakes," Lawrence writes in "The Seven Pillars of Wisdom" of one stage in the journey:
We could not lightly draw water after dark, for there were snakes swimming in the pools or clustering in knots around their brinks. Twice puff-adders came twisting into the alert ring of our debating coffee-circle. Three of our men died of bites; four recovered after great fear and pain, and a swelling of the poisoned limb. Howeitat treatment was to bind up the part with snake-skin plaster and read chapters of the Koran to the sufferer until he died.
When they finally arrived at Aqaba, Lawrence's band of several hundred warriors killed or captured twelve hundred Turks, and lost only two men. The Turks simply did not think that their opponent would be mad enough to come at them from the desert. This was Lawrence's great insight. David can beat Goliath by substituting effort for ability—and substituting effort for ability turns out to be a winning formula for underdogs in all walks of life, including little blond-haired girls on the basketball court.
3.
Vivek Ranadivé is an elegant man, slender and fine-boned, with impeccable manners and a languorous walk. His father was a pilot who was jailed by Indira Gandhi, he says, because he wouldn't stop challenging the safety of India's planes. Ranadivé went to M.I.T., because he saw a documentary on the school and decided that it was perfect for him. This was in the nineteen-seventies, when going abroad for undergraduate study required the Indian government to authorize the release of foreign currency, and Ranadivé camped outside the office of the governor of the Reserve Bank of India until he got his way. The Ranadivés are relentless.
In 1985, RanadivĂ© founded a software company in Silicon Valley devoted to what in the computer world is known as "real time" processing. If a businessman waits until the end of the month to collect and count his receipts, he's "batch processing." There is a gap between the events in the company—sales—and his understanding of those events. Wall Street used to be the same way. The information on which a trader based his decisions was scattered across a number of databases. The trader would collect information from here and there, collate and analyze it, and then make a trade. What RanadivĂ©'s company, TIBCO, did was to consolidate those databases into one stream, so that the trader could collect all the data he wanted instantaneously. Batch processing was replaced by real-time processing. Today, TIBCO's software powers most of the trading floors on Wall Street.
RanadivĂ© views this move from batch to real time as a sort of holy mission. The shift, to his mind, is one of kind, not just of degree. "We've been working with some airlines," he said. "You know, when you get on a plane and your bag doesn't, they actually know right away that it's not there. But no one tells you, and a big part of that is that they don't have all their information in one place. There are passenger systems that know where the passenger is. There are aircraft and maintenance systems that track where the plane is and what kind of shape 's in. Then, there are baggage systems and ticketing systems—and they're all separate. So you land, you wait at the baggage terminal, and it doesn't show up." Everything bad that happens in that scenario, RanadivĂ© maintains, happens because of the lag between the event (the luggage doesn't make it onto the plane) and the response (the airline tells you that your luggage didn't make the plane). The lag is why you're angry. The lag is why you had to wait, fruitlessly, at baggage claim. The lag is why you vow never to fly that airline again. Put all the databases together, and there's no lag. "What we can do is send you a text message the moment we know your bag didn't make it," RanadivĂ© said, "telling you we'll ship it to your house."
A few years ago, RanadivĂ© wrote a paper arguing that even the Federal Reserve ought to make its decisions in real time—not once every month or two. "Everything in the world is now real time," he said. "So when a certain type of shoe isn't selling at your corner shop, it's not six months before the guy in China finds out. It's almost instantaneous, thanks to my software. The world runs in real time, but government runs in batch. Every few months, it adjusts. Its mission is to keep the temperature comfortable in the economy, and, if you were to do things the government's way in your house, then every few months you'd turn the heater either on or off, overheating or underheating your house." RanadivĂ© argued that we ought to put the economic data that the Fed uses into a big stream, and write a computer program that sifts through those data, the moment they are collected, and make immediate, incremental adjustments to interest rates and the money supply. "It can all be automated," he said. "Look, we've had only one soft landing since the Second World War. Basically, we've got it wrong every single time."
You can imagine what someone like Alan Greenspan or Ben Bernanke might say about that idea. Such people are powerfully invested in the notion of the Fed as a Solomonic body: that pause of five or eight weeks between economic adjustments seems central to the process of deliberation. To Ranadivé, though, "deliberation" just prettifies the difficulties created by lag. The Fed has to deliberate because it's several weeks behind, the same way the airline has to bow and scrape and apologize because it waited forty-five minutes to tell you something that it could have told you the instant you stepped off the plane.
Is it any wonder that RanadivĂ© looked at the way basketball was played and found it mindless? A professional basketball game was forty-eight minutes long, divided up into alternating possessions of roughly twenty seconds: back and forth, back and forth. But a good half of each twenty-second increment was typically taken up with preliminaries and formalities. The point guard dribbled the ball up the court. He stood above the top of the key, about twenty-four feet from the opposing team's basket. He called out a play that the team had choreographed a hundred times in practice. It was only then that the defending team sprang into action, actively contesting each pass and shot. Actual basketball took up only half of that twenty-second interval, so that a game's real length was not forty-eight minutes but something closer to twenty-four minutes—and that twenty-four minutes of activity took place within a narrowly circumscribed area. It was as formal and as convention-bound as an eighteenth-century quadrille. The supporters of that dance said that the defensive players had to run back to their own end, in order to compose themselves for the arrival of the other team. But the reason they had to compose themselves, surely, was that by retreating they allowed the offense to execute a play that it had practiced to perfection. Basketball was batch!
Insurgents, though, operate in real time. Lawrence hit the Turks, in that stretch in the spring of 1917, nearly every day, because he knew that the more he accelerated the pace of combat the more the war became a battle of endurance—and endurance battles favor the insurgent. "And it happened as the Philistine arose and was drawing near David that David hastened and ran out from the lines toward the Philistine," the Bible says. "And he reached his hand into the pouch and took from there a stone and slung it and struck the Philistine in his forehead." The second sentence—the slingshot part—is what made David famous. But the first sentence matters just as much. David broke the rhythm of the encounter. He speeded it up. "The sudden astonishment when David sprints forward must have frozen Goliath, making him a better target," the poet and critic Robert Pinsky writes in "The Life of David." Pinsky calls David a "point guard ready to flick the basketball here or there." David pressed. That's what Davids do when they want to beat Goliaths.
4.
Ranadivé's basketball team played in the National Junior Basketball seventh-and-eighth-grade division, representing Redwood City. The girls practiced at Paye's Place, a gym in nearby San Carlos. Because Ranadivé had never played basketball, he recruited a series of experts to help him. The first was Roger Craig, the former all-pro running back for the San Francisco 49ers, who is also TIBCO's director of business development. As a football player, Craig was legendary for the off-season hill workouts he put himself through. Most of his N.F.L. teammates are now hobbling around golf courses. He has run seven marathons. After Craig signed on, he recruited his daughter Rometra, who played Division I basketball at Duke and U.S.C. Rometra was the kind of person you assigned to guard your opponent's best player in order to shut her down. The girls loved Rometra. "She has always been like my big sister," Anjali Ranadivé said. "It was so awesome to have her along."
Redwood City's strategy was built around the two deadlines that all basketball teams must meet in order to advance the ball. The first is the inbounds pass. When one team scores, a player from the other team takes the ball out of bounds and has five seconds to pass it to a teammate on the court. If that deadline is missed, the ball goes to the other team. Usually, that's not an issue, because teams don't contest the inbounds pass. They run back to their own end. Redwood City did not. Each girl on the team closely shadowed her counterpart. When some teams play the press, the defender plays behind the offensive player she's guarding, to impede her once she catches the ball. The Redwood City girls, by contrast, played in front of their opponents, to prevent them from catching the inbounds pass in the first place. And they didn't guard the player throwing the ball in. Why bother? Ranadivé used that extra player as a floater, who could serve as a second defender against the other team's best player. "Think about football," Ranadivé said. "The quarterback can run with the ball. He has the whole field to throw to, and it's still damned difficult to complete a pass." Basketball was harder. A smaller court. A five-second deadline. A heavier, bigger ball. As often as not, the teams Redwood City was playing against simply couldn't make the inbounds pass within the five-second limit. Or the inbounding player, panicked by the thought that her five seconds were about to be up, would throw the ball away. Or her pass would be intercepted by one of the Redwood City players. Ranadivé's girls were maniacal.
The second deadline requires a team to advance the ball across mid-court, into its opponent's end, within ten seconds, and if Redwood City's opponents met the first deadline the girls would turn their attention to the second. They would descend on the girl who caught the inbounds pass and "trap" her. Anjali was the designated trapper. She'd sprint over and double-team the dribbler, stretching her long arms high and wide. Maybe she'd steal the ball. Maybe the other player would throw it away in a panic—or get bottled up and stalled, so that the ref would end up blowing the whistle. "When we first started out, no one knew how to play defense or anything," Anjali said. "So my dad said the whole game long, 'Your job is to guard someone and make sure they never get the ball on inbounds plays.' It's the best feeling in the world to steal the ball from someone. We would press and steal, and do that over and over again. It made people so nervous. There were teams that were a lot better than us, that had been playing a long time, and we would beat them."
The Redwood City players would jump ahead 4–0, 6–0, 8–0, 12–0. One time, they led 25–0. Because they typically got the ball underneath their opponent's basket, they rarely had to take low-percentage, long-range shots that required skill and practice. They shot layups. In one of the few games that Redwood City lost that year, only four of the team's players showed up. They pressed anyway. Why not? They lost by three points.
"What that defense did for us is that we could hide our weaknesses," Rometra Craig said. She helped out once Redwood City advanced to the regional championships. "We could hide the fact that we didn't have good outside shooters. We could hide the fact that we didn't have the tallest lineup, because as long as we played hard on defense we were getting steals and getting easy layups. I was honest with the girls. I told them, 'We're not the best basketball team out there.' But they understood their roles." A twelve-year-old girl would go to war for Rometra. "They were awesome," she said.
Lawrence attacked the Turks where they were weak—the railroad—and not where they were strong, Medina. Redwood City attacked the inbounds pass, the point in a game where a great team is as vulnerable as a weak one. Lawrence extended the battlefield over as large an area as possible. So did the girls of Redwood City. They defended all ninety-four feet. The full-court press is legs, not arms. It supplants ability with effort. It is basketball for those "quite unused to formal warfare, whose assets were movement, endurance, individual intelligence . . . courage."
"It's an exhausting strategy," Roger Craig said. He and Ranadivé were in a TIBCO conference room, reminiscing about their dream season. Ranadivé was at the whiteboard, diagramming the intricacies of the Redwood City press. Craig was sitting at the table.
"My girls had to be more fit than the others," Ranadivé said.
"He used to make them run," Craig said, nodding approvingly.
"We followed soccer strategy in practice," Ranadivé said. "I would make them run and run and run. I couldn't teach them skills in that short period of time, and so all we did was make sure they were fit and had some basic understanding of the game. That's why attitude plays such a big role in this, because you're going to get tired." He turned to Craig. "What was our cheer again?"
The two men thought for a moment, then shouted out happily, in unison, "One, two, three, ATTITUDE!"
That was it! The whole Redwood City philosophy was based on a willingness to try harder than anyone else.
"One time, some new girls joined the team," RanadivĂ© said, "and so in the first practice I had I was telling them, 'Look, this is what we're going to do,' and I showed them. I said, 'It's all about attitude.' And there was this one new girl on the team, and I was worried that she wouldn't get the whole attitude thing. Then we did the cheer and she said, 'No, no, it's not One, two three, ATTITUDE. It's One, two, three, attitude HAH ' "—at which point RanadivĂ© and Craig burst out laughing.
5.
In January of 1971, the Fordham University Rams played a basketball game against the University of Massachusetts Redmen. The game was in Amherst, at the legendary arena known as the Cage, where the Redmen hadn't lost since December of 1969. Their record was –1. The Redmen's star was none other than Julius Erving—Dr. J. The UMass team was very, very good. Fordham, by contrast, was a team of scrappy kids from the Bronx and Brooklyn. Their center had torn up his knee the first week of the season, which meant that their tallest player was six feet five. Their starting forward—and forwards are typically almost as tall as centers—was Charlie Yelverton, who was six feet two. But from the opening buzzer the Rams launched a full-court press, and never let up. "We jumped out to a thirteen-to-six lead, and it was a war the rest of the way," Digger Phelps, the Fordham coach at the time, recalls. "These were tough city kids. We played you ninety-four feet. We knew that sooner or later we were going to make you crack." Phelps sent in one indefatigable Irish or Italian kid from the Bronx after another to guard Erving, and, one by one, the indefatigable Irish and Italian kids fouled out. None of them were as good as Erving. It didn't matter. Fordham won, 87–79.
In the world of basketball, there is one story after another like this about legendary games where David used the full-court press to beat Goliath. Yet the puzzle of the press is that it has never become popular. People look at upsets like Fordham over UMass and call them flukes. Basketball sages point out that the press can be beaten by a well-coached team with adept ball handlers and astute passers—and that is true. RanadivĂ© readily admitted that all an opposing team had to do to beat Redwood City was press back: the girls were not good enough to handle their own medicine. Playing insurgent basketball did not guarantee victory. It was simply the best chance an underdog had of beating Goliath. If Fordham had played UMass the conventional way, it would have lost by thirty points. And yet somehow that lesson has escaped the basketball establishment.
What did Digger Phelps do, the season after his stunning upset of UMass? He never used the full-court press the same way again. The UMass coach, Jack Leaman, was humbled in his own gym by a bunch of street kids. Did he learn from his defeat and use the press himself the next time he had a team of underdogs? He did not.
The only person who seemed to have absorbed the lessons of that game was a skinny little guard on the UMass freshman team named Rick Pitino. He didn't play that day. He watched, and his eyes grew wide. Even now, thirty-eight years later, he can name, from memory, nearly every player on the Fordham team: Yelverton, Sullivan, Mainor, Charles, Zambetti. "They came in with the most unbelievable pressing team I'd ever seen," Pitino said. "Five guys between six feet five and six feet. It was unbelievable how they covered ground. I studied it. There is no way they should have beaten us. Nobody beat us at the Cage."
Pitino became the head coach at Boston University in 1978, when he was twenty-five years old, and used the press to take the school to its first N.C.A.A. tournament appearance in twenty-four years. At his next head-coaching stop, Providence College, Pitino took over a team that had gone 11–20 the year before. The players were short and almost entirely devoid of talent—a carbon copy of the Fordham Rams. They pressed, and ended up one game away from playing for the national championship. At the University of Kentucky, in the mid-nineteen-nineties, Pitino took his team to the Final Four three times—and won a national championship—with full-court pressure, and then rode the full-court press back to the Final Four in 2005, as the coach at the University of Louisville. This year, his Louisville team entered the N.C.A.A. tournament ranked No. 1 in the land. College coaches of Pitino's calibre typically have had numerous players who have gone on to be bona-fide all-stars at the professional level. In his many years of coaching, Pitino has had one, Antoine Walker. It doesn't matter. Every year, he racks up more and more victories.
"The greatest example of the press I've ever coached was my Kentucky team in '96, when we played L.S.U.," Pitino said. He was at the athletic building at the University of Louisville, in a small room filled with television screens, where he watches tapes of opponents' games. "Do we have that tape?" Pitino called out to an assistant. He pulled a chair up close to one of the monitors. The game began with Kentucky stealing the ball from L.S.U., deep in L.S.U.'s end. Immediately, the ball was passed to Antoine Walker, who cut to the basket for a layup. L.S.U. got the ball back. Kentucky stole it again. Another easy basket by Walker. "Walker had almost thirty points at halftime," Pitino said. "He dunked it almost every time. When we steal, he just runs to the basket." The Kentucky players were lightning quick and long-armed, and swarmed around the L.S.U. players, arms flailing. It was mayhem. Five minutes in, it was clear that L.S.U. was panicking.
Pitino trains his players to look for what he calls the "rush state" in their opponents—that moment when the player with the ball is shaken out of his tempo—and L.S.U. could not find a way to get out of the rush state. "See if you find one play that L.S.U. managed to run," Pitino said. You couldn't. The L.S.U. players struggled to get the ball inbounds, and, if they did that, they struggled to get the ball over mid-court, and on those occasions when they managed both those things they were too overwhelmed and exhausted to execute their offense the way they had been trained to. "We had eighty-six points at halftime," Pitino went on—eighty-six points being, of course, what college basketball teams typically score in an entire game. "And I think we'd forced twenty-three turnovers at halftime," twenty-three turnovers being what college basketball teams might force in two games. "I love watching this," Pitino said. He had a faraway look in his eyes. "Every day, you dream about getting a team like this again." So why are there no more than a handful of college teams who use the full-court press the way Pitino does?
ArreguĂn-Toft found the same puzzling pattern. When an underdog fought like David, he usually won. But most of the time underdogs didn't fight like David. Of the two hundred and two lopsided conflicts in ArreguĂn-Toft's database, the underdog chose to go toe to toe with Goliath the conventional way a hundred and fifty-two times—and lost a hundred and nineteen times. In 1809, the Peruvians fought the Spanish straight up and lost; in 1816, the Georgians fought the Russians straight up and lost; in 1817, the Pindaris fought the British straight up and lost; in the Kandyan rebellion of 1817, the Sri Lankans fought the British straight up and lost; in 1823, the Burmese chose to fight the British straight up and lost. The list of failures was endless. In the nineteen-forties, the Communist insurgency in Vietnam bedevilled the French until, in 1951, the Viet Minh strategist Vo Nguyen Giap switched to conventional warfare—and promptly suffered a series of defeats. George Washington did the same in the American Revolution, abandoning the guerrilla tactics that had served the colonists so well in the conflict's early stages. "As quickly as he could," William Polk writes in "Violent Politics," a history of unconventional warfare, Washington "devoted his energies to creating a British-type army, the Continental Line. As a result, he was defeated time after time and almost lost the war."
It makes no sense, unless you think back to that Kentucky-L.S.U. game and to Lawrence's long march across the desert to Aqaba. It is easier to dress soldiers in bright uniforms and have them march to the sound of a fife-and-drum corps than it is to have them ride six hundred miles through the desert on the back of a camel. It is easier to retreat and compose yourself after every score than swarm about, arms flailing. We tell ourselves that skill is the precious resource and effort is the commodity. It's the other way around. Effort can trump ability—legs, in Saxe's formulation, can overpower arms—because relentless effort is in fact something rarer than the ability to engage in some finely tuned act of motor coördination.
"I have so many coaches come in every year to learn the press," Pitino said. Louisville was the Mecca for all those Davids trying to learn how to beat Goliaths. "Then they e-mail me. They tell me they can't do it. They don't know if they have the bench. They don't know if the players can last." Pitino shook his head. "We practice every day for two hours " " he went on. "The players are moving almost ninety-eight per cent of the practice. We spend very little time talking. When we make our corrections"—that is, when Pitino and his coaches stop play to give instruction—"they are seven-second corrections, so that our heart rate never rests. We are always working." Seven seconds! The coaches who came to Louisville sat in the stands and watched that ceaseless activity and despaired. The prospect of playing by David's rules was too daunting. They would rather lose.
6.
In 1981, a computer scientist from Stanford University named Doug Lenat entered the Traveller Trillion Credit Squadron tournament, in San Mateo, California. It was a war game. The contestants had been given several volumes of rules, well beforehand, and had been asked to design their own fleet of warships with a mythical budget of a trillion dollars. The fleets then squared off against one another in the course of a weekend. "Imagine this enormous auditorium area with tables, and at each table people are paired off," Lenat said. "The winners go on and advance. The losers get eliminated, and the field gets smaller and smaller, and the audience gets larger and larger."
Lenat had developed an artificial-intelligence program that he called Eurisko, and he decided to feed his program the rules of the tournament. Lenat did not give Eurisko any advice or steer the program in any particular strategic direction. He was not a war-gamer. He simply let Eurisko figure things out for itself. For about a month, for ten hours every night on a hundred computers at Xerox PARC, in Palo Alto, Eurisko ground away at the problem, until it came out with an answer. Most teams fielded some version of a traditional naval fleet—an array of ships of various sizes, each well defended against enemy attack. Eurisko thought differently. "The program came up with a strategy of spending the trillion on an astronomical number of small ships like P.T. boats, with powerful weapons but absolutely no defense and no mobility," Lenat said. "They just sat there. Basically, if they were hit once they would sink. And what happened is that the enemy would take its shots, and every one of those shots would sink our ships. But it didn't matter, because we had so many." Lenat won the tournament in a runaway.
The next year, Lenat entered once more, only this time the rules had changed. Fleets could no longer just sit there. Now one of the criteria of success in battle was fleet "agility." Eurisko went back to work. "What Eurisko did was say that if any of our ships got damaged it would sink itself—and that would raise fleet agility back up again," Lenat said. Eurisko won again.
Eurisko was an underdog. The other gamers were people steeped in military strategy and history. They were the sort who could tell you how Wellington had outfoxed Napoleon at Waterloo, or what exactly happened at Antietam. They had been raised on Dungeons and Dragons. They were insiders. Eurisko, on the other hand, knew nothing but the rule book. It had no common sense. As Lenat points out, a human being understands the meaning of the sentences "Johnny robbed a bank. He is now serving twenty years in prison," but Eurisko could not, because as a computer it was perfectly literal; it could not fill in the missing step—"Johnny was caught, tried, and convicted." Eurisko was an outsider. But it was precisely that outsiderness that led to Eurisko's victory: not knowing the conventions of the game turned out to be an advantage.
"Eurisko was exposing the fact that any finite set of rules is going to be a very incomplete approximation of reality," Lenat explained. "What the other entrants were doing was filling in the holes in the rules with real-world, realistic answers. But Eurisko didn't have that kind of preconception, partly because it didn't know enough about the world." So it found solutions that were, as Lenat freely admits, "socially horrifying": send a thousand defenseless and immobile ships into battle; sink your own ships the moment they get damaged.
This is the second half of the insurgent's creed. Insurgents work harder than Goliath. But their other advantage is that they will do what is "socially horrifying"—they will challenge the conventions about how battles are supposed to be fought. All the things that distinguish the ideal basketball player are acts of skill and coördination. When the game becomes about effort over ability, it becomes unrecognizable—a shocking mixture of broken plays and flailing limbs and usually competent players panicking and throwing the ball out of bounds. You have to be outside the establishment—a foreigner new to the game or a skinny kid from New York at the end of the bench—to have the audacity to play it that way. George Washington couldn't do it. His dream, before the war, was to be a British Army officer, finely turned out in a red coat and brass buttons. He found the guerrillas who had served the American Revolution so well to be "an exceeding dirty and nasty people." He couldn't fight the establishment, because he was the establishment.
T. E. Lawrence, by contrast, was the farthest thing from a proper British Army officer. He did not graduate with honors from Sandhurst. He was an archeologist by trade, a dreamy poet. He wore sandals and full Bedouin dress when he went to see his military superiors. He spoke Arabic like a native, and handled a camel as if he had been riding one all his life. And David, let's not forget, was a shepherd. He came at Goliath with a slingshot and staff because those were the tools of his trade. He didn't know that duels with Philistines were supposed to proceed formally, with the crossing of swords. "When the lion or the bear would come and carry off a sheep from the herd, I would go out after him and strike him down and rescue it from his clutches," David explained to Saul. He brought a shepherd's rules to the battlefield.
The price that the outsider pays for being so heedless of custom is, of course, the disapproval of the insider. Why did the Ivy League schools of the nineteen-twenties limit the admission of Jewish immigrants? Because they were the establishment and the Jews were the insurgents, scrambling and pressing and playing by immigrant rules that must have seemed to the Wasp Ă©lite of the time to be socially horrifying. "Their accomplishment is well over a hundred per cent of their ability on account of their tremendous energy and ambition," the dean of Columbia College said of the insurgents from Brooklyn, the Bronx, and the Lower East Side. He wasn't being complimentary. Goliath does not simply dwarf David. He brings the full force of social convention against him; he has contempt for David.
"In the beginning, everyone laughed at our fleet," Lenat said. "It was really embarrassing. People felt sorry for us. But somewhere around the third round they stopped laughing, and some time around the fourth round they started complaining to the judges. When we won again, some people got very angry, and the tournament directors basically said that it was not really in the spirit of the tournament to have these weird computer-designed fleets winning. They said that if we entered again they would stop having the tournament. I decided the best thing to do was to graciously bow out."
It isn't surprising that the tournament directors found Eurisko's strategies beyond the pale. 's wrong to sink your own ships, they believed. And they were right. But let's remember who made that rule: Goliath. And let's remember why Goliath made that rule: when the world has to play on Goliath's terms, Goliath wins.
7.
The trouble for Redwood City started early in the regular season. The opposing coaches began to get angry. There was a sense that Redwood City wasn't playing fair—that it wasn't right to use the full-court press against twelve-year-old girls, who were just beginning to grasp the rudiments of the game. The point of basketball, the dissenting chorus said, was to learn basketball skills. Of course, you could as easily argue that in playing the press a twelve-year-old girl learned something much more valuable—that effort can trump ability and that conventions are made to be challenged. But the coaches on the other side of Redwood City's lopsided scores were disinclined to be so philosophical.
"There was one guy who wanted to have a fight with me in the parking lot," Ranadivé said. "He was this big guy. He obviously played football and basketball himself, and he saw that skinny, foreign guy beating him at his own game. He wanted to beat me up."
Roger Craig says that he was sometimes startled by what he saw. "The other coaches would be screaming at their girls, humiliating them, shouting at them. They would say to the refs—'That's a foul! That's a foul!' But we weren't fouling. We were just playing aggressive defense."
"My girls were all blond-haired white girls," Ranadivé said. "My daughter is the closest we have to a black girl, because she's half-Indian. One time, we were playing this all-black team from East San Jose. They had been playing for years. These were born-with-a-basketball girls. We were just crushing them. We were up something like twenty to zero. We wouldn't even let them inbound the ball, and the coach got so mad that he took a chair and threw it. He started screaming at his girls, and of course the more you scream at girls that age the more nervous they get." Ranadivé shook his head: never, ever raise your voice. "Finally, the ref physically threw him out of the building. I was afraid. I think he couldn't stand it because here were all these blond-haired girls who were clearly inferior players, and we were killing them."
At the nationals, the Redwood City girls won their first two games. In the third round, their opponents were from somewhere deep in Orange County. Redwood City had to play them on their own court, and the opponents supplied their own referee as well. The game was at eight o'clock in the morning. The Redwood City players left their hotel at six, to beat the traffic. It was downhill from there. The referee did not believe in "One, two, three, attitude HAH." He didn't think that playing to deny the inbounds pass was basketball. He began calling one foul after another.
"They were touch fouls," Craig said. Ticky-tacky stuff. The memory was painful.
"My girls didn't understand," Ranadivé said. "The ref called something like four times as many fouls on us as on the other team."
"People were booing," Craig said. "It was bad."
"A two-to-one ratio is understandable, but a ratio of four to one?" Ranadivé shook his head.
"One girl fouled out."
"We didn't get blown out. There was still a chance to win. But . . ."
RanadivĂ© called the press off. He had to. The Redwood City players retreated to their own end, and passively watched as their opponents advanced down the court. They did not run. They paused and deliberated between each possession. They played basketball the way basketball is supposed to be played, and they lost—but not before making Goliath wonder whether he was a giant, after all.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
Priced to Sell
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 6, 2009
Books
Is free the future?
1.
At a hearing on Capitol Hill in May, James Moroney, the publisher of the Dallas Morning News, told Congress about negotiations he'd just had with the online retailer Amazon. The idea was to license his newspaper's content to the Kindle, Amazon's new electronic reader. "They want seventy per cent of the subscription revenue," Moroney testified. ""I get thirty per cent, they get seventy per cent. On top of that, they have said we get the right to republish your intellectual property to any portable device." The idea was that if a Kindle subscription to the Dallas Morning News cost ten dollars a month, seven dollars of that belonged to Amazon, the provider of the gadget on which the news was read, and just three dollars belonged to the newspaper, the provider of an expensive and ever-changing variety of editorial content. The people at Amazon valued the newspaper's contribution so little, in fact, that they felt they ought then to be able to license it to anyone else they wanted. Another witness at the hearing, Arianna Huffington, of the Huffington Post, said that she thought the Kindle could provide a business model to save the beleaguered newspaper industry. Moroney disagreed. "I get thirty per cent and they get the right to license my content to any portable device—not just ones made by Amazon?" He was incredulous. "That, to me, is not a model."
Had James Moroney read Chris Anderson's new book, "Free: The Future of a Radical Price" (Hyperion; $26.99), Amazon's offer might not have seemed quite so surprising. Anderson is the editor of Wired and the author of the 2006 best-seller "The Long Tail," and "Free" is essentially an extended elaboration of Stewart Brand's famous declaration that "information wants to be free." The digital age, Anderson argues, is exerting an inexorable downward pressure on the prices of all things "made of ideas." Anderson does not consider this a passing trend. Rather, he seems to think of it as an iron law: "In the digital realm you can try to keep Free at bay with laws and locks, but eventually the force of economic gravity will win." To musicians who believe that their music is being pirated, Anderson is blunt. They should stop complaining, and capitalize on the added exposure that piracy provides by making money through touring, merchandise sales, and "yes, the sale of some of [their] music to people who still want CDs or prefer to buy their music online." To the Dallas Morning News, he would say the same thing. Newspapers need to accept that content is never again going to be worth what they want it to be worth, and reinvent their business. "Out of the bloodbath will come a new role for professional journalists," he predicts, and he goes on:
There may be more of them, not fewer, as the ability to participate in journalism extends beyond the credentialed halls of traditional media. But they may be paid far less, and for many it won't be a full time job at all. Journalism as a profession will share the stage with journalism as an avocation. Meanwhile, others may use their skills to teach and organize amateurs to do a better job covering their own communities, becoming more editor/coach than writer. If so, leveraging the Free—paying people to get other people to write for non-monetary rewards—may not be the enemy of professional journalists. Instead, it may be their salvation.
Anderson is very good at paragraphs like this—with its reassuring arc from "bloodbath" to "salvation." His advice is pithy, his tone uncompromising, and his subject matter perfectly timed for a moment when old-line content providers are desperate for answers. That said, it is not entirely clear what distinction is being marked between "paying people to get other people to write" and paying people to write. If you can afford to pay someone to get other people to write, why can't you pay people to write? It would be nice to know, as well, just how a business goes about reorganizing itself around getting people to work for "non-monetary rewards." Does he mean that the New York Times should be staffed by volunteers, like Meals on Wheels? Anderson's reference to people who "prefer to buy their music online" carries the faint suggestion that refraining from theft should be considered a mere preference. And then there is his insistence that the relentless downward pressure on prices represents an iron law of the digital economy. Why is it a law? Free is just another price, and prices are set by individual actors, in accordance with the aggregated particulars of marketplace power. "Information wants to be free," Anderson tells us, "in the same way that life wants to spread and water wants to run downhill." But information can't actually want anything, can it? Amazon wants the information in the Dallas paper to be free, because that way Amazon makes more money. Why are the self-interested motives of powerful companies being elevated to a philosophical principle? But we are getting ahead of ourselves.
2.
Anderson's argument begins with a technological trend. The cost of the building blocks of all electronic activity—storage, processing, and bandwidth—has fallen so far that it is now approaching zero. In 1961, Anderson says, a single transistor was ten dollars. In 1963, it was five dollars. By 1968, it was one dollar. Today, Intel will sell you two billion transistors for eleven hundred dollars—meaning that the cost of a single transistor is now about .000055 cents.
Anderson's second point is that when prices hit zero extraordinary things happen. Anderson describes an experiment conducted by the M.I.T. behavioral economist Dan Ariely, the author of "Predictably Irrational." Ariely offered a group of subjects a choice between two kinds of chocolate—Hershey's Kisses, for one cent, and Lindt truffles, for fifteen cents. Three-quarters of the subjects chose the truffles. Then he redid the experiment, reducing the price of both chocolates by one cent. The Kisses were now free. What happened? The order of preference was reversed. Sixty-nine per cent of the subjects chose the Kisses. The price difference between the two chocolates was exactly the same, but that magic word "free" has the power to create a consumer stampede. Amazon has had the same experience with its offer of free shipping for orders over twenty-five dollars. The idea is to induce you to buy a second book, if your first book comes in at less than the twenty-five-dollar threshold. And that's exactly what it does. In France, however, the offer was mistakenly set at the equivalent of twenty cents—and consumers didn't buy the second book. "From the consumer's perspective, there is a huge difference between cheap and free," Anderson writes. "Give a product away, and it can go viral. Charge a single cent for it and you're in an entirely different business. . . . The truth is that zero is one market and any other price is another."
Since the falling costs of digital technology let you make as much stuff as you want, Anderson argues, and the magic of the word "free" creates instant demand among consumers, then Free (Anderson honors it with a capital) represents an enormous business opportunity. Companies ought to be able to make huge amounts of money "around" the thing being given away—as Google gives away its search and e-mail and makes its money on advertising.
Anderson cautions that this philosophy of embracing the Free involves moving from a "scarcity" mind-set to an "abundance" mind-set. Giving something away means that a lot of it will be wasted. But because it costs almost nothing to make things, digitally, we can afford to be wasteful. The elaborate mechanisms we set up to monitor and judge the quality of content are, Anderson thinks, artifacts of an era of scarcity: we had to worry about how to allocate scarce resources like newsprint and shelf space and broadcast time. Not anymore. Look at YouTube, he says, the free video archive owned by Google. YouTube lets anyone post a video to its site free, and lets anyone watch a video on its site free, and it doesn't have to pass judgment on the quality of the videos it archives. "Nobody is deciding whether a video is good enough to justify the scarce channel space it takes, because there is no scarce channel space," he writes, and goes on:
Distribution is now close enough to free to round down. Today, it costs about $0.25 to stream one hour of video to one person. Next year, it will be $0.15. A year later it will be less than a dime. Which is why YouTube's founders decided to give it away. . . . The result is both messy and runs counter to every instinct of a television professional, but this is what abundance both requires and demands.
There are four strands of argument here: a technological claim (digital infrastructure is effectively Free), a psychological claim (consumers love Free), a procedural claim (Free means never having to make a judgment), and a commercial claim (the market created by the technological Free and the psychological Free can make you a lot of money). The only problem is that in the middle of laying out what he sees as the new business model of the digital age Anderson is forced to admit that one of his main case studies, YouTube, "has so far failed to make any money for Google."
Why is that? Because of the very principles of Free that Anderson so energetically celebrates. When you let people upload and download as many videos as they want, lots of them will take you up on the offer. That's the magic of Free psychology: an estimated seventy-five billion videos will be served up by YouTube this year. Although the magic of Free technology means that the cost of serving up each video is "close enough to free to round down," "close enough to free" multiplied by seventy-five billion is still a very large number. A recent report by Credit Suisse estimates that YouTube's bandwidth costs in 2009 will be three hundred and sixty million dollars. In the case of YouTube, the effects of technological Free and psychological Free work against each other.
So how does YouTube bring in revenue? Well, it tries to sell advertisements alongside its videos. The problem is that the videos attracted by psychological Free—pirated material, cat videos, and other forms of user-generated content—are not the sort of thing that advertisers want to be associated with. In order to sell advertising, YouTube has had to buy the rights to professionally produced content, such as television shows and movies. Credit Suisse put the cost of those licenses in 2009 at roughly two hundred and sixty million dollars. For Anderson, YouTube illustrates the principle that Free removes the necessity of aesthetic judgment. (As he puts it, YouTube proves that "crap is in the eye of the beholder.") But, in order to make money, YouTube has been obliged to pay for programs that aren't crap. To recap: YouTube is a great example of Free, except that Free technology ends up not being Free because of the way consumers respond to Free, fatally compromising YouTube's ability to make money around Free, and forcing it to retreat from the "abundance thinking" that lies at the heart of Free. Credit Suisse estimates that YouTube will lose close to half a billion dollars this year. If it were a bank, it would be eligible for TARP funds.
3.
Anderson begins the second part of his book by quoting Lewis Strauss, the former head of the Atomic Energy Commission, who famously predicted in the mid-nineteen-fifties that "our children will enjoy in their homes electrical energy too cheap to meter."
"What if Strauss had been right?" Anderson wonders, and then diligently sorts through the implications: as much fresh water as you could want, no reliance on fossil fuels, no global warming, abundant agricultural production. Anderson wants to take "too cheap to meter" seriously, because he believes that we are on the cusp of our own "too cheap to meter" revolution with computer processing, storage, and bandwidth. But here is the second and broader problem with Anderson's argument: he is asking the wrong question. It is pointless to wonder what would have happened if Strauss's prediction had come true while rushing past the reasons that it could not have come true.
Strauss's optimism was driven by the fuel cost of nuclear energy—which was so low compared with its fossil-fuel counterparts that he considered it (to borrow Anderson's phrase) close enough to free to round down. Generating and distributing electricity, however, requires a vast and expensive infrastructure of transmission lines and power plants—and it is this infrastructure that accounts for most of the cost of electricity. Fuel prices are only a small part of that. As Gordon Dean, Strauss's predecessor at the A.E.C., wrote, " " Even if coal were mined and distributed free to electric generating plants today, the reduction in your monthly electricity bill would amount to but twenty per cent, so great is the cost of the plant itself and the distribution system."
This is the kind of error that technological utopians make. They assume that their particular scientific revolution will wipe away all traces of its predecessors—that if you change the fuel you change the whole system. Strauss went on to forecast "an age of peace," jumping from atoms to human hearts. "As the world of chips and glass fibers and wireless waves goes, so goes the rest of the world," Kevin Kelly, another Wired visionary, proclaimed at the start of his 1998 digital manifesto, "New Rules for the New Economy," offering up the same non sequitur. And now comes Anderson. "The more products are made of ideas, rather than stuff, the faster they can get cheap," he writes, and we know what's coming next: "However, this is not limited to digital products." Just look at the pharmaceutical industry, he says. Genetic engineering means that drug development is poised to follow the same learning curve of the digital world, to "accelerate in performance while it drops in price."
But, like Strauss, he's forgotten about the plants and the power lines. The expensive part of making drugs has never been what happens in the laboratory. It's what happens after the laboratory, like the clinical testing, which can take years and cost hundreds of millions of dollars. In the pharmaceutical world, what's more, companies have chosen to use the potential of new technology to do something very different from their counterparts in Silicon Valley. They've been trying to find a way to serve smaller and smaller markets—to create medicines tailored to very specific subpopulations and strains of diseases—and smaller markets often mean higher prices. The biotechnology company Genzyme spent five hundred million dollars developing the drug Myozyme, which is intended for a condition, Pompe disease, that afflicts fewer than ten thousand people worldwide. That's the quintessential modern drug: a high-tech, targeted remedy that took a very long and costly path to market. Myozyme is priced at three hundred thousand dollars a year. Genzyme isn't a mining company: its real assets are intellectual property—information, not stuff. But, in this case, information does not want to be free. It wants to be really, really expensive.
And there's plenty of other information out there that has chosen to run in the opposite direction from Free. The Times gives away its content on its Web site. But the Wall Street Journal has found that more than a million subscribers are quite happy to pay for the privilege of reading online. Broadcast television—the original practitioner of Free—is struggling. But premium cable, with its stiff monthly charges for specialty content, is doing just fine. Apple may soon make more money selling iPhone downloads (ideas) than it does from the iPhone itself (stuff). The company could one day give away the iPhone to boost downloads; it could give away the downloads to boost iPhone sales; or it could continue to do what it does now, and charge for both. Who knows? The only iron law here is the one too obvious to write a book about, which is that the digital age has so transformed the ways in which things are made and sold that there are no iron laws.
backtop
THE ARCHIVE
complete list
Articles from the New Yorker
Cocksure
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 27, 2009
Dept. of Finance
Banks, battles, and the psychology of overconfidence.
1.
In 1996, an investor named Henry de Kwiatkowski sued Bear Stearns for negligence and breach of fiduciary duty. De Kwiatkowski had made—and then lost—hundreds of millions of dollars by betting on the direction of the dollar, and he blamed his bankers for his reversals. The district court ruled in de Kwiatkowski's favor, ultimately awarding him $164.5 million in damages. But Bear Stearns appealed—successfully—and in William D. Cohan's engrossing account of the fall of Bear Stearns, "House of Cards," the firm's former chairman and C.E.O. Jimmy Cayne tells the story of what happened on the day of the hearing:
Their lead lawyer turned out to be about a 300-pound fag from Long Island . . . a really irritating guy who had cross-examined me and tried to kick the shit out of me in the lower court trial. Now when we walk into the courtroom for the appeal, they're arguing another case and we have to wait until they're finished. And I stopped this guy. I had to take a piss. I went into the bathroom to take a piss and came back and sat down. Then I see my blood enemy stand up and he's going to the bathroom. So I wait till he passes and then I follow him in and it's just he and I in the bathroom. And I said to him, "Today you're going to get your ass kicked, big." He ran out of the room. He thought I might have wanted to start it right there and then.
At the time Cayne said this, Bear Stearns had spectacularly collapsed. The eighty-five-year-old investment bank, with its shiny new billion-dollar headquarters and its storied history, was swallowed whole by J. P. Morgan Chase. Cayne himself had lost close to a billion dollars. His reputation—forty years in the making—was in ruins, especially when it came out that, during Bear's final, critical months, he'd spent an inordinate amount of time on the golf course.
Did Cayne think long and hard about how he wanted to make his case to Cohan? He must have. Cayne understood selling; he started out as a photocopier salesman, working the nine-hundred-mile stretch between Boise and Salt Lake City, and ended up among the highest-paid executives in banking. He was known as one of the savviest men on the Street, a master tactician, a brilliant gamesman. "Jimmy had it all," Bill Bamber, a former Bear senior managing director, writes in "Bear Trap: The Fall of Bear Stearns and the Panic of 2008" (a book co-written by Andrew Spencer). "The ability to read an opponent. The ability to objectively analyze his own strengths and weaknesses. . . . He knew how to exploit others' weaknesses—and their strengths, for that matter—as a way to further his own gain. He knew when to take his losses and live to fight another day."
Cohan asked Cayne about the last days of Bear Stearns, in the spring of 2008. Wall Street had become so spooked by rumors about the firm's financial status that investors withdrew their capital, and no one would lend Bear the money required for its day-to-day operations. The bank received some government money, via J. P. Morgan. But Timothy Geithner, then the head of the New York Federal Reserve Bank, didn't open the Fed's so-called "discount window" to investment banks until J. P. Morgan's acquisition of Bear was under way. What did Cayne think of Geithner? Picture the scene. The journalist in one chair, Cayne in another. Between them, a tape recorder. And the savviest man on Wall Street sets out to salvage his good name:
The audacity of that prick in front of the American people announcing he was deciding whether or not a firm of this stature and this whatever was good enough to get a loan. Like he was the determining factor, and it's like a flea on his back, floating down underneath the Golden Gate Bridge, getting a hard-on, saying, "Raise the bridge." This guy thinks he's got a big dick. He's got nothing, except maybe a boyfriend.
2.
Since the beginning of the financial crisis, there have been two principal explanations for why so many banks made such disastrous decisions. The first is structural. Regulators did not regulate. Institutions failed to function as they should. Rules and guidelines were either inadequate or ignored. The second explanation is that Wall Street was incompetent, that the traders and investors didn't know enough, that they made extravagant bets without understanding the consequences. But the first wave of postmortems on the crash suggests a third possibility: that the roots of Wall Street's crisis were not structural or cognitive so much as they were psychological.
In "Military Misfortunes," the historians Eliot Cohen and John Gooch offer, as a textbook example of this kind of failure, the British-led invasion of Gallipoli, in 1915. Gallipoli is a peninsula in southern Turkey, jutting out into the Aegean. The British hoped that by landing an army there they could make an end run around the stalemate on the Western Front, and give themselves a clear shot at the soft underbelly of Germany. It was a brilliant and daring strategy. "In my judgment, it would have produced a far greater effect upon the whole conduct of the war than anything [else]," the British Prime Minister H. H. Asquith later concluded. But the invasion ended in disaster, and Cohen and Gooch find the roots of that disaster in the curious complacency displayed by the British.
The invasion required a large-scale amphibious landing, something the British had little experience with. It then required combat against a foe dug into ravines and rocky outcroppings and hills and thickly vegetated landscapes that Cohen and Gooch call "one of the finest natural fortresses in the world." Yet the British never bothered to draw up a formal plan of operations. The British military leadership had originally estimated that the Allies would need a hundred and fifty thousand troops to take Gallipoli. Only seventy thousand were sent. The British troops should have had artillery—more than three hundred guns. They took a hundred and eighteen, and, for the most part, neglected to bring howitzers, trench mortars, or grenades. Command of the landing at Sulva —the most critical element of the attack—was given to Frederick Stopford, a retired officer whose experience was largely administrative. Stopford had two days during which he had a ten-to-one advantage over the Turks and could easily have seized the highlands overlooking the bay. Instead, his troops lingered on the beach, while Stopford lounged offshore, aboard a command ship. Winston Churchill later described the scene as "the placid, prudent, elderly English gentleman with his 20,000 men spread around the beaches, the front lines sitting on the tops of shallow trenches, smoking and cooking, with here and there an occasional rifle shot, others bathing by hundreds in the bright blue bay where, disturbed hardly by a single shell, floated the great ships of war." When word of Stopford's ineptitude reached the British commander, Sir Ian Hamilton, he rushed to Sulva Bay to intercede—although "rushed" may not be quite the right word here, since Hamilton had chosen to set up his command post on an island an hour away and it took him a good while to find a boat to take him to the scene.
Cohen and Gooch ascribe the disaster at Gallipoli to a failure to adapt—a failure to take into account how reality did not conform to their expectations. And behind that failure to adapt was a deeply psychological problem: the British simply couldn't wrap their heads around the fact that they might have to adapt. "Let me bring my lads face to face with Turks in the open field," Hamilton wrote in his diary before the attack. "We must beat them every time because British volunteer soldiers are superior individuals to Anatolians, Syrians or Arabs and are animated with a superior ideal and an equal joy in battle."
Hamilton was not a fool. Cohen and Gooch call him an experienced and "brilliant commander who was also a firstrate trainer of men and a good organizer." Nor was he entirely wrong in his assessments. The British probably were a superior fighting force. Certainly they were more numerous, especially when they held that ten-to-one advantage at Sulva Bay. Hamilton, it seems clear, was simply overconfident—and one of the things that happen to us when we become overconfident is that we start to blur the line between the kinds of things that we can control and the kinds of things that we can't. The psychologist Ellen Langer once had subjects engage in a betting game against either a self-assured, well-dressed opponent or a shy and badly dressed opponent (in Langer's delightful phrasing, the "dapper" or the "schnook" condition), and she found that her subjects bet far more aggressively when they played against the schnook. They looked at their awkward opponent and thought, I'm better than he is. Yet the game was pure chance: all the players did was draw cards at random from a deck, and see who had the high hand. This is called the "illusion of control": confidence spills over from areas where it may be warranted ("I'm savvier than that schnook") to areas where it isn't warranted at all ("and that means I'm going to draw higher cards").
At Gallipoli, the British acted as if their avowed superiority over the Turks gave them superiority over all aspects of the contest. They neglected to take into account the fact that the morning sun would be directly in the eyes of the troops as they stormed ashore. They didn't bring enough water. They didn't factor in the harsh terrain. "The attack was based on two assumptions," Cohen and Gooch write, "both of which turned out to be unwise: that the only really difficult part of the operation would be getting ashore, after which the Turks could easily be pushed off the peninsula; and that the main obstacles to a happy landing would be provided by the enemy."
Most people are inclined to use moral terms to describe overconfidence—terms like "arrogance" or "hubris." But psychologists tend to regard overconfidence as a state as much as a trait. The British at Gallipoli were victims of a situation that promoted overconfidence. Langer didn't say that it was only arrogant gamblers who upped their bets in the presence of the schnook. She argues that this is what competition does to all of us; because ability makes a difference in competitions of skill, we make the mistake of thinking that it must also make a difference in competitions of pure chance. Other studies have reached similar conclusions. As novices, we don't trust our judgment. Then we have some success, and begin to feel a little surer of ourselves. Finally, we get to the top of our game and succumb to the trap of thinking that there's nothing we can't master. As we get older and more experienced, we overestimate the accuracy of our judgments, especially when the task before us is difficult and when we're involved with something of great personal importance. The British were overconfident at Gallipoli not because Gallipoli didn't matter but, paradoxically, because it did; it was a high-stakes contest, of daunting complexity, and it is often in those circumstances that overconfidence takes root.
Several years ago, a team headed by the psychologist Mark Fenton-O'Creevy created a computer program that mimicked the ups and downs of an index like the Dow, and recruited, as subjects, members of a highly paid profession. As the line moved across the screen, Fenton-O'Creevy asked his subjects to press a series of buttons, which, they were told, might or might not affect the course of the line. At the end of the session, they were asked to rate their effectiveness in moving the line upward. The buttons had no effect at all on the line. But many of the players were convinced that their manipulation of the buttons made the index go up and up. The world these people inhabited was competitive and stressful and complex. They had been given every reason to be confident in their own judgments. If they sat down next to you, with a tape recorder, it wouldn't take much for them to believe that they had you in the palm of their hand. They were traders at an investment bank.
3.
The high-water mark for Bear Stearns was 2003. The dollar was falling. A wave of scandals had just swept through the financial industry. The stock market was in a swoon. But Bear Stearns was an exception. In the first quarter of that year, its earnings jumped fifty-five per cent. Its return on equity was the highest on Wall Street. The firm's mortgage business was booming. Since Bear Stearns's founding, in 1923, it had always been a kind of also-ran to its more blue-chip counterparts, like Goldman Sachs and Morgan Stanley. But that year Fortune named it the best financial company to work for. "We are hitting on all 99 cylinders,'' Jimmy Cayne told a reporter for the Times, in the spring of that year, "so you have to ask yourself, What can we do better? And I just can't decide what that might be.'' He went on, "Everyone says that when the markets turn around, we will suffer. But let me tell you, we are going to surprise some people this time around. Bear Stearns is a great place to be.''
With the benefit of hindsight, Cayne's words read like the purest hubris. But in 2003 they would have seemed banal. These are the kinds of things that bankers say. More precisely—and here is where psychological failure becomes more problematic still—these are the kinds of things that bankers are expected to say. Investment banks are able to borrow billions of dollars and make huge trades because, at the end of the day, their counterparties believe they are capable of making good on their promises. Wall Street is a confidence game, in the strictest sense of that phrase.
This is what social scientists mean when they say that human overconfidence can be an "adaptive trait. In conflicts involving mutual assessment, an exaggerated assessment of the probability of winning increases the probability of winning," Richard Wrangham, a biological anthropologist at Harvard, writes. "Selection therefore favors this form of overconfidence." Winners know how to bluff. And who bluffs the best? The person who, instead of pretending to be stronger than he is, actually believes himself to be stronger than he is. According to Wrangham, self-deception reduces the chances of "behavioral leakage"; that is, of "inadvertently revealing the truth through an inappropriate behavior." This much is in keeping with what some psychologists have been telling us for years—that it can be useful to be especially optimistic about how attractive our spouse is, or how marketable our new idea is. In the words of the social psychologist Roy Baumeister, humans have an "optimal margin of illusion."
If you were a Wall Street C.E.O., there were two potential lessons to be drawn from the collapse of Bear Stearns. The first was that Jimmy Cayne was overconfident. The second was that Jimmy Cayne wasn't overconfident enough. Bear Stearns did not collapse, after all, simply because it had made bad bets. Until very close to the end, the firm had a capital cushion of more than seventeen billion dollars. The problem was that when, in early 2008, Cayne and his colleagues stood up and said that Bear was a great place to be, the rest of Wall Street no longer believed them. Clients withdrew their money, and lenders withheld funding. As the run on Bear Stearns worsened, J. P. Morgan and the Fed threw the bank a lifeline—a multibillion-dollar line of credit. But confidence matters so much on Wall Street that the lifeline had the opposite of its intended effect. As Bamber writes:
This line-of-credit, the stop-gap measure that was supposed to solve the problem that hadn't really existed in the first place had done nothing but worsen it. When we started the week, we had no liquidity issues. But because people had said that we did have problems with our capital, it became true, even though it wasn't true when people started saying it. . . . So we were forced to find capital to offset the losses we'd sustained because somebody decided we didn't have capital when we really did. So when we finally got more capital to replace the capital we'd lost, people took that as a bad sign and pointed to the fact that we'd had no capital and had to get a loan to cover it, even when we did have the capital they said we didn't have.
Of course, one reason that over-confidence is so difficult to eradicate from expert fields like finance is that, at least some of the time, it's useful to be overconfident—or, more precisely, sometimes the only way to get out of the problems caused by overconfidence is to be even more overconfident.
From an individual perspective, it is hard to distinguish between the times when excessive optimism is good and the times when it isn't. All that we can say unequivocally is that overconfidence is, as Wrangham puts it, "globally maladaptive." When one opponent bluffs, he can score an easy victory. But when everyone bluffs, Wrangham writes, rivals end up "escalating conflicts that only one can win and suffering higher costs than they should if assessment were accurate." The British didn't just think the Turks would lose in Gallipoli; they thought that Belgium would prove to be an obstacle to Germany's advance, and that the Russians would crush the Germans in the east. The French, for their part, planned to be at the Rhine within six weeks of the start of the war, while the Germans predicted that by that point they would be on the outskirts of Paris. Every side in the First World War was bluffing, with the resolve and skill that only the deluded are capable of, and the results, of course, were catastrophic.
4.
Jimmy Cayne grew up in Chicago, the son of a patent lawyer. He wanted to be a bookie, but he realized that it wasn't quite respectable enough. He went to Purdue University to study mechanical engineering—and became hooked on bridge. His grades suffered, and he never graduated. He got married in 1956 and was divorced within four years. "At this time, he was one of the best bridge players in Chicago," his ex-brother-in-law told Cohan. "In fact, that's the reason for the divorce. There was no other woman or anything like that. The co-respondent in their divorce was bridge. He spent all of his time playing bridge—every night. He wasn't home." He was selling scrap metal in those days, and, Cohan says, he would fall asleep on the job, exhausted from playing cards. In 1964, he moved to New York to become a professional bridge player. It was bridge that led him to his second wife, and to a job interview with Alan (Ace) Greenberg, then a senior executive at Bear Stearns. When Cayne told Greenberg that he was a bridge player, Cayne tells Cohan, "you could see the electric light bulb." Cayne goes on:
[Greenberg] says, "How well do you play?" I said, "I play well." He said, "Like how well?" I said, "I play quite well." He says, "You don't understand." I said, "Yeah, I do. I understand. Mr. Greenberg, if you study bridge the rest of your life, if you play with the best partners and you achieve your potential, you will never play bridge like I play bridge."
Right then and there, Cayne says, Greenberg offered him a job.
Twenty years later, the scene was repeated with Warren Spector, who went on to become a co-president of the firm. Spector had been a bridge champion as a student, and Cayne somehow heard about it. "Suddenly, out of nowhere there's a bridge player at Bear Stearns on the bond desk," Cayne recalls. Spector tells Cohan, "He called me up and said, 'Are you a bridge player?' I said, 'I used to be.' So bridge was something that he, Ace, and I all shared and talked about." As reports circulated that two of Bear Stearns's hedge funds were going under—a failure that started the bank on its long, downward spiral into collapse—Spector and Cayne were attending the Spingold K.O. bridge tournament, in Nashville. The Wall Street Journal reported that, of the twenty-one workdays that month, Cayne was out of the office for nearly half of them.
It makes sense that there should be an affinity between bridge and the business of Wall Street. Bridge is a contest between teams, each of which competes over a —how many tricks they think they can win in a given hand. Winning requires knowledge of the cards, an accurate sense of probabilities, steely nerves, and the ability to assess an opponent's psychology. Bridge is Wall Street in miniature, and the reason the light bulb went on when Greenberg looked at Cayne, and Cayne looked at Spector, is surely that they assumed that bridge skills could be transferred to the trading floor—that being good at the game version of Wall Street was a reasonable proxy for being good at the real-life version of Wall Street.
It isn't, however. In bridge, there is such a thing as expertise unencumbered by bias. That's because, as the psychologist Gideon Keren points out, bridge involves "related items with continuous feedback." It has rules and boundaries and situations that repeat themselves and clear patterns that ——and when a player makes a mistake of overconfidence he or she learns of the consequences of that mistake almost immediately. In other words, it's a game. But running an investment bank is not, in this sense, a game: it is not a closed world with a limited set of possibilities. It is an open world where one day a calamity can happen that no one had dreamed could happen, and where you can make a mistake of overconfidence and not personally feel the consequences for years and years—if at all. Perhaps this is part of why we play games: there is something intoxicating about pure expertise, and the real mastery we can attain around a card table or behind the wheel of a racecar emboldens us when we move into the more complex realms. "I'm good at that. I must be good at this, too," we tell ourselves, forgetting that in wars and on Wall Street there is no such thing as absolute expertise, that every step taken toward mastery brings with it an increased risk of mastery's curse. Cayne must have come back from the Spingold bridge tournament fortified in his belief in his own infallibility. And the striking thing about his conversations with Cohan is that nothing that had happened since seemed to have shaken that belief.
"When I left," Cayne told Cohan, speaking of his final day at Bear Stearns, "I had three different meetings. The first was with the president's advisory group, which was about eighty people. There wasn't a dry eye. Standing ovation. I was crying." Until the very end, he evidently saw the world that he wanted to see. "The second meeting was with the retail sales force on the Web," he goes on. "Standing ovation. And the third was a partners' meeting that night for me to tell them that I was stepping down. Standing ovation, of the whole auditorium."
backtop
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
January 22, 1996
DEPT. OF DISPUTATION
Who can be blamed for a disaster like the
Challenger explosion, a decade ago? No one,
according to the new risk theorists, and
we'd better get used to it.
1.
In the technological age, there is a ritual to disaster. When planes crash or chemical plants explode, each piece of physical evidence-of twisted metal or fractured concrete- becomes a kind of fetish object, painstakingly located, mapped, tagged, and analyzed, with findings submitted to boards of inquiry that then probe and interview and soberly draw conclusions. It is a ritual of reassurance, based on the principle that what we learn from one accident can help us prevent another, and a measure of its effectiveness is that Americans did not shut down the nuclear industry after Three Mile Island and do not abandon the skies after each new plane crash. But the rituals of disaster have rarely been played out so dramatically as they were in the case of the Challenger space shuttle, which blew up over southern Florida on January 28th ten years ago.
Fifty-five minutes after the explosion, when the last of the debris had fallen into the ocean, recovery ships were on the scene. They remained there for the next three months, as part of what turned into the largest maritime salvage operation in history, combing a hundred and fifty thousand square nautical miles for floating debris, while the ocean floor surrounding the crash site was inspected by submarines. In mid-April of 1986, the salvage team found several chunks of charred metal that confirmed what had previously been only suspected: the explosion was caused by a faulty seal in one of the shuttle's rocket boosters, which had allowed a stream of flame to escape and ignite an external fuel tank.
Armed with this confirmation, a special Presidential investigative commission concluded the following June that the deficient seal reflected shoddy engineering and lax management at NASA and its prime contractor, Morton Thiokol. Properly chastised, NASA returned to the drawing board, to emerge thirty-two months later with a new shuttle-Discovery-redesigned according to the lessons learned from the disaster. During that first post- Challenger flight, as America watched breathlessly, the crew of the Discovery held a short commemorative service. "Dear friends," the mission commander, Captain Frederick H. Hauck, said, addressing the seven dead Challenger astronauts, "your loss has meant that we could confidently begin anew." The ritual was complete. NASA was back.
But what if the assumptions that underlie our disaster rituals aren't true? What if these public post mortems don't help us avoid future accidents? Over the past few years, a group of scholars has begun making the unsettling argument that the rituals that follow things like plane crashes or the Three Mile Island crisis are as much exercises in self-deception as they are genuine opportunities for reassurance. For these revisionists, high-technology accidents may not have clear causes at all. They may be inherent in the complexity of the technological systems we have created.
This month, on the tenth anniversary of the Challenger disaster, such revisionism has been extended to the space shuttle with the publication, by the Boston College sociologist Diane Vaughan, of "The Challenger Launch Decision" (Chicago), which is the first truly definitive analysis of the events leading up to January 28, 1986. The conventional view is that the Challenger accident was an anomaly, that it happened because people at NASA had not done their job. But the study's conclusion is the opposite: it says that the accident happened because people at NASA had done exactly what they were supposed to do. "No fundamental decision was made at NASA to do evil," Vaughan writes. "Rather, a series of seemingly harmless decisions were made that incrementally moved the space agency toward a catastrophic outcome."
No doubt Vaughan's analysis will be hotly disputed in the coming months, but even if she is only partly right the implications of this kind of argument are enormous. We have surrounded ourselves in the modern age with things like power plants and nuclear-weapons systems and airports that handle hundreds of planes an hour, on the understanding that the risks they represent are, at the very least, manageable. But if the potential for catastrophe is actually found in the normal functioning of complex systems, this assumption is false. Risks are not easily manageable, accidents are not easily preventable, and the rituals of disaster have no meaning. The first time around, the story of the Challenger was tragic. In its retelling, a decade later, it is merely banal.
2.
Perhaps the best way to understand the argument over the Challenger explosion is to start with an accident that preceded it-the near-disaster at the Three Mile Island (T.M.I.) nuclear- power plant in March of 1979. The conclusion of the President's commission that investigated the T.M.I. accident was that it was the result of human error, particularly on the part of the plant's operators. But the truth of what happened there, the revisionists maintain, is a good deal more complicated than that, and their arguments are worth examining in detail.
The trouble at T.M.I. started with a blockage in what is called the plant's polisher-a kind of giant water filter. Polisher problems were not unusual at T.M.I., or particularly serious. But in this case the blockage caused moisture to leak into the plant's air system, inadvertently tripping two valves and shutting down the flow of cold water into the plant's steam generator.
As it happens, T.M.I. had a backup cooling system for precisely this situation. But on that particular day, for reasons that no one really knows, the valves for the backup system weren't open. They had been closed, and an indicator in the control room showing they were closed was blocked by a repair tag hanging from a switch above it. That left the reactor dependent on another backup system, a special sort of relief valve. But, as luck would have it, the relief valve wasn't working properly that day, either. It stuck open when it was supposed to close, and, to make matters even worse, a gauge in the control room which should have told the operators that the relief valve wasn't working was itself not working. By the time T.M.I.'s engineers realized what was happening, the reactor had come dangerously close to a meltdown.
Here, in other words, was a major accident caused by five discrete events. There is no way the engineers in the control room could have known about any of them. No glaring errors or spectacularly bad decisions were made that exacerbated those events. And all the malfunctions-the blocked polisher, the shut valves, the obscured indicator, the faulty relief valve, and the broken gauge-were in themselves so trivial that individually they would have created no more than a nuisance. What caused the accident was the way minor events unexpectedly interacted to create a major problem.
This kind of disaster is what the Yale University sociologist Charles Perrow has famously called a "normal accident." By "normal" Perrow does not mean that it is frequent; he means that it is the kind of accident one can expect in the normal functioning of a technologically complex operation. Modern systems, Perrow argues, are made up of thousands of parts, all of which interrelate in ways that are impossible to anticipate. Given that complexity, he says, it is almost inevitable that some combinations of minor failures will eventually amount to something catastrophic. In a classic 1984 treatise on accidents, Perrow takes examples of well-known plane crashes, oil spills, chemical-plant explosions, and nuclear-weapons mishaps and shows how many of them are best understood as "normal." If you saw last year's hit movie "Apollo 13," in fact, you have seen a perfect illustration of one of the most famous of all normal accidents: the Apollo flight went awry because of the interaction of failures of the spacecraft's oxygen and hydrogen tanks, and an indicator light that diverted the astronauts' attention from the real problem.
Had this been a "real" accident-if the mission had run into trouble because of one massive or venal error-the story would have made for a much inferior movie. In real accidents, people rant and rave and hunt down the culprit. They do, in short, what people in Hollywood thrillers always do. But what made Apollo 13 unusual was that the dominant emotion was not anger but bafflement--bafflement that so much could go wrong for so little apparent reason. There was no one to blame, no dark secret to un-earth, no recourse but to re-create an entire system in place of one that had inexplicably failed. In the end, the normal accident was the more terrifying one.
3.
Was the Challenger explosion a "normal accident"? In a narrow sense, the answer is no. Unlike what happened at T.M.I., its explosion was caused by a single, catastrophic malfunction: the so-called O-rings that were supposed to prevent hot gases from leaking out of the rocket boosters didn't do their job. But Vaughan argues that the O-ring problem was really just a symptom. The cause of the accident was the culture of NASA, she says, and that culture led to a series of decisions about the Challenger which very much followed the contours of a normal accident.
The heart of the question is how NASA chose to evaluate the problems it had been having with the rocket boosters' O-rings. These are the thin rubber bands that run around the lips of each of the rocket's four segments, and each O-ring was meant to work like the rubber seal on the top of a bottle of preserves, making the fit between each part of the rocket snug and airtight. But from as far back as 1981, on one shuttle flight after another, the O-rings had shown increasing problems. In a number of instances, the rubber seal had been dangerously eroded-a condition suggesting that hot gases had almost escaped. What's more, O-rings were strongly suspected to be less effective in cold weather, when the rubber would harden and not give as tight a seal. On the morning of January 28, 1986, the shuttle launchpad was encased in ice, and the temperature at liftoff was just above freezing. Anticipating these low temperatures, engineers at Morton Thiokol, the manufacturer of the shuttle's rockets, had recommended that the launch be delayed. Morton Thiokol brass and NASA, however, overruled the recommendation, and that decision led both the President's commission and numerous critics since to accuse NASA of egregious-if not criminal-misjudgment.
Vaughan doesn't dispute that the decision was fatally flawed. But, after reviewing thousands of pages of transcripts and internal NASA documents, she can't find any evidence of people acting negligently, or nakedly sacrificing safety in the name of politics or expediency. The mistakes that NASA made, she says, were made in the normal course of operation. For example, in retrospect it may seem obvious that cold weather impaired O-ring performance. But it wasn't obvious at the time. A previous shuttle flight that had suffered worse O-ring damage had been launched in seventy-five-degree heat. And on a series of previous occasions when NASA had proposed-but eventually scrubbed for other reasons-shuttle launches in weather as cold as forty-one degrees, Morton Thiokol had not said a word about the potential threat posed by the cold, so its pre-Challenger objection had seemed to NASA not reasonable but arbitrary. Vaughan confirms that there was a dispute between managers and engineers on the eve of the launch but points out that in the shuttle program disputes of this sort were commonplace. And, while the President's commission was astonished by NASA's repeated use of the phrases "acceptable risk" and "acceptable erosion" in internal discussion of the rocket-booster joints, Vaughan shows that flying with acceptable risks was a standard part of NASA culture. The lists of "acceptable risks" on the space shuttle, in fact, filled six volumes. "Although [O-ring] erosion itself had not been predicted, its occurrence conformed to engineering expectations about large-scale technical systems," she writes. "At NASA, problems were the norm. The word 'anomaly' was part of everyday talk. . . . The whole shuttle system operated on the assumption that deviation could be controlled but not eliminated."
What NASA had created was a closed culture that, in her words, "normalized deviance" so that to the outside world decisions that were obviously questionable were seen by NASA's management as prudent and reasonable. It is her depiction of this internal world that makes her book so disquieting: when she lays out the sequence of decisions which led to the launch- each decision as trivial as the string of failures that led to T.M.I.-it is difficult to find any precise point where things went wrong or where things might be improved next time. "It can truly be said that the Challenger launch decision was a rule- based decision," she concludes. "But the cultural understandings, rules, procedures, and norms that always had worked in the past did not work this time. It was not amorally calculating managers violating rules that were responsible for the tragedy. It was conformity."
4.
There is another way to look at this problem, and that is from the standpoint of how human beings handle risk. One of the assumptions behind the modern disaster ritual is that when a risk can be identified and eliminated a system can be made safer. The new booster joints on the shuttle, for example, are so much better than the old ones that the over-all chances of a Challenger-style accident's ever happening again must be lower-right? This is such a straightforward idea that questioning it seems almost impossible. But that is just what another group of scholars has done, under what is called the theory of "risk homeostasis." It should be said that within the academic community there are huge debates over how widely the theory of risk homeostasis can and should be applied. But the basic idea, which has been laid out brilliantly by the Canadian psychologist Gerald Wilde in his book "Target Risk," is quite simple: under certain circumstances, changes that appear to make a system or an organization safer in fact don't. Why? Because human beings have a seemingly fundamental tendency to compensate for lower risks in one area by taking greater risks in another.
Consider, for example, the results of a famous experiment conducted several years ago in Germany. Part of a fleet of taxicabs in Munich was equipped with antilock brake systems (A.B.S.), the recent technological innovation that vastly improves braking, particularly on slippery surfaces. The rest of the fleet was left alone, and the two groups-which were otherwise perfectly matched-were placed under careful and secret observation for three years. You would expect the better brakes to make for safer driving. But that is exactly the opposite of what happened. Giving some drivers A.B.S. made no difference at all in their accident rate; in fact, it turned them into markedly inferior drivers. They drove faster. They made sharper turns. They showed poorer lane discipline. They braked harder. They were more likely to tailgate. They didn't merge as well, and they were involved in more near-misses. In other words, the A.B.S. systems were not used to reduce accidents; instead, the drivers used the additional element of safety to enable them to drive faster and more recklessly without increasing their risk of getting into an accident. As economists would say, they "consumed" the risk reduction, they didn't save it.
Risk homeostasis doesn't happen all the time. Often-as in the case of seat belts, say-compensatory behavior only partly offsets the risk-reduction of a safety measure. But it happens often enough that it must be given serious consideration. Why are more pedestrians killed crossing the street at marked crosswalks than at unmarked crosswalks? Because they compensate for the "safe" environment of a marked crossing by being less viligant about oncoming traffic. Why did the introduction of childproof lids on medicine bottles lead, according to one study, to a substantial increase in fatal child poisonings? Because adults became less careful in keeping pill bottles out of the reach of children.
Risk homeostasis also works in the opposite direction. In the late nineteen-sixties, Sweden changed over from driving on the left-hand side of the road to driving on the right, a switch that one would think would create an epidemic of accidents. But, in fact, the opposite was true. People compensated for their unfamiliarity with the new traffic patterns by driving more carefully. During the next twelve months, traffic fatalities dropped seventeen per cent-before returning slowly to their previous levels. As Wilde only half-facetiously argues, countries truly interested in making their streets and highways safer should think about switching over from one side of the road to the other on a regular basis.
It doesn't take much imagination to see how risk homeostasis applies to NASA and the space shuttle. In one frequently quoted phrase, Richard Feynman, the Nobel Prize- winning physicist who served on the Challenger commission, said that at NASA decision-making was "a kind of Russian roulette." When the O-rings began to have problems and nothing happened, the agency began to believe that "the risk is no longer so high for the next flights," Feynman said, and that "we can lower our standards a little bit because we got away with it last time." But fixing the O-rings doesn't mean that this kind of risk-taking stops. There are six whole volumes of shuttle components that are deemed by NASA to be as risky as O-rings. It is entirely possible that better O-rings just give NASA the confidence to play Russian roulette with something else.
This is a depressing conclusion, but it shouldn't come as a surprise. The truth is that our stated commitment to safety, our faithful enactment of the rituals of disaster, has always masked a certain hypocrisy. We don't really want the safest of all possible worlds. The national fifty-five-mile-per-hour speed limit probably saved more lives than any other single government intervention of the past twenty-five years. But the fact that Congress lifted it last month with a minimum of argument proves that we would rather consume the recent safety advances of things like seat belts and air bags than save them. The same is true of the dramatic improvements that have been made in recent years in the design of aircraft and flight- navigation systems. Presumably, these innovations could be used to bring down the airline-accident rate as low as possible. But that is not what consumers want. They want air travel to be cheaper, more reliable, or more convenient, and so those safety advances have been at least partly consumed by flying and landing planes in worse weather and heavier traffic conditions.
What accidents like the Challenger should teach us is that we have constructed a world in which the potential for high-tech catastrophe is embedded in the fabric of day-to-day life. At some point in the future-for the most mundane of reasons, and with the very best of intentions-a NASA spacecraft will again go down in flames. We should at least admit this to ourselves now. And if we cannot-if the possibility is too much to bear-then our only option is to start thinking about getting rid of things like space shuttles altogether.
Loopholes for Living
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
April 15, 1996
BOOKS
When the means justify the ends.
1.
Leo Katz begins "Ill-Gotten Gains: Evasion, Blackmail, Fraud, and Kindred Puzzles of the Law" (Chicago; $29.95), his elegant defense of circumvention and subterfuge, with a fable for tax day. There was once, he writes, a wealthy shoemaker who was looking for a way to lessen the burden of supporting his son, to whom he was paying, year in and year out, an annual allowance of a thousand dollars. Cutting him off wasn't an option, because the shoemaker loved his son dearly. Nor was writing the thousand dollars off his taxes, because the I.R.S., understandably, doesn't allow family gifts to serve as tax deductions. But the shoemaker had a brainstorm. He gave his son ten thousand dollars, and then he asked for that same amount back in the form of a business loan, promising in return to pay interest on the loan at the rate of ten per cent a year, which amounts, of course, to a thousand dollars. VoilĂ ! With a minor sleight of hand, the shoemaker turns his family obligation into a seemingly legitimate business deduction.
This is what Katz, who teaches law at the University of Pennsylvania, calls "avoision"--behavior a little too fishy to seem like simple avoidance of illegality but not so obviously illegal as to constitute clear-cut evasion. Avoision covers those acts which lie in the awkward middle, and Katz sees the potential for avoision everywhere in the modern world. Imagine, for example, a tourist from a third-world country who comes to America and decides, at the last minute, that she wants to stay here. She then makes a series of provocative statements about her country which render her unwelcome at home and thereby qualify her for political asylum. Or what about a pornographer who, worried about running afoul of decency laws with his collection of highly explicit photographs, decides to put them in a book entitled "Sex in Marriage," together with long, windy essays on the future of marriage. The shoemaker, the tourist, and the pornographer all adhere to the form of the law, but they violate its spirit: they have exploited a loophole. Is what they are doing right? Should they be allowed to get away with it?
I think it's fair to say that most of us, intuitively, have a problem with avoision. Few would raise much of a fuss if the opportunistic tourist was deported, and even fewer would be fooled by the pornographer's cynical repackaging. And if the wealthy shoemaker managed to slip his ruse past the I.R.S. we would expect him at the very least to have the decency to be ashamed of what he had done. Even the authors of the self-help tax books that proliferate at this time of year rarely present their various tax-dodging schemes without some kind of moral justification. ("What is most important is not what a tax law says, but how the I.R.S. interprets and acts on it," Martin Kaplan and Naomi Weiss write in the best-selling "What the I.R.S. Doesn't Want You to Know," after reeling off a handful of anecdotes of capricious and vindictive government audits.) In fact, if the brief rise of Steve Forbes teaches us anything, it is that Americans have come to associate the paperwork, the complexity, and the game-playing surrounding the tax code with its corruption. What is the flat tax, after all, but a secular version of the tithe, an attempt to imbue what has become essentially a commercial transaction between citizen and state with the purity and simplicity of religious obligation?
This is the attitude that "Ill-Gotten Gains" sets out to confront. Katz likes loopholes. He thinks that the wealthy shoemaker has a point. And if, in the end, Katz is not entirely convincing it does not really matter. This is a heroically counterintuitive book that will make it difficult to think about tax day in quite the same way again.
2.
The problem with the way we feel about loopholes, according to Katz, is that we don't give them enough credit. We think of them in narrow, legal terms, as the unintended result of badly drafted laws. If the wealthy shoemaker can get away with masking his son's allowance as a business deduction, it's assumed that there is something amiss with the law, or with the vigilance of the I.R.S. But avoision is something that runs much deeper than that.
Katz produces one example after another from history and literature--from the confrontation between Neil Klugman and Brenda Patimkin over her diaphragm in Philip Roth's "Goodbye Columbus" to the way Freud phrased his exit statement to the Gestapo upon leaving Vienna--to prove that avoision is a kind of basic human strategy. Consider this, for example, from Bob Woodward and Carl Bernstein's Watergate memoir, "All the President's Men." Katz quotes the passage where the two Washington Post reporters are trying to get a senior Justice Department official to confirm off the record a rumor that Nixon's chief of staff, H. R. Haldeman, was about to be indicted:
"I'd like to help you, I really would," said the lawyer. "But I just can't say anything."
Bernstein thought for a moment and told the man they understood why he couldn't say anything. So they would do it another way: Bernstein would count to 10. If there was any reason for the reporters to hold back on the story, the lawyer should hang up before 10. If he was on the line after 10, it would mean the story was okay.
"Hang up, right?" the lawyer asked.
That was right, Bernstein instructed, and he started counting. Okay, Bernstein said, and thanked him effusively.
"You've got it straight now?" the lawyer asked.
This is classic avoision, a perfectly transparent piece of self-justification. Failing to deny the story has exactly the same consequence as confirming it. Nonetheless, in the eyes of the lawyer the difference between those alternatives was quite real. Using the loophole allowed him to live with his own conscience, to convince himself that he had not actively violated the confidentiality requirements of his position.
It is Katz's argument that we play these avoision games all the time, and that, far from being trivial or contemptible ruses, they embody real moral distinctions. Here is another of his many examples, involving a trolley driver whose brakes are shot. As the driver hurtles along, he comes to a fork in the track. Ahead are five people who cannot get out of the way in time. To his right is one person stranded on the track. We would all agree, I think, that the trolley driver should steer right, choosing to kill one person instead of five. But now consider an analogous situation: A physician has in his hospital five people who will die unless they receive immediate organ transplants. Two need kidneys. Two need lungs. One needs a heart. At that moment, a perfectly healthy person walks into the doctor's office. The doctor realizes that if he sacrifices that patient he can save five lives for the price of one. But this time, it's safe to say, no one would maintain that the physician should act as the trolley driver did. It's not good enough to want to save lives. You have to save lives in the right way.
This, at least, is what Katz believes. He describes himself as a "deontologist," which is to say that he thinks the morality of any outcome depends very much on how that outcome is achieved. It is in the illustration of this point that "Ill-Gotten Gains" truly takes flight. In one brilliant riff in the middle of the book's first section, for example, Katz gleefully plunges into Jesuitical theology, since he believes that the Jesuits were the ones who raised hairsplitting and loopholes to an art. Let's say that one wants to guiltlessly communicate an untruth. All one need do is, in the words of a Jesuit theologian quoted by Katz, "swear . . . that one has not done something, though one really has done it, by inwardly understanding that one did not do it on a certain day, or before one was born, or by implying some other similar circumstance."
Ridiculous? Not really, says Katz. For a man to disguise himself as a woman's boyfriend, creep into her bedroom in the middle of the night, and have sex with her is rape. But if another man met the same woman at a bar and by pretending to be a famous C.E.O. successfully seduced her his falsehood in that instance would not invalidate her consent. In other words, here are two lies, identical in their intent and in their result. Yet one is a crime and the other, however deplorable, is not. The Jesuits had a point. The circumstances under which a lie is told can make a big difference. Or consider the case of a woman standing in line for a movie who sees a man pointing a pistol right at her. If she grabs the person behind her and uses that person as a shield, we would say she was guilty, at least, of manslaughter. If she simply ducks, and the bullet hits and kills the person behind her, we would call her lucky--even if she was fully aware that if she ducked the person behind her would die.
This is how Katz resolves the question of whether the wealthy shoemaker is in the right. Here we have two identical actions--the gift of a thousand dollars from father to son. But in the first case the gift is direct, and in the second case it is not. The father gives the son an asset, and that asset, in turn, generates the income. How important is this distinction? Well, imagine that the son took his father's ten thousand dollars, put it in the bank, and lived off the interest. And suppose the shoemaker borrows ten thousand dollars not from his son but from the same bank at an identical interest rate. This is essentially the same transaction as before, just a bit more roundabout. But now no one would deny the shoemaker his tax deduction.
According to Katz, there is an important ethical principle involved here. Suppose I had designed the world's most powerful telescope, the only machine capable of glimpsing far- off planets. If I discovered a new galaxy and published my results under my son's name, we would all agree that my son would not deserve the ensuing fame. It would be like John F. Kennedy's accepting the Pulitzer Prize for "Profiles in Courage," a book that he is often said not to have written. You can't assign your fame to someone else. But suppose I gave the telescope to my son, and, armed with this unique instrument, he stumbled upon the same discovery. Now we would all concede that at least some of the fame due to this discovery should accrue to my son. Putting a little distance between the father and the son changes everything.
3.
How far should we go in accepting Katz's deontological fixation? Does he go overboard in his adherence to form? This is the question raised, indirectly, by a Yale University law professor, Stephen L. Carter, in his new book, "Integrity," an essay-length exploration of the consequences of the decline of public morality. Carter argues that integrity requires three things: "(1) discerning what is right and what is wrong; (2) acting on what you have discerned, even at personal cost; and (3) saying openly that you are acting on your understanding of right from wrong." Like Katz, Carter believes that an action should be judged by how it came about, by its adherence to rights and rules, by its form. But Carter's idea of form is far more restrictive than Katz's. Carter's precepts don't seem to make much of an ethical distinction, for example, between the man who posed as a woman's boyfriend in order to seduce her and the man who posed as a C.E.O. Neither had discerned right from wrong. Neither was acting on what he had discerned and certainly neither was "saying openly" that he was doing what he thought was right. Carter locates the morality of an act in its intention: Did the man deliberately mislead in the aid of the seduction? Katz is much more sensitive to the particulars of the act's execution.
A good example of this difference is found in an anecdote Carter tells at the beginning of his book about an incident he once saw while watching a football game on television. A player who had failed to catch a pass thrown his way rolled on the field, scooped up the ball, and jumped up, exultantly, as if he had caught the ball after all. The referee, shielded partially from the play, was so misled by the player's acting that he ruled the pass complete. The player, Carter concludes, lied, and he presents this incident as a telling example of the lack of integrity in American public life.
For the sake of argument, however, let's add two new wrinkles to the story. Suppose that the player, after scooping up the ball, didn't go through the pantomime of exultation. He simply ran over to the referee and loudly and hotly began insisting that he had caught the ball, even though he knew that he hadn't. Or suppose that the player, after attempting the catch, made no attempt to convince the referee that he had caught the ball at all. He was tired, and sick of playing football, and no longer interested in winning, so he shrugged and walked away, indifferent to the outcome of the game. Carter's rules, I think, end up lumping the faker, the arguer, and the quitter together: in one way or another, they all fail his integrity test.
Now, let's imagine how Katz would think about this incident. In the first instance, I think he might make the case that the faker was practicing avoision. Football, after all, deliberately does not use instant replay to review close calls. It relies on the judgment of referees, even though that judgment will occasionally be flawed, or there will be plays (like this one) that the referees cannot see. That's the loophole the player was exploiting--the inherent subjectivity of the way the rules are enforced. Notice as well how he chose to exploit this loophole. Carter says that the faker lied. But that's not quite right. It was the arguer who lied. He purposefully and directly misrepresented what happened on the play to the referee, putting himself clearly outside the realm of good sportsmanship. By contrast, the faker didn't say anything at all. What he did was bluff, and if Carter doesn't see a difference between lying and bluffing then I hereby extend to him a permanent invitation to my poker game.
That leaves us with the quitter, who is the only player who does not attempt to mislead. But isn't he really the worst of the three? Sports--organized games--can continue to function if players attempt to mislead one another, because there are referees who (most of the time) will catch and punish that conduct. But sports can't survive if players no longer try. The quitter, whose actions make him appear to be the most honest of the players, actually threatens the integrity of the entire game.
The point of all of this is that Carter's rules, for all their superficial appeal, turn out to be somewhat unsatisfying. Because he won't go as far as Katz in scrutinizing the form of actions, he ends up papering over some fairly important distinctions. Yes, in some broad moral sense all three of the players lack a certain integrity. But there isn't a football player in the world who wouldn't rather play with fakers than with arguers, or with arguers than with quitters.
This is not to say that Katz prefers those who play avoision games to those who act with perfect integrity, although it is sometimes tempting to read his book this way, since he spends so much time and enthusiasm talking about the people searching for loopholes and not a great deal of time talking about people who play fair. What Katz is trying to do is show that the loophole is not an arbitrary creation, that the ambiguities of our law reflect deep ethical conundrums that cannot be wished away. There is, in other words, a certain deontological dignity to our tortuous circumventions of the I.R.S. If the Jesuit theologians of the seventeenth century were here today, Katz believes, they would probably all be accountants, which is, when you think about it, probably the nicest thing anyone has ever said about the tax system.
Black Like Them
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
April 29, 1996
PERSONAL HISTORY
Through the lens of his own family's experience,
the author explores why West Indians and American
blacks are perceived differently.
1.
My cousin Rosie and her husband, Noel, live in a two-bedroom bungalow on Argyle Avenue, in Uniondale, on the west end of Long Island. When they came to America, twelve years ago, they lived in a basement apartment a dozen or so blocks away, next to their church. At the time, they were both taking classes at the New York Institute of Technology, which was right nearby. But after they graduated, and Rosie got a job managing a fast-food place and Noel got a job in asbestos removal, they managed to save a little money and bought the house on Argyle Avenue.
From the outside, their home looks fairly plain. It's in a part of Uniondale that has a lot of tract housing from just after the war, and most of the houses are alike--squat and square, with aluminum siding, maybe a dormer window in the attic, and a small patch of lawn out front. But there is a beautiful park down the street, the public schools are supposed to be good, and Rosie and Noel have built a new garage and renovated the basement. Now that Noel has started his own business, as an environmental engineer, he has his office down there--Suite 2B, it says on his stationery--and every morning he puts on his tie and goes down the stairs to make calls and work on the computer. If Noel's business takes off, Rosie says, she would like to move to a bigger house, in Garden City, which is one town over. She says this even though Garden City is mostly white. In fact, when she told one of her girlfriends, a black American, about this idea, her friend said that she was crazy--that Garden City was no place for a black person. But that is just the point. Rosie and Noel are from Jamaica. They don't consider themselves black at all.
This doesn't mean that my cousins haven't sometimes been lumped together with American blacks. Noel had a job once removing asbestos at Kennedy Airport, and his boss there called him "nigger" and cut his hours. But Noel didn't take it personally. That boss, he says, didn't like women or Jews, either, or people with college degrees--or even himself, for that matter. Another time, Noel found out that a white guy working next to him in the same job and with the same qualifications was making ten thousand dollars a year more than he was. He quit the next day. Noel knows that racism is out there. It's just that he doesn't quite understand--or accept--the categories on which it depends.
To a West Indian, black is a literal description: you are black if your skin is black. Noel's father, for example, is black. But his mother had a white father, and she herself was fair-skinned and could pass. As for Rosie, her mother and my mother, who are twins, thought of themselves while they were growing up as "middle-class brown," which is to say that they are about the same shade as Colin Powell. That's because our maternal grandfather was part Jewish, in addition to all kinds of other things, and Grandma, though she was a good deal darker than he was, had enough Scottish blood in her to have been born with straight hair. Rosie's mother married another brown Jamaican, and that makes Rosie a light chocolate. As for my mother, she married an Englishman, making everything that much more complicated, since by the racial categories of my own heritage I am one thing and by the racial categories of America I am another. Once, when Rosie and Noel came to visit me while I was living in Washington, D.C., Noel asked me to show him "where the black people lived," and I was confused for a moment until I realized that he was using "black" in the American sense, and so was asking in the same way that someone visiting Manhattan might ask where Chinatown was. That the people he wanted to see were in many cases racially indistinguishable from him didn't matter. The facts of his genealogy, of his nationality, of his status as an immigrant made him, in his own eyes, different.
This question of who West Indians are and how they define themselves may seem trivial, like racial hairsplitting. But it is not trivial. In the past twenty years, the number of West Indians in America has exploded. There are now half a million in the New York area alone and, despite their recent arrival, they make substantially more money than American blacks. They live in better neighborhoods. Their families are stronger. In the New York area, in fact, West Indians fare about as well as Chinese and Korean immigrants. That is why the Caribbean invasion and the issue of West Indian identity have become such controversial issues. What does it say about the nature of racism that another group of blacks, who have the same legacy of slavery as their American counterparts and are physically indistinguishable from them, can come here and succeed as well as the Chinese and the Koreans do? Is overcoming racism as simple as doing what Noel does, which is to dismiss it, to hold himself above it, to brave it and move on?
These are difficult questions, not merely for what they imply about American blacks but for the ways in which they appear to contradict conventional views of what prejudice is. Racism, after all, is supposed to be indiscriminate. For example, sociologists have observed that the more blacks there are in a community the more negative the whites' attitudes will be. Blacks in Denver have a far easier time than blacks in, say, Cleveland. Lynchings in the South at the turn of this century, to give another example, were far more common in counties where there was a large black population than in areas where whites were in the majority. Prejudice is the crudest of weapons, a reaction against blacks in the aggregate that grows as the perception of black threat grows. If that is the case, however, the addition of hundreds of thousands of new black immigrants to the New York area should have made things worse for people like Rosie and Noel, not better. And, if racism is so indiscriminate in its application, why is one group of blacks flourishing and the other not?
The implication of West Indian success is that racism does not really exist at all--at least, not in the form that we have assumed it does. The implication is that the key factor in understanding racial prejudice is not the behavior and attitudes of whites but the behavior and attitudes of blacks--not white discrimination but black culture. It implies that when the conservatives in Congress say the responsibility for ending urban poverty lies not with collective action but with the poor themselves they are right.
I think of this sometimes when I go with Rosie and Noel to their church, which is in Hempstead, just a mile away. It was once a white church, but in the past decade or so it has been taken over by immigrants from the Caribbean. They have so swelled its membership that the church has bought much of the surrounding property and is about to add a hundred seats to its sanctuary. The pastor, though, is white, and when the band up front is playing and the congregation is in full West Indian form the pastor sometimes seems out of place, as if he cannot move in time with the music. I always wonder how long the white minister at Rosie and Noel's church will last--whether there won't be some kind of groundswell among the congregation to replace him with one of their own. But Noel tells me the issue has never really come up. Noel says, in fact, that he's happier with a white minister, for the same reasons that he's happy with his neighborhood, where the people across the way are Polish and another neighbor is Hispanic and still another is a black American. He doesn't want to be shut off from everyone else, isolated within the narrow confines of his race. He wants to be part of the world, and when he says these things it is awfully tempting to credit that attitude with what he and Rosie have accomplished.
Is this confidence, this optimism, this equanimity all that separates the poorest of American blacks from a house on Argyle Avenue?
2.
In 1994, Philip Kasinitz, a sociologist at Manhattan's Hunter College, and Jan Rosenberg, who teaches at Long Island University, conducted a study of the Red Hook area of Brooklyn, a neighborhood of around thirteen or fourteen thousand which lies between the waterfront and the Gowanus Expressway. Red Hook has a large public-housing project at its center, and around the project, in the streets that line the waterfront, are several hundred thriving blue-collar businesses--warehouses, shipping companies, small manufacturers, and contractors. The object of the study was to resolve what Kasinitz and Rosenberg saw as the paradox of Red Hook: despite Red Hook's seemingly fortuitous conjunction of unskilled labor and blue-collar jobs, very few of the Puerto Ricans and African-Americans from the neighborhood ever found work in the bustling economy of their own back yard.
After dozens of interviews with local employers, the two researchers uncovered a persistent pattern of what they call positive discrimination. It was not that the employers did not like blacks and Hispanics. It was that they had developed an elaborate mechanism for distinguishing between those they felt were "good" blacks and those they felt were "bad" blacks, between those they judged to be "good" Hispanics and those they considered "bad" Hispanics. "Good" meant that you came from outside the neighborhood, because employers identified locals with the crime and dissipation they saw on the streets around them. "Good" also meant that you were an immigrant, because employers felt that being an immigrant implied a loyalty and a willingness to work and learn not found among the native-born. In Red Hook, the good Hispanics are Mexican and South American, not Puerto Rican. And the good blacks are West Indian.
The Harvard sociologist Mary C. Waters conducted a similar study, in 1993, which looked at a food-service company in Manhattan where West Indian workers have steadily displaced African-Americans in the past few years. The transcripts of her interviews with the company managers make fascinating reading, providing an intimate view of the perceptions that govern the urban workplace. Listen to one forty-year-old white male manager on the subject of West Indians:
They tend more to shy away from doing all of the illegal things because they have such strict rules down in their countries and jails. And they're nothing like here. So like, they're like really paranoid to do something wrong. They seem to be very, very self-conscious of it. No matter what they have to do, if they have to try and work three jobs, they do. They won't go into drugs or anything like that.
Or listen to this, from a fifty-three-year-old white female manager:
I work closely with this one girl who's from Trinidad. And she told me when she first came here to live with her sister and cousin, she had two children. And she said I'm here four years and we've reached our goals. And what was your goal? For her two children to each have their own bedroom. Now she has a three bedroom apartment and she said that's one of the goals she was shooting for. . . . If that was an American, they would say, I reached my goal. I bought a Cadillac.
This idea of the West Indian as a kind of superior black is not a new one. When the first wave of Caribbean immigrants came to New York and Boston, in the early nineteen-hundreds, other blacks dubbed them Jewmaicans, in derisive reference to the emphasis they placed on hard work and education. In the nineteen-eighties, the economist Thomas Sowell gave the idea a serious intellectual imprimatur by arguing that the West Indian advantage was a historical legacy of Caribbean slave culture. According to Sowell, in the American South slaveowners tended to hire managers who were married, in order to limit the problems created by sexual relations between overseers and slave women. But the West Indies were a hardship post, without a large and settled white population. There the overseers tended to be bachelors, and, with white women scarce, there was far more commingling of the races. The resulting large group of coloreds soon formed a kind of proto-middle class, performing various kinds of skilled and sophisticated tasks that there were not enough whites around to do, as there were in the American South. They were carpenters, masons, plumbers, and small businessmen, many years in advance of their American counterparts, developing skills that required education and initiative.
My mother and Rosie's mother came from this colored class. Their parents were schoolteachers in a tiny village buried in the hills of central Jamaica. My grandmother's and grandfather's salaries combined put them, at best, on the lower rungs of the middle class. But their expectations went well beyond that. In my grandfather's library were Dickens and Maupassant. My mother and her sister were pushed to win scholarships to a proper English- style boarding school at the other end of the island; and later, when my mother graduated, it was taken for granted that she would attend university in England, even though the cost of tuition and passage meant that my grandmother had to borrow a small fortune from the Chinese grocer down the road.
My grandparents had ambitions for their children, but it was a special kind of ambition, born of a certainty that American blacks did not have--that their values were the same as those of society as a whole, and that hard work and talent could actually be rewarded. In my mother's first year at boarding school, she looked up "Negro" in the eleventh edition of the Encyclopædia Britannica. "In certain . . . characteristics . . . the negro would appear to stand on a lower evolutionary plane than the white man," she read. And the entry continued:
The mental constitution of the negro is very similar to that of a child, normally good-natured and cheerful, but subject to sudden fits of emotion and passion during which he is capable of performing acts of singular atrocity, impressionable, vain, but often exhibiting in the capacity of servant a dog-like fidelity which has stood the supreme test.
All black people of my mother's generation--and of generations before and since--have necessarily faced a moment like this, when they are confronted for the first time with the allegation of their inferiority. But, at least in my mother's case, her school was integrated, and that meant she knew black girls who were more intelligent than white girls, and she knew how she measured against the world around her. At least she lived in a country that had blacks and browns in every position of authority, so her personal experience gave the lie to what she read in the encyclopedia. This, I think, is what Noel means when he says that he cannot quite appreciate what it is that weighs black Americans down, because he encountered the debilitating effects of racism late, when he was much stronger. He came of age in a country where he belonged to the majority.
When I was growing up, my mother sometimes read to my brothers and me from the work of Louise Bennett, the great Jamaican poet of my mother's generation. The poem I remember best is about two women--one black and one white--in a hair salon, the black woman getting her hair straightened and, next to her, the white woman getting her hair curled:
same time me mind start 'tink
'bout me and de white woman
how me tek out me natural perm
and she put in false one
There is no anger or resentment here, only irony and playfulness--the two races captured in a shared moment of absurdity. Then comes the twist. The black woman is paying less to look white than the white woman is to look black:
de two a we da tek a risk
what rain or shine will bring
but fe har risk is t're poun'
fi me onle five shillin'
In the nineteen-twenties, the garment trade in New York was first integrated by West Indian women, because, the legend goes, they would see the sign on the door saying "No blacks need apply" and simply walk on in. When I look back on Bennett's poem, I think I understand how they found the courage to do that.
3.
It is tempting to use the West Indian story as evidence that discrimination doesn't really exist--as proof that the only thing inner-city African-Americans have to do to be welcomed as warmly as West Indians in places like Red Hook is to make the necessary cultural adjustments. If West Indians are different, as they clearly are, then it is easy to imagine that those differences are the reason for their success--that their refusal to be bowed is what lets them walk on by the signs that prohibit them or move to neighborhoods that black Americans would shy away from. It also seems hard to see how the West Indian story is in any way consistent with the idea of racism as an indiscriminate, pernicious threat aimed at all black people.
But here is where things become more difficult, and where what seems obvious about West Indian achievement turns out not to be obvious at all. One of the striking things in the Red Hook study, for example, is the emphasis that the employers appeared to place on hiring outsiders--Irish or Russian or Mexican or West Indian immigrants from places far from Red Hook. The reason for this was not, the researchers argue, that the employers had any great familiarity with the cultures of those immigrants. They had none, and that was the point. They were drawn to the unfamiliar because what was familiar to them--the projects of Red Hook--was anathema. The Columbia University anthropologist Katherine Newman makes the same observation in a recent study of two fast-food restaurants in Harlem. She compared the hundreds of people who applied for jobs at those restaurants with the few people who were actually hired, and found, among other things, that how far an applicant lived from the job site made a huge difference. Of those applicants who lived less than two miles from the restaurant, ten per cent were hired. Of those who lived more than two miles from the restaurant, nearly forty per cent were hired. As Newman puts it, employers preferred the ghetto they didn't know to the ghetto they did.
Neither study describes a workplace where individual attitudes make a big difference, or where the clunky and impersonal prejudices that characterize traditional racism have been discarded. They sound like places where old-style racism and appreciation of immigrant values are somehow bound up together. Listen to another white manager who was interviewed by Mary Waters:
Island blacks who come over, they're immigrant. They may not have such a good life where they are so they gonna try to strive to better themselves and I think there's a lot of American blacks out there who feel we owe them. And enough is enough already. You know, this is something that happened to their ancestors, not now. I mean, we've done so much for the black people in America now that it's time that they got off their butts.
Here, then, are the two competing ideas about racism side by side: the manager issues a blanket condemnation of American blacks even as he holds West Indians up as a cultural ideal. The example of West Indians as "good" blacks makes the old blanket prejudice against American blacks all the easier to express. The manager can tell black Americans to get off their butts without fear of sounding, in his own ears, like a racist, because he has simultaneously celebrated island blacks for their work ethic. The success of West Indians is not proof that discrimination against American blacks does not exist. Rather, it is the means by which discrimination against American blacks is given one last, vicious twist: I am not so shallow as to despise you for the color of your skin, because I have found people your color that I like. Now I can despise you for who you are.
This is racism's newest mutation--multicultural racism, where one ethnic group can be played off against another. But it is wrong to call West Indians the victors in this competition, in anything but the narrowest sense. In American history, immigrants have always profited from assimilation: as they have adopted the language and customs of this country, they have sped their passage into the mainstream. The new racism means that West Indians are the first group of people for whom that has not been true. Their advantage depends on their remaining outsiders, on remaining unfamiliar, on being distinct by custom, culture, and language from the American blacks they would otherwise resemble. There is already some evidence that the considerable economic and social advantages that West Indians hold over American blacks begin to dissipate by the second generation, when the island accent has faded, and those in positions of power who draw distinctions between good blacks and bad blacks begin to lump West Indians with everyone else. For West Indians, assimilation is tantamount to suicide. This is a cruel fate for any immigrant group, but it is especially so for West Indians, whose history and literature are already redolent with the themes of dispossession and loss, with the long search for identity and belonging. In the nineteen-twenties, Marcus Garvey sought community in the idea of Africa. Bob Marley, the Jamaican reggae singer, yearned for Zion. In "Rites of Passage" the Barbadian poet Edward Kamau Brathwaite writes:
Where, then, is the nigger's
home?
In Paris Brixton Kingston
Rome?
Here?
Or in Heaven?
America might have been home. But it is not: not Red Hook, anyway; not Harlem; not even Argyle Avenue.
There is also no small measure of guilt here, for West Indians cannot escape the fact that their success has come, to some extent, at the expense of American blacks, and that as they have noisily differentiated themselves from African-Americans--promoting the stereotype of themselves as the good blacks--they have made it easier for whites to join in. It does not help matters that the same kinds of distinctions between good and bad blacks which govern the immigrant experience here have always lurked just below the surface of life in the West Indies as well. It was the infusion of white blood that gave the colored class its status in the Caribbean, and the members of this class have never forgotten that, nor have they failed, in a thousand subtle ways, to distance themselves from those around them who experienced a darker and less privileged past.
In my mother's house, in Harewood, the family often passed around a pencilled drawing of two of my great-grandparents; she was part Jewish, and he was part Scottish. The other side--the African side--was never mentioned. My grandmother was the ringleader in this. She prized my grandfather's light skin, but she also suffered as a result of this standard. "She's nice, you know, but she's too dark," her mother-in-law would say of her. The most telling story of all, though, is the story of one of my mother's relatives, whom I'll call Aunt Joan, who was as fair as my great-grandmother was. Aunt Joan married what in Jamaica is called an Injun--a man with a dark complexion that is redeemed from pure Africanness by straight, fine black hair. She had two daughters by him--handsome girls with dark complexions. But he died young, and one day, while she was travelling on a train to visit her daughter, she met and took an interest in a light-skinned man in the same railway car. What happened next is something that Aunt Joan told only my mother, years later, with the greatest of shame. When she got off the train, she walked right by her daughter, disowning her own flesh and blood, because she did not want a man so light-skinned and desirable to know that she had borne a daughter so dark.
My mother, in the nineteen-sixties, wrote a book about her experiences. It was entitled "Brown Face, Big Master," the brown face referring to her and the big master, in the Jamaican dialect, referring to God. Sons, of course, are hardly objective on the achievements of their mothers, but there is one passage in the book that I find unforgettable, because it is such an eloquent testimony to the moral precariousness of the Jamaican colored class--to the mixture of confusion and guilt that attends its position as beneficiary of racism's distinctions. The passage describes a time just after my mother and father were married, when they were living in London and my eldest brother was still a baby. They were looking for an apartment, and after a long search my father found one in a London suburb. On the day after they moved in, however, the landlady ordered them out. "You didn't tell me your wife was colored," she told my father, in a rage.
In her book my mother describes her long struggle to make sense of this humiliation, to reconcile her experience with her faith. In the end, she was forced to acknowledge that anger was not an option--that as a Jamaican "middle-class brown," and a descendant of Aunt Joan, she could hardly reproach another for the impulse to divide good black from bad black:
I complained to God in so many words: "Here I was, the wounded representative of the negro race in our struggle to be accounted free and equal with the dominating whites!" And God was amused; my prayer did not ring true with Him. I would try again. And then God said, "Have you not done the same thing? Remember this one and that one, people whom you have slighted or avoided or treated less considerately than others because they were different superficially, and you were ashamed to be identified with them. Have you not been glad that you are not more colored than you are? Grateful that you are not black?" My anger and hate against the landlady melted. I was no better than she was, nor worse for that matter. . . . We were both guilty of the sin of self-regard, the pride and the exclusiveness by which we cut some people off from ourselves.
4.
I grew up in Canada, in a little farming town an hour and a half outside of Toronto. My father teaches mathematics at a nearby university, and my mother is a therapist. For many years, she was the only black person in town, but I cannot remember wondering or worrying, or even thinking, about this fact. Back then, color meant only good things. It meant my cousins in Jamaica. It meant the graduate students from Africa and India my father would bring home from the university. My own color was not something I ever thought much about, either, because it seemed such a stray fact. Blacks knew what I was. They could discern the hint of Africa beneath my fair skin. But it was a kind of secret--something that they would ask me about quietly when no one else was around. ("Where you from?" an older black man once asked me. "Ontario," I said, not thinking. "No," he replied. "Where you from?" And then I understood and told him, and he nodded as if he had already known. "We was speculatin' about your heritage," he said.) But whites never guessed, and even after I informed them it never seemed to make a difference. Why would it? In a town that is ninety-nine per cent white, one modest alleged splash of color hardly amounts to a threat.
But things changed when I left for Toronto to attend college. This was during the early nineteen-eighties, when West Indians were immigrating to Canada in droves, and Toronto had become second only to New York as the Jamaican expatriates' capital in North America. At school, in the dining hall, I was served by Jamaicans. The infamous Jane-Finch projects, in northern Toronto, were considered the Jamaican projects. The drug trade then taking off was said to be the Jamaican drug trade. In the popular imagination, Jamaicans were--and are--welfare queens and gun-toting gangsters and dissolute youths. In Ontario, blacks accused of crimes are released by the police eighteen per cent of the time; whites are released twenty-nine per cent of the time. In drug-trafficking and importing cases, blacks are twenty-seven times as likely as whites to be jailed before their trial takes place, and twenty times as likely to be imprisoned on drug-possession charges.
After I had moved to the United States, I puzzled over this seeming contradiction--how West Indians celebrated in New York for their industry and drive could represent, just five hundred miles northwest, crime and dissipation. Didn't Torontonians see what was special and different in West Indian culture? But that was a naĂŻve question. The West Indians were the first significant brush with blackness that white, smug, comfortable Torontonians had ever had. They had no bad blacks to contrast with the newcomers, no African-Americans to serve as a safety valve for their prejudices, no way to perform America's crude racial triage.
Not long ago, I sat in a coffee shop with someone I knew vaguely from college, who, like me, had moved to New York from Toronto. He began to speak of the threat that he felt Toronto now faced. It was the Jamaicans, he said. They were a bad seed. He was, of course, oblivious of my background. I said nothing, though, and he launched into a long explanation of how, in slave times, Jamaica was the island where all the most troublesome and obstreperous slaves were sent, and how that accounted for their particularly nasty disposition today.
I have told that story many times since, usually as a joke, because it was funny in an appalling way--particularly when I informed him much, much later that my mother was Jamaican. I tell the story that way because otherwise it is too painful. There must be people in Toronto just like Rosie and Noel, with the same attitudes and aspirations, who want to live in a neighborhood as nice as Argyle Avenue, who want to build a new garage and renovate their basement and set up their own business downstairs. But it is not completely up to them, is it? What has happened to Jamaicans in Toronto is proof that what has happened to Jamaicans here is not the end of racism, or even the beginning of the end of racism, but an accident of history and geography. In America, there is someone else to despise. In Canada, there is not. In the new racism, as in the old, somebody always has to be the nigger.
The Tipping Point
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
June 3, 1996
DEPT. OF DISPUTATION
Why is the city suddenly so much safer---
could it be that crime really is an epidemic?
1.
As you drive east on Atlantic Avenue, through the part of New York City that the Police Department refers to as Brooklyn North, the neighborhoods slowly start to empty out: the genteel brownstones of the western part of Brooklyn give way to sprawling housing projects and vacant lots. Bedford-Stuyvesant is followed by Bushwick, then by Brownsville, and, finally, by East New York, home of the Seventy-fifth Precinct, a 5.6-square-mile tract where some of the poorest people in the city live. East New York is not a place of office buildings or parks and banks, just graffiti- covered bodegas and hair salons and auto shops. It is an economically desperate community destined, by most accounts, to get more desperate in the years ahead-which makes what has happened there over the past two and a half years all the more miraculous. In 1993, there were a hundred and twenty-six homicides in the Seven-Five, as the police call it. Last year, there were forty-four. There is probably no other place in the country where violent crime has declined so far, so fast.
Once the symbol of urban violence, New York City is in the midst of a strange and unprecedented transformation. According to the preliminary crime statistics released by the F.B.I. earlier this month, New York has a citywide violent-crime rate that now ranks it a hundred and thirty-sixth among major American cities, on a par with Boise, Idaho. Car thefts have fallen to seventy-one thousand, down from a hundred and fifty thousand as recently as six years ago. Burglaries have fallen from more than two hundred thousand in the early nineteen-eighties to just under seventy-five thousand in 1995. Homicides are now at the level of the early seventies, nearly half of what they were in 1990. Over the past two and a half years, every precinct in the city has recorded double-digit decreases in violent crime. Nowhere, however, have the decreases been sharper than Brooklyn North, in neighborhoods that not long ago were all but written off to drugs and violence. On the streets of the Seven-Five today, it is possible to see signs of everyday life that would have been unthinkable in the early nineties. There are now ordinary people on the streets at dusk-small children riding their bicycles, old people on benches and stoops, people coming out of the subways alone. "There was a time when it wasn't uncommon to hear rapid fire, like you would hear somewhere in the jungle in Vietnam," Inspector Edward A. Mezzadri, who commands the Seventy-fifth Precinct, told me. "You would hear that in Bed-Stuy and Brownsville and, particularly, East New York all the time. I don't hear the gunfire anymore. I've been at this job one year and twelve days. The other night when I was going to the garage to get my car, I heard my first volley. That was my first time."
But what accounts for the drop in crime rates? William J. Bratton-who as the New York City Police Commissioner presided over much of the decline from the fall of 1994 until his resignation, this spring-argues that his new policing strategies made the difference: he cites more coördination between divisions of the N.Y.P.D., more accountability from precinct commanders, more arrests for gun possession, more sophisticated computer-aided analysis of crime patterns, more aggressive crime prevention. In the Seven-Five, Mezzadri has a team of officers who go around and break up the groups of young men who congregate on street corners, drinking, getting high, and playing dice-and so remove what was once a frequent source of violent confrontations. He says that he has stepped up random "safety checks" on the streets, looking for drunk drivers or stolen cars. And he says that streamlined internal procedures mean that he can now move against drug-selling sites in a matter of days, where it used to take weeks. "It's aggressive policing," he says. "It's a no-nonsense attitude. Persistence is not just a word, it's a way of life."
All these changes make good sense. But how does breaking up dice games and streamlining bureaucracy cut murder rates by two-thirds? Many criminologists have taken a broader view, arguing that changes in crime reflect fundamental demographic and social trends-for example, the decline and stabilization of the crack trade, the aging of the population, and longer prison sentences, which have kept hard-core offenders off the streets. Yet these trends are neither particularly new nor unique to New York City; they don't account for why the crime rate has dropped so suddenly here and now. Furthermore, whatever good they have done is surely offset, at least in part, by the economic devastation visited on places like Brownsville and East New York in recent years by successive rounds of federal, state, and city social-spending cuts.
It's not that there is any shortage of explanations, then, for what has happened in New York City. It's that there is a puzzling gap between the scale of the demographic and policing changes that are supposed to have affected places like the Seven-Five and, on the other hand, the scale of the decrease in crime there. The size of that gap suggests that violent crime doesn't behave the way we expect it to behave. It suggests that we need a new way of thinking about crime, which is why it may be time to turn to an idea that has begun to attract serious attention in the social sciences: the idea that social problems behave like infectious agents. It may sound odd to talk about the things people do as analogous to the diseases they catch. And yet the idea has all kinds of fascinating implications. What if homicide, which we often casually refer to as an epidemic, actually is an epidemic, and moves through populations the way the flu bug does? Would that explain the rise and sudden decline of homicide in Brooklyn North?
2.
When social scientists talk about epidemics, they mean something very specific. Epidemics have their own set of rules. Suppose, for example, that one summer a thousand tourists come to Manhattan from Canada carrying an untreatable strain of twenty-four-hour flu. The virus has a two-per-cent infection rate, which is to say that one out of every fifty people who come into close contact with someone carrying it catches the bug himself. Let's say that fifty is also exactly the number of people the average Manhattanite-in the course of riding the subways and mingling with colleagues at work-comes into contact with every day. What we have, then, given the recovery rate, is a disease in equilibrium. Every day, each carrier passes on the virus to a new person. And the next day those thousand newly infected people pass on the virus to another thousand people, so that throughout the rest of the summer and the fall the flu chugs along at a steady but unspectacular clip.
But then comes the Christmas season. The subways and buses get more crowded with tourists and shoppers, and instead of running into an even fifty people a day, the average Manhattanite now has close contact with, say, fifty-five people a day. That may not sound like much of a difference, but for our flu bug it is critical. All of a sudden, one out of every ten people with the virus will pass it on not just to one new person but to two. The thousand carriers run into fifty-five thousand people now, and at a two-per-cent infection rate that translates into eleven hundred new cases the following day. Some of those eleven hundred will also pass on the virus to more than one person, so that by Day Three there are twelve hundred and ten Manhattanites with the flu and by Day Four thirteen hundred and thirty-one, and by the end of the week there are nearly two thousand, and so on up, the figure getting higher every day, until Manhattan has a full-blown flu epidemic on its hands by Christmas Day.
In the language of epidemiologists, fifty is the "tipping point" in this epidemic, the point at which an ordinary and stable phenomenon-a low-level flu outbreak- can turn into a public-health crisis. Every epidemic has its tipping point, and to fight an epidemic you need to understand what that point is. Take AIDS, for example. Since the late eighties, the number of people in the United States who die of AIDS every year has been steady at forty thousand, which is exactly the same as the number of people who are estimated to become infected with H.I.V. every year. In other words, AIDS is in the same self- perpetuating phase that our Canadian flu was in, early on; on the average, each person who dies of aids infects, in the course of his or her lifetime, one new person.
That puts us at a critical juncture. If the number of new infections increases just a bit-if the average H.I.V. carrier passes on the virus to slightly more than one person-then the epidemic can tip upward just as dramatically as our flu did when the number of exposed people went from fifty to fifty-five. On the other hand, even a small decrease in new infections can cause the epidemic to nosedive. It would be as if the number of people exposed to our flu were cut from fifty to forty-five a day-a change that within a week would push the number of flu victims down to four hundred and seventy-eight.
Nobody really knows what the tipping point for reducing AIDS may be. Donald Des Jarlais, an epidemiologist at Beth Israel Hospital, in Manhattan, estimates that halving new infections to twenty thousand a year would be ideal. Even cutting it to thirty thousand, he says, would probably be enough. The point is that it's not some completely unattainable number. "I think people think that to beat AIDS everybody has to either be sexually abstinent or use a clean needle or a condom all the time," Des Jarlais said. "But you don't really need to completely eliminate risk. If over time you can just cut the number of people capable of transmitting the virus, then our present behavior-change programs could potentially eradicate the disease in this country."
That's the surprising thing about epidemics. They don't behave the way we think they will behave. Suppose, for example, that the number of new H.I.V. infections each year was a hundred thousand, and by some heroic aids- education effort you managed to cut that in half. You would expect the size of the epidemic to also be cut in half, right? This is what scientists call a linear assumption-the expectation that every extra increment of effort will produce a corresponding improvement in result. But epidemics aren't linear. Improvement does not correspond directly to effort. All that matters is the tipping point, and because fifty thousand is still above that point, all these heroics will come to naught. The epidemic would still rise. This is the fundamental lesson of nonlinearity. When it comes to fighting epidemics, small changes-like bringing new infections down to thirty thousand from forty thousand-can have huge effects. And large changes-like reducing new infections to fifty thousand from a hundred thousand-can have small effects. It all depends on when and how the changes are made.
The reason this seems surprising is that human beings prefer to think in linear terms. Many expectant mothers, for example, stop drinking entirely, because they've heard that heavy alcohol use carries a high risk of damaging the fetus. They make the perfectly understandable linear assumption that if high doses of alcohol carry a high risk, then low doses must carry a low- but still unacceptable-risk. The problem is that fetal-alcohol syndrome isn't linear. According to one study, none of the sixteen problems associated with fetal-alcohol syndrome show up until a pregnant woman starts regularly consuming more than three drinks a day. But try telling that to a neurotic nineties couple.
I can remember struggling with these same theoretical questions as a child, when I tried to pour ketchup on my dinner. Like all children encountering this problem for the first time, I assumed that the solution was linear: that steadily increasing hits on the base of the bottle would yield steadily increasing amounts of ketchup out the other end. Not so, my father said, and he recited a ditty that, for me, remains the most concise statement of the fundamental nonlinearity of everyday life: Tomato ketchup in a bottle-None will come and then the lot'll
3.
What does this have to do with the murder rate in Brooklyn? Quite a bit, as it turns out, because in recent years social scientists have started to apply the theory of epidemics to human behavior. The foundational work in this field was done in the early seventies by the economist Thomas Schelling, then at Harvard University, who argued that "white flight" was a tipping-point phenomenon. Since that time, sociologists have actually gone to specific neighborhoods and figured out what the local tipping point is. A racist white neighborhood, for example, might empty out when blacks reach five per cent of the population. A liberal white neighborhood, on the other hand, might not tip until blacks make up forty or fifty per cent. George Galster, of the Urban Institute, in Washington, argues that the same patterns hold for attempts by governments or developers to turn a bad neighborhood around. "You get nothing until you reach the threshold," he says, "then you get boom."
Another researcher, David Rowe, a psychologist at the University of Arizona, uses epidemic theory to explain things like rates of sexual intercourse among teen-agers. If you take a group of thirteen-year-old virgins and follow them throughout their teen-age years, Rowe says, the pattern in which they first have sex will look like an epidemic curve. Non-virginity starts out at a low level, and then, at a certain point, it spreads from the precocious to the others as if it were a virus.
Some of the most fascinating work, however, comes from Jonathan Crane, a sociologist at the University of Illinois. In a 1991 study in the American Journal of Sociology, Crane looked at the effect the number of role models in a community-the professionals, managers, teachers whom the Census Bureau has defined as "high status"-has on the lives of teen-agers in the same neighborhood. His answer was surprising. He found little difference in teen-pregnancy rates or school-dropout rates in neighborhoods with between forty and five per cent of high-status workers. But when the number of professionals dropped below five per cent, the problems exploded. For black school kids, for example, as the percentage of high- status workers falls just 2.2 percentage points-from 5.6 per cent to 3.4 per cent-dropout rates more than double. At the same tipping point, the rates of childbearing for teen-age girls-which barely move at all up to that point-nearly double as well.
The point made by both Crane and Rowe is not simply that social problems are contagious-that non-virgins spread sex to virgins and that when neighborhoods decline good kids become infected by the attitudes of dropouts and teen-age mothers. Their point is that teen-age sex and dropping out of school are contagious in the same way that an infectious disease is contagious. Crane's study essentially means that at the five-per-cent tipping point neighborhoods go from relatively functional to wildly dysfunctional virtually overnight. There is no steady decline: a little change has a huge effect. The neighborhoods below the tipping point look like they've been hit by the Ebola virus.
It is possible to read in these case studies a lesson about the fate of modern liberalism. Liberals have been powerless in recent years to counter the argument that their policy prescriptions don't work. A program that spends, say, an extra thousand dollars to educate inner-city kids gets cut by Congress because it doesn't raise reading scores. But if reading problems are nonlinear the failure of the program doesn't mean-as conservatives might argue-that spending extra money on inner-city kids is wasted. It may mean that we need to spend even more money on these kids so that we can hit their tipping point. Hence liberalism's crisis. Can you imagine explaining the link between tipping points and big government to Newt Gingrich? Epidemic theory, George Galster says, "greatly complicates the execution of public policy. . . . You work, and you work, and you work, and if you haven't quite reached the threshold you don't seem to get any payoff. That's a very tough situation to sustain politically."
At the same time, tipping points give the lie to conservative policies of benign neglect. In New York City, for example, one round of cuts in, say, subway maintenance is justified with the observation that the previous round of cuts didn't seem to have any adverse consequences. But that's small comfort. With epidemic problems, as with ketchup, nothing comes and then the lot'll.
4.
Epidemic theory, in other words, should change the way we think about whether and why social programs work. Now for the critical question: Should it change the way we think about violent crime as well? This is what a few epidemiologists at the Centers for Disease Control, in Atlanta, suggested thirteen years ago, and at the time no one took them particularly seriously. "There was just a small group of us in an old converted bathroom in the sub- subbasement of Building Three at C.D.C.," Mark L. Rosenberg, who heads the Centers' violence group today, says. "Even within C.D.C., we were viewed as a fringe group. We had seven people and our budget was two hundred thousand dollars. People were very skeptical." But that was before Rosenberg's group began looking at things like suicide and gunshot wounds in ways that had never quite occurred to anyone else. Today, bringing epidemiological techniques to bear on violence is one of the hottest ideas in criminal research. "We've got a hundred and ten people and a budget of twenty-two million dollars," Rosenberg says. "There is interest in this all around the world now."
The public-health approach to crime doesn't hold that all crime acts like infectious disease. Clearly, there are neighborhoods where crime is simply endemic-where the appropriate medical analogy for homicide is not something as volatile as aids but cancer, a disease that singles out its victims steadily and implacably. There are, however, times and places where the epidemic model seems to make perfect sense. In the United States between the early sixties and the early seventies, the homicide rate doubled. In Stockholm between 1950 and 1970, rape went up three hundred per cent, murder and attempted murder went up six hundred per cent, and robberies a thousand per cent. That's not cancer; that's aids.
An even better example is the way that gangs spread guns and violence. "Once crime reaches a certain level, a lot of the gang violence we see is reciprocal," Robert Sampson, a sociologist at the University of Chicago, says. "Acts of violence lead to further acts of violence. You get defensive gun ownership. You get retaliation. There is a nonlinear phenomenon. With a gang shooting, you have a particular act, then a counter-response. It's sort of like an arms race. It can blow up very quickly."
How quickly? Between 1982 and 1992, the number of gang-related homicides in Los Angeles County handled by the L.A.P.D. and the County Sheriff's Department went from a hundred and fifty-eight to six hundred and eighteen. A more interesting number, however, is the proportion of those murders which resulted from drive-by shootings. Between 1979 and 1986, that number fluctuated, according to no particular pattern, between twenty-two and fifty-one: the phenomenon, an epidemiologist would say, was in equilibrium. Then, in 1987, the death toll from drive-bys climbed to fifty-seven, the next year to seventy-one, and the year after that to a hundred and ten; by 1992, it had reached two hundred and eleven. At somewhere between fifty and seventy homicides, the idea of drive-by shootings in L.A. had become epidemic. It tipped. When these results were published last fall in the Journal of the American Medical Association, the paper was entitled "The Epidemic of Gang-Related Homicides in Los Angeles County from 1979 Through 1994." The choice of the word "epidemic" was not metaphorical. "If this were a disease," H. Range Hutson, the physician who was the leading author on the study, says, "you would see the government rushing down here to assess what infectious organism is causing all these injuries and deaths."
Some of the best new ideas in preventing violence borrow heavily from the principles of epidemic theory. Take, for example, the so-called "broken window" hypothesis that has been used around the country as the justification for cracking down on "quality of life" crimes like public urination and drinking. In a famous experiment conducted twenty-seven years ago by the Stanford University psychologist Philip Zimbardo, a car was parked on a street in Palo Alto, where it sat untouched for a week. At the same time, Zimbardo had an identical car parked in a roughly comparable neighborhood in the Bronx, only in this case the license plates were removed and the hood was propped open. Within a day, it was stripped. Then, in a final twist, Zimbardo smashed one of the Palo Alto car's windows with a sledgehammer. Within a few hours, that car, too, was destroyed. Zimbardo's point was that disorder invites even more disorder-that a small deviation from the norm can set into motion a cascade of vandalism and criminality. The broken window was the tipping point.
The broken-window hypothesis was the inspiration for the cleanup of the subway system conducted by the New York City Transit Authority in the late eighties and early nineties. Why was the Transit Authority so intent on removing graffiti from every car and cracking down on the people who leaped over turnstiles without paying? Because those two "trivial" problems were thought to be tipping points-broken windows-that invited far more serious crimes. It is worth noting that not only did this strategy seem to work-since 1990, felonies have fallen more than fifty per cent-but one of its architects was the then chief of the Transit Police, William Bratton, who was later to take his ideas about preventing crime to the city as a whole when he became head of the New York Police Department.
Which brings us to North Brooklyn and the Seventy- fifth Precinct. In the Seven-Five, there are now slightly more officers than before. They stop more cars. They confiscate more guns. They chase away more street-corner loiterers. They shut down more drug markets. They have made a series of what seem, when measured against the extraordinary decline in murders, to be small changes. But it is the nature of nonlinear phenomena that sometimes the most modest of changes can bring about enormous effects. What happened to the murder rate may not be such a mystery in the end. Perhaps what William Bratton and Inspector Mezzadri have done is the equivalent of repairing the broken window or preventing that critical ten or fifteen thousand new H.I.V. infections. Perhaps Brooklyn-and with it New York City-has tipped.
Conquering the Coma
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 8, 1996
ANNALS OF MEDICINE
What does it take to save the life of a coma
patient like the Central Park victim? Not a
miracle, as her family and doctor explain.
1.
On the afternoon of Tuesday, June 4th, a young woman was taken by ambulance from Central Park to New York Hospital, on the Upper East Side. When she arrived in the emergency room, around four o'clock, she was in a coma, and she had no identification. Her head was, in the words of one physician, "the size of a pumpkin." She was bleeding from her nose and her left ear. Her right eye was swollen shut, and the bones above the eye were broken and covered by a black-and-blue bruise.
Within minutes, she was put on a ventilator and then given X-rays and a cat scan. A small hole was drilled in her skull and a slender silicone catheter inserted, to drain cerebral spinal fluid and relieve the pressure steadily building in her brain. At midnight, after that pressure had risen precipitously, a neurosurgeon removed a blood clot from her right frontal cortex. A few hours later, Urgent Four-as the trauma-unit staff named her, because she was the fourth unidentified trauma patient in the hospital at that time-was wheeled from the operating room to an intensive-care bed overlooking the East River. She had staples in her scalp from the operation, and her chest and arms and fingers were hooked up, via a maze of intravenous lines and cables, to monitors registering heart rate, arterial blood pressure, and blood-oxygen saturation. She had special inflatable cuffs on her legs to prevent the formation of blood clots, and splints on her ankles, since coma patients tend to point their toes. Four days later, after she was identified and her parents and her two sisters arrived at her bedside, one of the first things the family did was to put two large pictures of her on the wall above her bed-one of her holding her niece, and the other of her laughing and leaning through a doorway-just so people would know what she really looked like.
Urgent Four-or the Central Park victim, as she became known during the spate of media attention that surrounded her case-came close to dying on two occasions. Each time, she fought back. On Wednesday, June 12th, eight days after entering the hospital, she opened one blue eye. The mayor of New York, Rudolph Giuliani, was paying one of his daily visits to her room at the time, and she looked directly at him. Several days later, she began tracking people with her one good eye as they came in and out of her room. She began to frown and smile. On June 19th, the neurosurgeon supervising her care leaned over her bed, pinched her to get her attention, and asked, "Can you open your mouth?" She opened her mouth. He said, "Is your name --?" She nodded and mouthed her name.
"I'm at the foot of her bed, her cheering squad," her mother recalls. She is a striking woman, with thick black hair and luminous eyes, and her voice grows animated with the memory. " 'Go! Go! You're doing great! Go!' And the doctor says, 'Do you want me to pinch her again?' And I'm yelling at her. I'm telling her, 'Say no, get out of here, go!' "
And her daughter mouthed, "Go."
2.
There is something compelling about such stories of medical recovery, and something undeniably moving about a young woman fighting back from the most devastating of injuries. In the days following the Central Park beating, the case assumed national proportions as the police frantically worked to locate Urgent Four's family and identify her attacker. The victim turned out to be a talented musician, a piano teacher beloved by her students. Her alleged assailant turned out to be a strange and deeply disturbed unemployed salesclerk, who veered off into Eastern mysticism during his interrogation by the police. The story also had a hero, in Jam Ghajar, the man who saved her life: a young and handsome neurosurgeon with an M.D. and a Ph.D., a descendant of Iranian royalty who has an athlete's walk, strong, beautiful hands, and ten medical-device patents to his name. If this were the movies, Ghajar would be played by Andy Garcia.
But the story of Urgent Four is not the standard tale of the triumph of medicine and the human spirit. To think of this as an episode of "E.R." is to diminish it. The typical narratives of recovery are about exceptional people in exceptional circumstances, and that is why the narratives are both irresistible and, finally, less than consoling. Brilliant doctors and new technology can work miracles. But what if your doctor isn't brilliant and your hospital doesn't have the newest technology? If the principal failing of the American medical system is that it provides one standard of care for the fortunate and another for everyone else, the typical story of medical triumph ends up as a kind of indictment, a reminder that miracles are apportioned by privilege and position.
The case of Urgent Four is different. The profession that saved her life is in the midst of an ambitious transformation, an attempt to insure that you do not have to be ten minutes away from one of the best hospitals in the country in order to survive a vicious beating. One of the leaders of the movement, in fact, is the doctor who saved Urgent Four's life, and he has held up the care she received as an example of what ought to be routine in the treatment of brain injury. That makes the lesson of the Central Park victim and her remarkable recovery exactly the opposite of the lesson of the heroic medical dramas on television. Recovery need not be remarkable. The real medical miracle is the kind that can be repeated over and over again.
3.
When Urgent Four was attacked in Central Park, her assailant smashed her forehead on the smooth, hard surface of the sidewalk with such force that the bones above her right eye were shattered. Then, as if he didn't consider his task completed, he turned her over and began again, pounding the back of her head against the ground hard enough to fracture her skull behind her left ear. The ferocity-and the thoroughness-of the attack bruised the muscles between her scalp and her skull, causing her scalp to swell. What was more serious was that in response to the trauma her brain also began to swell, pressing up against the inside of her skull. In any trauma patient, this swelling, which increases what is known as intracranial pressure (ICP), is the neurosurgeon's chief concern, because the continuing rise in ICP makes it harder and harder for the body to supply the brain with an adequate amount of blood. Upon autopsy, ninety per cent of coma patients show clear signs of stroke: their brains quite literally starved to death.
This is why when Urgent Four was brought to the New York Hospital-Cornell Medical Center complex the neurosurgical resident on duty inserted a catheter through her skull to siphon off excess cerebral spinal fluid. In cases of trauma, this clear fluid, in which the brain floats, flows into a cavity that is called the ventricle, in the center of the brain, and the hope was to empty the ventricle, reducing the pressure inside the skull. This is also why the trauma staff kept a very close eye on the pressure gauge attached to that catheter during the first few hours after Urgent Four was admitted. According to the index used by neurologists, a healthy person's ICP is between zero and ten. Urgent Four's was at twenty, which is high but not disastrous. A further rise, however, would put her in the danger zone. At nine o'clock Tuesday night, that is exactly what happened: Urgent Four's ICP abruptly surged into the fifties.
The physician in charge of her case, Jam Ghajar, is, at forty- four, one of the country's leading neurotrauma specialists. On his father's side, he is descended from the family that ruled Persia from the late seventeen-hundreds until 1925, and his grandfather on his mother's side was the Shah of Iran's personal physician. Neurosurgeons, Ghajar says, with a smile, are "overachievers," and the description fits him perfectly. As a seventeen-year-old, he was a volunteer at U.C.L.A's Brain Research Institute. As a first-year resident at New York Hospital, he invented a device--a tiny tripod to guide the insertion of ventricular catheters--that made the cover of the Journal of Neurosurgery. Today, Ghajar is the chief of neurosurgery at Jamaica Hospital, in Queens. He is also the president of the Aitken Neuroscience Institute, in Manhattan, a research group that grew out of the double tragedy experienced by the children of Sunny von BĂĽlow, who lost not only their mother to coma but also their father, Prince Alfred von Auersperg, after a car accident, thirteen years ago. Most days, Ghajar drives back and forth between the hospital and the institute, juggling his research at Aitken with a clinical schedule that keeps him on call two weeks out of every four. "Jam is completely committed--he's got a razor-sharp focus," Sunny von BĂĽlow's daughter, Ala Isham, told me. "He's godfather to my son. I always joke that we should carry little cards in our wallets saying that if anything happens to us call Jam Ghajar."
Ghajar spent all day Tuesday, June 4th, at Jamaica Hospital. In the evening, he returned to the Aitken Neuroscience Institute, where a colleague, Michael Lavyne, told him of the young woman hovering near death across the street at New York Hospital. At seven o'clock, Ghajar left his office for the hospital. Two hours later, with Urgent Four's ICP at dangerous levels, he ordered a second cat scan, which immediately identified the culprit: the bruise on her right frontal cortex had given rise to a massive clot. At midnight, Ghajar drilled a small hole in her skull, cut out a chunk three inches in diameter with a zip saw, and, he said, "this big brain hemorrhage just came out-plop-like a big piece of black jelly."
But the task was only half finished. The rule of thumb for a trauma patient is that the blood pressure has to be kept at least seventy points higher than the ICP or the flow of oxygen and nutrients to the brain will be impaired. Even after Urgent Four's clot was removed, her differential was only fifty points. At the same time, however, her heart was racing at a hundred and eighty beats per minute. This made raising her blood pressure tricky. "We're standing around her bed," Ghajar recalls. "It's four in the morning. There's Dr. Fischer"-Eva Fischer, the group-care physician-"there's three surgical residents, there's myself, there's a chief resident from neurosurgery, and then two nurses, and we're all standing around her trying to figure out what the best drug would be to reduce her pulse and increase her blood pressure at the same time." It took three hours-and two different blood-pressure medicines-to get Urgent Four out of the danger zone. It was 7 a.m. when Ghajar left her bedside and began neurosurgery rounds.
4.
The identity of Urgent Four did not become known until the next day, Thursday. By a series of flukes, no one in her family had even suspected that she was missing. Her older sister, whom I will call Jane, had been with her the previous Saturday night, when she played in a concert. The two sisters, who share a birthday, spoke on the phone on Monday afternoon, and it wasn't unusual for several days to pass between conversations. Nor did the news of the attack, when it became public, make much of an impression on Jane: her car radio was broken, and, because she was busy with work, she had no time to read the newspaper. Her parents, meanwhile, were travelling in Utah, and were equally oblivious. "It was the first vacation we'd ever taken where we hadn't read a newspaper," her mother told me. "Or watched the news."
On Thursday, however, one of Urgent Four's piano students showed up for her weekly lesson, and when her teacher didn't arrive the student remembered seeing drawings of the Central Park victim that had been posted on buildings and mailboxes throughout Manhattan, and she began to wonder. She called the police. They searched the woman's apartment, on Fifty- seventh Street, and learned her parents' address, in New Jersey. Upon finding that they were away, the police telephoned Jane, at her home, also in New Jersey, using as a guide the return address on a letter Jane had written to her sister. It was one- thirty Friday morning.
"I got a call from the police, which I didn't believe, of course," Jane said. She is a graceful woman, with shoulder- length black hair and a hint of a Jersey accent. "I thought it was a prank call, and I thought I was being stalked. They asked me my name and if I had a sister with that name, and I was almost rude to them on the phone, because I thought it was someone playing a joke on me. Then they referred to this incident, and I had no idea what they were talking about. At that point, my husband ran and got the newspaper, because he had been following the story and had seen the sketch. I got off the phone and had to fight collapsing. The captain probably sensed that. He said, 'Can you come? I'll send you an escort.' And then he called back a little while later and said, 'Would you be willing to ride in a helicopter to get here?' "
At 3 a.m., she and her husband landed in Manhattan. They were taken immediately by police car to the hospital, and there they were greeted by Mayor Giuliani and Howard Safir, the police commissioner. "They probably spent twenty minutes trying to let me understand what had happened and prepare me, and I ended up saying, 'Don't bother trying to prepare me. It's not going to work.' The anticipation was awful. And when I saw her, of course, the effect was indescribable."
The next to arrive was the family's youngest daughter, who came by car with her husband later on Friday morning. At midnight Friday, the parents arrived. The police had tracked them down by tracing their rental-car registration and then sending the Utah police cruising through motel parking lots in and around Zion National Park to spot the corresponding license plate. "At one point after they found us," her father told me, "we drove through a town in Utah which had my mother's name. Both of us burst out crying. My mother was pretty close to her. So we took that as an omen that she would be looking over her." Jane said that when she first saw the patient at the hospital she knew immediately she was her sister. But her father said that if he had not been told who she was he would never have known her. "To me she was almost unrecognizable," he said.
I met with Urgent Four's family-her mother, father, and older sister-in Jam Ghajar's office, on East Seventy-second Street, two days after she first began to speak. Her parents have been together for thirty-eight years, and have the easy affection of the well-married. The father, trim and gray-haired, is an engineer by training, with the discipline and plainspokenness characteristic of that profession. His wife is a schoolteacher, intelligent and articulate. They spoke with me on the condition that the personal details of their lives be kept private, and they confined their conversation to details of the case which they considered germane: their religious faith, their admiration for Dr. Ghajar's medical team, their hopes for their daughter's recovery. It was an intense and moving conversation. Over the past three weeks, the family has fashioned a protective cocoon for themselves, refusing to read any of the press accounts of Urgent Four's assailant, and barely leaving her hospital room except to rest and eat. This was the first time they had talked to the outside world, and long-pent-up feelings and thoughts came out in a rush.
"We went for days on two hours' sleep," her mother said. "You don't feel as tired, because you're so wound up. You want it all to be over. You want to wake up and know it's over-and it's not." The mother seemed the most shaken and most exhausted of the three. At one point as we talked, she accidentally referred to her daughter in the past tense, saying, "She was-"
"Is," Jane interrupted. "Is."
5.
Had Urgent Four been taken to a smaller hospital, or to any of the thousands of trauma centers in America which do not specialize in brain injuries, the chances are that she would have been dead by the time any of her family arrived. This is what trauma experts who are familiar with the case believe, and, of the many lessons of the Central Park beating, it is the one that is hardest to understand. It's not, after all, as if Urgent Four were suffering from a rare and difficult brain tumor. Brain trauma is the leading cause of death due to injury for Americans under forty-five, and results in the death of some sixty thousand people every year. Nor is it as if Urgent Four had been given some kind of daring experimental therapy, available only at the most exclusive research hospitals. The insertion of the ventricular catheter is something that all neurosurgeons are taught to do in their first year of residency. CAT scanners are in every hospital. The removal of Urgent Four's blood clots was straightforward neurosurgery. The raising and monitoring of blood pressure are taught in Nursing 101. Urgent Four was treated according to standards and protocols that have been discussed in the medical literature, outlined at conferences, and backed by every expert in the field.
Yet the fact is that if she had been taken to a smaller hospital or to any one of the thousands of trauma centers in America which do not specialize in brain injuries she would have been treated very differently. When Ghajar and five other researchers surveyed the country's trauma centers five years ago, they found that seventy-nine per cent of the coma patients were routinely given steroids, despite the fact that steroids have been shown repeatedly to be of no use-and possibly of some harm-in reducing intracranial pressure. Ninety-five per cent of the centers surveyed were relying as well on hyperventilation, in which a patient is made to breathe more rapidly to reduce swelling-a technique that specialists like Ghajar will use only as a last resort. Prolonged hyperventilation does reduce ICP, but it can also end up reducing the flow of blood to the brain, causing irreversible brain damage. The most troubling finding, however, was that only a third of the trauma centers surveyed said that they routinely monitored ICP at all. In another hospital, the surge in Urgent Four's ICP on Tuesday night which signalled the formation of a blood clot might not have been caught.
Such dramatic variations in medical practice are hardly confined to neurosurgery. It is not unusual for doctors in one community to perform hysterectomies, say, at two or three times the rate of doctors in another town. Rates for some cardiac procedures differ around the country by as much as fifty per cent. Obstetrical specialists are almost twice as likely to deliver children by cesarean section as family physicians are. In one classic study published seven years ago, a team of researchers found that children in Boston were 3.8 times as likely to be hospitalized for asthma as children in Rochester, New York; 6.1 times as likely to be hospitalized for accidental poisoning; and 2.6 times as likely to be hospitalized for head injury.
In most cases, however, the concern about practice variation has focussed on the issue of cost. The point of the Boston- Rochester study was not that the children of Boston were receiving considerably better care than their counterparts in upstate New York but, rather, that health care for children in Boston might well be needlessly expensive. When it comes to brain injury, the stakes are a little higher. At the handful of centers around the country specializing in brain trauma, it is now not unusual for the mortality rates of coma patients to run in the range of twenty per cent or less. At trauma centers where brain injury is not a specialty, mortality rates for coma patients are often twice that. "If I break my leg, I don't care where I go," Randall Chesnut, a trauma specialist at San Francisco General Hospital, told me. "But, if I hit my head, I want to choose my hospital."
Part of the problem is that in the field of neurosurgery it has been difficult to reach hard, scientific conclusions about procedures and treatments. Physicians in the field have long assumed, for example, that blood clots in the brain should be removed as soon as possible. But how could that assumption ever be scientifically verified? Who would ever agree to let a comatose family member lie still with a mass of congealed blood in the brain while a team of curious researchers watched to see what happened? The complexity and mystery of the brain has, moreover, led to a culture that rewards intuition, and has thus convinced each neurosurgeon that his own experience is as valid as anyone else's. Worse, brain injury is an area that is of no more than passing interest to many neurosurgeons. Most neurosurgeons make their living doing disk surgery and removing brain tumors. Trauma is an afterthought. It doesn't pay particularly well, because many car-accident and shooting victims don't have insurance. (Urgent Four herself was without insurance, and a public collection has been made to help defray her medical expenses.) Nor does it pose the kind of surgical challenge that, say, an aneurysm or a tumor does. "It's something like-well, you've got mashed-up brains, and someone got hit by a car, and it's not really very interesting," Ghajar says. "But brain tumors are kind of interesting. What's happening with the DNA? Why does a tumor develop?"
Then, there are the hours, long and unpredictable, tied to the rhythms of street thugs and drunk drivers. Ghajar, for example, routinely works through the night. He practices primarily out of Jamaica Hospital, not the far more prestigious New York Hospital, because Jamaica gets serious brain-trauma cases every second day and New York might get one only every second week. "If I were operating and doing disks and brain tumors, I'd be making ten times as much," he says. In the entire country, there are probably no more than two dozen neurosurgeons who, like Ghajar, exclusively focus on researching and treating brain trauma.
Ghajar says that in talking to other neurosurgeons he sensed a certain resignation in treating brain injury-a feeling that the prognosis facing coma patients was so poor that the neurosurgeon's role was limited. "It wasn't that the neurosurgeons were lazy," Ghajar said. "It was just that there was so much information out there that it was confusing. When they got young people in comas, half of the patients would die. And the half that lived would be severely disabled, so the neurosurgeon is saying, 'What am I doing for these people? Am I saving vegetables?' And that was honestly the feeling that neurosurgeons had, because the methods they were trained in and were using would produce that kind of result."
Three years ago, after a neurosurgery meeting in Vancouver, Ghajar-along with Randall Chesnut and Donald W. Marion, a brain-trauma specialist at the University of Pittsburgh-decided to act. For help they turned to the Brain Trauma Foundation, which is the education arm of the brain-trauma institute started by Sunny von Bülow's children. The foundation gathered some of the world's top brain-injury specialists together for eleven meetings between the winter of 1994 and last summer. Four thousand scientific papers covering fourteen aspects of brain- injury management were reviewed. Peter C. Quinn, the executive director of the Brain Trauma Foundation, who coördinated the effort, says, "Sometimes I felt I was in a courtroom drama, because what they did was argue the evidence of the scientific documents, and as soon as someone said, 'It's been my experience,' everyone would say, 'Oh, no, that won't cut it. We want to know what the evidence is.' They would come in on a Friday and work all day Saturday and Sunday. They'd work a twenty-hour weekend. It was gruelling."
In March of this year, the group produced a book-a blue three-ring binder with fifteen bright-colored chapter tabs-laying out the scientific evidence and state-of-the-art treatment in every phase of brain-trauma care. The guidelines represent the first successful attempt by the neurosurgery community to come up with a standard treatment protocol, and if they are adopted by anything close to a majority of the country's trauma centers they could save more than ten thousand lives a year. A copy has now been sent to every neurosurgeon in the country. The Brain Trauma Foundation has mailed the guidelines to scientific journals, hospitals, managed-care groups, and insurance companies, and the neurosurgeons involved with the project have been promoting their work at medical meetings around the country. This is why the story of the Central Park victim does not end the way most medical dramas end, in empty celebration of heroics and exceptionalism, but instead has become a powerful symbol of the campaign to reform neurosurgery. For everything Jam Ghajar used to save Urgent Four's life is in that binder.
"What we are hoping is that if a woman gets hurt in the middle of rural Wyoming, and there is a neurosurgeon there and a hospital with an I.C.U., then she will have as good a chance to survive as she would in the middle of New York City," I was told by Jack Wilberger, Jr., who is an associate professor of neurosurgery at the University of Pittsburgh Medical Center and a member of the guidelines team. "That's what we're hoping for. To give everyone the same chance, to give a everyone a level playing field."
6.
Urgent Four had one more scare before she began her climb toward recovery. Late Sunday night, her ICP began to rise again, back up into the thirties. Ghajar, who was in Paris meeting with the World Health Organization about the brain- trauma guidelines and was calling in to the hospital residents for updates, began to get worried. He booked a flight home. While he was in the air, Urgent Four's condition worsened. A third cat scan was ordered, and it showed that she had developed a second clot-this time on her left temporal lobe, in the place behind her ear where her attacker had banged the back of her head. This clot was far more serious than the first, because the temporal lobe is the seat of comprehension, and to remove the clot might well risk damaging Urgent Four's ability both to speak and to understand. "At about twelve-thirty, quarter to one on Monday, there was a pounding on the door of our room," the patient's father said. "We were wanted back on the floor, and we had to make a decision within a very few minutes on whether they should operate. What we were given was: If you don't operate, she might die. The other side of it was that if they did operate it could save her life but with a decent likelihood that she might be very badly impaired. So we and our two daughters went back and thrashed it out and we unanimously decided to go forward."
It was by then one-thirty in the morning. For four hours, the family waited, sleepless and exhausted, terrified that they had made the wrong decision. At dawn, the surgeon filling in for Ghajar, Michael Lavyne, emerged from the operating room. A miracle had happened, he reported: as soon as an incision was made, the clot had just popped out, all on its own. "They got lucky," Ghajar says.
From that point, Urgent Four's progress was steady. Her eye opened. Then she began to talk. The swelling around her face receded. Her ICP became normal. Soon she was sitting up. By last week, she was working with a speech therapist, and Ghajar and her father had begun driving around the New York area looking for a good rehabilitation center.
"Yesterday, she was looking at me, and I said, 'You know, you had a bad accident, and your brain was bruised'-I'd told everyone not to tell her she was assaulted. 'Your brain was bruised, and you are recovering.' She looked at me and she frowned. Her eye went up with this 'Oh, really?' look. And I said, 'Do you remember your accident?' She shook her head. But it's too early. Sometimes they do." Ghajar went on, "We are very good at predicting outcome, in the sense of mortality, but we're not good at predicting functional outcome, which is the constant question for this patient. 'Is she going to be able to play the piano?' We still can't answer that question."
In his first week back on call after the Urgent Four case, Ghajar saw three new coma patients. The latest was a thirty- year-old man who had barely survived a serious car accident. He was in worse shape than Urgent Four had been, with a hemorrhage on top of his brain. He was admitted to Jamaica Hospital on Monday at 11 p.m., and Ghajar operated from midnight to 6 a.m. He inserted a catheter in the patient's skull to drain the spinal fluid and monitored his blood pressure, to make sure it was seventy points higher than his ICP. Then, that evening-fourteen hours later-the patient's condition worsened. "I had to go back in and take out the hemorrhages," Ghajar said, and there was a note of exhaustion in his voice. He left the hospital at one o'clock Wednesday morning.
"People want to personalize this," Ghajar said. He was on Seventy-second Street, outside his office, walking back to New York Hospital to visit Urgent Four. "I guess that's human nature. They want to say, 'It's Dr. Ghajar's protocol. He's a wonderful doctor.' But that's not it. These are standards developed according to the best available science. These are standards that everyone can use."
The Science of Shopping
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
November 4, 1996
A REPORTER AT LARGE
The American shopper has never been so fickle.
What are stores, including the new flagship designer
boutiques, doing about it? Applying science.
1.
Human beings walk the way they drive, which is to say that Americans tend to keep to the right when they stroll down shopping-mall concourses or city sidewalks. This is why in a well-designed airport travellers drifting toward their gate will always find the fast-food restaurants on their left and the gift shops on their right: people will readily cross a lane of pedestrian traffic to satisfy their hunger but rarely to make an impulse buy of a T-shirt or a magazine. This is also why Paco Underhill tells his retail clients to make sure that their window displays are canted, preferably to both sides but especially to the left, so that a potential shopper approaching the store on the inside of the sidewalk-the shopper, that is, with the least impeded view of the store window-can see the display from at least twenty-five feet away.
Of course, a lot depends on how fast the potential shopper is walking. Paco, in his previous life, as an urban geographer in Manhattan, spent a great deal of time thinking about walking speeds as he listened in on the great debates of the nineteen-seventies over whether the traffic lights in midtown should be timed to facilitate the movement of cars or to facilitate the movement of pedestrians and so break up the big platoons that move down Manhattan sidewalks. He knows that the faster you walk the more your peripheral vision narrows, so you become unable to pick up visual cues as quickly as someone who is just ambling along. He knows, too, that people who walk fast take a surprising amount of time to slow down-just as it takes a good stretch of road to change gears with a stick-shift automobile. On the basis of his research, Paco estimates the human downshift period to be anywhere from twelve to twenty-five feet, so if you own a store, he says, you never want to be next door to a bank: potential shoppers speed up when they walk past a bank (since there's nothing to look at), and by the time they slow down they've walked right past your business. The downshift factor also means that when potential shoppers enter a store it's going to take them from five to fifteen paces to adjust to the light and refocus and gear down from walking speed to shopping speed-particularly if they've just had to navigate a treacherous parking lot or hurry to make the light at Fifty- seventh and Fifth. Paco calls that area inside the door the Decompression Zone, and something he tells clients over and over again is never, ever put anything of value in that zone- not shopping baskets or tie racks or big promotional displays- because no one is going to see it. Paco believes that, as a rule of thumb, customer interaction with any product or promotional display in the Decompression Zone will increase at least thirty per cent once it's moved to the back edge of the zone, and even more if it's placed to the right, because another of the fundamental rules of how human beings shop is that upon entering a store-whether it's Nordstrom or K mart, Tiffany or the Gap-the shopper invariably and reflexively turns to the right. Paco believes in the existence of the Invariant Right because he has actually verified it. He has put cameras in stores trained directly on the doorway, and if you go to his office, just above Union Square, where videocassettes and boxes of Super-eight film from all his work over the years are stacked in plastic Tupperware containers practically up to the ceiling, he can show you reel upon reel of grainy entryway video-customers striding in the door, downshifting, refocussing, and then, again and again, making that little half turn.
Paco Underhill is a tall man in his mid-forties, partly bald, with a neatly trimmed beard and an engaging, almost goofy manner. He wears baggy khakis and shirts open at the collar, and generally looks like the academic he might have been if he hadn't been captivated, twenty years ago, by the ideas of the urban anthropologist William Whyte. It was Whyte who pioneered the use of time-lapse photography as a tool of urban planning, putting cameras in parks and the plazas in front of office buildings in midtown Manhattan, in order to determine what distinguished a public space that worked from one that didn't. As a Columbia undergraduate, in 1974, Paco heard a lecture on Whyte's work and, he recalls, left the room "walking on air." He immediately read everything Whyte had written. He emptied his bank account to buy cameras and film and make his own home movie, about a pedestrian mall in Poughkeepsie. He took his "little exercise" to Whyte's advocacy group, the Project for Public Spaces, and was offered a job. Soon, however, it dawned on Paco that Whyte's ideas could be taken a step further-that the same techniques he used to establish why a plaza worked or didn't work could also be used to determine why a store worked or didn't work. Thus was born the field of retail anthropology, and, not long afterward, Paco founded Envirosell, which in just over fifteen years has counselled some of the most familiar names in American retailing, from Levi Strauss to Kinney, Starbucks, McDonald's, Blockbuster, Apple Computer, A.T. & T., and a number of upscale retailers that Paco would rather not name. When Paco gets an assignment, he and his staff set up a series of video cameras throughout the test store and then back the cameras up with Envirosell staffers-trackers, as they're known-armed with clipboards. Where the cameras go and how many trackers Paco deploys depends on exactly what the store wants to know about its shoppers. Typically, though, he might use six cameras and two or three trackers, and let the study run for two or three days, so that at the end he would have pages and pages of carefully annotated tracking sheets and anywhere from a hundred to five hundred hours of film. These days, given the expansion of his business, he might tape fifteen thousand hours in a year, and, given that he has been in operation since the late seventies, he now has well over a hundred thousand hours of tape in his library. Even in the best of times, this would be a valuable archive. But today, with the retail business in crisis, it is a gold mine. The time per visit that the average American spends in a shopping mall was sixty-six minutes last year-down from seventy-two minutes in 1992-and is the lowest number ever recorded. The amount of selling space per American shopper is now more than double what it was in the mid-seventies, meaning that profit margins have never been narrower, and the costs of starting a retail business-and of failing-have never been higher. In the past few years, countless dazzling new retailing temples have been built along Fifth and Madison Avenues- Barneys, Calvin Klein, Armani, Valentino, Banana Republic, Prada, Chanel, Nike Town, and on and on-but it is an explosion of growth based on no more than a hunch, a hopeful multimillion-dollar gamble that the way to break through is to provide the shopper with spectacle and more spectacle. "The arrogance is gone," Millard Drexler, the president and CEO of the Gap, told me. "Arrogance makes failure. Once you think you know the answer, it's almost always over." In such a competitive environment, retailers don't just want to know how shoppers behave in their stores. They have to know. And who better to ask than Paco Underhill, who in the past decade and a half has analyzed tens of thousands of hours of shopping videotape and, as a result, probably knows more about the strange habits and quirks of the species Emptor americanus than anyone else alive?
2.
Paco is considered the originator, for example, of what is known in the trade as the butt-brush theory-or, as Paco calls it, more delicately, le facteur bousculade-which holds that the likelihood of a woman's being converted from a browser to a buyer is inversely proportional to the likelihood of her being brushed on her behind while she's examining merchandise. Touch-or brush or bump or jostle-a woman on the behind when she has stopped to look at an item, and she will bolt. Actually, calling this a theory is something of a misnomer, because Paco doesn't offer any explanation for why women react that way, aside from venturing that they are "more sensitive back there." It's really an observation, based on repeated and close analysis of his videotape library, that Paco has transformed into a retailing commandment: a women's product that requires extensive examination should never be placed in a narrow aisle.
Paco approaches the problem of the Invariant Right the same way. Some retail thinkers see this as a subject crying out for interpretation and speculation. The design guru Joseph Weishar, for example, argues, in his magisterial "Design for Effective Selling Space," that the Invariant Right is a function of the fact that we "absorb and digest information in the left part of the brain" and "assimilate and logically use this information in the right half," the result being that we scan the store from left to right and then fix on an object to the right "essentially at a 45 degree angle from the point that we enter." When I asked Paco about this interpretation, he shrugged, and said he thought the reason was simply that most people are right-handed. Uncovering the fundamentals of "why" is clearly not a pursuit that engages him much. He is not a theoretician but an empiricist, and for him the important thing is that in amassing his huge library of in- store time-lapse photography he has gained enough hard evidence to know how often and under what circumstances the Invariant Right is expressed and how to take advantage of it.
What Paco likes are facts. They come tumbling out when he talks, and, because he speaks with a slight hesitation-lingering over the first syllable in, for example, "re-tail" or "de-sign"-he draws you in, and you find yourself truly hanging on his words. "We have reached a historic point in American history," he told me in our very first conversation. "Men, for the first time, have begun to buy their own underwear." He then paused to let the comment sink in, so that I could absorb its implications, before he elaborated: "Which means that we have to totally rethink the way we sell that product." In the parlance of Hollywood scriptwriters, the best endings must be surprising and yet inevitable; and the best of Paco's pronouncements take the same shape. It would never have occurred to me to wonder about the increasingly critical role played by touching-or, as Paco calls it, petting- clothes in the course of making the decision to buy them. But then I went to the Gap and to Banana Republic and saw people touching and fondling and, one after another, buying shirts and sweaters laid out on big wooden tables, and what Paco told me-which was no doubt based on what he had seen on his videotapes-made perfect sense: that the reason the Gap and Banana Republic have tables is not merely that sweaters and shirts look better there, or that tables fit into the warm and relaxing residential feeling that the Gap and Banana Republic are trying to create in their stores, but that tables invite-indeed, symbolize-touching. "Where do we eat?" Paco asks. "We eat, we pick up food, on tables."
Paco produces for his clients a series of carefully detailed studies, totalling forty to a hundred and fifty pages, filled with product-by-product breakdowns and bright-colored charts and graphs. In one recent case, he was asked by a major clothing retailer to analyze the first of a new chain of stores that the firm planned to open. One of the things the client wanted to know was how successful the store was in drawing people into its depths, since the chances that shoppers will buy something are directly related to how long they spend shopping, and how long they spend shopping is directly related to how deep they get pulled into the store. For this reason, a supermarket will often put dairy products on one side, meat at the back, and fresh produce on the other side, so that the typical shopper can't just do a drive-by but has to make an entire circuit of the store, and be tempted by everything the supermarket has to offer. In the case of the new clothing store, Paco found that ninety-one per cent of all shoppers penetrated as deep as what he called Zone 4, meaning more than three-quarters of the way in, well past the accessories and shirt racks and belts in the front, and little short of the far wall, with the changing rooms and the pants stacked on shelves. Paco regarded this as an extraordinary figure, particularly for a long, narrow store like this one, where it is not unusual for the rate of penetration past, say, Zone 3 to be under fifty per cent. But that didn't mean the store was perfect-far from it. For Paco, all kinds of questions remained.
Purchasers, for example, spent an average of eleven minutes and twenty-seven seconds in the store, nonpurchasers two minutes and thirty-six seconds. It wasn't that the nonpurchasers just cruised in and out: in those two minutes and thirty-six seconds, they went deep into the store and examined an average of 3.42 items. So why didn't they buy? What, exactly, happened to cause some browsers to buy and other browsers to walk out the door?
Then, there was the issue of the number of products examined. The purchasers were looking at an average of 4.81 items but buying only 1.33 items. Paco found this statistic deeply disturbing. As the retail market grows more cutthroat, store owners have come to realize that it's all but impossible to increase the number of customers coming in, and have concentrated instead on getting the customers they do have to buy more. Paco thinks that if you can sell someone a pair of pants you must also be able to sell that person a belt, or a pair of socks, or a pair of underpants, or even do what the Gap does so well: sell a person a complete outfit. To Paco, the figure 1.33 suggested that the store was doing something very wrong, and one day when I visited him in his office he sat me down in front of one of his many VCRs to see how he looked for the 1.33 culprit.
It should be said that sitting next to Paco is a rather strange experience. "My mother says that I'm the best-paid spy in America," he told me. He laughed, but he wasn't entirely joking. As a child, Paco had a nearly debilitating stammer, and, he says, "since I was never that comfortable talking I always relied on my eyes to understand things." That much is obvious from the first moment you meet him: Paco is one of those people who look right at you, soaking up every nuance and detail. It isn't a hostile gaze, because Paco isn't hostile at all. He has a big smile, and he'll call you "chief" and use your first name a lot and generally act as if he knew you well. But that's the awkward thing: he has looked at you so closely that you're sure he does know you well, and you, meanwhile, hardly know him at all. This kind of asymmetry is even more pronounced when you watch his shopping videos with him, because every movement or gesture means something to Paco-he has spent his adult life deconstructing the shopping experience-but nothing to the outsider, or, at least, not at first. Paco had to keep stopping the video to get me to see things through his eyes before I began to understand. In one sequence, for example, a camera mounted high on the wall outside the changing rooms documented a man and a woman shopping for a pair of pants for what appeared to be their daughter, a girl in her mid-teens. The tapes are soundless, but the basic steps of the shopping dance are so familiar to Paco that, once I'd grasped the general idea, he was able to provide a running commentary on what was being said and thought. There is the girl emerging from the changing room wearing her first pair. There she is glancing at her reflection in the mirror, then turning to see herself from the back. There is the mother looking on. There is the father-or, as fathers are known in the trade, the "wallet carrier"-stepping forward and pulling up the jeans. There's the girl trying on another pair. There's the primp again. The twirl. The mother. The wallet carrier. And then again, with another pair. The full sequence lasted twenty minutes, and at the end came the take-home lesson, for which Paco called in one of his colleagues, Tom Moseman, who had supervised the project. "This is a very critical moment," Tom, a young, intense man wearing little round glasses, said, and he pulled up a chair next to mine. "She's saying, 'I don't know whether I should wear a belt.' Now here's the salesclerk. The girl says to him, 'I need a belt,' and he says, 'Take mine.' Now there he is taking her back to the full-length mirror." A moment later, the girl returns, clearly happy with the purchase. She wants the jeans. The wallet carrier turns to her, and then gestures to the salesclerk. The wallet carrier is telling his daughter to give back the belt. The girl gives back the belt. Tom stops the tape. He's leaning forward now, a finger jabbing at the screen. Beside me, Paco is shaking his head. I don't get it-at least, not at first-and so Tom replays that last segment. The wallet carrier tells the girl to give back the belt. She gives back the belt. And then, finally, it dawns on me why this store has an average purchase number of only 1.33. "Don't you see?" Tom said. "She wanted the belt. A great opportunity to make an add-on sale . . . lost!"
3.
Should we be afraid of Paco Underhill? One of the fundamental anxieties of the American consumer, after all, has always been that beneath the pleasure and the frivolity of the shopping experience runs an undercurrent of manipulation, and that anxiety has rarely seemed more justified than today. The practice of prying into the minds and habits of American consumers is now a multibillion-dollar business. Every time a product is pulled across a supermarket checkout scanner, information is recorded, assembled, and sold to a market-research firm for analysis. There are companies that put tiny cameras inside frozen-food cases in supermarket aisles; market-research firms that feed census data and behavioral statistics into algorithms and come out with complicated maps of the American consumer; anthropologists who sift through the garbage of carefully targeted households to analyze their true consumption patterns; and endless rounds of highly organized focus groups and questionnaire takers and phone surveyors. That some people are now tracking our every shopping move with video cameras seems in many respects the last straw: Paco's movies are, after all, creepy. They look like the surveillance videos taken during convenience-store holdups-hazy and soundless and slightly warped by the angle of the lens. When you watch them, you find yourself waiting for something bad to happen, for someone to shoplift or pull a gun on a cashier.
The more time you spend with Paco's videos, though, the less scary they seem. After an hour or so, it's no longer clear whether simply by watching people shop-and analyzing their every move-you can learn how to control them. The shopper that emerges from the videos is not pliable or manipulable. The screen shows people filtering in and out of stores, petting and moving on, abandoning their merchandise because checkout lines are too long, or leaving a store empty-handed because they couldn't fit their stroller into the aisle between two shirt racks. Paco's shoppers are fickle and headstrong, and are quite unwilling to buy anything unless conditions are perfect-unless the belt is presented at exactly the right moment. His theories of the butt-brush and petting and the Decompression Zone and the Invariant Right seek not to make shoppers conform to the desires of sellers but to make sellers conform to the desires of shoppers. What Paco is teaching his clients is a kind of slavish devotion to the shopper's every whim. He is teaching them humility. Paco has worked with supermarket chains, and when you first see one of his videos of grocery aisles it looks as if he really had- at least in this instance-got one up on the shopper. The clip he showed me was of a father shopping with a small child, and it was an example of what is known in the trade as "advocacy," which basically means what happens when your four-year-old goes over and grabs a bag of cookies that the store has conveniently put on the bottom shelf, and demands that it be purchased. In the clip, the father takes what the child offers him. "Generally, dads are not as good as moms at saying no," Paco said as we watched the little boy approach his dad. "Men tend to be more impulse-driven than women in grocery stores. We know that they tend to shop less often with a list. We know that they tend to shop much less frequently with coupons, and we know, simply by watching them shop, that they can be marching down the aisle and something will catch their eye and they will stop and buy." This kind of weakness on the part of fathers might seem to give the supermarket an advantage in the cookie-selling wars, particularly since more and more men go grocery shopping with their children. But then Paco let drop a hint about a study he'd just done in which he discovered, to his and everyone else's amazement, that shoppers had already figured this out, that they were already one step ahead-that families were avoiding the cookie aisle. This may seem like a small point. But it begins to explain why, even though retailers seem to know more than ever about how shoppers behave, even though their efforts at intelligence-gathering have rarely seemed more intrusive and more formidable, the retail business remains in crisis. The reason is that shoppers are a moving target. They are becoming more and more complicated, and retailers need to know more and more about them simply to keep pace. This fall, for example, Estée Lauder is testing in a Toronto shopping mall a new concept in cosmetics retailing. Gone is the enclosed rectangular counter, with the sales staff on one side, customers on the other, and the product under glass in the middle. In its place the company has provided an assortment of product-display, consultation, and testing kiosks arranged in a broken circle, with a service desk and a cashier in the middle. One of the kiosks is a "makeup play area," which allows customers to experiment on their own with a hundred and thirty different shades of lipstick. There are four self-service displays-for perfumes, skin-care products, and makeup-which are easily accessible to customers who have already made up their minds. And, for those who haven't, there is a semiprivate booth for personal consultations with beauty advisers and makeup artists. The redesign was prompted by the realization that the modern working woman no longer had the time or the inclination to ask a salesclerk to assist her in every purchase, that choosing among shades of lipstick did not require the same level of service as, say, getting up to speed on new developments in skin care, that a shopper's needs were now too diverse to be adequately served by just one kind of counter. "I was going from store to store, and the traffic just wasn't there," Robin Burns, the president and C.E.O. of Estée Lauder U.S.A. and Canada, told me. "We had to get rid of the glass barricade." The most interesting thing about the new venture, though, is what it says about the shifting balance of power between buyer and seller. Around the old rectangular counter, the relationship of clerk to customer was formal and subtly paternalistic. If you wanted to look at a lipstick, you had to ask for it. "Twenty years ago, the sales staff would consult with you and tell you what you needed, as opposed to asking and recommending," Burns said. "And in those days people believed what the salesperson told them." Today, the old hierarchy has been inverted. "Women want to draw their own conclusions," Burns said. Even the architecture of the consultation kiosk speaks to the transformation: the beauty adviser now sits beside the customer, not across from her.
4.
This doesn't mean that marketers and retailers have stopped trying to figure out what goes on in the minds of shoppers. One of the hottest areas in market research, for example, is something called typing, which is a sophisticated attempt to predict the kinds of products that people will buy or the kind of promotional pitch they will be susceptible to on the basis of where they live or how they score on short standardized questionnaires. One market-research firm in Virginia, Claritas, has divided the entire country, neighborhood by neighborhood, into sixty-two different categories-Pools & Patios, Shotguns & Pickups, Bohemia Mix, and so on-using census data and results from behavioral surveys. On the basis of my address in Greenwich Village, Claritas classifies me as Urban Gold Coast, which means that I like Kellogg's Special K, spend more than two hundred and fifty dollars on sports coats, watch "Seinfeld," and buy metal polish. Such typing systems-and there are a number of them- can be scarily accurate. I actually do buy Kellogg's Special K, have spent more than two hundred and fifty dollars on a sports coat, and watch "Seinfeld." (I don't buy metal polish.) In fact, when I was typed by a company called Total Research, in Princeton, the results were so dead-on that I got the same kind of creepy feeling that I got when I first watched Paco's videos. On the basis of a seemingly innocuous multiple-choice test, I was scored as an eighty-nine-per-cent Intellect and a seven-per-cent Relief Seeker (which I thought was impressive until John Morton, who developed the system, told me that virtually everyone who reads The New Yorker is an Intellect). When I asked Morton to guess, on the basis of my score, what kind of razor I used, he riffed, brilliantly, and without a moment's hesitation. "If you used an electric razor, it would be a Braun," he began. "But, if not, you're probably shaving with Gillette, if only because there really isn't an Intellect safety-razor positioning out there. Schick and Bic are simply not logical choices for you, although I'm thinking, You're fairly young, and you've got that Relief Seeker side. It's possible you would use Bic because you don't like that all- American, overly confident masculine statement of Gillette. It's a very, very conventional positioning that Gillette uses. But then they've got the technological angle with the Gillette Sensor. . . . I'm thinking Gillette. It's Gillette."
He was right. I shave with Gillette-though I didn't even know that I do. I had to go home and check. But information about my own predilections may be of limited usefulness in predicting how I shop. In the past few years, market researchers have paid growing attention to the role in the shopping experience of a type of consumer known as a Market Maven. "This is a person you would go to for advice on a car or a new fashion," said Linda Price, a marketing professor at the University of South Florida, who first came up with the Market Maven concept, in the late eighties. "This is a person who has information on a lot of different products or prices or places to shop. This is a person who likes to initiate discussions with consumers and respond to requests. Market Mavens like to be helpers in the marketplace. They take you shopping. They go shopping for you, and it turns out they are a lot more prevalent than you would expect." Mavens watch more television than almost anyone else does, and they read more magazines and open their junk mail and look closely at advertisements and have an awful lot of influence on everyone else. According to Price, sixty per cent of Americans claim to know a Maven.
The key question, then, is not what I think but what my Mavens think. The challenge for retailers and marketers, in turn, is not so much to figure out and influence my preferences as to figure out and influence the preferences of my Mavens, and that is a much harder task. "What's really interesting is that the distribution of Mavens doesn't vary by ethnic category, by income, or by professional status," Price said. "A working woman is just as likely to be a Market Maven as a nonworking woman. You might say that Mavens are likely to be older, unemployed people, but that's wrong, too. There is simply not a clear demographic guide to how to find these people." More important, Mavens are better consumers than most of the rest of us. In another of the typing systems, developed by the California-based SRI International, Mavens are considered to be a subcategory of the consumer type known as Fulfilled, and Fulfilleds, one SRI official told me, are "the consumers from Hell-they are very feature oriented." He explained, "They are not pushed by promotions. You can reach them, but it's an intellectual argument." As the complexity of the marketplace grows, in other words, we have responded by appointing the most skeptical and the most savvy in our midst to mediate between us and sellers. The harder stores and manufacturers work to sharpen and refine their marketing strategies, and the harder they try to read the minds of shoppers, the more we hide behind Mavens.
5.
Imagine that you want to open a clothing store, men's and women's, in the upper-middle range-say, khakis at fifty dollars, dress shirts at forty dollars, sports coats and women's suits at two hundred dollars and up. The work of Paco Underhill would suggest that in order to succeed you need to pay complete and concentrated attention to the whims of your customers. What does that mean, in practical terms? Well, let's start with what's called the shopping gender gap. In the retail-store study that Paco showed me, for example, male buyers stayed an average of nine minutes and thirty-nine seconds in the store and female buyers stayed twelve minutes and fifty-seven seconds. This is not atypical. Women always shop longer than men, which is one of the major reasons that in the standard regional mall women account for seventy per cent of the dollar value of all purchases. "Women have more patience than men," Paco says. "Men are more distractible. Their tolerance level for confusion or time spent in a store is much shorter than women's." If you wanted, then, you could build a store designed for men, to try to raise that thirty-per-cent sales figure to forty or forty-five per cent. You could make the look more masculine-more metal, darker woods. You could turn up the music. You could simplify the store, put less product on the floor. "I'd go narrow and deep," says James Adams, the design director for NBBJ Retail Concepts, a division of one of the country's largest retail- design firms. "You wouldn't have fifty different cuts of pants. You'd have your four basics with lots of color. You know the Garanimals they used to do to help kids pick out clothes, where you match the giraffe top with the giraffe bottom? I'm sure every guy is like 'I wish I could get those, too.' You'd want to stick with the basics. Making sure most of the color story goes together. That is a big deal with guys, because they are always screwing the colors up." When I asked Carrie Gennuso, the Gap's regional vice-president for New York, what she would do in an all-male store, she laughed and said, "I might do fewer displays and more signage. Big signs. Men! Smalls! Here!" As a rule, though, you wouldn't want to cater to male customers at the expense of female ones. It's no accident that many clothing stores have a single look in both men's and women's sections, and that the quintessential nineties look-light woods, white walls-is more feminine than masculine. Women are still the shoppers in America, and the real money is to be made by making retailing styles more female-friendly, not less. Recently, for example, NBBJ did a project to try to increase sales of the Armstrong flooring chain. Its researchers found that the sales staff was selling the flooring based on its functional virtues-the fact that it didn't scuff, that it was long-lasting, that it didn't stain, that it was easy to clean. It was being sold by men to men, as if it were a car or a stereo. And that was the problem. "It's a wonder product technologically," Adams says. "But the woman is the decision-maker on flooring, and that's not what's she's looking for. This product is about fashion, about color and design. You don't want to get too caught up in the man's way of thinking."
To appeal to men, then, retailers do subtler things. At the Banana Republic store on Fifth Avenue in midtown, the men's socks are displayed near the shoes and between men's pants and the cash register (or cash/wrap, as it is known in the trade), so that the man can grab them easily as he rushes to pay. Women's accessories are by the fitting rooms, because women are much more likely to try on pants first, and then choose an item like a belt or a bag. At the men's shirt table, the display shirts have matching ties on them-the tie table is next to it-in a grownup version of the Garanimals system. But Banana Republic would never match scarves with women's blouses or jackets. "You don't have to be that direct with women," Jeanne Jackson, the president of Banana Republic, told me. "In fact, the Banana woman is proud of her sense of style. She puts her own looks together." Jackson said she liked the Fifth Avenue store because it's on two floors, so she can separate men's and women's sections and give men what she calls "clarity of offer," which is the peace of mind that they won't inadvertently end up in, say, women's undergarments. In a one-floor store, most retailers would rather put the menswear up front and the women's wear at the back (that is, if they weren't going to split the sexes left and right), because women don't get spooked navigating through apparel of the opposite sex, whereas men most assuredly do. (Of course, in a store like the Gap at Thirty- ninth and Fifth, where, Carrie Gennuso says, "I don't know if I've ever seen a man," the issue is moot. There, it's safe to put the women's wear out front.)
The next thing retailers want to do is to encourage the shopper to walk deep into the store. The trick there is to put "destination items"-basics, staples, things that people know you have and buy a lot of-at the rear of the store. Gap stores, invariably, will have denim, which is a classic destination item for them, on the back wall. Many clothing stores also situate the cash/wrap and the fitting rooms in the rear of the store, to compel shoppers to walk back into Zone 3 or 4. In the store's prime real estate-which, given Paco's theory of the Decompression Zone and the Invariant Right, is to the right of the front entrance and five to fifteen paces in-you always put your hottest and newest merchandise, because that's where the maximum number of people will see it. Right now, in virtually every Gap in the country, the front of the store is devoted to the Gap fall look-casual combinations in black and gray, plaid shirts and jackets, sweaters, black wool and brushed-twill pants. At the Gap at Fifth Avenue and Seventeenth Street, for example, there is a fall ensemble of plaid jacket, plaid shirt, and black pants in the first prime spot, followed, three paces later, by an ensemble of gray sweater, plaid shirt, T-shirt, and black pants, followed, three paces after that, by an ensemble of plaid jacket, gray sweater, white T-shirt, and black pants. In all, three variations on the same theme, each placed so that the eye bounces naturally from the first to the second to the third, and then, inexorably, to a table deep inside Zone 1 where merchandise is arrayed and folded for petting. Every week or ten days, the combinations will change, the "look" highlighted at the front will be different, and the entryway will be transformed.
Through all of this, the store environment-the lighting, the colors, the fixtures-and the clothes have to work together. The point is not so much beauty as coherence. The clothes have to match the environment. "In the nineteen-seventies, you didn't have to have a complete wardrobe all the time," Gabriella Forte, the president and chief operating officer of Calvin Klein, says. "I think now the store has to have a complete point of view. It has to have all the options offered, so people have choices. It's the famous one-stop shopping. People want to come in, be serviced, and go out. They want to understand the clear statement the designer is making."
At the new Versace store on Fifth Avenue, in the restored neoclassical Vanderbilt mansion, Gianni Versace says that the "statement" he is making with the elaborate mosaic and parquet floors, the marble façade and the Corinthian columns is "quality-my message is always a scream for quality." At her two new stores in London, Donna Karan told me, she never wants "customers to think that they are walking into a clothing store." She said, "I want them to think that they are walking into an environment, that I am transforming them out of their lives and into an experience, that it's not about clothes, it's about who they are as people." The first thing the shopper sees in her stark, all-white DKNY store is a video monitor and café: "It's about energy," Karan said, "and nourishment." In her more sophisticated, "collection" store, where the walls are black and ivory and gold, the first thing that the customer notices is the scent of a candle: "I wanted a nurturing environment where you feel that you will be taken care of." And why, at a Giorgio Armani store, is there often only a single suit in each style on display? Not because the store has only the one suit in stock but because the way the merchandise is displayed has to be consistent with the message of the designers: that Armani suits are exclusive, that the Armani customer isn't going to run into another man wearing his suit every time he goes to an art opening at Gagosian.
The best stores all have an image-or what retailers like to call a "point of view." The flagship store for Ralph Lauren's Polo collection, for example, is in the restored Rhinelander mansion, on Madison Avenue and Seventy-second Street. The Polo Mansion, as it is known, is alive with color and artifacts that suggest a notional prewar English gentility. There are fireplaces and comfortable leather chairs and deep-red Oriental carpets and soft, thick drapes and vintage photographs and paintings of country squires and a color palette of warm crimsons and browns and greens-to the point that after you've picked out a double-breasted blazer or a cashmere sweater set or an antique silver snuffbox you feel as though you ought to venture over to Central Park for a vigorous morning of foxhunting. The Calvin Klein flagship store, twelve blocks down Madison Avenue, on the other hand, is a vast, achingly beautiful minimalist temple, with white walls, muted lighting, soaring ceilings, gray stone flooring, and, so it seems, less merchandise in the entire store than Lauren puts in a single room. The store's architect, John Pawson, says, "People who enter are given a sense of release. They are getting away from the hustle and bustle of the street and New York. They are in a calm space. It's a modern idea of luxury, to give people space."
The first thing you see when you enter the Polo Mansion is a display of two hundred and eight sweaters, in twenty- eight colors, stacked in a haberdasher's wooden fixture, behind an antique glass counter; the first thing you see at the Klein store is a white wall, and then, if you turn to the right, four clear-glass shelves, each adorned with three solitary- looking black handbags. The Polo Mansion is an English club. The Klein store, Pawson says, is the equivalent of an art gallery, a place where "neutral space and light make a work of art look the most potent." When I visited the Polo Mansion, the stereo was playing Bobby Short. At Klein, the stereo was playing what sounded like Brian Eno. At the Polo Mansion, I was taken around by Charles Fagan, a vice-president at Polo Ralph Lauren. He wore pale-yellow socks, black loafers, tight jeans, a pale-purple polo shirt, blue old-school tie, and a brown plaid jacket-which sounds less attractive on paper than it was in reality. He looked, in a very Ralph Lauren way, fabulous. He was funny and engaging and bounded through the store, keeping up a constant patter ("This room is sort of sportswear, Telluride-y, vintage"), all the while laughing and hugging people and having his freshly cut red hair tousled by the sales assistants in each section. At the Calvin Klein store, the idea that the staff-tall, austere, sombre-suited-might laugh and hug and tousle each other's hair is unthinkable. Lean over and whisper, perhaps. At the most, murmur discreetly into tiny black cellular phones. Visiting the Polo Mansion and the Calvin Klein flagship in quick succession is rather like seeing a "Howards End"-"The Seventh Seal" double feature.
Despite their differences, though, these stores are both about the same thing-communicating the point of view that shoppers are now thought to demand. At Polo, the "life style" message is so coherent and all-encompassing that the store never has the 1.33 items-per-purchase problem that Paco saw in the retailer he studied. "We have multiple purchases in excess-it's the cap, it's the tie, it's the sweater, it's the jacket, it's the pants," Fagan told me, plucking each item from its shelf and tossing it onto a tartan-covered bench seat. "People say, 'I have to have the belt.' It's a life-style decision."
As for the Klein store, it's really concerned with setting the tone for the Calvin Klein clothes and products sold outside the store-including the designer's phenomenally successful underwear line, the sales of which have grown nearly fivefold in the past two and a half years, making it one of the country's dominant brands. Calvin Klein underwear is partly a design triumph: lowering the waistband just a tad in order to elongate, and flatter, the torso. But it is also a triumph of image-transforming, as Gabriella Forte says, a "commodity good into something desirable," turning a forgotten necessity into fashion. In the case of women's underwear, Bob Mazzoli, president of Calvin Klein Underwear, told me that the company "obsessed about the box being a perfect square, about the symmetry of it all, how it would feel in a woman's hand." He added, "When you look at the boxes they are little works of art." And the underwear itself is without any of the usual busyness-without, in Mazzoli's words, "the excessive detail" of most women's undergarments. It's a clean look, selling primarily in white, heather gray, and black. It's a look, in other words, not unlike that of the Calvin Klein flagship store, and it exemplifies the brilliance of the merchandising of the Calvin Klein image: preposterous as it may seem, once you've seen the store and worn the underwear, it's difficult not to make a connection between the two.
All this imagemaking seeks to put the shopping experience in a different context, to give it a story line. "I wish that the customers who come to my stores feel the same comfort they would entering a friend's house-that is to say, that they feel at ease, without the impression of having to deal with the 'sanctum sanctorum' of a designer," Giorgio Armani told me. Armani has a house. Donna Karan has a kitchen and a womb. Ralph Lauren has a men's club. Calvin Klein has an art gallery. These are all very different points of view. What they have in common is that they have nothing to do with the actual act of shopping. (No one buys anything at a friend's house or a men's club.) Presumably, by engaging in this kind of misdirection designers aim to put us at ease, to create a kind of oasis. But perhaps they change the subject because they must, because they cannot offer an ultimate account of the shopping experience itself. After all, what do we really know, in the end, about why people buy? We know about the Invariant Right and the Decompression Zone. We know to put destination items at the back and fashion at the front, to treat male shoppers like small children, to respect the female derrière, and to put the socks between the cash/wrap and the men's pants. But this is grammar; it's not prose. It is enough. But it is not much.
6.
One of the best ways to understand the new humility in shopping theory is to go back to the work of William Whyte. Whyte put his cameras in parks and in the plazas in front of office buildings because he believed in the then radical notion that the design of public spaces had been turned inside out- that planners were thinking of their designs first and of people second, when they should have been thinking of people first and of design second. In his 1980 classic, "The Social Life of Small Urban Spaces," for example, Whyte trained his cameras on a dozen or so of the public spaces and small parks around Manhattan, like the plaza in front of the General Motors Building, on Fifth Avenue, and the small park at 77 Water Street, downtown, and Paley Park, on Fifty-third Street, in order to determine why some, like the tiny Water Street park, averaged well over a hundred and fifty people during a typical sunny lunch hour and others, like the much bigger plaza at 280 Park Avenue, were almost empty. He concluded that all the things used by designers to attempt to lure people into their spaces made little or no difference. It wasn't the size of the space, or its beauty, or the presence of waterfalls, or the amount of sun, or whether a park was a narrow strip along the sidewalk or a pleasing open space. What mattered, overwhelmingly, was that there were plenty of places to sit, that the space was in some way connected to the street, and-the mystical circularity-that it was already well frequented. "What attracts people most, it would appear, is other people," Whyte noted:
If I labor the point, it is because many urban spaces still are being designed as though the opposite were true-as though what people liked best were the places they stay away from. People often do talk along such lines, and therefore their responses to questionnaires can be entirely misleading. How many people would say they like to sit in the middle of a crowd? Instead, they speak of "getting away from it all," and use words like "escape," "oasis," "retreat." What people do, however, reveals a different priority.
Whyte's conclusions demystified the question of how to make public space work. Places to sit, streets to enjoy, and people to watch turned out to be the simple and powerful rules for park designers to follow, and these rules demolished the orthodoxies and theoretical principles of conventional urban design. But in a more important sense-and it is here that Whyte's connection with Paco Underhill and retail anthropology and the stores that line Fifth and Madison is most striking-what Whyte did was to remystify the art of urban planning. He said, emphatically, that people could not be manipulated, that they would enter a public space only on their own terms, that the goal of observers like him was to find out what people wanted, not why they wanted it. Whyte, like Paco, was armed with all kinds of facts and observations about what it took to build a successful public space. He had strict views on how wide ledges had to be to lure passersby (at least thirty inches, or two backsides deep), and what the carrying capacity of prime outdoor sitting space is (total number of square feet divided by three). But, fundamentally, he was awed by the infinite complexity and the ultimate mystery of human behavior. He took people too seriously to think that he could control them. Here is Whyte, in "The Social Life of Small Urban Spaces," analyzing hours of videotape and describing what he has observed about the way men stand in public. He's talking about feet. He could just as easily be talking about shopping:
Foot movements . . . seem to be a silent language. Often, in a schmoozing group, no one will be saying anything. Men stand bound in amiable silence, surveying the passing scene. Then, slowly, rhythmically, one of the men rocks up and down; first on the ball of the foot, then back on the heel. He stops. Another man starts the same movement. Sometimes there are reciprocal gestures. One man makes a half turn to the right. Then, after a rhythmic interval, another responds with a half turn to the left. Some kind of communication seems to be taking place here, but I've never broken the code.
Damaged
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
February 24, 1997
CRIME AND SCIENCE
Why do some people turn into violent criminals?
New evidence suggests that it may all be in the brain
1.
On the morning of November 18, 1996, Joseph Paul Franklin was led into Division 15 of the St. Louis County Courthouse, in Clayton, Missouri. He was wearing a pair of black high-top sneakers, an orange jumpsuit with short sleeves that showed off his prison biceps, and a pair of thick black-rimmed glasses. There were two guards behind him, two guards in front of him, and four more guards stationed around the courtroom, and as he walked into the room-or, rather, shuffled, since his feet were manacled-Franklin turned to one of them and said "Wassup?" in a loud, Southern-accented voice. Then he sat down between his attorneys and stared straight ahead at the judge, completely still except for his left leg, which bounced up and down in an unceasing nervous motion.
Joseph Franklin takes credit for shooting and paralyzing Larry Flynt, the publisher of Hustler, outside a Lawrenceville, Georgia, courthouse in March of 1978, apparently because Flynt had printed photographs of a racially mixed couple. Two years later, he says, he gunned down the civil-rights leader Vernon Jordan outside a Marriott in Fort Wayne, Indiana, tearing a hole in Jordan's back the size of a fist. In the same period in the late seventies, as part of what he later described as a "mission" to rid America of blacks and Jews and of whites who like blacks and Jews, Franklin says that he robbed several banks, bombed a synagogue in Tennessee, killed two black men jogging with white women in Utah, shot a black man and a white woman coming out of a Pizza Hut in a suburb of Chattanooga, Tennessee, and on and on-a violent spree that may have spanned ten states and claimed close to twenty lives, and, following Franklin's arrest, in 1980, earned him six consecutive life sentences.
Two years ago, while Franklin was imprisoned in Marion Federal Penitentiary, in Illinois, he confessed to another crime. He was the one, he said, who had hidden in the bushes outside a synagogue in suburban St. Louis in the fall of 1977 and opened fire on a group of worshippers, killing forty-two-year-old Gerald Gordon. After the confession, the State of Missouri indicted him on one count of capital murder and two counts of assault. He was moved from Marion to the St. Louis County jail, and from there, on a sunny November morning last year, he was brought before Judge Robert Campbell, of the St. Louis County Circuit Court, so that it could be determined whether he was fit to stand trial-whether, in other words, embarking on a campaign to rid America of Jews and blacks was an act of evil or an act of illness.
The prosecution went first. On a television set at one side of the courtroom, two videotapes were shown-one of an interview with Franklin by a local news crew and the other of Franklin's formal confession to the police. In both, he seems lucid and calm, patiently retracing how he planned and executed his attack on the synagogue. He explains that he bought the gun in a suburb of Dallas, answering a classified ad, so the purchase couldn't be traced. He drove to the St. Louis area and registered at a Holiday Inn. He looked through the Yellow Pages to find the names of synagogues. He filed the serial number off his rifle and bought a guitar case to carry the rifle in. He bought a bicycle. He scouted out a spot near his chosen synagogue from which he could shoot without being seen. He parked his car in a nearby parking lot and rode his bicycle to the synagogue. He lay in wait in the bushes for several hours, until congregants started to emerge. He fired five shots. He rode the bicycle back to the parking lot, climbed into his car, pulled out of the lot, checked his police scanner to see if he was being chased, then drove south, down I-55, back home toward Memphis.
In the interview with the news crew, Franklin answered every question, soberly and directly. He talked about his tattoos ("This one is the Grim Reaper. I got it in Dallas") and his heroes ("One person I like is Howard Stern. I like his honesty"), and he respectfully disagreed with the media's description of racially motivated crimes as "hate crimes," since, he said, "every murder is committed out of hate." In his confession to the police, after he detailed every step of the synagogue attack, Franklin was asked if there was anything he'd like to say. He stared thoughtfully over the top of his glasses. There was a long silence. "I can't think of anything," he answered. Then he was asked if he felt any remorse. There was another silence. "I can't say that I do," he said. He paused again, then added, "The only thing I'm sorry about is that it's not legal."
"What's not legal?"
Franklin answered as if he'd just been asked the time of day: "Killing Jews."
After a break for lunch, the defense called Dorothy Otnow Lewis, a psychiatrist at New York's Bellevue Hospital and a professor at New York University School of Medicine. Over the past twenty years, Lewis has examined, by her own rough estimate, somewhere between a hundred and fifty and two hundred murderers. She was the defense's only expert witness in the trial of Arthur Shawcross, the Rochester serial killer who strangled eleven prostitutes in the late eighties. She examined Joel Rifkin, the Long Island serial killer, and Mark David Chapman, who shot John Lennon-both for the defense. Once, in a Florida prison, she sat for hours talking with Ted Bundy. It was the day before his execution, and when they had finished Bundy bent down and kissed her cheek. "Bundy thought I was the only person who didn't want something from him," Lewis says. Frequently, Lewis works with Jonathan Pincus, a neurologist at Georgetown University. Lewis does the psychiatric examination; Pincus does the neurological examination. But Franklin put his foot down. He could tolerate being examined by a Jewish woman, evidently, but not by a Jewish man. Lewis testified alone.
Lewis is a petite woman in her late fifties, with short dark hair and large, liquid brown eyes. She was wearing a green blazer and a black skirt with a gold necklace, and she was so dwarfed by the witness stand that from the back of the courtroom only her head was visible. Under direct examination she said that she had spoken with Franklin twice-once for six hours and once for less than half an hour-and had concluded that he was a paranoid schizophrenic: a psychotic whose thinking was delusional and confused, a man wholly unfit to stand trial at this time. She talked of brutal physical abuse he had suffered as a child. She mentioned scars on his scalp from blows Franklin had told her were inflicted by his mother. She talked about his obsessive desire to be castrated, his grandiosity, his belief that he may have been Jewish in an earlier life, his other bizarre statements and beliefs. At times, Lewis seemed nervous, her voice barely audible, but perhaps that was because Franklin was staring at her unblinkingly, his leg bouncing faster and faster under the table. After an hour, Lewis stepped down. She paused in front of Franklin and, ever the psychiatrist, suggested that when everything was over they should talk. Then she walked slowly through the courtroom, past the defense table and the guards, and out the door.
Later that day, on the plane home to New York City, Lewis worried aloud that she hadn't got her point across. Franklin, at least as he sat there in the courtroom, didn't seem insane. The following day, Franklin took the stand himself for two hours, during which he did his own psychiatric diagnosis, confessing to a few "minor neuroses," but not to being "stark raving mad," as he put it. Of the insanity defense, he told the court, "I think it is hogwash, to tell you the truth. I knew exactly what I was doing." During his testimony, Franklin called Lewis "a well-intentioned lady" who "seems to embellish her statements somewhat." Lewis seemed to sense that that was the impression she'd left: that she was overreaching, that she was some kind of caricature- liberal Jewish New York psychiatrist comes to Middle America to tell the locals to feel sorry for a murderer. Sure enough, a week later the Judge rejected Lewis's arguments and held Franklin competent to stand trial. But, flying back to New York, Lewis insisted that she wasn't making an ideological point of Franklin; rather, she was saying that she didn't feel that Franklin's brain worked the way brains are supposed to work-that he had identifiable biological and psychiatric problems that diminished his responsibility for his actions. "I just don't believe people are born evil," she said. "To my mind, that is mindless. Forensic psychiatrists tend to buy into the notion of evil. I felt that that's no explanation. The deed itself is bizarre, grotesque. But it's not evil. To my mind, evil bespeaks conscious control over something. Serial murderers are not in that category. They are driven by forces beyond their control."
The plane was in the air now. By some happy set of circumstances, Lewis had been bumped up to first class. She was sipping champagne. Her shoes were off. "You know, when I was leaving our last interview, he sniffed me right here," she said, and she touched the back of her neck and flared her nostrils in mimicry of Franklin's gesture. "He'd said to his attorney, 'You know, if you weren't here, I'd make a play for her.' " She had talked for six hours to this guy who hated Jews so much that he hid in the bushes and shot at them with a rifle, and he had come on to her, just like that. She shivered at the memory: "He said he wanted some pussy."
2.
When Dorothy Lewis graduated from Yale School of Medicine, in 1963, neurology, the study of the brain and the rest of the nervous system, and psychiatry, the study of behavior and personality, were entirely separate fields. This was still the Freudian era. Little attempt was made to search for organic causes of criminality. When, after medical school, she began working with juvenile delinquents in New Haven, the theory was that these boys were robust, healthy. According to the prevailing wisdom, a delinquent was simply an ordinary kid who had been led astray by a troubled home life-by parents who were too irresponsible or too addled by drugs and alcohol to provide proper discipline. Lewis came from the archetypal do- gooding background-reared on Central Park West; schooled at Ethical Culture; a socialist mother who as a child had once introduced Eugene V. Debs at a political rally; father in the garment business; heated dinner-table conversations about the Rosenbergs-and she accepted this dogma. Criminals were just like us, only they had been given bad ideas about how to behave. The trouble was that when she began working with delinquents they didn't seem like that at all. They didn't lack for discipline. If anything, she felt, they were being disciplined too much. And these teenagers weren't robust and rowdy; on the contrary, they seemed to be damaged and impaired. "I was studying for my boards in psychiatry, and in order to do a good job you wanted to do a careful medical history and a careful mental-status exam," she says. "I discovered that many of these kids had had serious accidents, injuries, or illnesses that seemed to have affected the central nervous system and that hadn't been identified previously."
In 1976, she was given a grant by the State of Connecticut to study a group of nearly a hundred juvenile delinquents. She immediately went to see Pincus, then a young professor of neurology at Yale. They had worked together once before. "Dorothy came along and said she wanted to do this project with me," Pincus says. "She wanted to look at violence. She had this hunch that there was something physically wrong with these kids. I said, 'That's ridiculous. Everyone knows violence has nothing to do with neurology.' " At that point, Pincus recalls, he went to his bookshelf and began reading out loud from what was then the definitive work in the field: "Criminality and Psychiatric Disorders," by Samuel Guze, the chairman of the psychiatry department of Washington University, in St. Louis. "Sociopathy, alcoholism, and drug dependence are the psychiatric disorders characteristically associated with serious crime," he read. "Schizophrenia, primary affective disorders, anxiety neurosis, obsessional neurosis, phobic neurosis, and"-and there he paused- "brain syndromes are not." But Lewis would have none of it. "She said, 'We should do it anyway.' I said, 'I don't have the time.' She said, 'Jonathan, I can pay you.' So I would go up on Sunday, and I would examine three or four youths, just give them a standard neurological examination." But, after seeing the kids for himself, Pincus, too, became convinced that the prevailing wisdom about juvenile delinquents--and, by extension, about adult criminals--was wrong, and that Lewis was right. "Almost all the violent ones were damaged," Pincus recalls, shaking his head.
Over the past twenty years, Lewis and Pincus have testified for the defense in more than a dozen criminal cases, most of them death- penalty appeals. Together, they have published a series of groundbreaking studies on murderers and delinquents, painstakingly outlining the medical and psychiatric histories of the very violent; one of their studies has been cited twice in United States Supreme Court opinions. Of the two, Pincus is more conservative. He doesn't have doubts about evil the way Lewis does, and sharply disagrees with her on some of the implications of their work. On the core conclusions, however, they are in agreement. They believe that the most vicious criminals are, overwhelmingly, people with some combination of abusive childhoods, brain injuries, and psychotic symptoms (in particular, paranoia), and that while each of these problems individually has no connection to criminality (most people who have been abused or have brain injuries or psychotic symptoms never end up harming anyone else), somehow these factors together create such terrifying synergy as to impede these individuals' ability to play by the rules of society.
Trying to determine the causes of human behavior is, of course, a notoriously tricky process. Lewis and Pincus haven't done the kind of huge, population-wide studies that could definitively answer just how predictive of criminality these factors are. Their findings are, however, sufficiently tantalizing that their ideas have steadily gained ground in recent years. Other researchers have now done some larger studies supporting their ideas. Meanwhile, a wave of new findings in the fields of experimental psychiatry and neurology has begun to explain why it is that brain dysfunction and child abuse can have such dire effects. The virtue of this theory is that it sidesteps all the topics that so cripple contemporary discussions of violence-genetics, biological determinism, and, of course, race. In a sense, it's a return to the old liberal idea that environment counts, and that it is possible to do something significant about crime by changing the material conditions of people's lives. Only, this time the maddening imprecision of the old idea (what, exactly, was it about bad housing, say, that supposedly led to violent crime?) has been left behind. Lewis and Pincus and other neurologists and psychiatrists working in the field of criminal behavior think they are beginning to understand what it is that helps to turn some people into violent criminals-right down to which functions of the brain are damaged by abuse and injury. That's what Lewis means when she says she doesn't think that people are intrinsically evil. She thinks that some criminals simply suffer from a dysfunction of the brain, the way cardiac patients suffer from a dysfunction of the heart, and this is the central and in some ways disquieting thing about her. When she talks about criminals as victims, she doesn't use the word in the standard liberal metaphorical sense. She means it literally.
Lewis works out of a tiny set of windowless offices on the twenty- first floor of the new wing of Bellevue Hospital, in Manhattan's East Twenties. The offices are decorated in institutional colors-gray carpeting and bright-orange trim-and since they're next to the children's psychiatric ward you can sometimes hear children crying out. Lewis's desk is stacked high with boxes of medical and court records from cases she has worked on, and also with dozens of videotapes of interviews with murderers which she has conducted over the years. She talks about some of her old cases-especially some of her death-row patients-as if they had just happened, going over and over details, sometimes worrying about whether she made the absolutely correct diagnosis. The fact that everyone else has long since given up on these people seems to be just what continues to attract her. Years ago, when she was in college, Lewis found herself sitting next to the Harvard theologian Paul Tillich on the train from New York to Boston. "When you read about witches being burned at the stake," Tillich asked her, in the midst of a long and wide-ranging conversation, "do you identify with the witch or with the people looking on?" Tillich said he himself identified with the crowd. Not Lewis. She identified with the witch.
In her offices, Lewis has tapes of her interviews with Shawcross, the serial killer, and also tapes of Shawcross being interviewed by Park Elliott Dietz, the psychiatrist who testified for the prosecution in that case. Dietz is calm, in control, and has a slightly bored air, as if he had heard everything before. By contrast, Lewis, in her interviews, has a kind of innocence about her. She seems completely caught up in what is happening, and at one point, when Shawcross makes some particularly outrageous comment on what he did to one of the prostitutes he murdered, she looks back at the camera wide-eyed, as if to say "Wow!" When Dietz was on the stand, his notes were beside him in one of those rolling evidence carts, where everything is labelled and items are distinguished by color-coded dividers, so that he had the entire case at his fingertips. When Lewis testified, she kept a big stack of untidy notes on her lap and fussed through them after she was asked a question. She is like that in everyday life as well-a little distracted and spacey, wrapped up in the task at hand. It makes her so approachable and so unthreatening that it's no wonder she gets hardened criminals to tell her their secrets. It's another way of identifying with the witch. Once, while talking with Bundy, Lewis looked up after several hours and found that she had been so engrossed in their conversation that she hadn't noticed that everyone outside the soundproof glass of the interview booth-the guard, the prison officials at their desks-had left for lunch. She and Bundy were utterly alone. Terrified, Lewis stayed glued to her seat, her eyes never leaving his. "I didn't bat an eyelash," she recalls. Another time, after Lewis had interviewed a murderer in a Tennessee prison, she returned to her hotel room to find out that there had been a riot in the prison while she was there.
3.
The human brain comprises, in the simplest terms, four interrelated regions, stacked up in ascending order of complexity. At the bottom is the brain stem, which governs the most basic and primitive functions-breathing, blood pressure, and body temperature. Above that is the diencephalon, the seat of sleep and appetite. Then comes the limbic region, the seat of sexual behavior and instinctual emotions. And on top, covering the entire outside of the brain in a thick carpet of gray matter, is the cortex, the seat of both concrete and abstract thought. It is the function of the cortex-and, in particular, those parts of the cortex beneath the forehead, known as the frontal lobes-to modify the impulses that surge up from within the brain, to provide judgment, to organize behavior and decision-making, to learn and adhere to rules of everyday life. It is the dominance of the cortex and the frontal lobes, in other words, that is responsible for making us human; and the central argument of the school to which Lewis and Pincus belong is that what largely distinguishes many violent criminals from the rest of us is that something has happened inside their brains to throw the functioning of the cortex and the frontal lobes out of whack. "We are a highly socialized animal. We can sit in theatres with strangers and not fight with each other," Stuart Yudofsky, the chairman of psychiatry at Baylor College of Medicine, in Houston, told me. "Many other mammals could never crowd that closely together. Our cortex helps us figure out when we are and are not in danger. Our memory tells us what we should be frightened of and angry with and what we shouldn't. But if there are problems there-if it's impaired-one can understand how that would lead to confusion, to problems with disinhibition, to violence." One of the most important things that Lewis and Pincus have to do, then, when they evaluate a murderer is check for signs of frontal-lobe impairment. This, the neurological exam, is Pincus's task.
Pincus begins by taking a medical history: he asks about car accidents and falls from trees and sports injuries and physical abuse and problems at birth and any blows to the head of a kind that might have caused damage to the frontal lobes. He asks about headaches, tests for reflexes and sensorimotor functions, and compares people's right and left sides and observes gait. "I measure the head circumference-if it's more than two standard deviations below the normal brain circumference, there may be some degree of mental retardation, and, if it's more than two standard deviations above, there may be hydrocephalus," Pincus told me. "I also check gross motor coördination. I ask people to spread their fingers and hold their hands apart and look for choreiform movements-discontinuous little jerky movements of the fingers and arms." We were in Pincus's cluttered office at Georgetown University Medical Center, in Washington, D.C., and Pincus, properly professorial in a gray Glen- plaid suit, held out his hand to demonstrate. "Then I ask them to skip, to hop," he went on, and he hopped up and down in a small space on the floor between papers and books.
Pincus stands just over six feet, has the long-limbed grace of an athlete, and plays the part of neurologist to perfection: calm, in command, with a distinguished sprinkle of white hair. At the same time, he has a look of mischief in his eyes, a streak of irreverence that allows him to jump up and down in his office before a total stranger. It's an odd combination, like Walter Matthau playing Sigmund Freud.
"Then I check for mixed dominance, to see if the person is, say, right-eyed, left-footed," he said. "If he is, it might mean that his central nervous system hasn't differentiated the way it should." He was sitting back down now. "No one of these by itself means he is damaged. But they can tell us something in aggregate."
At this point, Pincus held up a finger forty-five degrees to my left and moved it slowly to the right. "Now we're checking for frontal functions," he said. "A person should be able to look at the examiner's finger and follow it smoothly with his eyes. If he can only follow it jerkily, the frontal eye fields are not working properly. Then there's upward gaze." He asked me to direct my eyes to the ceiling. "The eye should go up five millimetres and a person should also be able to direct his gaze laterally and maintain it for twenty seconds. If he can't, that's motor impersistence." Ideally, Pincus will attempt to amplify his results with neuropsychological testing, an EEG (an electroencephalogram, which measures electrical patterns in the brain), and an MRI scan (that's magnetic resonance imaging), to see if he can spot scarring or lesions in any of the frontal regions which might contribute to impairment.
Pincus is also interested in measuring judgment. But since there is no objective standard for judgment, he tries to pick up evidence of an inability to cope with complexity, a lack of connection between experience and decision-making which is characteristic of cortical dysfunction. Now he walked behind me, reached over the top of my head, and tapped the bridge of my nose in a steady rhythm. I blinked once, then stopped. That, he told me, was normal.
"When you tap somebody on the bridge of the nose, it's reasonable for a person to blink a couple of times, because there is a threat from the outside," Pincus said. "When it's clear there is no threat, the subject should be able to accommodate that. But, if the subject blinks more than three times, that's 'insufficiency of suppression,' which may reflect frontal-lobe dysfunction. The inability to accommodate means you can't adapt to a new situation. There's a certain rigidity there."
Arthur Shawcross, who had a cyst pressing on one temporal lobe and scarring in both frontal lobes (probably from, among other things, being hit on the head with a sledgehammer and with a discus, and falling on his head from the top of a forty-foot ladder), used to walk in absolutely straight lines, splashing through puddles instead of walking around them, and he would tear his pants on a barbed-wire fence instead of using a gate a few feet away. That's the kind of behavior Pincus tries to correlate with abnormalities on the neurological examination. "In the Wisconsin Card Sorting Test, the psychologist shows the subject four playing cards-three red ones, one black one- and asks which doesn't fit," Pincus said. "Then he shows the subject, say, the four of diamonds, the four of clubs, the four of hearts, and the three of diamonds. Somebody with frontal-lobe damage who correctly picked out the black one the first time-say, the four of clubs- is going to pick the four of clubs the second time. But the rules have changed. It's now a three we're after. We're going by numbers now, not color. It's that kind of change that people with frontal-lobe damage can't make. They can't change the rules. They get stuck in a pattern. They keep using rules that are demonstrably wrong. Then there's the word-fluency test. I ask them to name in one minute as many different words as they can think of which begin with the letter 'f.' Normal is fourteen, plus or minus five. Anyone who names fewer than nine is abnormal."
This is not an intelligence test. People with frontal-lobe damage might do just as well as anyone else if they were asked, say, to list the products they might buy in a supermarket. "Under those rules, most people can think of at least sixteen products in a minute and rattle them off," Pincus said. But that's a structured test, involving familiar objects, and it's a test with rules. The thing that people with frontal-lobe damage can't do is cope with situations where there are no rules, where they have to improvise, where they need to make unfamiliar associations. "Very often, they get stuck on one word- they'll say 'four,' 'fourteen,' 'forty-four,' " Pincus said. "They'll use the same word again and again-'farm' and then 'farming.' Or, as one fellow in a prison once said to me, 'fuck,' 'fucker,' 'fucking.' They don't have the ability to come up with something else."
What's at stake, fundamentally, with frontal-lobe damage is the question of inhibition. A normal person is able to ignore the tapping after one or two taps, the same way he can ignore being jostled in a crowded bar. A normal person can screen out and dismiss irrelevant aspects of the environment. But if you can't ignore the tapping, if you can't screen out every environmental annoyance and stimulus, then you probably can't ignore being jostled in a bar, either. It's living life with a hair trigger.
A recent study of two hundred and seventy-nine veterans who suffered penetrating head injuries in Vietnam showed that those with frontal-lobe damage were anywhere from two to six times as violent and aggressive as veterans who had not suffered such injuries. This kind of aggression is what is known as neurological, or organic, rage. Unlike normal anger, it's not calibrated by the magnitude of the original insult. It's explosive and uncontrollable, the anger of someone who no longer has the mental equipment to moderate primal feelings of fear and aggression.
"There is a reactivity to it, in which a modest amount of stimulation results in a severe overreaction," Stuart Yudofsky told me. "Notice that reactivity implies that, for the most part, this behavior is not premeditated. The person is rarely violent and frightening all the time. There are often brief episodes of violence punctuating stretches when the person does not behave violently at all. There is also not any gain associated with organic violence. The person isn't using the violence to manipulate someone else or get something for himself. The act of violence does just the opposite. It is usually something that causes loss for the individual. He feels that it is out of his control and unlike himself. He doesn't blame other people for it. He often says, 'I hate myself for acting this way.' The first person with organic aggression I ever treated was a man who had been inflating a truck tire when the tire literally exploded and the rim was driven into his prefrontal cortex. He became extraordinarily aggressive. It was totally uncharacteristic: he had been a religious person with strong values. But now he would not only be physically violent-he would curse. When he came to our unit, a nurse offered him some orange juice. He was calm at that moment. But then he realized that the orange juice was warm, and in one quick motion he threw it back at her, knocking her glasses off and injuring her cornea. When we asked him why, he said, 'The orange juice was warm.' But he also said, 'I don't know what got into me.' It wasn't premeditated. It was something that accelerated quickly. He went from zero to a hundred in a millisecond." At that point, I asked Yudofsky an obvious question. Suppose you had a person from a difficult and disadvantaged background, who had spent much of his life on the football field, getting his head pounded by the helmets of opposing players. Suppose he was involved in a tempestuous on-again, off-again relationship with his ex-wife. Could a vicious attack on her and another man fall into the category of neurological rage? "You're not the first person to ask that question," Yudofsky replied dryly, declining to comment further.
Pincus has found that when he examines murderers neurological problems of this kind come up with a frequency far above what would be expected in the general population. For example, Lewis and Pincus published a study of fifteen death-row inmates randomly referred to them for examination; they were able to verify forty-eight separate incidents of significant head injury. Here are the injuries suffered by just the first three murderers examined:
I.
three years: beaten almost to death by father
(multiple facial scars)
early childhood: thrown into sink onto head
(palpable scar)
late adolescence: one episode of loss of consciousness while boxing
II.
childhood: beaten in head with two-by-fours by parents
childhood: fell into pit, unconscious for several hours
seventeen years: car accident with injury to right eye
eighteen years: fell from roof apparently because of a blackout
III.
six years: glass bottle deliberately dropped onto head from tree (palpable scar on top of cranium)
eight years: hit by car
nine years: fell from platform, received head injury
fourteen years: jumped from moving car, hit head.
4.
Dorothy Lewis's task is harder than Jonathan Pincus's. He administers relatively straightforward tests of neurological function. But she is interested in the psychiatric picture, which means getting a murderer to talk about his family, his feelings and behavior, and, perhaps most important, his childhood. It is like a normal therapy session, except that Lewis doesn't have weeks in which to establish intimacy. She may have only a session or two. On one occasion, when she was visiting a notorious serial killer at San Quentin, she got lucky. "By chance, one of the lawyers had sent me some clippings from the newspaper, where I read that when he was caught he had been carrying some Wagner records," she told me. "For some reason, that stuck in my mind. The first time I went to see him, I started to approach him and he pointed at me and said, 'What's happening on June 18th?' And I said, 'That's the first night PBS is broadcasting "Der Ring des Nibelungen." ' You know, we'd studied Wagner at Ethical Culture. Granted, it was a lucky guess. But I showed him some respect, and you can imagine the rapport that engendered." Lewis says that even after talking for hours with someone guilty of horrendous crimes she never gets nightmares. She seems to be able to separate her everyday life from the task at hand-to draw a curtain between her home and her work. Once, I visited Lewis at her home: she and her husband, Mel, who is a professor of psychiatry at Yale, live in New Haven. The two dote on each other ("When I met Mel, I knew within a week that this was the man I wanted to marry," she says, flushing, "and I've never forgiven him, because it took him two weeks to ask me"), and as soon as I walked in they insisted on giving me a detailed tour of their house, picking up each memento, pointing out their children's works of art, and retelling the stories behind thirty years of anniversaries and birthdays: sometimes they told their stories in alternating sentences, and sometimes they told a story twice, first from Dorothy's perspective and then from Mel's. All in all, it was a full hour of domestic vaudeville. Then Dorothy sat on her couch, with her cat, Ptolemy, on her lap, and began to talk about serial killers, making a seamless transition from the sentimental to the unspeakable.
At the heart of Lewis's work with murderers is the search for evidence of childhood abuse. She looks for scars. She combs through old medical records for reports of suspicious injuries. She tries to talk to as many family members and friends as possible. She does all this because, of course, child abuse has devastating psychological consequences for children and the adults they become. But there is the more important reason-the one at the heart of the new theory of violence-which is that finding evidence of prolonged child abuse is a key to understanding criminal behavior because abuse also appears to change the anatomy of the brain.
When a child is born, the parts of his brain that govern basic physiological processes-that keep him breathing and keep his heart beating-are fully intact. But a newborn can't walk, can't crawl, can't speak, can't reason or do much of anything besides sleep and eat, because the higher regions of his brain-the cortex, in particular-aren't developed yet. In the course of childhood, neurons in the cortex begin to organize themselves-to differentiate and make connections-and that maturation process is in large part responsive to what happens in the child's environment. Bruce Perry, a psychiatrist at Baylor College of Medicine, has done brain scans of children who have been severely neglected, and has found that their cortical and sub-cortical areas never developed properly, and that, as a result, those regions were roughly twenty or thirty per cent smaller than normal. This kind of underdevelopment doesn't affect just intelligence; it affects emotional health. "There are parts of the brain that are involved in attachment behavior-the connectedness of one individual to another-and in order for that to be expressed we have to have a certain nature of experience and have that experience at the right time," Perry told me. "If early in life you are not touched and held and given all the somatosensory stimuli that are associated with what we call love, that part of the brain is not organized in the same way."
According to Perry, the section of the brain involved in attachment-which he places just below the cortex, in the limbic region-would look different in someone abused or neglected. The wiring wouldn't be as dense or as complex. "Such a person is literally lacking some brain organization that would allow him to actually make strong connections to other human beings. Remember the orphans in Romania? They're a classic example of children who, by virtue of not being touched and held and having their eyes gazed into, didn't get the somatosensory bath. It doesn't matter how much you love them after age two-they've missed that critical window."
In a well-known paper in the field of child abuse, Mary Main, a psychologist at Berkeley, and Carol George, now at Mills College, studied a group of twenty disadvantaged toddlers, half of whom had been subjected to serious physical abuse and half of whom had not. Main and George were interested in how the toddlers responded to a classmate in distress. What they found was that almost all the nonabused children responded to a crying or otherwise distressed peer with concern or sadness or, alternatively, showed interest and made some attempt to provide comfort. But not one of the abused toddlers showed any concern. At the most, they showed interest. The majority of them either grew distressed and fearful themselves or lashed out with threats, anger, and physical assaults. Here is the study's description of Martin, an abused boy of two and a half, who- emotionally retarded in the way that Perry describes-seemed incapable of normal interaction with another human being:
Martin . . . tried to take the hand of the crying other child, and when she resisted, he slapped her on the arm with his open hand. He then turned away from her to look at the ground and began vocalizing very strongly. "Cut it out! cut it out!," each time saying it a little faster and louder. He patted her, but when she became disturbed by his patting, he retreated "hissing at her and baring his teeth." He then began patting her on the back again, his patting became beating, and he continued beating her despite her screams.
Abuse also disrupts the brain's stress-response system, with profound results. When something traumatic happens-a car accident, a fight, a piece of shocking news-the brain responds by releasing several waves of hormones, the last of which is cortisol. The problem is that cortisol can be toxic. If someone is exposed to too much stress over too long a time, one theory is that all that cortisol begins to eat away at the organ of the brain known as the hippocampus, which serves as the brain's archivist: the hippocampus organizes and shapes memories and puts them in context, placing them in space and time and tying together visual memory with sound and smell. J. Douglas Bremner, a psychiatrist at Yale, has measured the damage that cortisol apparently does to the hippocampus by taking M.R.I. scans of the brains of adults who suffered severe sexual or physical abuse as children and comparing them with the brains of healthy adults. An M.R.I. scan is a picture of a cross-section of the brain-as if someone's head had been cut into thin slices like a tomato, and then each slice had been photographed-and in the horizontal section taken by Bremner the normal hippocampus is visible as two identical golf-ball-size organs, one on the left and one on the right, and each roughly even with the ear. In child-abuse survivors, Bremner found, the golf ball on the left is on average twelve per cent smaller than that of a healthy adult, and the theory is that it was shrunk by cortisol. Lewis says that she has examined murderers with dozens of scars on their backs, and that they have no idea how the scars got there. They can't remember their abuse, and if you look at Bremner's scans that memory loss begins to make sense: the archivist in their brain has been crippled.
Abuse also seems to affect the relationship between the left hemisphere of the brain, which plays a large role in logic and language, and the right hemisphere, which is thought to play a large role in creativity and depression. Martin Teicher, a professor of psychiatry at Harvard and McLean Hospital, recently gave EEGs to a hundred and fifteen children who had been admitted to a psychiatric facility, some of whom had a documented history of abuse. Not only did the rate of abnormal EEGs among the abused turn out to be twice that of the non-abused but all those abnormal brain scans turned out to be a result of problems on the left side of the brain. Something in the brain's stress response, Teicher theorized, was interfering with the balanced development of the brain's hemispheres.
Then Teicher did M.R.I.s of the brains of a subset of the abused children, looking at what is known as the corpus callosum. This is the fibre tract-the information superhighway-that connects the right and the left hemispheres. Sure enough, he found that parts of the corpus callosum of the abused kids were smaller than they were in the nonabused children. Teicher speculated that these abnormalities were a result of something wrong with the sheathing-the fatty substance, known as myelin, that coats the nerve cells of the corpus callosum. In a healthy person, the myelin helps the neuronal signals move quickly and efficiently. In the abused kids, the myelin seemed to have been eaten away, perhaps by the same excess cortisol that is thought to attack the hippocampus.
Taken together, these changes in brain hardware are more than simple handicaps. They are, in both subtle and fundamental ways, corrosive of self. Richard McNally, a professor of psychology at Harvard, has done memory studies with victims of serious trauma, and he has discovered that people with post-traumatic-stress disorder, or P.T.S.D., show marked impairment in recalling specific autobiographical memories. A healthy trauma survivor, asked to name an instance when he exhibited kindness, says, "Last Friday, I helped a neighbor plow out his driveway." But a trauma survivor with P.T.S.D. can only say something like "I was kind to people when I was in high school." This is what seems to happen when your hippocampus shrinks: you can't find your memories. "The ability to solve problems in the here and now depends on one's ability to access specific autobiographical memories in which one has encountered similar problems in the past," McNally says. "It depends on knowing what worked and what didn't." With that ability impaired, abuse survivors cannot find coherence in their lives. Their sense of identity breaks down.
It is a very short walk from this kind of psychological picture to a diagnosis often associated with child abuse; namely, dissociative identity disorder, or D.I.D. Victims of child abuse are thought sometimes to dissociate, as a way of coping with their pain, of distancing themselves from their environment, of getting away from the danger they faced. It's the kind of disconnection that would make sense if a victim's memories were floating around without context and identification, his left and right hemispheres separated and unequal, and his sense of self fragmented and elusive. It's also a short walk from here to understanding how someone with such neurological problems could become dangerous. Teicher argues that in some of his EEG and M.R.I. analyses of the imbalance between the left and the right hemispheres he is describing the neurological basis for the polarization so often observed in psychiatrically disturbed patients-the mood swings, the sharply contrasting temperaments. Instead of having two integrated hemispheres, these patients have brains that are, in some sense, divided down the middle. "What you get is a kind of erratic-ness," says Frank Putnam, who heads the Unit on Developmental Traumatology at the National Institute of Mental Health, in Maryland. "These kinds of people can be very different in one situation compared with another. There is the sense that they don't have a larger moral compass."
Several years ago, Lewis and Pincus worked together on an appeal for David Wilson, a young black man on death row in Louisiana. Wilson had been found guilty of murdering a motorist, Stephen Stinson, who had stopped to help when the car Wilson was in ran out of gas on I-10 outside New Orleans; and the case looked, from all accounts, almost impossible to appeal. Wilson had Stinson's blood on his clothes, in his pocket he had a shotgun shell of the same type and gauge as the one found in the gun at the murder scene, and the prosecution had an eyewitness to the whole shooting. At the trial, Wilson denied that the bloody clothes were his, denied that he had shot Stinson, denied that a tape-recorded statement the prosecution had played for the jury was of his voice, and claimed he had slept through the entire inci-dent. It took the jury thirty-five minutes to convict him of first-degree murder and sixty-five minutes more, in the sentencing phase, to send him to the electric chair.
But when Lewis and Pincus examined him they became convinced that his story was actually much more complicated. In talking to Wilson's immediate family and other relatives, they gathered evidence that he had never been quite normal-that his personality had always seemed fractured and polarized. His mother recalled episodes from a very early age during which he would become "glassy-eyed" and seem to be someone else entirely. "David had, like, two personalities," his mother said. At times, he would wander off and be found, later, miles away, she recalled. He would have violent episodes during which he would attack his siblings' property, and subsequently deny that he had done anything untoward at all. Friends would say that they had seen someone who looked just like Wilson at a bar, but weren't sure that it had been Wilson, because he'd been acting altogether differently. On other occasions, Wilson would find things in his pockets and have no idea how they got there. He sometimes said he was born in 1955 and at other times said 1948.
What he had, in other words, were the classic symptoms of dissociation, and when Lewis and Pincus dug deeper into his history they began to understand why. Wilson's medical records detailed a seemingly endless list of hospitalizations for accidents, falls, periods of unconsciousness, and "sunstroke," dating from the time Wilson was two through his teens-the paper trail of a childhood marked by extraordinary trauma and violence. In his report to Wilson's attorneys, based on his examination of Wilson, Pincus wrote that there had been "many guns" in the home and that Wilson was often shot at as a child. He was also beaten "with a bull whip, 2x4's, a hose, pipes, a tree approximately 4 inches in diameter, wire, a piece of steel and belt buckles . . . on his back, legs, abdomen and face," until "he couldn't walk." Sometimes, when the beatings became especially intense, Wilson would have to "escape from the house and live in the fields for as long as two weeks." A kindly relative would leave food surreptitiously for him. The report goes on:
As a result of his beatings David was ashamed to go to school lest he be seen with welts. He would "lie down in the cold sand in a hut" near his home to recuperate for several days rather than go to school.
At the hearing, Lewis argued that when Wilson said he had no memory of shooting Stinson he was actually telling the truth. The years of abuse had hurt his ability to retrieve memories. Lewis also argued that Wilson had a violent side that he was, quite literally, unaware of; that he had the classic personality polarization of the severely abused who develop dissociative identity disorder. Lewis has videotapes of her sessions with Wilson: he is a handsome man with long fine black hair, sharply defined high cheekbones, and large, soft eyes. In the videotapes, he looks gentle. "During the hearing," Lewis recalls, "I was testifying, and I looked down at the defense table and David wasn't there. You know, David is a sweetie. He has a softness and a lovable quality. Instead, seated in his place there was this glowering kind of character, and I interrupted myself. I said, 'Excuse me, Your Honor, I just wanted to call to your attention that that is not David.' Everyone just looked." In the end, the judge vacated Wilson's death sentence.
Lewis talks a great deal about the Wilson case. It is one of the few instances in which she and Pincus succeeded in saving a defendant from the death penalty, and when she talks about what happened she almost always uses one of her favorite words-"poignant," spoken with a special emphasis, with a hesitation just before and just afterward. "In the course of evaluating someone, I always look for scars," Lewis told me. We were sitting in her Bellevue offices, watching the video of her examination of Wilson, and she was remembering the poignant moment she first met him. "Since I was working with a male psychologist, I said to him, 'Would you be good enough to go into the bathroom and look at David's back?' So he did that, and then he came back out and said, 'Dorothy! You must come and see this.' David had scars all over his back and chest. Burn marks. Beatings. I've seen a lot. But that was really grotesque."
5.
Abuse, in and of itself, does not necessarily result in violence, any more than neurological impairment or psychosis does. Lewis and Pincus argue, however, that if you mix these conditions together they become dangerous, that they have a kind of pathological synergy, that, like the ingredients of a bomb, they are troublesome individually but explosive in combination.
Several years ago, Lewis and some colleagues did a followup study of ninety-five male juveniles she and Pincus had first worked with in the late nineteen-seventies, in Connecticut. She broke the subjects into several groups: Group 1 consisted of those who did not have psychiatric or neurological vulnerabilities or an abusive childhood; Group 2 consisted of those with vulnerabilities but no abuse at home; Group 3 consisted of those with abuse but no vulnerabilities; yet another group consisted of those with abuse and extensive vulnerabilities. Seven years later, as adults, those in Group 1 had been arrested for an average of just over two criminal offenses, none of which were violent, so the result was essentially no jail time. Group 2, the psychiatrically or neurologically impaired kids, had been convicted of an average of almost ten offenses, two of which were violent, the result being just under a year of jail time. Group 3, the abused kids, had 11.9 offenses, 1.9 of them violent, the result being five hundred and sixty-two days in jail. But the group of children who had the most vulnerabilities and abuse were in another league entirely. In the intervening seven years, they had been arrested for, on average, 16.8 crimes, 5.4 of which were violent, the result being a thousand two hundred and fourteen days in prison.
In another study on this topic, a University of Southern California psychologist named Adrian Raine looked at four thousand two hundred and sixty-nine male children born and living in Denmark, and classified them according to two variables. The first was whether there were complications at birth-which correlates, loosely, with neurological impairment. The second was whether the child had been rejected by the mother (whether the child was unplanned, unwanted, and so forth)-which correlates, loosely, with abuse and neglect. Looking back eighteen years later, Raine found that those children who had not been rejected and had had no birth complications had roughly the same chance of becoming criminally violent as those with only one of the risk factors-around three per cent. For the children with both complications and rejection, however, the risk of violence tripled: in fact, the children with both problems accounted for eighteen per cent of all the violent crimes, even though they made up only 4.5 per cent of the group.
There is in these statistics a powerful and practical suggestion for how to prevent crime. In the current ideological climate, liberals argue that fighting crime requires fighting poverty, and conservatives argue that fighting crime requires ever more police and prisons; both of these things may be true, but both are also daunting. The studies suggest that there may be instances in which more modest interventions can bring large dividends. Criminal behavior that is associated with specific neurological problems is behavior that can, potentially, be diagnosed and treated like any other illness. Already, for example, researchers have found drugs that can mimic the cortical function of moderating violent behavior. The work is preliminary but promising. "We are on the cusp of a revolution in treating these conditions," Stuart Yudofsky told me. "We can use anticonvulsants, antidepressants, anti- hypertensive medications. There are medications out there that are F.D.A.-approved for other conditions which have profound effects on mitigating aggression." At the prevention end, as well, there's a strong argument for establishing aggressive child-abuse-prevention programs. Since 1992, for example, the National Committee to Prevent Child Abuse, a not-for- profit advocacy group based in Chicago, has been successfully promoting a program called Healthy Families America, which, working with hospitals, prenatal clinics, and physicians, identifies mothers in stressful and potentially abusive situations either before they give birth or immediately afterward, and then provides them with weekly home visits, counselling, and support for as long as five years. The main thing holding back nationwide adoption of programs like this is money: Healthy Families America costs up to two thousand dollars per family per year, but if we view it as a crime-prevention measure that's not a large sum.
These ideas, however, force a change in the way we think about criminality. Advances in the understanding of human behavior are necessarily corrosive of the idea of free will. That much is to be expected, and it is why courts have competency hearings, and legal scholars endlessly debate the definition and the use of the insanity defense. But the new research takes us one step further. If the patient of Yudofsky's who lashed out at his nurse because his orange juice was warm had, in the process, accidentally killed her, could we really hold him criminally responsible? Yudofsky says that that scenario is no different from one involving a man who is driving a car, has a heart attack, and kills a pedestrian. "Would you put him in jail?" he asks. Or consider Joseph Paul Franklin. By all accounts, he suffered through a brutal childhood on a par with that of David Wilson. What if he has a lesion on one of his frontal lobes, an atrophied hippocampus, a damaged and immature corpus callosum, a maldeveloped left hemisphere, a lack of synaptic complexity in the precortical limbic area, a profound left-right hemisphere split? What if in his remorselessness he was just the grownup version of the little boy Martin, whose ability to understand and relate to others was so retarded that he kept on hitting and hitting, even after the screams began? What if a history of abuse had turned a tendency toward schizophrenia-recall Franklin's colorful delusions-from a manageable impairment into the engine of murderousness? Such a person might still be sane, according to the strict legal definition. But that kind of medical diagnosis suggests, at the very least, that his ability to live by the rules of civilized society, and to understand and act on the distinctions between right and wrong, is quite different from that of someone who had a normal childhood and a normal brain.
What is implied by these questions is a far broader debate over competency and responsibility-an attempt to make medical considerations far more central to the administration of justice, so that we don't bring in doctors only when the accused seems really crazy but, rather, bring in doctors all the time, to add their expertise to the determination of responsibility.
One of the state-of-the-art diagnostic tools in neurology and psychiatry is the pet scan, a computerized X-ray that tracks the movement and rate of the body's metabolism. When you sing, for instance, the neurons in the specific regions that govern singing will start to fire. Blood will flow toward those regions, and if you take a pet scan at that moment the specific areas responsible for singing will light up on the pet computer monitor. Bremner, at Yale, has done pet scans of Vietnam War veterans suffering from post-traumatic-stress disorder. As he scanned the vets, he showed them a set of slides of Vietnam battle scenes accompanied by an appropriate soundtrack of guns and helicopters. Then he did the same thing with vets who were not suffering from P.T.S.D. Bremner printed out the results of the comparison for me, and they are fascinating. The pictures are color- coded. Blue shows the parts of the brain that were being used identically in the two groups of veterans, and most of each picture is blue. A few parts are light blue or green, signifying that the P.T.S.D. vets were using those regions a little less than the healthy vets were. The key color, however, is white. White shows brain areas that the healthy vets were using as they watched the slide show and the unhealthy vets were hardly using at all; in Bremner's computer printout, there is a huge white blob in the front of every non-P.T.S.D. scan.
"That's the orbitofrontal region," Bremner told me. "It's responsible for the extinction of fear." The orbitofrontal region is the part of your brain that evaluates the primal feelings of fear and anxiety which come up from the brain's deeper recesses. It's the part that tells you that you're in a hospital watching a slide show of the Vietnam War, not in Vietnam living through the real thing. The vets with P.T.S.D. weren't using that part of their brain. That's why every time a truck backfires or they see a war picture in a magazine they are forced to relive their wartime experiences: they can't tell the difference.
It doesn't take much imagination to see that this technique might someday be used to evaluate criminals-to help decide whether to grant parole, for example, or to find out whether some kind of medical treatment might aid reëntry into normal society. We appear to be creating a brand-new criminal paradigm: the research suggests that instead of thinking about and categorizing criminals merely by their acts-murder, rape, armed robbery, and so on-we ought to categorize criminals also according to their disabilities, so that a murderer with profound neurological damage and a history of vicious childhood abuse is thought of differently from a murderer with no brain damage and mild child abuse, who is, in turn, thought of differently from a murderer with no identifiable impairment at all. This is a more flexible view. It can be argued that it is a more sophisticated view. But even those engaged in such research-for example, Pincus-confess to discomfort at its implications, since something is undoubtedly lost in the translation. The moral force of the old standard, after all, lay in its inflexibility. Murder was murder, and the allowances made for aggravated circumstances were kept to a minimum. Is a moral standard still a moral standard when it is freighted with exceptions and exemptions and physiological equivocation?
When Lewis went to see Bundy, in Florida, on the day before his execution, she asked him why he had invited her-out of a great many people lining up outside his door-to see him. He answered, "Because everyone else wants to know what I did. You are the only one who wants to know why I did it." It's impossible to be sure what the supremely manipulative Bundy meant by this: whether he genuinely appreciated Lewis, or whether he simply regarded her as his last conquest. What is clear is that, over the four or five times they met in Bundy's last years, the two reached a curious understanding: he was now part of her scientific enterprise.
"I wasn't writing a book about him," Lewis recalls. "That he knew. The context in which he had first seen me was a scientific study, and this convinced him that I wasn't using him. In the last meeting, as I recall, he said that he wanted any material that I found out about him to be used to understand what causes people to be violent. We even discussed whether he would allow his brain to be studied. It was not an easy thing to talk about with him, let me tell you." At times, Lewis says, Bundy was manic, "high as a kite." On one occasion, he detailed to her just how he had killed a woman, and, on another occasion, he stared at her and stated flatly, "The man sitting across from you did not commit any murders." But she says that at the end she sensed a certain breakthrough. "The day before he was executed, he asked me to turn off the tape recorder. He said he wanted to tell me things that he didn't want recorded, so I didn't record them. It was very confidential." To this day, Lewis has never told anyone what Bundy said. There is something almost admirable about this. But there is also something strange about extending the physician-patient privilege to a killer like Bundy-about turning the murderer so completely into a patient. It is not that the premise is false, that murderers can't also be patients. It's just that once you make that leap-once you turn the criminal into an object of medical scrutiny-the crime itself inevitably becomes pushed aside and normalized. The difference between a crime of evil and a crime of illness is the difference between a sin and a symptom. And symptoms don't intrude in the relationship between the murderer and the rest of us: they don't force us to stop and observe the distinctions between right and wrong, between the speakable and the unspeakable, the way sins do. It was at the end of that final conversation that Bundy reached down and kissed Lewis on the cheek. But that was not all that happened. Lewis then reached up, put her arms around him, and kissed him back.
Listening to Khakis
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 28, 1997
ANNALS OF STYLE
What America's most popular pants tell us about the way guys think.
1.
In the fall of 1987, Levi Strauss & Co. began running a series of national television commercials to promote Dockers, its new brand of men's khakis. All the spots-and there were twenty-eight-had the same basic structure. A handheld camera would follow a group of men as they sat around a living room or office or bar. The men were in their late thirties, but it was hard to tell, because the camera caught faces only fleetingly. It was trained instead on the men from the waist down-on the seats of their pants, on the pleats of their khakis, on their hands going in and out of their pockets. As the camera jumped in quick cuts from Docker to Docker, the men chatted in loose, overlapping non sequiturs-guy-talk fragments that, when they are rendered on the page, achieve a certain Dadaist poetry. Here is the entire transcript of "Poolman," one of the first-and, perhaps, best-ads in the series:
"She was a redhead about five foot six inches tall."
"And all of a sudden this thing starts spinning, and it's going round and round."
"Is that Nelson?"
"And that makes me safe, because with my wife, I'll never be that way."
"It's like your career, and you're frustrated. I mean that-that's-what you want."
"Of course, that's just my opinion."
"So money's no object."
"Yeah, money's no object."
"What are we going to do with our lives, now?"
"Well . . ."
"Best of all . . ."
[Voice-over] "Levi's one-hundred-per-cent-cotton Dockers. If you're not wearing Dockers, you're just wearing pants."
"And I'm still paying the loans off."
"You've got all the money in the world."
"I'd like to at least be your poolman."
By the time the campaign was over, at the beginning of the nineties, Dockers had grown into a six-hundred-million-dollar business-a brand that if it had spun off from Levi's would have been (and would still be) the fourth-largest clothing brand in the world. Today, seventy per cent of American men between the ages of twenty-five and forty-five own a pair of Dockers, and khakis are expected to be as popular as blue jeans by the beginning of the next century. It is no exaggeration to call the original Dockers ads one of the most successful fashion-advertising campaigns in history.
This is a remarkable fact for a number of reasons, not the least of which is that the Dockers campaign was aimed at men, and no one had ever thought you could hit a home run like that by trying to sell fashion to the American male. Not long ago, two psychologists at York University, in Toronto-Irwin Silverman and Marion Eals-conducted an experiment in which they had men and women sit in an office for two minutes, without any reading material or distraction, while they ostensibly waited to take part in some kind of academic study. Then they were taken from the office and given the real reason for the experiment: to find out how many of the objects in the office they could remember. This was not a test of memory so much as it was a test of awareness-of the kind and quality of unconscious attention that people pay to the particulars of their environment. If you think about it, it was really a test of fashion sense, because, at its root, this is what fashion sense really is-the ability to register and appreciate and remember the details of the way those around you look and dress, and then reinterpret those details and memories yourself.
When the results of the experiment were tabulated, it was found that the women were able to recall the name and the placement of seventy per cent more objects than the men, which makes perfect sense. Women's fashion, after all, consists of an endless number of subtle combinations and variations-of skirt, dress, pants, blouse, T-shirt, hose, pumps, flats, heels, necklace, bracelet, cleavage, collar, curl, and on and on-all driven by the fact that when a woman walks down the street she knows that other women, consciously or otherwise, will notice the name and the placement of what she is wearing. Fashion works for women because women can appreciate its complexity. But when it comes to men what's the point? How on earth do you sell fashion to someone who has no appreciation for detail whatsoever?
The Dockers campaign, however, proved that you could sell fashion to men. But that was only the first of its remarkable implications. The second-which remains as weird and mysterious and relevant to the fashion business today as it was ten years ago-was that you could do this by training a camera on a man's butt and having him talk in yuppie gibberish.
2.
I watched "Poolman" with three members of the new team handling the Dockers account at Foote, Cone & Belding (F.C.B.), Levi's ad agency. We were in a conference room at Levi's Plaza, in downtown San Francisco, a redbrick building decorated (appropriately enough) in khaki like earth tones, with the team members-Chris Shipman, Iwan Thomis, and Tanyia Kandohla-forming an impromptu critical panel. Shipman, who had thick black glasses and spoke in an almost inaudible laid-back drawl, put a videocassette of the first campaign into a VCR-stopping, starting, and rewinding-as the group analyzed what made the spots so special.
"Remember, this is from 1987," he said, pointing to the screen, as the camera began its jerky dance. "Although this style of film making looks everyday now, that kind of handheld stuff was very fresh when these were made."
"They taped real conversations," Kandohla chimed in. "Then the footage was cut together afterward. They were thrown areas to talk about. It was very natural, not at all scripted. People were encouraged to go off on tangents."
After "Poolman," we watched several of the other spots in the original group-"Scorekeeper" and "Dad's Chair," "Flag Football," and "The Meaning of Life"-and I asked about the headlessness of the commercials, because if you watch too many in a row all those anonymous body parts begin to get annoying. But Thomis maintained that the headlessness was crucial, because it was the absence of faces that gave the dialogue its freedom. "They didn't show anyone's head because if they did the message would have too much weight," he said. "It would be too pretentious. You know, people talking about their hopes and dreams. It seems more genuine, as opposed to something stylized."
The most striking aspect of the spots is how different they are from typical fashion advertising. If you look at men's fashion magazines, for example, at the advertisements for the suits of Ralph Lauren or Valentino or Hugo Boss, they almost always consist of a beautiful man, with something interesting done to his hair, wearing a gorgeous outfit. At the most, the man may be gesturing discreetly, or smiling in the demure way that a man like that might smile after, say, telling the supermodel at the next table no thanks he has to catch an early-morning flight to Milan. But that's all. The beautiful face and the clothes tell the whole story. The Dockers ads, though, are almost exactly the opposite. There's no face. The camera is jumping around so much that it's tough to concentrate on the clothes. And instead of stark simplicity, the fashion image is overlaid with a constant, confusing patter. It's almost as if the Dockers ads weren't primarily concerned with clothes at all-and in fact that's exactly what Levi's intended. What the company had discovered, in its research, was that baby-boomer men felt that the chief thing missing from their lives was male friendship. Caught between the demands of the families that many of them had started in the eighties and career considerations that had grown more onerous, they felt they had lost touch with other men. The purpose of the ads-the chatter, the lounging around, the quick cuts-was simply to conjure up a place where men could put on one-hundred-per-cent-cotton khakis and reconnect with one another. In the original advertising brief, that imaginary place was dubbed Dockers World.
This may seem like an awfully roundabout way to sell a man a pair of pants. But that was the genius of the campaign. One of the truisms of advertising is that it's always easier to sell at the extremes than in the middle, which is why the advertisements for Valentino and Hugo Boss are so simple. The man in the market for a thousand-dollar suit doesn't need to be convinced of the value of nice clothes. The man in the middle, though-the man in the market for a forty-dollar pair of khakis-does. In fact, he probably isn't comfortable buying clothes at all. To sell him a pair of pants you have to take him somewhere he is comfortable, and that was the point of Dockers World. Even the apparent gibberish of lines like " 'She was a redhead about five foot six inches tall.' / 'And all of a sudden this thing starts spinning, and it's going round and round.' / 'Is that Nelson?' " have, if you listen closely enough, a certain quintessentially guy-friendly feel. It's the narrative equivalent of the sports-highlight reel-the sequence of five- second film clips of the best plays from the day's basketball or football or baseball games, which millions of American men watch every night on television. This nifty couplet from "Scorekeeper," for instance-" 'Who remembers their actual first girlfriend?'/ 'I would have done better, but I was bald then, too' "-is not nonsense but a twenty- minute conversation edited down to two lines. A man schooled in the highlight reel no more needs the other nineteen minutes and fifty- eight seconds of that exchange than he needs to see the intervening catch and throw to make sense of a sinking liner to left and a close play at the plate.
"Men connected to the underpinnings of what was being said," Robert Hanson, the vice-president of marketing for Dockers, told me. "These guys were really being honest and genuine and real with each other, and talking about their lives. It may not have been the truth, but it was the fantasy of what a lot of customers wanted, which was not just to be work-focussed but to have the opportunity to express how you feel about your family and friends and lives. The content was very important. The thing that built this brand was that we absolutely nailed the emotional underpinnings of what motivates baby boomers."
Hanson is a tall, striking man in his early thirties. He's what Jeff Bridges would look like if he had gone to finishing school. Hanson said that when he goes out on research trips to the focus groups that Dockers holds around the country he often deliberately stays in the background, because if the men in the group see him "they won't necessarily respond as positively or as openly." When he said this, he was wearing a pair of stone-white Dockers, a deep-blue shirt, a navy blazer, and a brilliant-orange patterned tie, and these worked so well together that it was obvious what he meant. When someone like Hanson dresses up that fabulously in Dockers, he makes it clear just how many variations and combinations are possible with a pair of khakis-but that, of course, defeats the purpose of the carefully crafted Dockers World message, which is to appeal to the man who wants nothing to do with fashion's variations and combinations. It's no coincidence that every man in every one of the group settings profiled in each commercial is wearing-albeit in different shades-exactly the same kind of pants. Most fashion advertising sells distinctiveness. (Can you imagine, say, an Ann Taylor commercial where a bunch of thirtyish girlfriends are lounging around chatting, all decked out in matching sweater sets?) Dockers was selling conformity.
"We would never do anything with our pants that would frighten anyone away," Gareth Morris, a senior designer for the brand, told me. "We'd never do too many belt loops, or an unusual base cloth. Our customers like one-hundred-per-cent-cotton fabrics. We would never do a synthetic. That's definitely in the market, but it's not where we need to be. Styling-wise, we would never do a wide, wide leg. We would never do a peg-legged style. Our customers seem to have a definite idea of what they want. They don't like tricky openings or zips or a lot of pocket flaps and details on the back. We've done button-through flaps, to push it a little bit. But we usually do a welt pocket-that's a pocket with a button-through. It's funny. We have focus groups in New York, Chicago, and San Francisco, and whenever we show them a pocket with a flap-it's a simple thing-they hate it. They won't buy the pants. They complain, 'How do I get my wallet?' So we compromise and do a welt. That's as far as they'll go. And there's another thing. They go, 'My butt's big enough. I don't want flaps hanging off of it, too.' They like inseam pockets. They like to know where they put their hands." He gestured to the pair of experimental prototype Dockers he was wearing, which had pockets that ran almost parallel to the waistband of the pants. "This is a stretch for us," he said. "If you start putting more stuff on than we have on our product, you're asking for trouble."
The apotheosis of the notion of khakis as nonfashion-guy fashion came several years after the original Dockers campaign, when Haggar Clothing Co. hired the Goodby, Silverstein & Partners ad agency, in San Francisco, to challenge Dockers' khaki dominance. In retrospect, it was an inspired choice, since Goodby, Silverstein is Guy Central. It does Porsche ("Kills Bugs Fast") and Isuzu and the recent "Got Milk?" campaign and a big chunk of the Nike business, and it operates out of a gutted turn-of-the-century building downtown, refurbished in what is best described as neo-Erector set. The campaign that it came up with featured voice-overs by Roseanne's television husband, John Goodman. In the best of the ads, entitled "I Am," a thirtyish man wakes up, his hair all mussed, pulls on a pair of white khakis, and half sleepwalks outside to get the paper. "I am not what I wear. I'm not a pair of pants, or a shirt," Goodman intones. The man walks by his wife, handing her the front sections of the paper. "I'm not in touch with my inner child. I don't read poetry, and I'm not politically correct." He heads away from the kitchen, down a hallway, and his kid grabs the comics from him. "I'm just a guy, and I don't have time to think about what I wear, because I've got a lot of important guy things to do." All he has left now is the sports section and, gripping it purposefully, he heads for the bathroom. "One-hundred-per-cent-cotton wrinkle-free khaki pants that don't require a lot of thought. Haggar. Stuff you can wear."
"We softened it," Richard Silverstein told me as we chatted in his office, perched on chairs in the midst of-among other things--a lacrosse stick, a bike stand, a gym bag full of yesterday's clothing, three toy Porsches, and a giant model of a Second World War Spitfire hanging from the ceiling. "We didn't say 'Haggar Apparel' or 'Haggar Clothing.' We said, 'Hey, listen, guys, don't worry. It's just stuff. Don't worry about it.' The concept was 'Make it approachable.' " The difference between this and the Dockers ad is humor. F.C.B. assiduously documented men's inner lives. Goodby, Silverstein made fun of them. But it's essentially the same message. It's instructive, in this light, to think about the Casual Friday phenomenon of the past decade, the loosening of corporate dress codes that was spawned by the rise of khakis. Casual Fridays are commonly thought to be about men rejecting the uniform of the suit. But surely that's backward. Men started wearing khakis to work because Dockers and Haggar made it sound as if khakis were going to be even easier than a suit. The khaki-makers realized that men didn't want to get rid of uniforms; they just wanted a better uniform.
The irony, of course, is that this idea of nonfashion-of khakis as the choice that diminishes, rather than enhances, the demands of fashion-turned out to be a white lie. Once you buy even the plainest pair of khakis, you invariably also buy a sports jacket and a belt and a whole series of shirts to go with it-maybe a polo knit for the weekends, something in plaid for casual, and a button-down for a dressier look-and before long your closet is thick with just the kinds of details and options that you thought you were avoiding. You may not add these details as brilliantly or as consciously as, say, Hanson does, but you end up doing it nonetheless. In the past seven years, sales of men's clothing in the United States have risen an astonishing twenty- one per cent, in large part because of this very fact-that khakis, even as they have simplified the bottom half of the male wardrobe, have forced a steady revision of the top. At the same time, even khakis themselves-within the narrow constraints of khakidom-have quietly expanded their range. When Dockers were launched, in the fall of 1986, there were just three basic styles: the double-pleated Docker in khaki, olive, navy, and black; the Steamer, in cotton canvas; and the more casual flat-fronted Docker. Now there are twenty-four. Dockers and Haggar and everyone else has been playing a game of bait and switch: lure men in with the promise of a uniform and then slip them, bit by bit, fashion. Put them in an empty room and then, ever so slowly, so as not to scare them, fill the room with objects.
3.
There is a puzzle in psychology known as the canned-laughter problem, which has a deeper and more complex set of implications about men and women and fashion and why the Dockers ads were so successful. Over the years, several studies have been devoted to this problem, but perhaps the most instructive was done by two psychologists at the University of Wisconsin, Gerald Cupchik and Howard Leventhal. Cupchik and Leventhal took a stack of cartoons (including many from The New Yorker), half of which an independent panel had rated as very funny and half of which it had rated as mediocre. They put the cartoons on slides, had a voice-over read the captions, and presented the slide show to groups of men and women. As you might expect, both sexes reacted pretty much the same way. Then Cupchik and Leventhal added a laugh track to the voice-over-the subjects were told that it was actual laughter from people who were in the room during the taping-and repeated the experiment. This time, however, things got strange. The canned laughter made the women laugh a little harder and rate the cartoons as a little funnier than they had before. But not the men. They laughed a bit more at the good cartoons but much more at the bad cartoons. The canned laughter also made them rate the bad cartoons as much funnier than they had rated them before, but it had little or no effect on their ratings of the good cartoons. In fact, the men found a bad cartoon with a laugh track to be almost as funny as a good cartoon without one. What was going on?
The guru of male-female differences in the ad world is Joan Meyers-Levy, a professor at the University of Chicago business school. In a groundbreaking series of articles written over the past decade, Meyers-Levy has explained the canned-laughter problem and other gender anomalies by arguing that men and women use fundamentally different methods of processing information. Given two pieces of evidence about how funny something is-their own opinion and the opinion of others (the laugh track)-the women came up with a higher score than before because they added the two clues together: they integrated the information before them. The men, on the other hand, picked one piece of evidence and ignored the other. For the bad cartoons, they got carried away by the laugh track and gave out hugely generous scores for funniness. For the good cartoons, however, they were so wedded to their own opinion that suddenly the laugh track didn't matter at all.
This idea-that men eliminate and women integrate-is called by Meyers-Levy the "selectivity hypothesis." Men are looking for a way to simplify the route to a conclusion, so they seize on the most obvious evidence and ignore the rest, while women, by contrast, try to process information comprehensively. So-called bandwidth research, for example, has consistently shown that if you ask a group of people to sort a series of objects or ideas into categories, the men will create fewer and larger categories than the women will. They use bigger mental bandwidths. Why? Because the bigger the bandwidth the less time and attention you have to pay to each individual object. Or consider what is called the invisibility question. If a woman is being asked a series of personal questions by another woman, she'll say more if she's facing the woman she's talking to than she will if her listener is invisible. With men, it's the opposite. When they can't see the person who's asking them questions, they suddenly and substantially open up. This, of course, is a condition of male communication which has been remarked on by women for millennia. But the selectivity hypothesis suggests that the cause of it has been misdiagnosed. It's not that men necessarily have trouble expressing their feelings; it's that in a face-to-face conversation they experience emotional overload. A man can't process nonverbal information (the expression and body language of the person asking him questions) and verbal information (the personal question being asked) at the same time any better than he can process other people's laughter and his own laughter at the same time. He has to select, and it is Meyers- Levy's contention that this pattern of behavior suggests significant differences in the way men and women respond to advertising.
Joan Meyers-Levy is a petite woman in her late thirties, with a dark pageboy haircut and a soft voice. She met me in the downtown office of the University of Chicago with three large folders full of magazine advertisements under one arm, and after chatting about the origins and the implications of her research she handed me an ad from several years ago for Evian bottled water. It has a beautiful picture of the French Alps and, below that, in large type, "Our factory." The text ran for several paragraphs, beginning:
You're not just looking at the French Alps. You're looking at one of the most pristine places on earth. And the origin of Evian Natural Spring Water.
Here, it takes no less than 15 years for nature to purify every drop of Evian as it flows through mineral-rich glacial formations deep within the mountains. And it is here that Evian acquires its unique balance of minerals.
"Now, is that a male or a female ad?" she asked. I looked at it again. The picture baffled me. But the word "factory" seemed masculine, so I guessed male.
She shook her head. "It's female. Look at the picture. It's just the Alps, and then they label it 'Our factory.' They're using a metaphor. To understand this, you're going to have to engage in a fair amount of processing. And look at all the imagery they're encouraging you to build up. You're not just looking at the French Alps. It's 'one of the most pristine places on earth' and it will take nature 'no less than fifteen years' to purify." Her point was that this is an ad that works only if the viewer appreciates all its elements-if the viewer integrates, not selects. A man, for example, glancing at the ad for a fraction of a second, might focus only on the words "Our factory" and screen out the picture of the Alps entirely, the same way he might have screened out the canned laughter. Then he wouldn't get the visual metaphor. In fact, he might end up equating Evian with a factory, and that would be a disaster. Anyway, why bother going into such detail about the glaciers if it's just going to get lost in the big male bandwidth?
Meyers-Levy handed me another Evian advertisement. It showed a man-the Olympic Gold Medal swimmer Matt Biondi-by a pool drinking Evian, with the caption "Revival of the fittest." The women's ad had a hundred and nineteen words of text. This ad had just twenty-nine words: "No other water has the unique, natural balance of minerals that Evian achieves during its 15-year journey deep within the French Alps. To be the best takes time." Needless to say, it came from a men's magazine. "With men, you don't want the fluff," she said. "Women, though, participate a lot more in whatever they are processing. By giving them more cues, you give them something to work with. You don't have to be so literal. With women you can be more allusive, so you can draw them in. They will engage in elaboration, and the more associations they make the easier it is to remember and retrieve later on."
Meyers-Levy took a third ad from her pile, this one for the 1997 Mercury Mountaineer four-wheel-drive sport-utility vehicle. It covers two pages, has the heading "Take the Rough with the Smooth," and shows four pictures-one of the vehicle itself, one of a mother and her child, one of a city skyline, and a large one of the interior of the car, over which the ad's text is superimposed. Around the border of the ad are forty-four separate, tiny photographs of roadways and buildings and construction sites and manhole covers. Female. Next to it on the table she put another ad-this one a single page, with a picture of the Mountaineer's interior, fifteen lines of text, a picture of the car's exterior, and, at the top, the heading: "When the Going Gets Tough, the Tough Get Comfortable." Male. "It's details, details. They're saying lots of different stuff," she said, pointing to the female version. "With men, instead of trying to cover everything in a single execution, you'd probably want to have a whole series of ads, each making a different point."
After a while, the game got very easy-if a bit humiliating. Meyers- Levy said that her observations were not antimale-that both the male and the female strategies have their strengths and their weaknesses- and, of course, she's right. On the other hand, reading the gender of ads makes it painfully obvious how much the advertising world- consciously or not-talks down to men. Before I met Meyers-Levy, I thought that the genius of the famous first set of Dockers ads was their psychological complexity, their ability to capture the many layers of eighties guyness. But when I thought about them again after meeting Meyers-Levy, I began to think that their real genius lay in their heroic simplicity-in the fact that F.C.B. had the self-discipline to fill the allotted thirty seconds with as little as possible. Why no heads? The invisibility rule. Guys would never listen to that Dadaist extemporizing if they had to process nonverbal cues, too. Why were the ads set in people's living rooms and at the office? Bandwidth. The message was that khakis were wide-bandwidth pants. And why were all the ads shot in almost exactly the same way, and why did all the dialogue run together in one genial, faux-philosophical stretch of highlight reel? Because of canned laughter. Because if there were more than one message to be extracted men would get confused.
4.
In the early nineties, Dockers began to falter. In 1992, the company sold sixty-six million pairs of khakis, but in 1993, as competition from Haggar and the Gap and other brands grew fiercer, that number slipped to fifty-nine million six hundred thousand, and by 1994 it had fallen to forty-seven million. In marketing-speak, user reality was encroaching on brand personality; that is, Dockers were being defined by the kind of middle-aged men who wore them, and not by the hipper, younger men in the original advertisements. The brand needed a fresh image, and the result was the "Nice Pants" campaign currently being shown on national television-a campaign widely credited with the resurgence of Dockers' fortunes. In one of the spots, "Vive la France," a scruffy young man in his early twenties, wearing Dockers, is sitting in a café in Paris. He's obviously a tourist. He glances up and sees a beautiful woman (actually, the supermodel Tatjana Patitz) looking right at him. He's in heaven. She starts walking directly toward him, and as she passes by she says, "Beau pantalon." As he looks frantically through his French phrase book for a translation, the waiter comes by and cuffs him on the head: "Hey, she says, 'Nice pants.' " Another spot in the series, "Subway Love," takes place on a subway car in Chicago. He (a nice young man wearing Dockers) spots her (a total babe), and their eyes lock. Romantic music swells. He moves toward her, but somehow, in a sudden burst of pushing and shoving, they get separated. Last shot: she's inside the car, her face pushed up against the glass. He's outside the car, his face pushed up against the glass. As the train slowly pulls away, she mouths two words: "Nice pants."
It may not seem like it, but "Nice Pants" is as radical a campaign as the original Dockers series. If you look back at the way that Sansabelt pants, say, were sold in the sixties, each ad was what advertisers would call a pure "head" message: the pants were comfortable, durable, good value. The genius of the first Dockers campaign was the way it combined head and heart: these were all- purpose, no-nonsense pants that connected to the emotional needs of baby boomers. What happened to Dockers in the nineties, though, was that everyone started to do head and heart for khakis. Haggar pants were wrinkle-free (head) and John Goodman-guy (heart). The Gap, with its brilliant billboard campaign of the early nineties-"James Dean wore khakis," "Frank Lloyd Wright wore khakis"-perfected the heart message by forging an emotional connection between khakis and a particular nostalgic, glamorous all-Americanness. To reassert itself, Dockers needed to go an extra step. Hence "Nice Pants," a campaign that for the first time in Dockers history raises the subject of sex.
"It's always been acceptable for a man to be a success in business," Hanson said, explaining the rationale behind "Nice Pants." "It's always been expected of a man to be a good provider. The new thing that men are dealing with is that it's O.K. for men to have a sense of personal style, and that it's O.K. to be seen as sexy. It's less about the head than about the combination of the head, the heart, and the groin. It's those three things. That's the complete man."
The radical part about this, about adding the groin to the list, is that almost no other subject for men is as perilous as the issue of sexuality and fashion. What "Nice Pants" had to do was talk about sex the same way that "Poolman" talked about fashion, which was to talk about it by not talking about it-or, at least, to talk about it in such a coded, cautious way that no man would ever think Dockers was suggesting that he wear khakis in order to look pretty. When I took a videotape of the "Nice Pants" campaign to several of the top agencies in New York and Los Angeles, virtually everyone agreed that the spots were superb, meaning that somehow F.C.B. had managed to pull off this balancing act.
What David Altschiller, at Hill, Holliday/Altschiller, in Manhattan, liked about the spots, for example, was that the hero was naïve: in neither case did he know that he had on nice pants until a gorgeous woman told him so. Naïveté, Altschiller stressed, is critical. Several years ago, he did a spot for Claiborne for Men cologne in which a great-looking guy in a bar, wearing a gorgeous suit, was obsessing neurotically about a beautiful woman at the other end of the room: "I see this woman. She's perfect. She's looking at me. She's smiling. But wait. Is she smiling at me? Or laughing at me? . . . Or looking at someone else?" You'd never do this in an ad for women's cologne. Can you imagine? "I see this guy. He's perfect. Ohmigod. Is he looking at me?" In women's advertising, self-confidence is sexy. But if a man is self-confident-if he knows he is attractive and is beautifully dressed- then he's not a man anymore. He's a fop. He's effeminate. The cologne guy had to be neurotic or the ad wouldn't work. "Men are still abashed about acknowledging that clothing is important," Altschiller said. "Fashion can't be important to me as a man. Even when, in the first commercial, the waiter says 'Nice pants,' it doesn't compute to the guy wearing the nice pants. He's thinking, What do you mean, 'Nice pants'?" Altschiller was looking at a videotape of the Dockers ad as he talked-standing at a forty-five-degree angle to the screen, with one hand on the top of the monitor, one hand on his hip, and a small, bemused smile on his lips. "The world may think they are nice, but so long as he doesn't think so he doesn't have to be self-conscious about it, and the lack of self-consciousness is very important to men. Because 'I don't care.' Or 'Maybe I care, but I can't be seen to care.' " For the same reason, Altschiller liked the relative understatement of the phrase "nice pants," as opposed to something like "great pants," since somewhere between "nice" and "great" a guy goes from just happening to look good to the unacceptable position of actually trying to look good. "In focus groups, men said that to be told you had 'nice pants' was one of the highest compliments a man could wish for," Tanyia Kandohla told me later, when I asked about the slogan. "They wouldn't want more attention drawn to them than that."
In many ways, the "Nice Pants" campaign is a direct descendant of the hugely successful campaign that Rubin-Postaer & Associates, in Santa Monica, did for Bugle Boy Jeans in the early nineties. In the most famous of those spots, the camera opens on an attractive but slightly goofy-looking man in a pair of jeans who is hitchhiking by the side of a desert highway. Then a black Ferrari with a fabulous babe at the wheel drives by, stops, and backs up. The babe rolls down the window and says, "Excuse me. Are those Bugle Boy Jeans that you're wearing?" The goofy guy leans over and pokes his head in the window, a surprised half smile on his face: "Why, yes, they are Bugle Boy Jeans."
"Thank you," the babe says, and she rolls up the window and drives away.
This is really the same ad as "Nice Pants"-the babe, the naĂŻve hero, the punch line. The two ads have something else in common. In the Bugle Boy spot, the hero wasn't some stunning male model. "I think he was actually a box boy at Vons in Huntington Beach," Larry Postaer, the creative director of Rubin-Postaer & Associates, told me. "I guess someone"-at Bugle Boy-"liked him." He's O.K.-looking, but not nearly in the same class as the babe in the Ferrari. In "Subway Love," by the same token, the Dockers man is medium-sized, almost small, and gets pushed around by much tougher people in the tussle on the train. He's cute, but he's a little bit of a wimp. Kandohla says that F.C.B. tried very hard to find someone with that look-someone who was, in her words, "aspirational real," not some "buff, muscle- bound jock." In a fashion ad for women, you can use Claudia Schiffer to sell a cheap pair of pants. But not in a fashion ad for men. The guy has to be believable. "A woman cannot be too gorgeous," Postaer explained. "A man, however, can be too gorgeous, because then he's not a man anymore. It's pretty rudimentary. Yet there are people who don't buy that, and have gorgeous men in their ads. I don't get it. Talk to Barneys about how well that's working. It couldn't stay in business trying to sell that high-end swagger to a mass market. The general public wouldn't accept it. Look at beer commercials. They always have these gorgeous girls-even now, after all the heat-and the guys are always just guys. That's the way it is. We only reflect what's happening out there, we're not creating it. Those guys who run the real high-end fashion ads-they don't understand that. They're trying to remold how people think about gender. I can't explain it, though I have my theories. It's like a Grecian ideal. But you can't be successful at advertising by trying to re-create the human condition. You can't alter men's minds, particularly on subjects like sexuality. It'll never happen."
Postaer is a gruff, rangy guy, with a Midwestern accent and a gravelly voice, who did Budweiser commercials in Chicago before moving West fifteen years ago. When he wasn't making fun of the pretentious style of East Coast fashion advertising, he was making fun of the pretentious questions of East Coast writers. When, for example, I earnestly asked him to explain the logic behind having the goofy guy screw up his face in such a-well, goofy-way when he says, "Why, yes, they are Bugle Boy Jeans," Postaer took his tennis shoes off his desk, leaned forward bemusedly in his chair, and looked at me as if my head came to a small point. "Because that's the only way he could say it," he said. "I suppose we might have had him say it a little differently if he could actually act."
Incredibly, Postaer said, the people at Bugle Boy wanted the babe to invite the goofy guy into the car, despite the fact that this would have violated the most important rule that governs this new style of groin messages in men's-fashion advertising, which is that the guy absolutely cannot ever get the girl. It's not just that if he got the girl the joke wouldn't work anymore; it's that if he got the girl it might look as if he had deliberately dressed to get the girl, and although at the back of every man's mind as he's dressing in the morning there is the thought of getting the girl, any open admission that that's what he's actually trying to do would undermine the whole unself- conscious, antifashion statement that men's advertising is about. If Tatjana Patitz were to say "Beau garçon" to the guy in "Vive la France," or the babe on the subway were to give the wimp her number, Dockers would suddenly become terrifyingly conspicuous-the long-pants equivalent of wearing a tight little Speedo to the beach. And if the Vons box boy should actually get a ride from the Ferrari babe, the ad would suddenly become believable only to that thin stratum of manhood which thinks that women in Ferraris find twenty- four-dollar jeans irresistible. "We fought that tooth and nail," Postaer said. "And it more or less cost us the account, even though the ad was wildly successful." He put his tennis shoes back up on the desk. "But that's what makes this business fun-trying to prove to clients how wrong they are."
5.
The one ad in the "Nice Pants" campaign which isn't like the Bugle Boy spots is called "Motorcycle." In it a nice young man happens upon a gleaming Harley on a dark back street of what looks like downtown Manhattan. He strokes the seat and then, unable to contain himself, climbs aboard the bike and bounces up and down, showing off his Dockers (the "product shot") but accidentally breaking a mirror on the handlebar. He looks up. The Harley's owner-a huge, leather-clad biker-is looking down at him. The biker glowers, looking him up and down, and says, "Nice pants." Last shot: the biker rides away, leaving the guy standing on the sidewalk in just his underwear.
What's surprising about this ad is that, unlike "Vive la France" and "Subway Love," it does seem to cross the boundaries of acceptable sex talk. The rules of guy advertising so carefully observed in those spots-the fact that the hero has to be naĂŻve, that he can't be too good-looking, that he can't get the girl, and that he can't be told anything stronger than "Nice pants"-are all, in some sense, reactions to the male fear of appearing too concerned with fashion, of being too pretty, of not being masculine. But what is "Motorcycle"? It's an ad about a sweet-looking guy down in the Village somewhere who loses his pants to a butch-looking biker in leather. "I got so much feedback at the time of 'Well, God, that's kind of gay, don't you think?' " Robert Hanson said. "People were saying, 'This buff guy comes along and he rides off with the guy's pants. I mean, what the hell were they doing?' It came from so many different people within the industry. It came from some of our most conservative retailers. But do you know what? If you put these three spots up-'Vive la France,' 'Subway Love,' and 'Motorcycle'-which one do you think men will talk about ad nauseam? 'Motorcycle.' It's No. 1. It's because he's really cool. He's in a really cool environment, and it's every guy's fantasy to have a really cool, tricked-out fancy motorcycle."
Hanson paused, as if he recognized that what he was saying was quite sensitive. He didn't want to say that men failed to pick up the gay implications of the ad because they're stupid, because they aren't stupid. And he didn't want to sound condescending, because Dockers didn't build a six-hundred-million-dollar business in five years by sounding condescending. All he was trying to do was point out the fundamental exegetical error in calling this a gay ad, because the only way for a Dockers man to be offended by "Motorcycle" would be if he thought about it with a little imagination, if he picked up on some fairly subtle cues, if he integrated an awful lot of detail. In other words, a Dockers man could only be offended if he did precisely what, according to Meyers-Levy, men don't do. It's not a gay ad because it's a guy ad. "The fact is," Hanson said, "that most men's interpretation of that spot is: You know what? Those pants must be really cool, because they prevented him from getting the shit kicked out of him." The Coolhunt
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
March 17, 1997
ANNALS OF STYLE
Who decides what's cool?
Certain kids in certain places--
and only the coolhunters know who they are.
1.
Baysie Wightman met DeeDee Gordon, appropriately enough, on a coolhunt. It was 1992. Baysie was a big shot for Converse, and DeeDee, who was barely twenty-one, was running a very cool boutique called Placid Planet, on Newbury Street in Boston. Baysie came in with a camera crew-one she often used when she was coolhunting-and said, "I've been watching your store, I've seen you, I've heard you know what's up," because it was Baysie's job at Converse to find people who knew what was up and she thought DeeDee was one of those people. DeeDee says that she responded with reserve-that "I was like, 'Whatever' "-but Baysie said that if DeeDee ever wanted to come and work at Converse she should just call, and nine months later DeeDee called. This was about the time the cool kids had decided they didn't want the hundred-and-twenty- five-dollar basketball sneaker with seventeen different kinds of high-technology materials and colors and air-cushioned heels anymore. They wanted simplicity and authenticity, and Baysie picked up on that. She brought back the Converse One Star, which was a vulcanized, suède, low-top classic old-school sneaker from the nineteen-seventies, and, sure enough, the One Star quickly became the signature shoe of the retro era. Remember what Kurt Cobain was wearing in the famous picture of him lying dead on the ground after committing suicide? Black Converse One Stars. DeeDee's big score was calling the sandal craze. She had been out in Los Angeles and had kept seeing the white teen-age girls dressing up like cholos, Mexican gangsters, in tight white tank tops known as "wife beaters," with a bra strap hanging out, and long shorts and tube socks and shower sandals. DeeDee recalls, "I'm like, 'I'm telling you, Baysie, this is going to hit. There are just too many people wearing it. We have to make a shower sandal.' " So Baysie, DeeDee, and a designer came up with the idea of making a retro sneaker-sandal, cutting the back off the One Star and putting a thick outsole on it. It was huge, and, amazingly, it's still huge.
Today, Baysie works for Reebok as general-merchandise manager-part of the team trying to return Reebok to the position it enjoyed in the mid-nineteen-eighties as the country's hottest sneaker company. DeeDee works for an advertising agency in Del Mar called Lambesis, where she puts out a quarterly tip sheet called the L Report on what the cool kids in major American cities are thinking and doing and buying. Baysie and DeeDee are best friends. They talk on the phone all the time. They get together whenever Baysie is in L.A. (DeeDee: "It's, like, how many times can you drive past O. J. Simpson's house?"), and between them they can talk for hours about the art of the coolhunt. They're the Lewis and Clark of cool.
What they have is what everybody seems to want these days, which is a window on the world of the street. Once, when fashion trends were set by the big couture houses-when cool was trickle- down-that wasn't important. But sometime in the past few decades things got turned over, and fashion became trickle-up. It's now about chase and flight-designers and retailers and the mass consumer giving chase to the elusive prey of street cool-and the rise of coolhunting as a profession shows how serious the chase has become. The sneakers of Nike and Reebok used to come out yearly. Now a new style comes out every season. Apparel designers used to have an eighteen-month lead time between concept and sale. Now they're reducing that to a year, or even six months, in order to react faster to new ideas from the street. The paradox, of course, is that the better coolhunters become at bringing the mainstream close to the cutting edge, the more elusive the cutting edge becomes. This is the first rule of the cool: The quicker the chase, the quicker the flight. The act of discovering what's cool is what causes cool to move on, which explains the triumphant circularity of coolhunting: because we have coolhunters like DeeDee and Baysie, cool changes more quickly, and because cool changes more quickly, we need coolhunters like DeeDee and Baysie.
DeeDee is tall and glamorous, with short hair she has dyed so often that she claims to have forgotten her real color. She drives a yellow 1977 Trans Am with a burgundy stripe down the center and a 1973 Mercedes 450 SL, and lives in a spare, Japanese-style cabin in Laurel Canyon. She uses words like "rad" and "totally," and offers non-stop, deadpan pronouncements on pop culture, as in "It's all about Pee-wee Herman." She sounds at first like a teen, like the same teens who, at Lambesis, it is her job to follow. But teen speech-particularly girl-teen speech, with its fixation on reported speech ("so she goes," "and I'm like," "and he goes") and its stock vocabulary of accompanying grimaces and gestures-is about using language less to communicate than to fit in. DeeDee uses teen speech to set herself apart, and the result is, for lack of a better word, really cool. She doesn't do the teen thing of climbing half an octave at the end of every sentence. Instead, she drags out her vowels for emphasis, so that if she mildly disagreed with something I'd said she would say "Maalcolm" and if she strongly disagreed with what I'd said she would say "Maaalcolm."
Baysie is older, just past forty (although you would never guess that), and went to Exeter and Middlebury and had two grandfathers who went to Harvard (although you wouldn't guess that, either). She has curly brown hair and big green eyes and long legs and so much energy that it is hard to imagine her asleep, or resting, or even standing still for longer than thirty seconds. The hunt for cool is an obsession with her, and DeeDee is the same way. DeeDee used to sit on the corner of West Broadway and Prince in SoHo-back when SoHo was cool-and take pictures of everyone who walked by for an entire hour. Baysie can tell you precisely where she goes on her Reebok coolhunts to find the really cool alternative white kids ("I'd maybe go to Portland and hang out where the skateboarders hang out near that bridge") or which snowboarding mountain has cooler kids-Stratton, in Vermont, or Summit County, in Colorado. (Summit, definitely.) DeeDee can tell you on the basis of the L Report's research exactly how far Dallas is behind New York in coolness (from six to eight months). Baysie is convinced that Los Angeles is not happening right now: "In the early nineteen-nineties a lot more was coming from L.A. They had a big trend with the whole Melrose Avenue look-the stupid goatees, the shorter hair. It was cleaned-up aftergrunge. There were a lot of places you could go to buy vinyl records. It was a strong place to go for looks. Then it went back to being horrible." DeeDee is convinced that Japan is happening: "I linked onto this future-technology thing two years ago. Now look at it, it's huge. It's the whole resurgence of Nike-Nike being larger than life. I went to Japan and saw the kids just bailing the most technologically advanced Nikes with their little dresses and little outfits and I'm like, 'Whoa, this is trippy!' It's performance mixed with fashion. It's really superheavy." Baysie has a theory that Liverpool is cool right now because it's the birthplace of the whole "lad" look, which involves soccer blokes in the pubs going superdressy and wearing Dolce & Gabbana and Polo Sport and Reebok Classics on their feet. But when I asked DeeDee about that, she just rolled her eyes: "Sometimes Baysie goes off on these tangents. Man, I love that woman!"
I used to think that if I talked to Baysie and DeeDee long enough I could write a coolhunting manual, an encyclopedia of cool. But then I realized that the manual would have so many footnotes and caveats that it would be unreadable. Coolhunting is not about the articulation of a coherent philosophy of cool. It's just a collection of spontaneous observations and predictions that differ from one moment to the next and from one coolhunter to the next. Ask a coolhunter where the baggy-jeans look came from, for example, and you might get any number of answers: urban black kids mimicking the jailhouse look, skateboarders looking for room to move, snowboarders trying not to look like skiers, or, alternatively, all three at once, in some grand concordance.
Or take the question of exactly how Tommy Hilfiger-a forty- five-year-old white guy from Greenwich, Connecticut, doing all- American preppy clothes-came to be the designer of choice for urban black America. Some say it was all about the early and visible endorsement given Hilfiger by the hip-hop auteur Grand Puba, who wore a dark-green-and-blue Tommy jacket over a white Tommy T-shirt as he leaned on his black Lamborghini on the cover of the hugely influential "Grand Puba 2000" CD, and whose love for Hilfiger soon spread to other rappers. (Who could forget the rhymes of Mobb Deep? "Tommy was my nigga /And couldn't figure /How me and Hilfiger / used to move through with vigor.") Then I had lunch with one of Hilfiger's designers, a twenty-six-year-old named Ulrich (Ubi) Simpson, who has a Puerto Rican mother and a Dutch-Venezuelan father, plays lacrosse, snowboards, surfs the long board, goes to hip-hop concerts, listens to Jungle, Edith Piaf, opera, rap, and Metallica, and has working with him on his design team a twenty-seven-year-old black guy from Montclair with dreadlocks, a twenty-two-year-old Asian-American who lives on the Lower East Side, a twenty-five-year-old South Asian guy from Fiji, and a twenty-one-year-old white graffiti artist from Queens. That's when it occurred to me that maybe the reason Tommy Hilfiger can make white culture cool to black culture is that he has people working for him who are cool in both cultures simultaneously. Then again, maybe it was all Grand Puba. Who knows?
One day last month, Baysie took me on a coolhunt to the Bronx and Harlem, lugging a big black canvas bag with twenty-four different shoes that Reebok is about to bring out, and as we drove down Fordham Road, she had her head out the window like a little kid, checking out what everyone on the street was wearing. We went to Dr. Jay's, which is the cool place to buy sneakers in the Bronx, and Baysie crouched down on the floor and started pulling the shoes out of her bag one by one, soliciting opinions from customers who gathered around and asking one question after another, in rapid sequence. One guy she listened closely to was maybe eighteen or nineteen, with a diamond stud in his ear and a thin beard. He was wearing a Polo baseball cap, a brown leather jacket, and the big, oversized leather boots that are everywhere uptown right now. Baysie would hand him a shoe and he would hold it, look at the top, and move it up and down and flip it over. The first one he didn't like: "Oh-kay." The second one he hated: he made a growling sound in his throat even before Baysie could give it to him, as if to say, "Put it back in the bag-now!" But when she handed him a new DMX RXT-a low-cut run/walk shoe in white and blue and mesh with a translucent "ice" sole, which retails for a hundred and ten dollars-he looked at it long and hard and shook his head in pure admiration and just said two words, dragging each of them out: "No doubt."
Baysie was interested in what he was saying, because the DMX RXT she had was a girls' shoe that actually hadn't been doing all that well. Later, she explained to me that the fact that the boys loved the shoe was critical news, because it suggested that Reebok had a potential hit if it just switched the shoe to the men's section. How she managed to distill this piece of information from the crowd of teenagers around her, how she made any sense of the two dozen shoes in her bag, most of which (to my eyes, anyway) looked pretty much the same, and how she knew which of the teens to really focus on was a mystery. Baysie is a Wasp from New England, and she crouched on the floor in Dr. Jay's for almost an hour, talking and joking with the homeboys without a trace of condescension or self-consciousness.
Near the end of her visit, a young boy walked up and sat down on the bench next to her. He was wearing a black woollen cap with white stripes pulled low, a blue North Face pleated down jacket, a pair of baggy Guess jeans, and, on his feet, Nike Air Jordans. He couldn't have been more than thirteen. But when he started talking you could see Baysie's eyes light up, because somehow she knew the kid was the real thing.
"How many pairs of shoes do you buy a month?" Baysie asked.
"Two," the kid answered. "And if at the end I find one more I like I get to buy that, too."
Baysie was onto him. "Does your mother spoil you?"
The kid blushed, but a friend next to him was laughing. "Whatever he wants, he gets."
Baysie laughed, too. She had the DMX RXT in his size. He tried them on. He rocked back and forth, testing them. He looked back at Baysie. He was dead serious now: "Make sure these come out."
Baysie handed him the new "Rush" Emmitt Smith shoe due out in the fall. One of the boys had already pronounced it "phat," and another had looked through the marbleized-foam cradle in the heel and cried out in delight, "This is bug!" But this kid was the acid test, because this kid knew cool. He paused. He looked at it hard. "Reebok," he said, soberly and carefully, "is trying to get butter."
In the car on the way back to Manhattan, Baysie repeated it twice. "Not better. Butter! That kid could totally tell you what he thinks." Baysie had spent an hour coolhunting in a shoe store and found out that Reebok's efforts were winning the highest of hip-hop praise. "He was so fucking smart."
2.
If you want to understand how trends work, and why coolhunters like Baysie and DeeDee have become so important, a good place to start is with what's known as diffusion research, which is the study of how ideas and innovations spread. Diffusion researchers do things like spending five years studying the adoption of irrigation techniques in a Colombian mountain village, or developing complex matrices to map the spread of new math in the Pittsburgh school system. What they do may seem like a far cry from, say, how the Tommy Hilfiger thing spread from Harlem to every suburban mall in the country, but it really isn't: both are about how new ideas spread from one person to the next.
One of the most famous diffusion studies is Bruce Ryan and Neal Gross's analysis of the spread of hybrid seed corn in Greene County, Iowa, in the nineteen-thirties. The new seed corn was introduced there in about 1928, and it was superior in every respect to the seed that had been used by farmers for decades. But it wasn't adopted all at once. Of two hundred and fifty-nine farmers studied by Ryan and Gross, only a handful had started planting the new seed by 1933. In 1934, sixteen took the plunge. In 1935, twenty-one more followed; the next year, there were thirty-six, and the year after that a whopping sixty-one. The succeeding figures were then forty-six, thirty-six, fourteen, and three, until, by 1941, all but two of the two hundred and fifty-nine farmers studied were using the new seed. In the language of diffusion research, the handful of farmers who started trying hybrid seed corn at the very beginning of the thirties were the "innovators," the adventurous ones. The slightly larger group that followed them was the "early adopters." They were the opinion leaders in the community, the respected, thoughtful people who watched and analyzed what those wild innovators were doing and then did it themselves. Then came the big bulge of farmers in 1936, 1937, and 1938-the "early majority" and the "late majority," which is to say the deliberate and the skeptical masses, who would never try anything until the most respected farmers had tried it. Only after they had been converted did the "laggards," the most traditional of all, follow suit. The critical thing about this sequence is that it is almost entirely interpersonal. According to Ryan and Gross, only the innovators relied to any great extent on radio advertising and farm journals and seed salesmen in making their decision to switch to the hybrid. Everyone else made his decision overwhelmingly because of the example and the opinions of his neighbors and peers.
Isn't this just how fashion works? A few years ago, the classic brushed-suède Hush Puppies with the lightweight crêpe sole-the moc-toe oxford known as the Duke and the slip-on with the golden buckle known as the Columbia-were selling barely sixty-five thousand pairs a year. The company was trying to walk away from the whole suède casual look entirely. It wanted to do "aspirational" shoes: "active casuals" in smooth leather, like the Mall Walker, with a Comfort Curve technology outsole and a heel stabilizer-the kind of shoes you see in Kinney's for $39.95. But then something strange started happening. Two Hush Puppies executives-Owen Baxter and Jeff Lewis-were doing a fashion shoot for their Mall Walkers and ran into a creative consultant from Manhattan named Jeffrey Miller, who informed them that the Dukes and the Columbias weren't dead, they were dead chic. "We were being told," Baxter recalls, "that there were areas in the Village, in SoHo, where the shoes were selling-in resale shops-and that people were wearing the old Hush Puppies. They were going to the ma-and-pa stores, the little stores that still carried them, and there was this authenticity of being able to say, 'I am wearing an original pair of Hush Puppies.' "
Baxter and Lewis-tall, solid, fair-haired Midwestern guys with thick, shiny wedding bands-are shoe men, first and foremost. Baxter was working the cash register at his father's shoe store in Mount Prospect, Illinois, at the age of thirteen. Lewis was doing inventory in his father's shoe store in Pontiac, Michigan, at the age of seven. Baxter was in the National Guard during the 1968 Democratic Convention, in Chicago, and was stationed across the street from the Conrad Hilton downtown, right in the middle of things. Today, the two men work out of Rockford, Michigan (population thirty-eight hundred), where Hush Puppies has been making the Dukes and the Columbias in an old factory down by the Rogue River for almost forty years. They took me to the plant when I was in Rockford. In a crowded, noisy, low-slung building, factory workers stand in long rows, gluing, stapling, and sewing together shoes in dozens of bright colors, and the two executives stopped at each production station and described it in detail. Lewis and Baxter know shoes. But they would be the first to admit that they don't know cool. "Miller was saying that there is something going on with the shoes-that Isaac Mizrahi was wearing the shoes for his personal use," Lewis told me. We were seated around the conference table in the Hush Puppies headquarters in Rockford, with the snow and the trees outside and a big water tower behind us. "I think it's fair to say that at the time we had no idea who Isaac Mizrahi was."
By late 1994, things had begun to happen in a rush. First, the designer John Bartlett called. He wanted to use Hush Puppies as accessories in his spring collection. Then Anna Sui called. Miller, the man from Manhattan, flew out to Michigan to give advice on a new line ("Of course, packing my own food and thinking about 'Fargo' in the corner of my mind"). A few months later, in Los Angeles, the designer Joel Fitzpatrick put a twenty-five-foot inflatable basset hound on the roof of his store on La Brea Avenue and gutted his adjoining art gallery to turn it into a Hush Puppies department, and even before he opened-while he was still painting and putting up shelves-Pee-wee Herman walked in and asked for a couple of pairs. Pee-wee Herman! "It was total word of mouth. I didn't even have a sign back then," Fitzpatrick recalls. In 1995, the company sold four hundred and thirty thousand pairs of the classic Hush Puppies. In 1996, it sold a million six hundred thousand, and that was only scratching the surface, because in Europe and the rest of the world, where Hush Puppies have a huge following-where they might outsell the American market four to one-the revival was just beginning.
The cool kids who started wearing old Dukes and Columbias from thrift shops were the innovators. Pee-wee Herman, wandering in off the street, was an early adopter. The million six hundred thousand people who bought Hush Puppies last year are the early majority, jumping in because the really cool people have already blazed the trail. Hush Puppies are moving through the country just the way hybrid seed corn moved through Greene County-all of which illustrates what coolhunters can and cannot do. If Jeffrey Miller had been wrong-if cool people hadn't been digging through the thrift shops for Hush Puppies-and he had arbitrarily decided that Baxter and Lewis should try to convince non-cool people that the shoes were cool, it wouldn't have worked. You can't convince the late majority that Hush Puppies are cool, because the late majority makes its coolness decisions on the basis of what the early majority is doing, and you can't convince the early majority, because the early majority is looking at the early adopters, and you can't convince the early adopters, because they take their cues from the innovators. The innovators do get their cool ideas from people other than their peers, but the fact is that they are the last people who can be convinced by a marketing campaign that a pair of suède shoes is cool. These are, after all, the people who spent hours sifting through thrift-store bins. And why did they do that? Because their definition of cool is doing something that nobody else is doing. A company can intervene in the cool cycle. It can put its shoes on really cool celebrities and on fashion runways and on MTV. It can accelerate the transition from the innovator to the early adopter and on to the early majority. But it can't just manufacture cool out of thin air, and that's the second rule of cool.
At the peak of the Hush Puppies craziness last year, Hush Puppies won the prize for best accessory at the Council of Fashion Designers' awards dinner, at Lincoln Center. The award was accepted by the Hush Puppies president, Louis Dubrow, who came out wearing a pair of custom-made black patent-leather Hush Puppies and stood there blinking and looking at the assembled crowd as if it were the last scene of "Close Encounters of the Third Kind." It was a strange moment. There was the president of the Hush Puppies company, of Rockford, Michigan, population thirty-eight hundred, sharing a stage with Calvin Klein and Donna Karan and Isaac Mizrahi-and all because some kids in the East Village began combing through thrift shops for old Dukes. Fashion was at the mercy of those kids, whoever they were, and it was a wonderful thing if the kids picked you, but a scary thing, too, because it meant that cool was something you could not control. You needed someone to find cool and tell you what it was.
3.
When Baysie Wightman went to Dr. Jay's, she was looking for customer response to the new shoes Reebok had planned for the fourth quarter of 1997 and the first quarter of 1998. This kind of customer testing is critical at Reebok, because the last decade has not been kind to the company. In 1987, it had a third of the American athletic-shoe market, well ahead of Nike. Last year, it had sixteen per cent. "The kid in the store would say, 'I'd like this shoe if your logo wasn't on it,' " E. Scott Morris, who's a senior designer for Reebok, told me. "That's kind of a punch in the mouth. But we've all seen it. You go into a shoe store. The kid picks up the shoe and says, 'Ah, man, this is nice.' He turns the shoe around and around. He looks at it underneath. He looks at the side and he goes, 'Ah, this is Reebok,' and says, 'I ain't buying this,' and puts the shoe down and walks out. And you go, 'You was just digging it a minute ago. What happened?' " Somewhere along the way, the company lost its cool, and Reebok now faces the task not only of rebuilding its image but of making the shoes so cool that the kids in the store can't put them down.
Every few months, then, the company's coolhunters go out into the field with prototypes of the upcoming shoes to find out what kids really like, and come back to recommend the necessary changes. The prototype of one recent Emmitt Smith shoe, for example, had a piece of molded rubber on the end of the tongue as a design element; it was supposed to give the shoe a certain "richness," but the kids said they thought it looked overbuilt. Then Reebok gave the shoes to the Boston College football team for wear-testing, and when they got the shoes back they found out that all the football players had cut out the rubber component with scissors. As messages go, this was hard to miss. The tongue piece wasn't cool, and on the final version of the shoe it was gone. The rule of thumb at Reebok is that if the kids in Chicago, New York, and Detroit all like a shoe, it's a guaranteed hit. More than likely, though, the coolhunt is going to turn up subtle differences from city to city, so that once the coolhunters come back the designers have to find out some way to synthesize what was heard, and pick out just those things that all the kids seemed to agree on. In New York, for example, kids in Harlem are more sophisticated and fashion-forward than kids in the Bronx, who like things a little more colorful and glitzy. Brooklyn, meanwhile, is conservative and preppy, more like Washington, D.C. For reasons no one really knows, Reeboks are coolest in Philadelphia. In Philly, in fact, the Reebok Classics are so huge they are known simply as National Anthems, as in "I'll have a pair of blue Anthems in nine and a half." Philadelphia is Reebok's innovator town. From there trends move along the East Coast, trickling all the way to Charlotte, North Carolina.
Reebok has its headquarters in Stoughton, Massachusetts, outside Boston-in a modern corporate park right off Route 24. There are basketball and tennis courts next to the building, and a health club on the ground floor that you can look directly into from the parking lot. The front lobby is adorned with shrines for all of Reebok's most prominent athletes-shrines complete with dramatic action photographs, their sports jerseys, and a pair of their signature shoes-and the halls are filled with so many young, determinedly athletic people that when I visited Reebok headquarters I suddenly wished I'd packed my gym clothes in case someone challenged me to wind sprints. At Stoughton, I met with a handful of the company's top designers and marketing executives in a long conference room on the third floor. In the course of two hours, they put one pair of shoes after another on the table in front of me, talking excitedly about each sneaker's prospects, because the feeling at Reebok is that things are finally turning around. The basketball shoe that Reebok brought out last winter for Allen Iverson, the star rookie guard for the Philadelphia 76ers, for example, is one of the hottest shoes in the country. Dr. Jay's sold out of Iversons in two days, compared with the week it took the store to sell out of Nike's new Air Jordans. Iverson himself is brash and charismatic and faster from foul line to foul line than anyone else in the league. He's the equivalent of those kids in the East Village who began wearing Hush Puppies way back when. He's an innovator, and the hope at Reebok is that if he gets big enough the whole company can ride back to coolness on his coattails, the way Nike rode to coolness on the coattails of Michael Jordan. That's why Baysie was so excited when the kid said Reebok was trying to get butter when he looked at the Rush and the DMX RXT: it was a sign, albeit a small one, that the indefinable, abstract thing called cool was coming back.
When Baysie comes back from a coolhunt, she sits down with marketing experts and sales representatives and designers, and reconnects them to the street, making sure they have the right shoes going to the right places at the right price. When she got back from the Bronx, for example, the first thing she did was tell all these people they had to get a new men's DMX RXT out, fast, because the kids on the street loved the women's version. "It's hotter than we realized," she told them. The coolhunter's job in this instance is very specific. What DeeDee does, on the other hand, is a little more ambitious. With the L Report, she tries to construct a kind of grand matrix of cool, comprising not just shoes but everything kids like, and not just kids of certain East Coast urban markets but kids all over. DeeDee and her staff put it out four times a year, in six different versions-for New York, Los Angeles, San Francisco, Austin-Dallas, Seattle, and Chicago-and then sell it to manufacturers, retailers, and ad agencies (among others) for twenty thousand dollars a year. They go to each city and find the coolest bars and clubs, and ask the coolest kids to fill out questionnaires. The information is then divided into six categories-You Saw It Here First, Entertainment and Leisure, Clothing and Accessories, Personal and Individual, Aspirations, and Food and Beverages-which are, in turn, broken up into dozens of subcategories, so that Personal and Individual, for example, includes Cool Date, Cool Evening, Free Time, Favorite Possession, and on and on. The information in those subcategories is subdivided again by sex and by age bracket (14-18, 19-24, 25-30), and then, as a control, the L Report gives you the corresponding set of preferences for "mainstream" kids.
Few coolhunters bother to analyze trends with this degree of specificity. DeeDee's biggest competitor, for example, is something called the Hot Sheet, out of Manhattan. It uses a panel of three thousand kids a year from across the country and divides up their answers by sex and age, but it doesn't distinguish between regions, or between trendsetting and mainstream respondents. So what you're really getting is what all kids think is cool-not what cool kids think is cool, which is a considerably different piece of information. Janine Misdom and Joanne DeLuca, who run the Sputnik coolhunting group out of the garment district in Manhattan, meanwhile, favor an entirely impressionistic approach, sending out coolhunters with video cameras to talk to kids on the ground that it's too difficult to get cool kids to fill out questionnaires. Once, when I was visiting the Sputnik girls-as Misdom and DeLuca are known on the street, because they look alike and their first names are so similar and both have the same awesome New York accents-they showed me a video of the girl they believe was the patient zero of the whole eighties revival going on right now. It was back in September of 1993. Joanne and Janine were on Seventh Avenue, outside the Fashion Institute of Technology, doing random street interviews for a major jeans company, and, quite by accident, they ran into this nineteen-year- old raver. She had close-cropped hair, which was green at the top, and at the temples was shaved even closer and dyed pink. She had rings and studs all over her face, and a thick collection of silver tribal jewelry around her neck, and vintage jeans. She looked into the camera and said, "The sixties came in and then the seventies came in and I think it's ready to come back to the eighties. It's totally eighties: the eye makeup, the clothes. It's totally going back to that." Immediately, Joanne and Janine started asking around. "We talked to a few kids on the Lower East Side who said they were feeling the need to start breaking out their old Michael Jackson jackets," Joanne said. "They were joking about it. They weren't doing it yet. But they were going to, you know? They were saying, 'We're getting the urge to break out our Members Only jackets.' " That was right when Joanne and Janine were just starting up; calling the eighties revival was their first big break, and now they put out a full-blown videotaped report twice a year which is a collection of clips of interviews with extremely progressive people.
What DeeDee argues, though, is that cool is too subtle and too variegated to be captured with these kind of broad strokes. Cool is a set of dialects, not a language. The L Report can tell you, for example, that nineteen-to-twenty-four-year-old male trendsetters in Seattle would most like to meet, among others, King Solomon and Dr. Seuss, and that nineteen-to-twenty-four-year- old female trendsetters in San Francisco have turned their backs on Calvin Klein, Nintendo Gameboy, and sex. What's cool right now? Among male New York trendsetters: North Face jackets, rubber and latex, khakis, and the rock band Kiss. Among female trendsetters: ska music, old-lady clothing, and cyber tech. In Chicago, snowboarding is huge among trendsetters of both sexes and all ages. Women over nineteen are into short hair, while those in their teens have embraced mod culture, rock climbing, tag watches, and bootleg pants. In Austin-Dallas, meanwhile, twenty-five-to- thirty-year-old women trendsetters are into hats, heroin, computers, cigars, Adidas, and velvet, while men in their twenties are into video games and hemp. In all, the typical L Report runs over one hundred pages. But with that flood of data comes an obsolescence disclaimer: "The fluctuating nature of the trendsetting market makes keeping up with trends a difficult task." By the spring, in other words, everything may have changed.
The key to coolhunting, then, is to look for cool people first and cool things later, and not the other way around. Since cool things are always changing, you can't look for them, because the very fact they are cool means you have no idea what to look for. What you would be doing is thinking back on what was cool before and extrapolating, which is about as useful as presuming that because the Dow rose ten points yesterday it will rise another ten points today. Cool people, on the other hand, are a constant.
When I was in California, I met Salvador Barbier, who had been described to me by a coolhunter as "the Michael Jordan of skateboarding." He was tall and lean and languid, with a cowboy's insouciance, and we drove through the streets of Long Beach at fifteen miles an hour in a white late-model Ford Mustang, a car he had bought as a kind of ironic status gesture ("It would look good if I had a Polo jacket or maybe Nautica," he said) to go with his '62 Econoline van and his '64 T-bird. Sal told me that he and his friends, who are all in their mid-twenties, recently took to dressing up as if they were in eighth grade again and gathering together-having a "rally"-on old BMX bicycles in front of their local 7-Eleven. "I'd wear muscle shirts, like Def Leppard or Foghat or some old heavy-metal band, and tight, tight tapered Levi's, and Vans on my feet-big, like, checkered Vans or striped Vans or camouflage Vans-and then wristbands and gloves with the fingers cut off. It was total eighties fashion. You had to look like that to participate in the rally. We had those denim jackets with patches on the back and combs that hung out the back pocket. We went without I.D.s, because we'd have to have someone else buy us beers." At this point, Sal laughed. He was driving really slowly and staring straight ahead and talking in a low drawl-the coolhunter's dream. "We'd ride to this bar and I'd have to carry my bike inside, because we have really expensive bikes, and when we got inside people would freak out. They'd say, 'Omigod,' and I was asking them if they wanted to go for a ride on the handlebars. They were like, 'What is wrong with you. My boyfriend used to dress like that in the eighth grade!' And I was like, 'He was probably a lot cooler then, too.' "
This is just the kind of person DeeDee wants. "I'm looking for somebody who is an individual, who has definitely set himself apart from everybody else, who doesn't look like his peers. I've run into trendsetters who look completely Joe Regular Guy. I can see Joe Regular Guy at a club listening to some totally hardcore band playing, and I say to myself 'Omigod, what's that guy doing here?' and that totally intrigues me, and I have to walk up to him and say, 'Hey, you're really into this band. What's up?' You know what I mean? I look at everything. If I see Joe Regular Guy sitting in a coffee shop and everyone around him has blue hair, I'm going to gravitate toward him, because, hey, what's Joe Regular Guy doing in a coffee shop with people with blue hair?"
We were sitting outside the Fred Segal store in West Hollywood. I was wearing a very conservative white Brooks Brothers button-down and a pair of Levi's, and DeeDee looked first at my shirt and then my pants and dissolved into laughter: "I mean, I might even go up to you in a cool place."
Picking the right person is harder than it sounds, though. Piney Kahn, who works for DeeDee, says, "There are a lot of people in the gray area. You've got these kids who dress ultra funky and have their own style. Then you realize they're just running after their friends." The trick is not just to be able to tell who is different but to be able to tell when that difference represents something truly cool. It's a gut thing. You have to somehow just know. DeeDee hired Piney because Piney clearly knows: she is twenty-four and used to work with the Beastie Boys and has the formidable self-possession of someone who is not only cool herself but whose parents were cool. "I mean," she says, "they named me after a tree."
Piney and DeeDee said that they once tried to hire someone as a coolhunter who was not, himself, cool, and it was a disaster.
"You can give them the boundaries," Piney explained. "You can say that if people shop at Banana Republic and listen to Alanis Morissette they're probably not trendsetters. But then they might go out and assume that everyone who does that is not a trendsetter, and not look at the other things."
"I mean, I myself might go into Banana Republic and buy a T-shirt," DeeDee chimed in.
Their non-cool coolhunter just didn't have that certain instinct, that sense that told him when it was O.K. to deviate from the manual. Because he wasn't cool, he didn't know cool, and that's the essence of the third rule of cool: you have to be one to know one. That's why Baysie is still on top of this business at forty-one. "It's easier for me to tell you what kid is cool than to tell you what things are cool," she says. But that's all she needs to know. In this sense, the third rule of cool fits perfectly into the second: the second rule says that cool cannot be manufactured, only observed, and the third says that it can only be observed by those who are themselves cool. And, of course, the first rule says that it cannot accurately be observed at all, because the act of discovering cool causes cool to take flight, so if you add all three together they describe a closed loop, the hermeneutic circle of coolhunting, a phenomenon whereby not only can the uncool not see cool but cool cannot even be adequately described to them. Baysie says that she can see a coat on one of her friends and think it's not cool but then see the same coat on DeeDee and think that it is cool. It is not possible to be cool, in other words, unless you are-in some larger sense-already cool, and so the phenomenon that the uncool cannot see and cannot have described to them is also something that they cannot ever attain, because if they did it would no longer be cool. Coolhunting represents the ascendancy, in the marketplace, of high school.
Once, I was visiting DeeDee at her house in Laurel Canyon when one of her L Report assistants, Jonas Vail, walked in. He'd just come back from Niketown on Wilshire Boulevard, where he'd bought seven hundred dollars' worth of the latest sneakers to go with the three hundred dollars' worth of skateboard shoes he'd bought earlier in the afternoon. Jonas is tall and expressionless, with a peacoat, dark jeans, and short-cropped black hair. "Jonas is good," DeeDee says. "He works with me on everything. That guy knows more pop culture. You know: What was the name of the store Mrs. Garrett owned on 'The Facts of Life'? He knows all the names of the extras from eighties sitcoms. I can't believe someone like him exists. He's fucking unbelievable. Jonas can spot a cool person a mile away."
Jonas takes the boxes of shoes and starts unpacking them on the couch next to DeeDee. He picks up a pair of the new Nike ACG hiking boots, and says, "All the Japanese in Niketown were really into these." He hands the shoes to DeeDee.
"Of course they were!" she says. "The Japanese are all into the tech-looking shit. Look how exaggerated it is, how bulbous." DeeDee has very ambivalent feelings about Nike, because she thinks its marketing has got out of hand. When she was in the New York Niketown with a girlfriend recently, she says, she started getting light-headed and freaked out. "It's cult, cult, cult. It was like, 'Hello, are we all drinking the Kool-Aid here?' " But this shoe she loves. It's Dr. Jay's in the Bronx all over again. DeeDee turns the shoe around and around in the air, tapping the big clear-blue plastic bubble on the side-the visible Air-Sole unit- with one finger. "It's so fucking rad. It looks like a platypus!" In front of me, there is a pair of Nike's new shoes for the basketball player Jason Kidd.
I pick it up. "This looks . . . cool," I venture uncertainly.
DeeDee is on the couch, where she's surrounded by shoeboxes and sneakers and white tissue paper, and she looks up reprovingly because, of course, I don't get it. I can't get it. "Beyooond cool, Maalcolm. Beyooond cool."
The Sports Taboo
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 19, 1997
DEPT. OF DISPUTATION
Why blacks are like boys and whites are like girls.
1.
The education of any athlete begins, in part, with an education in the racial taxonomy of his chosen sport-in the subtle, unwritten rules about what whites are supposed to be good at and what blacks are supposed to be good at. In football, whites play quarterback and blacks play running back; in baseball whites pitch and blacks play the outfield. I grew up in Canada, where my brother Geoffrey and I ran high-school track, and in Canada the rule of running was that anything under the quarter-mile belonged to the West Indians. This didn't mean that white people didn't run the sprints. But the expectation was that they would never win, and, sure enough, they rarely did. There was just a handful of West Indian immigrants in Ontario at that point-clustered in and around Toronto-but they owned Canadian sprinting, setting up under the stands at every major championship, cranking up the reggae on their boom boxes, and then humiliating everyone else on the track. My brother and I weren't from Toronto, so we weren't part of that scene. But our West Indian heritage meant that we got to share in the swagger. Geoffrey was a magnificent runner, with powerful legs and a barrel chest, and when he was warming up he used to do that exaggerated, slow-motion jog that the white guys would try to do and never quite pull off. I was a miler, which was a little outside the West Indian range. But, the way I figured it, the rules meant that no one should ever outkick me over the final two hundred metres of any race. And in the golden summer of my fourteenth year, when my running career prematurely peaked, no one ever did.
When I started running, there was a quarter-miler just a few years older than I was by the name of Arnold Stotz. He was a bulldog of a runner, hugely talented, and each year that he moved through the sprinting ranks he invariably broke the existing four-hundred-metre record in his age class. Stotz was white, though, and every time I saw the results of a big track meet I'd keep an eye out for his name, because I was convinced that he could not keep winning. It was as if I saw his whiteness as a degenerative disease, which would eventually claim and cripple him. I never asked him whether he felt the same anxiety, but I can't imagine that he didn't. There was only so long that anyone could defy the rules. One day, at the provincial championships, I looked up at the results board and Stotz was gone.
Talking openly about the racial dimension of sports in this way, of course, is considered unseemly. It's all right to say that blacks dominate sports because they lack opportunities elsewhere. That's the "Hoop Dreams" line, which says whites are allowed to acknowledge black athletic success as long as they feel guilty about it. What you're not supposed to say is what we were saying in my track days-that we were better because we were black, because of something intrinsic to being black. Nobody said anything like that publicly last month when Tiger Woods won the Masters or when, a week later, African men claimed thirteen out of the top twenty places in the Boston Marathon. Nor is it likely to come up this month, when African-Americans will make up eighty per cent of the players on the floor for the N.B.A. playoffs. When the popular television sports commentator Jimmy (the Greek) Snyder did break this taboo, in 1988- infamously ruminating on the size and significance of black thighs-one prominent N.A.A.C.P. official said that his remarks "could set race relations back a hundred years." The assumption is that the whole project of trying to get us to treat each other the same will be undermined if we don't all agree that under the skin we actually are the same.
The point of this, presumably, is to put our discussion of sports on a par with legal notions of racial equality, which would be a fine idea except that civil-rights law governs matters like housing and employment and the sports taboo covers matters like what can be said about someone's jump shot. In his much heralded new book "Darwin's Athletes," the University of Texas scholar John Hoberman tries to argue that these two things are the same, that it's impossible to speak of black physical superiority without implying intellectual inferiority. But it isn't long before the argument starts to get ridiculous. "The spectacle of black athleticism," he writes, inevitably turns into "a highly public image of black retardation." Oh, really? What, exactly, about Tiger Woods's victory in the Masters resembled "a highly public image of black retardation"? Today's black athletes are multimillion- dollar corporate pitchmen, with talk shows and sneaker deals and publicity machines and almost daily media opportunities to share their thoughts with the world, and it's very hard to see how all this contrives to make them look stupid. Hoberman spends a lot of time trying to inflate the significance of sports, arguing that how we talk about events on the baseball diamond or the track has grave consequences for how we talk about race in general. Here he is, for example, on Jackie Robinson:
The sheer volume of sentimental and intellectual energy that has been invested in the mythic saga of Jackie Robinson has discouraged further thinking about what his career did and did not accomplish. . . . Black America has paid a high and largely unacknowledged price for the extraordinary prominence given the black athlete rather than other black men of action (such as military pilots and astronauts), who represent modern aptitudes in ways that athletes cannot.
Please. Black America has paid a high and largely unacknowledged price for a long list of things, and having great athletes is far from the top of the list. Sometimes a baseball player is just a baseball player, and sometimes an observation about racial difference is just an observation about racial difference. Few object when medical scientists talk about the significant epidemiological differences between blacks and whites-the fact that blacks have a higher incidence of hypertension than whites and twice as many black males die of diabetes and prostate cancer as white males, that breast tumors appear to grow faster in black women than in white women, that black girls show signs of puberty sooner than white girls. So why aren't we allowed to say that there might be athletically significant differences between blacks and whites?
According to the medical evidence, African-Americans seem to have, on the average, greater bone mass than do white Americans-a difference that suggests greater muscle mass. Black men have slightly higher circulating levels of testosterone and human-growth hormone than their white counterparts, and blacks over all tend to have proportionally slimmer hips, wider shoulders, and longer legs. In one study, the Swedish physiologist Bengt Saltin compared a group of Kenyan distance runners with a group of Swedish distance runners and found interesting differences in muscle composition: Saltin reported that the Africans appeared to have more blood-carrying capillaries and more mitochondria (the body's cellular power plant) in the fibres of their quadriceps. Another study found that, while black South African distance runners ran at the same speed as white South African runners, they were able to use more oxygen- eighty-nine per cent versus eighty-one per cent-over extended periods: somehow, they were able to exert themselves more. Such evidence suggested that there were physical differences in black athletes which have a bearing on activities like running and jumping, which should hardly come as a surprise to anyone who follows competitive sports.
To use track as an example-since track is probably the purest measure of athletic ability-Africans recorded fifteen out of the twenty fastest times last year in the men's ten-thousand- metre event. In the five thousand metres, eighteen out of the twenty fastest times were recorded by Africans. In the fifteen hundred metres, thirteen out of the twenty fastest times were African, and in the sprints, in the men's hundred metres, you have to go all the way down to the twenty-third place in the world rankings-to Geir Moen, of Norway-before you find a white face. There is a point at which it becomes foolish to deny the fact of black athletic prowess, and even more foolish to banish speculation on the topic. Clearly, something is going on. The question is what.
2.
If we are to decide what to make of the differences between blacks and whites, we first have to decide what to make of the word "difference," which can mean any number of things. A useful case study is to compare the ability of men and women in math. If you give a large, representative sample of male and female students a standardized math test, their mean scores will come out pretty much the same. But if you look at the margins, at the very best and the very worst students, sharp differences emerge. In the math portion of an achievement test conducted by Project Talent-a nationwide survey of fifteen-year-olds-there were 1.3 boys for every girl in the top ten per cent, 1.5 boys for every girl in the top five per cent, and seven boys for every girl in the top one per cent. In the fifty-six-year history of the Putnam Mathematical Competition, which has been described as the Olympics of college math, all but one of the winners have been male. Conversely, if you look at people with the very lowest math ability, you'll find more boys than girls there, too. In other words, although the average math ability of boys and girls is the same, the distribution isn't: there are more males than females at the bottom of the pile, more males than females at the top of the pile, and fewer males than females in the middle. Statisticians refer to this as a difference in variability.
This pattern, as it turns out, is repeated in almost every conceivable area of gender difference. Boys are more variable than girls on the College Board entrance exam and in routine elementary-school spelling tests. Male mortality patterns are more variable than female patterns; that is, many more men die in early and middle age than women, who tend to die in more of a concentrated clump toward the end of life. The problem is that variability differences are regularly confused with average differences. If men had higher average math scores than women, you could say they were better at the subject. But because they are only more variable the word "better" seems inappropriate.
The same holds true for differences between the races. A racist stereotype is the assertion of average difference-it's the claim that the typical white is superior to the typical black. It allows a white man to assume that the black man he passes on the street is stupider than he is. By contrast, if what racists believed was that black intelligence was simply more variable than white intelligence, then it would be impossible for them to construct a stereotype about black intelligence at all. They wouldn't be able to generalize. If they wanted to believe that there were a lot of blacks dumber than whites, they would also have to believe that there were a lot of blacks smarter than they were. This distinction is critical to understanding the relation between race and athletic performance. What are we seeing when we remark black domination of Ă©lite sporting events-an average difference between the races or merely a difference in variability?
This question has been explored by geneticists and physical anthropologists, and some of the most notable work has been conducted over the past few years by Kenneth Kidd, at Yale. Kidd and his colleagues have been taking DNA samples from two African Pygmy tribes in Zaire and the Central African Republic and comparing them with DNA samples taken from populations all over the world. What they have been looking for is variants-subtle differences between the DNA of one person and another-and what they have found is fascinating. "I would say, without a doubt, that in almost any single African population-a tribe or however you want to define it-there is more genetic variation than in all the rest of the world put together," Kidd told me. In a sample of fifty Pygmies, for example, you might find nine variants in one stretch of DNA. In a sample of hundreds of people from around the rest of the world, you might find only a total of six variants in that same stretch of DNA-and probably every one of those six variants would also be found in the Pygmies. If everyone in the world was wiped out except Africans, in other words, almost all the human genetic diversity would be preserved.
The likelihood is that these results reflect Africa's status as the homeland of Homo sapiens: since every human population outside Africa is essentially a subset of the original African population, it makes sense that everyone in such a population would be a genetic subset of Africans, too. So you can expect groups of Africans to be more variable in respect to almost anything that has a genetic component. If, for example, your genes control how you react to aspirin, you'd expect to see more Africans than whites for whom one aspirin stops a bad headache, more for whom no amount of aspirin works, more who are allergic to aspirin, and more who need to take, say, four aspirin at a time to get any benefit-but far fewer Africans for whom the standard two-aspirin dose would work well. And to the extent that running is influenced by genetic factors you would expect to see more really fast blacks-and more really slow blacks-than whites but far fewer Africans of merely average speed. Blacks are like boys. Whites are like girls.
There is nothing particularly scary about this fact, and certainly nothing to warrant the kind of gag order on talk of racial differences which is now in place. What it means is that comparing élite athletes of different races tells you very little about the races themselves. A few years ago, for example, a prominent scientist argued for black athletic supremacy by pointing out that there had never been a white Michael Jordan. True. But, as the Yale anthropologist Jonathan Marks has noted, until recently there was no black Michael Jordan, either. Michael Jordan, like Tiger Woods or Wayne Gretzky or Cal Ripken, is one of the best players in his sport not because he's like the other members of his own ethnic group but precisely because he's not like them-or like anyone else, for that matter. Élite athletes are élite athletes because, in some sense, they are on the fringes of genetic variability. As it happens, African populations seem to create more of these genetic outliers than white populations do, and this is what underpins the claim that blacks are better athletes than whites. But that's all the claim amounts to. It doesn't say anything at all about the rest of us, of all races, muddling around in the genetic middle.
3.
There is a second consideration to keep in mind when we compare blacks and whites. Take the men's hundred-metre final at the Atlanta Olympics. Every runner in that race was of either Western African or Southern African descent, as you would expect if Africans had some genetic affinity for sprinting. But suppose we forget about skin color and look just at country of origin. The eight-man final was made up of two African-Americans, two Africans (one from Namibia and one from Nigeria), a Trinidadian, a Canadian of Jamaican descent, an Englishman of Jamaican descent, and a Jamaican. The race was won by the Jamaican-Canadian, in world-record time, with the Namibian coming in second and the Trinidadian third. The sprint relay-the 4 x 100-was won by a team from Canada, consisting of the Jamaican-Canadian from the final, a Haitian-Canadian, a Trinidadian-Canadian, and another Jamaican-Canadian. Now it appears that African heritage is important as an initial determinant of sprinting ability, but also that the most important advantage of all is some kind of cultural or environmental factor associated with the Caribbean.
Or consider, in a completely different realm, the problem of hypertension. Black Americans have a higher incidence of hypertension than white Americans, even after you control for every conceivable variable, including income, diet, and weight, so it's tempting to conclude that there is something about being of African descent that makes blacks prone to hypertension. But it turns out that although some Caribbean countries have a problem with hypertension, others-Jamaica, St. Kitts, and the Bahamas-don't. It also turns out that people in Liberia and Nigeria-two countries where many New World slaves came from-have similar and perhaps even lower blood-pressure rates than white North Americans, while studies of Zulus, Indians, and whites in Durban, South Africa, showed that urban white males had the highest hypertension rates and urban white females had the lowest. So it's likely that the disease has nothing at all to do with Africanness.
The same is true for the distinctive muscle characteristic observed when Kenyans were compared with Swedes. Saltin, the Swedish physiologist, subsequently found many of the same characteristics in Nordic skiers who train at high altitudes and Nordic runners who train in very hilly regions-conditions, in other words, that resemble the mountainous regions of Kenya's Rift Valley, where so many of the country's distance runners come from. The key factor seems to be Kenya, not genes.
Lots of things that seem to be genetic in origin, then, actually aren't. Similarly, lots of things that we wouldn't normally think might affect athletic ability actually do. Once again, the social-science literature on male and female math achievement is instructive. Psychologists argue that when it comes to subjects like math, boys tend to engage in what's known as ability attribution. A boy who is doing well will attribute his success to the fact that he's good at math, and if he's doing badly he'll blame his teacher or his own lack of motivation-anything but his ability. That makes it easy for him to bounce back from failure or disappointment, and gives him a lot of confidence in the face of a tough new challenge. After all, if you think you do well in math because you're good at math, what's stopping you from being good at, say, algebra, or advanced calculus? On the other hand, if you ask a girl why she is doing well in math she will say, more often than not, that she succeeds because she works hard. If she's doing poorly, she'll say she isn't smart enough. This, as should be obvious, is a self-defeating attitude. Psychologists call it "learned helplessness"-the state in which failure is perceived as insurmountable. Girls who engage in effort attribution learn helplessness because in the face of a more difficult task like algebra or advanced calculus they can conceive of no solution. They're convinced that they can't work harder, because they think they're working as hard as they can, and that they can't rely on their intelligence, because they never thought they were that smart to begin with. In fact, one of the fascinating findings of attribution research is that the smarter girls are, the more likely they are to fall into this trap. High achievers are sometimes the most helpless. Here, surely, is part of the explanation for greater math variability among males. The female math whizzes, the ones who should be competing in the top one and two per cent with their male counterparts, are the ones most often paralyzed by a lack of confidence in their own aptitude. They think they belong only in the intellectual middle.
The striking thing about these descriptions of male and female stereotyping in math, though, is how similar they are to black and white stereotyping in athletics-to the unwritten rules holding that blacks achieve through natural ability and whites through effort. Here's how Sports Illustrated described, in a recent article, the white basketball player Steve Kerr, who plays alongside Michael Jordan for the Chicago Bulls. According to the magazine, Kerr is a "hard-working overachiever," distinguished by his "work ethic and heady play" and by a shooting style "born of a million practice shots." Bear in mind that Kerr is one of the best shooters in basketball today, and a key player on what is arguably one of the finest basketball teams in history. Bear in mind, too, that there is no evidence that Kerr works any harder than his teammates, least of all Jordan himself, whose work habits are legendary. But you'd never guess that from the article. It concludes, "All over America, whenever quicker, stronger gym rats see Kerr in action, they must wonder, How can that guy be out there instead of me?"
There are real consequences to this stereotyping. As the psychologists Carol Dweck and Barbara Licht write of high- achieving schoolgirls, "[They] may view themselves as so motivated and well disciplined that they cannot entertain the possibility that they did poorly on an academic task because of insufficient effort. Since blaming the teacher would also be out of character, blaming their abilities when they confront difficulty may seem like the most reasonable option." If you substitute the words "white athletes" for "girls" and "coach" for "teacher," I think you have part of the reason that so many white athletes are underrepresented at the highest levels of professional sports. Whites have been saddled with the athletic equivalent of learned helplessness-the idea that it is all but fruitless to try and compete at the highest levels, because they have only effort on their side. The causes of athletic and gender discrimination may be diverse, but its effects are not. Once again, blacks are like boys, and whites are like girls.
4.
When I was in college, I once met an old acquaintance from my high-school running days. Both of us had long since quit track, and we talked about a recurrent fantasy we found we'd both had for getting back into shape. It was that we would go away somewhere remote for a year and do nothing but train, so that when the year was up we might finally know how good we were. Neither of us had any intention of doing this, though, which is why it was a fantasy. In adolescence, athletic excess has a certain appeal-during high school, I happily spent Sunday afternoons running up and down snow-covered sandhills-but with most of us that obsessiveness soon begins to fade. Athletic success depends on having the right genes and on a self-reinforcing belief in one's own ability. But it also depends on a rare form of tunnel vision. To be a great athlete, you have to care, and what was obvious to us both was that neither of us cared anymore. This is the last piece of the puzzle about what we mean when we say one group is better at something than another: sometimes different groups care about different things. Of the seven hundred men who play major-league baseball, for example, eighty-six come from either the Dominican Republic or Puerto Rico, even though those two islands have a combined population of only eleven million. But then baseball is something that Dominicans and Puerto Ricans care about-and you can say the same thing about African-Americans and basketball, West Indians and sprinting, Canadians and hockey, and Russians and chess. Desire is the great intangible in performance, and unlike genes or psychological affect we can't measure it and trace its implications. This is the problem, in the end, with the question of whether blacks are better at sports than whites. It's not that it's offensive, or that it leads to discrimination. It's that, in some sense, it's not a terribly interesting question; "better" promises a tidier explanation than can ever be provided.
I quit competitive running when I was sixteen-just after the summer I had qualified for the Ontario track team in my age class. Late that August, we had travelled to St. John's, Newfoundland, for the Canadian championships. In those days, I was whippet-thin, as milers often are, five feet six and not much more than a hundred pounds, and I could skim along the ground so lightly that I barely needed to catch my breath. I had two white friends on that team, both distance runners, too, and both, improbably, even smaller and lighter than I was. Every morning, the three of us would run through the streets of St. John's, charging up the hills and flying down the other side. One of these friends went on to have a distinguished college running career, the other became a world-class miler; that summer, I myself was the Canadian record holder in the fifteen hundred metres for my age class. We were almost terrifyingly competitive, without a shred of doubt in our ability, and as we raced along we never stopped talking and joking, just to prove how absurdly easy we found running to be. I thought of us all as equals. Then, on the last day of our stay in St. John's, we ran to the bottom of Signal Hill, which is the town's principal geographical landmark-an abrupt outcrop as steep as anything in San Francisco. We stopped at the base, and the two of them turned to me and announced that we were all going to run straight up Signal Hill backward. I don't know whether I had more running ability than those two or whether my Africanness gave me any genetic advantage over their whiteness. What I do know is that such questions were irrelevant, because, as I realized, they were willing to go to far greater lengths to develop their talent. They ran up the hill backward. I ran home.
Love that bomb
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 25, 1998
It's not just the Indians who fetishize nukes.
In the American hierarchy of things that kill people, bombs have never had much moral cachet. There is no National Bomb Association dedicated to the inalienable right of hunters to off Bambi's mom with, say, fragmentation grenades. Even in the shrillest debates on Capitol Hill, no freshman Republican has ever got up to declare that "bombs don't kill people, people do." In the movies, it's the bad guy who plants the bomb in the good guy's car; the good guy uses a .44 Magnum. When it comes to how we think about blowing each other away, there is an unspoken assumption that the kind of weapon you point is nobler than the kind you wire up to an alarm clock.
A similar ordering obtains on the level of international relations. "Weapons of mass destruction"--poison gas, germ- warfare agents, and, above all, nuclear bombs and missiles--are bad. Guns are, if not exactly good, the backbone of one of America's most profitable export businesses. Such distinctions have little to do with how deadly guns and bombs have proved to be in real life. On that score, firearms, which claimed tens of thousands of American lives last year, win hands down. In most states, they can be bought almost as easily as toaster ovens. (We will ban cigarettes long before we will ban handguns.) But a bomb is something that evil geniuses like the Unabomber use. Bombs have elaborate detonators and timing devices, and must be defused by experts. Guns are dumb; bombs are brainy (and nuclear bombs are the brainiest of all).
Imagine if, last week, India had conducted provocative military maneuvers along the Pakistani border, or had started infiltrating commandos across it, or, for that matter, had merely continued along the pugnacious path that the new government there had publicly charted. None of those conventionally bellicose acts would have turned India into International Public Enemy No. 1.
This is not to minimize or excuse what India has done in conducting the underground nuclear tests that disturbed the world's peace last week. Nuclear weapons have always received special consideration, for very good and very obvious reasons, and the fact that not one has been fired in anger since the Second World War is among the greatest successes of modern diplomacy. That India has chosen to flaunt its sinister expertise in this area is rightly a cause for indignation: India has triggered an arms race on the Asian subcontinent; it has further destabilized an already unstable region. The imposition of sanctions is entirely justified. Still, it is worth asking why this particular act--among the infinite variety of nasty things that countries do to one another and to their own citizens--is treated as uniquely outrageous. For if there is a lesson to be learned from the last fifty years--from what has been done by Hitler, Stalin, Mao, Pol Pot, the Rwandans, and the Bosnians, among others--it is that human beings don't need weapons of mass destruction in order to engage in mass destruction. Human ingenuity and human depravity are such that guns and machetes will do. It is true that with a bomb you can kill people faster and more emphatically. But here we risk reënacting, on the international stage, the inanity of our domestic things-that- kill-people hierarchy. Pol Pot, instead of shooting and starving millions of his countrymen in the course of several years, could presumably have herded them together and detonated a small nuclear device in their midst. That the former genocidal act did not compel us to action but the latter most assuredly would have is not evidence of moral seriousness on our part. It is evidence of moral myopia.
The Indians are not the only ones to fetishize the bomb. A decade after the fading of the Cold War, the United States and Russia continue to maintain arsenals of tens of thousands of atomic and hydrogen bombs, for what purpose no one can say--unless it is to serve some sort of vague national prestige. To the Indians, this action, or inaction, evidently speaks louder than all our anti-proliferation words. No wonder the announcement of the tests last week was greeted with euphoria by so many Indians. To them, the bomb is a way of earning respect--a salve for what President Clinton quickly diagnosed as their belief that "they have been underappreciated in the world." Clinton spoke of India as a school principal might speak of a troubled but promising adolescent suffering from low self-esteem. If this was diplomacy as guidance counselling, it was nonetheless a shrewd insight.
It is important that India be firmly divested of the nuclear illusion, but it is equally important that we divest ourselves of it as well. The bomb fetish is a James Bondish fantasy. It is also an embodiment of the high-modernist creed that form--in this case, the mastery of the scientific--carries automatic moral weight. And it is this creed that has led us to channel our indignation disproportionately into those instances in which evil meets a predetermined set of technical criteria. To the victims of mass slaughter, the distinction is without a difference.
The Estrogen Question
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
June 9, 1997
MEDICAL DISPATCHES
How wrong is Dr. Susan Love?
1.
When Dr. Susan Love gives speeches, she stands informally, with her feet slightly apart and her hands in casual motion. She talks without notes, as if she were holding a conversation, and translates the complexities of medicine and women's health with clarity and wit. "I see my role as a truthteller," she told a sold-out audience of middle-aged women at the Smithsonian, in Washington, last month, and everybody roared with approval, because that's what they've come to expect of Susan Love. She was, as usual, dressed simply that day--in a blue pants suit, with no makeup and with her hair in a short perm that looked as if she had combed it with her fingers. She had no briefcase or purse or adornment of any sort, and certainly none of a surgeon's customary hauteur, since it is Love's belief that medicine has for too long condescended to women. She was there to promote her new best-seller on estrogen therapy and menopause, but she made it clear right away that she wasn't about to preach. "Don't expect to leave here tonight knowing what to do," she said. Love wanted her audience to hear the evidence but, above all, to listen to their own feelings. "You have lived in your body a long time," she told the crowd, smiling warmly and reassuringly. "You know it pretty well--you know how it reacts to things, and you can trust it."
There are at least three doctors in America who fall into the category of media-celebrity--who can reliably write a best-seller or fill a lecture hall. The first is Deepak Chopra, practitioner of quantum healing and mind-body medicine. The second is Andrew Weil, whose seventh book, "Eight Weeks to Optimum Health," has been on the best-seller lists since March. And the third is Susan Love, breast surgeon, co-founder of the National Breast Cancer Coalition, and the author of two hugely successful books: "Dr. Susan Love's Breast Book," in 1990, and this year's "Dr. Susan Love's Hormone Book."
These celebrity doctors are all, in one way or another, proponents of what is fast becoming a basic tenet of popular medicine: that the system of health care devised by doctors and drug companies and hospitals is close-minded, arrogant, and paternalistic--dismissive of the role that nontraditional remedies, and patients themselves, can play in treating illness. "Blind faith in professional medicine is not healthy," Weil states flatly in "Natural Health, Natural Medicine"--and it's a sentence that could easily have appeared in any of the books by Love or Chopra.
Of the three, though, Love's critique is the most sophisticated. She's not a hippie, like Weil, or a mystic, like Chopra. She's a respected clinician--the former director of the Revlon-U.C.L.A. Breast Center, and an adviser to the National Institute of Health's vast Women's Health Initiative--and her criticisms have the power of the insider. Karen Stabiner writes, in "To Dance with the Devil," her brilliant, recently published account of Love's tenure at U.C.L.A.:
Love had a set of immutable rules about proper examining room behavior, all designed to even out what she saw as an impossibly inequitable relationship. She always had the patient get dressed after an exam and threatened that otherwise she would have to disrobe to even things out. She never stood with her hands folded across her chest, which would make her seem inaccessible. She tried never to stand near the door, which made the patient feel that the doctor was in a hurry. Love had been known to breeze into a room and sit on the floor, legs splayed, her notes on her lap. She often sat on the footstool the patients used to step up onto the table. It was a conscious maneuver. These women felt helpless enough without having to assume a supplicant's posture, staring up at the all-knowing physician.
In "Dr. Susan Love's Hormone Book" Love applies this skepticism to perhaps the most important topic in women's health today: whether older women should take estrogen. The medical establishment and the pharmaceutical industry, she says, have told women that they have a disease, menopause, and have then given them a cure, estrogen, even though it's not clear that the disease is a disease or that the cure is a cure. "The reason I got into this is that a lot of the books out there were 'Don't worry, dear, we'll take care of it,' " Love told me just before she took the stage at the Smithsonian. "Women were dying to get more information, literally and figuratively. They weren't hearing the voice that says, 'You can figure this out. This is your choice, this is your body, this is your life. You don't have to do what the doctor says. You can do what feels right for you.' That's the voice that was missing." It's a nearly irresistible argument, made all the more so by the way Love presents it--honestly, passionately, forthrightly. So why, after even the slightest scrutiny, does so much of what Love has to say begin to fall apart?
2.
Estrogen, Or Premarin (the trade name under which it is principally sold), is the most widely used prescription drug in the United States. Taken in the short term, during the onset of menopause, it offers relief from hot flashes and other symptoms. Taken over the long term, as part of a regime of hormone-replacement therapy (H.R.T.), it has been shown to reduce the risk of hip and spinal fractures in older women by as much as half, to lower the risk of heart disease by somewhere between forty and fifty per cent, and even--in recent and very preliminary work--to either forestall or modify the ravages of Alzheimer's disease and osteoarthritis. H.R.T. has two potential side effects, however. It raises the risk of uterine cancer, and that's why many women who take Premarin add a dose of the hormone progestin, which blocks the action of estrogen in the uterus. Long-term H.R.T. may also lead to higher rates of breast cancer.
It is the last fact that Love considers the most important. She has spent almost all her professional career fighting breast cancer, and was one of the earliest and most vocal opponents of radical mastectomies. Through the National Breast Cancer Coalition, she helped lead the fight to increase government funding for breast-cancer research, and it's hardly an exaggeration to say that her first book, "Dr. Susan Love's Breast Book," is to women's health what Benjamin Spock's "Baby and Child Care" was to parenting. Love is concerned about breast cancer above all else: she's worried about anything that might increase the risk of such an implacable disease. What's more, she believes that the benefits of estrogen are vastly exaggerated. Women humped over with osteoporosis are, according to Love, "far more common in Premarin ads than in everyday life," and she says that, since serious bone loss doesn't occur until very late in life, taking estrogen over the long term is unnecessary. On the question of heart disease, she says that the studies purporting to show estrogen's benefits are critically flawed. In any case, she points out, there are ways women can cut their risk of heart attack which don't involve taking drugs--such as eating right and exercising. So why take the chance? "It's only very recently that we've started talking about using drugs for prevention, and that's O.K. when we talk about high cholesterol or high blood pressure," she told me. "Those are people who have something wrong. But when you talk about H.R.T. for postmenopausal women, you're talking about women who have nothing wrong, who are normal, who may never get these diseases, and who are not necessarily at high risk. There is no drug that is a free lunch. There are always side effects, so why would we put women on a drug that has the side effect of a potentially life-threatening disease?"
What Love has done is recalculate the risk/benefit equation for estrogen which is fine, except that she consistently overstates the risks and understates the benefits. In the case of osteoporosis, for example, it is true that most women don't experience the effects of bone loss until their seventies. But some--about ten to fifteen per cent of women--do, with quite serious consequences. It's also the case that the maximum protection against hip fractures comes only after ten years of H.R.T., which, considering how debilitating hip fractures are to the well-being of the elderly, is a strong argument for long-term estrogen use. Or consider how Love deals with the question of heart disease. All the major studies from which conclusions have been drawn are what are called observational studies: epidemiologists have found a large group of women who were taking estrogen, followed them for a number of years, and then determined that those women had about half the number of heart attacks that women who weren't taking estrogen had. The problem with this kind of study, of course, is that it doesn't tell you whether estrogen lowers the risk of heart disease or whether the kind of women who have the lower risk of heart disease are the kind of women who take estrogen. Love suspects that it's the latter. In all the studies, she points in her new book, "the women who took estrogen were of higher socioeconomic status, better educated, thinner, more likely to be non-smokers . . . more likely to go to doctors . . . and therefore more likely to have had overall preventative care, such as having their blood pressure checked and their cholesterol monitored."
What Love doesn't point out, though, is that over the past decade estrogen researchers have been scrupulously attempting to account for this problem, by breaking down the data in order to match up the estrogen users more closely with the nonusers. Women on hormones who smoke, have a college degree, and have high blood pressure, say, are matched up with women who smoke, have a college degree and high blood pressure, and don't take hormones. It's an imperfect way of breaking down the data, since the resulting samples are not always large enough to be statistically significant. But it gives you some idea of how real the effect of estrogen is, and when researchers have done this kind of reanalysis they've found that estrogen cuts heart attacks by about forty per cent, which is a lower figure than before the reanalysis but still awfully impressive.
With breast cancer, Love takes the opposite approach--taking a relatively weak piece of evidence and making it appear more robust than it is. Her logic goes something like this. We know that hysterectomies, regular exercise, and early pregnancy--all things that lower a woman's exposure to her own estrogen--reduce the risk of breast cancer. We also know that having one's first period before the age of twelve, having children late or never having children, reaching menopause late, drinking a lot of alcohol, or being overweight--things that raise a woman's exposure to her own estrogen--increase the risk of breast cancer. "Since your body's own hormones can cause breast cancer," Love writes, "it makes sense to conclude that hormones taken as drugs will also increase your risk."
That sounds persuasive. But where's the clinical evidence to support it? "I just reviewed the hormone/breast-cancer research from the last five years," Trudy Bush, an epidemiologist at the University of Maryland, told me. "I found one report, from the Nurses' Health Study, showing a forty-percent increase in breast- cancer risk. I found four reports--two very large and well done--showing no effect, and I found another study showing that estrogen gave women significant protection against breast cancer. They're all over the place."
The problem is that measuring the link between estrogen and breast-cancer risk is tricky. The Nurses' Health Study, for example, which showed that women on H.R.T. had a forty-per-cent greater chance of getting breast cancer, is the study that has received the most media attention and the one that preoccupies Love: it is among the largest and best of the studies, and its conclusions are worrying. But it has some of the same selection-bias problems as the heart-disease studies. The estrogen users in the study, for example, had fewer pregnancies, got their periods earlier, and have other differences with the control group which would lead you to believe that they might have had a higher risk of breast cancer anyway.
There is another possible complication: estrogen does such a good job of fighting heart disease that most women who are on H.R.T. live substantially longer than women who aren't. In a recent computer analysis, Nananda Col, who teaches at the Tufts School of Medicine, and her colleagues there took the most conservative possible estimates--the highest available estimate for breast- cancer risk and the lowest one for heart-disease benefit--and devised an H.R.T. risk/benefit table, from which any woman can figure out on the basis of her own risk factors what her expected benefit would be. It shows that a woman who smokes, has relatively high cholesterol, high blood pressure, and moderate breast-cancer risk can expect to live two and a half years longer if she takes estrogen. That's two and a half years in which she has a chance to develop another disease of old age--for example, breast cancer. In other words, you'd expect to see more breast cancer in women who are on estrogen than in women who aren't, even if estrogen has nothing whatever to do with cancer, for the simple reason that women on estrogen live so much longer that they have a greater chance of getting the disease naturally.
Most experts agree that, in the end, H.R.T. is probably linked to some increased breast-cancer risk. What all the questions suggest, though, is that the effect is probably not huge and is certainly nowhere close to cancelling out the benefits of estrogen in fighting heart disease. Col, in her computer analysis, estimates that only about one per cent of women--those with the very highest risk of breast cancer and only a slight heart-disease risk--can expect no gain, or even a loss, in life expectancy from H.R.T. Everybody else--even those who have a close relative with breast cancer--is likely to benefit from the drug, and for some women taking estrogen is as good a way of living longer as quitting smoking. It is, unfortunately, very hard to convince most women of this fact. As few as a quarter of those who begin H.R.T. stay on the treatment for more than two years, and much of that has to do with the persistent inclination of many women to overestimate their risks of getting breast cancer and underestimate their risks of developing heart disease. In one recent study of several hundred educated middle-aged women, almost three- quarters of those polled thought that their risk of developing heart disease by age seventy was less than one per cent--when, in fact, statistically, it's more like twenty per cent. In making her argument the way she does, then, Love is not "truthtelling"; she's simply furthering an existing--and dangerous--myth. "You can understand where she's coming from," Trudy Bush says. "Fourteen hours a day, six days a week, she sees women with breast cancer, and that's all she sees. Your world becomes very narrowly defined. It happens with everyone who is a breast surgeon. But I also think that there is a perception on the part of some women who are activists that there is a conspiracy to force women to buy these hormones and force doctors to prescribe them. Instead of the military-industrial complex, it would be the A.M.A.-pharmaceutical complex. But things aren't so simple. In my opinion, we're all struggling here, trying to tease this out. We can only look at the data."
In March, Love published an Op-Ed piece in the Times, in which she directly addressed the question of the relative risks facing women. "Pharmaceutical companies defend their products by pointing out that one in three women dies of heart disease, while one in eight women gets breast cancer," she wrote. "Although this is true, it is important to note that in women younger than age 75 there are actually three times as many deaths from breast cancer as there are form heart disease."
This statistic is central to Love's argument. She is saying that it makes no sense to avoid something that will kill you tomorrow if it increases your chances of dying of something else today. Incredibly, however, Love has her numbers backward: in women younger than seventy-five, there are actually more than three times as many deaths from heart disease as from breast cancer. (In 1993, about ninety-six thousand women between thirty-five and seventy-four died of heart disease, while twenty-eight thousand died of breast cancer.) Even the general idea behind this argument--that heart disease is more of a problem for older women and breast cancer is more of a problem for younger women--is wrong. In every menopausal and post menopausal age category, more women die of heart attacks than die of breast cancer. For women between the ages of forty-five and fifty-four, death rates for heart disease are roughly 1.4 times those for breast cancer. For women between the ages of fifty-five and sixty-four, it's nearly three times the problem; for women between the ages of sixty-five and seventy-four, it's five and a half times the problem; and for women seventy-five and older it's almost twenty times the problem.
It's hard to know what to make of this kind of error. The Harvard epidemiologist Meir Stampfer was so dismayed by it that he wrote a letter to the editor of the Times, which was published a week after Love's article appeared. But he didn't think that her mistake was deliberate. He thought that she had just looked at the government's mortality tables and confused the heart-disease category with the breast-cancer category. "Somebody told me that they heard her on the radio or TV giving those wrong numbers, and I was pretty astonished," Stampfer told me. "And then, when I saw it in print, I flipped my lid a little bit. I'm assuming that it's just an unwitting transposition of the numbers."
That, at least, is the charitable explanation. When I met with Love, a month or so after Stampfer's letter appeared, I asked her about the relative risks of breast cancer and heart disease. We were sitting together in a booth at a hotel coffee shop in downtown Washington. It was the kind of situation, you'd think, where she might have felt free to admit to embarrassment or to offer some kind of candid explanation for what went wrong. But that's not what happened. "One of the problems with that comparison is that they act like these diseases are all at the same time," she answered. "Most women at fifty know someone who has died of breast cancer. Most women at fifty don't know someone who has had heart disease." Her eyes locked reassuringly on mine. "That's because under seventy-five there are three times as many deaths from breast cancer as from heart disease."
3.
There is an even more striking problem with the anti-estrogen movement, and that is the way that it ignores the next generation of H.R.T., the compounds known as serms (for "selective estrogen- receptor modulators"). For many years, it was thought that estrogen was a kind of blunt instrument, that if a woman took the hormone it would affect her bones and her breasts and her heart and her uterus in the same way. In other words, it was thought that a woman's body had one kind of molecular switch that would be turned on all over the body whenever she took the hormone. But when scientists were testing the drug Tamoxifen on women with breast cancer several years ago, they found out something unexpected. Tamoxifen was supposed to turn off the estrogen switch, so someone with breast cancer would take it on the theory that starving breast tissue of natural estrogen would help shrink or prevent tumors. "Everyone expected that the bone quality in these women on Tamoxifen would not be good." Donald McDonnell., a pharmacologist at Duke University, told me, explaining that people had assumed that if there was no estrogen going to the breasts there would be none going to the bones, either. In fact, though, the women's bones were fine. Somehow, Tamoxifen was turning off the estrogen switch in the breasts by acting just like estrogen in the bones. "What that suggested for the first time was that maybe estrogen doesn't work the same way in every cell and maybe we could use this information to build better compounds that would be tissue-selective. " McDonnell said.
What researchers now believe is that there are many kinds of estrogen switches in the body, and that whether they turn on or off is highly dependent on the type of the estrogen like compound that they are presented with. Tamoxifen, by purest chance, happens to be a compound that turns the switch on in the bones and off in the breasts. Unfortunately, it also turns the switch on in the uterus, raising the risk of uterine cancer. But a second generation of serms is now in development; these act like estrogen in the heart and the bones but block the harmful effects of estrogen in the breasts and the uterus. McDonnell has one such compound that is about to go into clinical trials. The Indianapolis-based drug firm Eli Lilly has another--Raloxifene--that is much further along and could be on the market within a year or so. Before very long, in short, women worried about raising their breast-cancer risk will have the option of taking a different kind of hormone that doesn't affect their breasts at all --or that may even protect against breast cancer.
"In the past, what you were looking at was a risk/benefit game," John D. Termine, a vice-president at Lilly's research laboratories, told me. "There was estrogen with all these terrific properties, but at the same time there was this downside, that women were afraid of breast cancer. Now Raloxifene and the other serms come along, and we're going to have alternatives. Now the risks and benefits are much different, because we have something else. . . . One of the physicians on our advisory board said that it's like when beta-blockers were introduced for heart diseases. It changed the game completely."
You might think that this development would be of enormous significance to Love, answering, as it does, her great worry about the potential side effects of H.R.T. In fact, she mentions serms just twice in her book and, each item, only briefly. It's a bit as if someone had written a book about computers in 1984 and
Scientists are hoping to use some of this new information to design the perfect hormone: one that will protect the uterus and breast from cancer, stop hot flashes, and prevent osteoporosis and heart disease. It would be lovely--could it do housework too?--but I'm skeptical,. It would still be a drug. And I have yet to see a drug that doesn't have some side effects.
This is an extraordinary passage. It would still be a drug? What form does a successful medical intervention have to take before Love finds it acceptable? And, for that matter, since when does the possibility of side effects negate the usefulness of a drug? Drugs have side effects, but we take them anyway, because in most cases the side effects are a lot less significant than the main effects. (That's why they're called side effects.) At one point in her speech in Washington, Love spoke of her daily breakfast of soy milk and flax-seed granola, and boasted jokingly that it was so rich in natural plant estrogens that "one bowl is as good as a Premarin pill." Now it turns out that one bowl is not as good as a Premarin pill, because plant estrogens as much weaker than animal estrogens. Nor are plant estrogens exactly "natural," because plant estrogens are, technically, nonsteroidal while Premarin--like the estrogen a woman makes herself--is a steroid. But Love wasn't really intending to enter into a discussion of estrogen chemistry. She was simply expressing her skepticism of modern medicine--of the idea that medical salvation can come in the form of a pill. Her objection is not to Premarin itself so much as it is to the idea that postmenopausal women should rely on any sort of drug at all.
This is where, sooner or later, you end up when you start down the path of people like Andrew Weil and Deepak Chopra and Susan Love. To read the health books on the best-seller lists right now is to be left with the impression that exercise and a good diet are all that matter--that medicine is too ineffectual to help us if we do not first help ourselves. That's one of the reasons these books are so successful: they take the language of emotional and spiritual fulfillment and apply it to medicine, prompting people to find and follow their own instincts about health in the same way they have been taught to find and follow their own instincts in relationships, say. When Love told me in Washington that "this is your choice, this is your body, this is your life," that's what she meant--that the medical was the personal. This kind of talk may inspire people to shape up, which is all to the good. But it does not begin to reflect how sophisticated and powerful medicine has become. In the introduction to his 1990 book "Natural Health, Natural Medicine" Weil claims that "professional medicine" is "bad" at treating, among other diseases, cancer and viral infections. Yet today, just a few years after he wrote those words, not only are we on the verge of getting a new class of anti-influenza drugs but a combination therapy for H.I.V. appears to have dramatically extended the lives of aids patients, and over the next several years the biotechnology industry is likely to get approval for almost two dozen new cancer drugs, representing second generation of treatments, to replace chemotherapy and radiation. The list of things that traditional medicine is bad at gets shorter all the time.
Earlier this year, a study appeared in the Journal of the American Medical Association that put many of these changes in perspective. The study, conducted by a team from Harvard University's School of Public Health, attempted to figure out why the mortality rate from coronary heart disease dropped so dramatically in the nineteen-eighties. In that decade, the decline averaged 3.4 per cent a year, which means that in 1990 there were about a hundred and thirty thousand fewer deaths from heart disease in America than there would have been if the mortality rate had been the same as it was in 1980. Most people, I think, would credit this extraordinary achievement to our getting more exercise in the nineteen-eighties and losing weight. But we didn't, much. Smoking, which is obviously a major risk factor for heart disease, was down, but not by a lot: the Harvard group estimated that declines in smoking probably account for about six per cent of the decrease. People did eat better as the decade progressed, but better diet probably accounts for only about a quarter of the difference. Most of the drop--about seventy per cent of the total--happened because of the increased use of procedures like angioplasty and coronary bypass and, more important, the advent of a new class of powerful clot-dissolving drugs, like streptokinase and tissue-plasminogen activator.
This does not, of course, change the fact that people should exercise and eat properly and take charge of their lives., We should all listen to our bodies and make our own choices. But there are times when what we can find out about our bodies and do for ourselves pales in comparison to what we do not know and cannot do--when we have to rely on doctors and medicine to do things for us. There is more to medicine than can be explained by the language of personal fulfillment.
The Dead Zone
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
September 29, 1997
A REPORTER AT LARGE
Seven bodies buried in the Artic tundra might
solve the riddle of the worst flu pandemic
in history -- and might help us prevent it
from happening again.
I--PERMAFROST
On September 24, 1918, three days after setting sail from Norway's northern coast, the Forsete arrived in Longyearbyen, a tiny mining town on one of the Norwegian islands north of the Arctic Circle. It was the last ship of the year, before ice made the Arctic fjords impassable, and it carried among its passengers a number of fishermen and farmers going north for the winter to earn extra money in Longyearbyen's coal mines. During the voyage, however, the ship had been hit with an outbreak of flu. Upon landing, many of the passengers had to be taken to the local hospital, and over the next two weeks seven of them died. They were buried side by side in the local cemetery, their graves marked by six white crosses and one headstone:
Ole Kristoffersen
February 1, 1896-October 1, 1918
Magnus Gabrielsen
May 10, 1890-October 2, 1918
Hans Hansen
September 14, 1891-October 3, 1918
Tormod Albrigtsen
February 2, 1899-October 3, 1918
Johan Bjerk
July 3, 1892-October 4, 1918
William Henry Richardsen
April 7, 1893-October 4, 1918
Kristian Hansen
March 10, 1890-October 7, 1918
The Longyearbyen cemetery is at the base of a steep hill, just beyond the town limits. If you look up from the cemetery, you can see the gray wooden skeleton of the coal mine that used to burrow into the side of the hill, and if you look to your left you can see the icy fringes of a glacier. Farther down the mountain are a shallow stream, a broad shale plain, and then, half a mile or so across the valley, Longyearbyen itself: a small cluster of red-roofed, brightly painted frame buildings. There are no trees, because Longyearbyen is many miles above the tree line, and from almost anywhere in the valley the cemetery is in plain view. Each grave site is slightly elevated and surrounded by rocks, and there are well-worn pathways among the rows of crosses. A chain-link fence rings the periphery. When I was there in late August, the ground had been warmed by the Arctic summer sun and was soft and spongy, carpeted with orange and red and white lichen. In the last row I found the miners' graves--seven deaths separated by six days.
It is possible to go to almost any cemetery in the world and find a similar cluster of graves from the fall of 1918. Between September and November of that year, as the First World War came to an end, an extraordinarily lethal strain of influenza swept the globe, killing between twenty million and forty million people. More Americans died of the flu over the next few months than were killed during the First World War, the Second World War, the Korean War, and the Vietnam War combined. The Spanish flu, as it came to be known, reached every continent and virtually every country on the map, going wherever ships sailed or cars or trucks or trains travelled, killing so many so quickly that some cities were forced to convert streetcars into hearses, and others buried their dead in mass graves, because they ran out of coffins. Of all those millions of graves, though, the seven in Longyearbyen stand apart. There, less than eight hundred miles from the North Pole, the ground beneath the lichen is hard-frozen permafrost. The bodies of the seven miners may well be intact, cryogenically preserved in the tundra, and, if so, the flu virus they caught on board the Forsete--the deadliest virus that the world has ever known--may still be down there with them.
At the beginning of next month, a scientific team led by the Canadian geographer Kirsty Duncan will fly to Longyearbyen and set up a workstation in the church graveyard. The team will map the site, and then scan it with ground-penetrating radar, passing what looks like a small black vacuum cleaner over the tundra to see how deep the bodies are buried. If the radar shows that they are below the active layer of the permafrost--that is, below the layer that thaws each summer--the team will return next fall with enough medical equipment and gear to outfit a small morgue. The site will be prepared with tarpaulins and duckboards. Pavement breakers--electric jackhammers--will be used to break up the tundra, and the chunks of earth will be scooped out with a shovel. As the excavation gets close to the coffins, the diggers will don biohazard spacesuits, and a dome or a tent will be erected over the operation.
To minimize the possibility of infection, the bodies will be left where they are, in their coffins, and autopsies will be performed in the ground. If the clothes on the corpses are frozen to the skin or tightly matted, someone on the team might run a hair dryer over the material to loosen it up--but only briefly. "If the bodies are thawed out and this material is taken out, it will melt, and then there is always the chance of the spread of micro droplets," Peter Lewin, one of the team members, told me. Lewin is a pediatrician at Toronto's Hospital for Sick Children who doubles as a medical archeologist, and he has earned international renown for his pioneering cat scans of Egyptian mummies. (He helped determine that Ramses V died of smallpox.) "Say you're doing an autopsy"--he gestured to indicate a body spread out on the desk in front of him--"if it melts, there may be a mucousy, secondary blood product--some type of liquid exudation. The liquid seeping out of that material may suddenly, by mistake, be aerosolized and someone inhales it. You just don't want to take any chances."
From the ad-hoc morgue in the Longyearbyen cemetery, the samples will be flown to a BSL-4 facility--4 is the highest level of biological containment--either in England or at the United States Army's infectious-disease research facility, at Fort Detrick, Maryland. There's a small possibility that what scientists will find is a live virus--a virus that, once thawed, could be as deadly and infectious as it was in 1918. If they don't, the hope is that they'll at least be able to recover the virus's genetic footprint--what scientists call RNA residue. Samples of the virus will then be sent to laboratories around the world. Its genetic code will be sequenced and compared with every major sample of the flu virus on file in the world's virological centers.
This task has a certain urgency. Scientists know that global outbreaks of deadly influenza go back at least four hundred years, and that there have been two more since 1918--the Asian flu, of 1957, which killed seventy thousand Americans, and the Hong Kong flu, which killed thirty-three thousand during the winter of 1968-69. With luck, we'll be able to anticipate the next Spanish flu before it does much damage. The problem is that we're not really sure what to look for. No one kept a sample of the virus in 1918, because the flu virus wasn't isolated until fifteen years later. And, because influenza mutates so rapidly, there's almost nothing to be learned about the peculiarities of the 1918 virus from looking at the influenzas in circulation today. The only way to find out about the 1918 virus is to find the 1918 virus.
"We've designed core-biopsy-removal equipment to take core samples," Peter Lewin said. "You drill into the body, because it's solid. It's a technique taken from forestry. You use what's called a hole-saw tube." He drew a diagram on the back of a file folder, outlining a long, hollow cylinder, with circular, screw like grooves on its outside, a serrated edge on its tip, and a T-shaped handle at its other end. "It's about nine inches long, about a quarter inch in diameter," he went on, explaining that as the tube is twisted into a body it will collect a long cross-sectional sample of tissue. "We'll probably take four core samples of the lung"--he pointed at the upper and lower chambers of his left and right lung--"one of the brain, one of the trachea, perhaps two of the bowel and liver."
Lewin was raised in Egypt, where his father was a British military officer--two of Lewin's schoolmates were Adnan Khashoggi and the future King Hussein--and he has the unflappable, genteel air of a nineteenth-century colonial explorer. He ticked off the details of the exhumation in Longyearbyen as if he were reciting a grocery list. "We're doing some practice runs on frozen material--basically, on frozen pigs--to see if this thing works. We were initially going to use a drill. But the drill goes so fast that it heats the tissue up, and, of course, we don't want that. So why not just slowly twist it in?" He rotated his hand. "They use hole saws on trees to get core samples of rings. They're very useful. But no one has ever used them here. I mean"--he laughed--"how often do you do core samples of frozen bodies?"
II--The Second Wave
The first known case of Spanish flu was reported on March 4, 1918, at Camp Funston, in Kansas. By April, it had spread to most cities in America and had reached Europe, following the trail of the hundreds of thousands of American soldiers who crossed the Atlantic that spring for the closing offensives of the First World War. The spring wave was serious but not disastrous, and by midsummer it had subsided. A month or so later, however, the Spanish flu resurfaced. It was the same virus in the sense that if you'd got the flu in the spring you were resistant to it in the fall. But somehow over the summer it had mutated. Now it was a killer.
The first case of the second wave was recorded on August 22nd, in Brest, a major port for incoming American troops. Within days, it appeared simultaneously in Boston and Freetown, Sierra Leone, carried in the former case by returning American soldiers and in the latter by H.M.S. Mantua, a British navy ship. The virus crossed Europe in a matter of weeks. It attacked Spain through Portugal, in the west, and across the Pyrenees, in the north, lingering long enough to be dubbed--erroneously, as it turned out--the Spanish flu. Scandinavia was infected by England; Italy was infected by France; and Sicily was infected by Italy. Allied soldiers coming to the aid of anti-Bolshevik forces during the Russian Revolution carried the flu to the White Sea area of northwestern Russia. European and American ships brought the flu to Iceland in mid-October, and American ships brought the flu to New Zealand at around the same time. In India, the virus came by sea and raced inland along the country's railroad lines. As many as half of all those who died in the pandemic died within India's borders. In America, an estimated six hundred and seventy-five thousand people died. In Philadelphia, seventy-six hundred people died within fourteen days. Putrefying bodies were stacked up three and four deep in the corridors of the city morgue, creating such a stench that the morgue was forced to throw open its doors for ventilation. In "America's Forgotten Pandemic" (1976), a definitive history of the Spanish flu, the historian Alfred Crosby offered this description of the flu's advance on Alaska:
On or about November 1 the virus reached the finest medium for its propagation in Nome and vicinity, the city's Eskimo village. Few Eskimos escaped infection. In a single eight-day period 162 of them died. Some Eskimos, hounded by superstitious horror, fled from cabin to cabin, infecting increasing numbers with disease and panic. The temperature fell below freezing, and when rescuers broke into cabins from whose chimney came no sign of smoke, they found many, sometimes whole families, who had been too sick to renew their fires and who had frozen to death. When a number of Eskimos were rounded up from their separate cabins and placed in a single large building so they could be cared for efficiently, several of them responded to what they apparently perceived as incarceration in a death house by hanging themselves.
This was not the flu as we normally think of it. Typically, influenza infects the inner lining of the respiratory tract, damaging the air-filled cells of the lungs known as alveoli. Sometimes it brings on pneumonia. Usually it passes. This was much worse. "If you autopsied some of the worst cases, you'd find the lungs very red and very firm," said Jeffery Taubenberger, a pathologist at the Armed Forces Institute of Pathology, in Washington, D.C. "The lungs are normally filled with air, so they are compressible. These would be very heavy and very dense. It's the difference between a dry sponge and a wet sponge. A normal piece of lung would float in water because it was basically filled with air. These would sink. Microscopically, you would see that the alveoli would be filled with fluid, which made it impossible to breathe. These people were drowning. There was so much liquid in the air spaces of their lungs that patients would have bloody fluid coming out of their noses. When they died, it would often drench the bedsheets."
Without sufficient oxygen, patients would suffer from cyanosis--a discoloration of the skin. "Two hours after admission they have the mahogany spots over the cheek bones," a physician wrote at the time, describing the epidemic at Camp Devens, Massachusetts. "And in a few hours you can begin to see the cyanosis extended from the ears and spreading all over the face, until it is hard to distinguish the colored man from the white." Nurses would triage incoming flu patients by looking at the color of their feet. Patients whose feet were black were considered as good as dead.
Something else was strange about the 1918 strain, and that was its choice of victim. Flu epidemics kill mostly at the demographic fringes--the very old, whose immune systems are the least robust, and the very young. Other adults do get sick, but they rarely die. In 1918, however, the usual pattern of mortality was reversed. The Longyearbyen seven, for example, were all between the ages of nineteen and twenty-eight, and that was by no means unusual. In the United States, men between twenty-five and twenty-nine died of the Spanish flu at several times the rate of men between seventy and seventy-four. This wasn't just a deadly infectious disease. It was a deadly infectious disease with the singular and terrifying quality of being better at killing the young and healthy than the old and the infirm.
III--Process of Elimination
Kirsty Duncan, the leader of the Longyearbyen expedition, is a medical geographer and climatologist by training, with dual appointments at the University of Windsor and the University of Toronto. We met in her parents' house, a bungalow in the Toronto suburb of Etobicoke, she on one side of the family dining-room table, I on the other. Between us were five overstuffed black binders, filled with the results of four and a half years that Duncan had spent searching for frozen flu victims. In the kitchen behind us, Duncan's mother prepared lunch. Whenever the phone rang, or the banging from the kitchen got too loud, or Duncan was coming to a critical part of her story, she dropped her voice almost to a whisper, so that I had to lean forward to hear what she saying. She has large dark eyes and straight dark-brown hair that runs so far down her back that once when she got up her hair got caught in the chair. She's thirty, but she looks much younger. When I first walked up to the house, I approached a woman watering the flowers and said, "Professor Duncan?" The woman replied, "Oh no. I'm her mother. Kirsty's inside."
Duncan's obsession with the Spanish flu began when she read Crosby's book on the pandemic. "I was absolutely fascinated--horrified, more than anything--that we didn't know what caused this disease," she told me. "I said to my husband, 'I'm going to find the cause of the Spanish flu.'" The logical place to start, it seemed to her at the time, was Alaska, so she wrote to the Alaska bureau of Vital Statistics and had it ship her records from 1918. "I went through thousands of death certificates, and I found all kinds of cases of Spanish flu. The problem was trying to decipher where the permafrost was." In 1951, the Army had led a secret expedition to a grave site near Marks Air Force Base, in Nome, to dig up 1918 corpses, but the mission--code-named Project George--failed for that very reason: the bodies weren't in permafrost, and they had melted and decomposed. After Alaska, Duncan thought of Iceland. "But, of course, with all that geothermal energy it's too warm," she said. "Then I had a friend returning from Norway, and he mentioned permafrost, and I became excited, because I knew flu had been in Norway."
Duncan's focus was on the huge archipelago of islands, about six hundred miles north of Norway, that is known as Svalbard--and, in particular, on the town of Longyearbyen, a settlement of just over a thousand people which has served as Svalbard's major port for the better part of the century. "I knew that people used to do coal mining in Svalbard," she said. "I contacted the Norwegian Polar Institute. But they told me I had a really difficult task ahead of me. There are no medical records, because the hospital was bombed in the Second World War; no church records, because the first minister didn't come out until the nineteen-twenties; and no government records, because Svalbard didn't officially become part of Norway until 1925. They said there are these diaries, though, that the coal company kept." Duncan called the coal company, which referred her to a schoolteacher in Longyearbyen. She called the schoolteacher. He found, in the 1918 entries, a record of the deaths of seven young miners from Spanish flu. "So now I knew that there were seven bodies, and that they were buried in the churchyard in Longyearbyen," Duncan said. "I contacted the minister at the church. I said I wanted to know if the graves were marked. He said they were."
The bodies of the seven miners are not, in all likelihood, perfectly preserved. Prolonged freezing desiccates soft tissue, so the best Duncan's team can hope for is, essentially, natural mummies. "In a frozen state, the fluids in the body simply evaporate," Michael Zimmerman, an anthropologist at the University of Pennsylvania and an expert on mummification, explained to me. "The process is called sublimation. It's the change from the solid state to a gaseous state without going through a liquid state. If you put a tray of ice cubes in your freezer and go back two weeks later, they're a lot smaller. That's what we're talking about." Zimmerman estimated that the Longyearbyen seven, if they had been properly buried, would probably be down to about half their original weight, and maybe even less, so that their skin would be stretched tight over their bones, and every one of their ribs would be showing, as if they had been deprived of food for an extended period. "The eyes collapse, because there is a large fat pad behind the eye that's mostly water, and when that dries the eye falls back into the socket," he said. "Like everything else, the lips will tend to retract, so the teeth will become more prominent." Nonetheless, Zimmerman thought that a full autopsy would still be possible. "I don't see a problem," he went on, "especially given that these bodies were buried only about eighty years ago. The tissues are probably still fairly flexible. They're not like Egyptian mummies. Their tissues are like old leather, like an ancient book, and unless you're careful they'll crumble. Frozen bodies, since they don't completely desiccate until they've been frozen for a thousand years, are still flexible. You can get big pieces out pretty easily." There is a catch, though. During the summer months, the top layer of the permafrost thaws. In Longyearbyen, that layer is between one and 1.2 metres deep. If the miners were buried in that layer--if the gravediggers in 1918 hadn't gone to the trouble of blasting or pickaxing their way deep into the tundra--the bodies would be dust and bone by now. "I contacted the Norwegian authorities and asked what depth the bodies would have been buried, and they said, 'Well, no one knows,'" Duncan went on. "Back then, that was no man's land. But they assumed they would have followed the practice of the time, which was about two metres. The church minister believes they will be at two metres."
This was more than simply a guess. In the permafrost, anything buried in the active layer will, over time, "float"--that is, be pushed up toward the surface by the continual expansion and contraction of the ground. For that reason, it's relatively common in the hills around Longyearbyen to stumble across skeletons.
"If you go places where trappers are buried, you often see the coffin, open on the ground," Kjell Mork, a Longyearbyen high-school teacher who serves as the town's unofficial historian, told me. Mork is the man who gave the coal-company diaries to Duncan. He's a dead ringer for the novelist Robertson Davies, and has in his house a polar-bear pelt that takes up almost an entire wall. "I see it all the time. Back in the sixteenth, seventeenth, and eighteenth centuries, the trapping teams had only two or three people, so they couldn't take the effort to bury the bodies deep enough. Up at the northwest corner of the fjord, there used to be plenty of them. But now there's a new ethic--to cover them up again. I think the polar bears were going there." In the Longyearbyen churchyard, however, nothing has ever floated. Next to one of the crosses, just a few feet beyond the fence, I had seen a pile of fairly sizable white bones, including what looked like a human-size femur. But when I asked Mork about this he shook his head. "I think that's just reindeer," he said. "They come down the mountain to die."
Duncan's next big problem was to find out what had happened to the bodies before they were buried. The flu virus, after all, is notoriously unstable. It's an RNA virus, as opposed to a DNA virus, and that means that instead of being composed of double strands of genetic code it has just one strand, and is much more vulnerable. The moment someone dies, enzymes are released that begin breaking down these nucleic acids and the genetic information they carry. A DNA virus, like herpes or hepatitis, could probably last in a body for a few days before being totally destroyed. But an RNA virus, like flu, would last between twelve and twenty-four hours at the most. The diaries kept by the manager of the coal company show that the bodies of the Longyearbyen seven were not buried until October 17th, ten days after the last of them died. What happened in those ten days? Did the bodies start to decompose before they were buried? Duncan was told that orderlies at the Longyearbyen hospital would have taken the bodies to an outdoor morgue while they waited for the graves to be dug. For there still to be RNA residue, the weather would have had to have been cold enough in those first two weeks of October to keep the RNA-dissolving enzymes at bay. She checked the weather records. The average temperature in early October was minus five degrees Celsius. Duncan had her bodies, and she knew where to find them.
IV--Wax Museum
The Spanish-flu virus has been glimpsed just once, and that was in a scrap of lung tissue found two years ago in the National Tissue Repository, a division of the Armed Forces Institute of Pathology. The repository is in an annex of the Walter Reed Army Medical Center, in Maryland, just across the District of Columbia line, in a windowless corrugated-steel building behind a former elementary school. At the side are a parking lot and a loading dock, and there is an ill-kempt lawn out front. It looks like an industrial warehouse. Inside, there are three rooms, the largest of which is filled with rows of tall metal shelves, all stacked high with small brown cardboard boxes. Inside each of those boxes are pieces of human tissue about the size of a fingernail which have been preserved in formaldehyde and encased in a block of transparent paraffin wax. The repository holds more than two and a half million samples--some pressed between glass slides, some in boxes, some fully preserved organs--from autopsies on soldiers dating back to before the First World War. It's the world's largest library of death.
The supervisor of the repository is Al Riddick, a powerfully built black man in his mid-forties with a bald head, a gold chain, and glasses. When I toured the repository in late summer, Riddick took me to the back of the main room and pulled out a cardboard box from one of the shelves. Inside it were seventeen wax blocks, measuring roughly an inch by an inch by half an inch. "This is from a 1958 autopsy," he said. He picked up one of the blocks, tilting it so that I could see a speckled, bright-orange sliver of tissue embedded in the wax. "That's a brain block right there," he said. Then he picked out another block, this one encasing a dark-reddish pockmarked rectangle that looked like a dried scab. "I would say that's liver."
Next, we walked into an adjacent room, where the Army keeps its collection of organs. On a lab bench was a plastic Ziploc bag with a heavy-looking, linen-wrapped object inside. "That's a large surgical case," Riddick said. "Could be a breast. Could be a lung. It's big. Looks like a lung." He picked it up in his hands, and began to knead the package delicately, as one might check a mango for ripeness. "No, there's some bone in there." He was as matter-of-fact as Peter Lewin had been in describing how to use a hole saw on a frozen corpse. I asked Riddick whether he was ever spooked, working in roomfuls of human parts. He shook his head. "Son," he said. "I'm a Vietnam vet. It's the people who move that bother me."
In March of 1995, Jeffery Taubenberger, who heads the institute's division of molecular pathology, called over to the repository to see whether it had any tissue samples from Spanish-flu victims. Taubenberger is not a "flu man," meaning that he is not one of the small circle of scientists who have devoted their lives to influenza research. But he is one of the world's experts in the arcane art of recovering genetic information from preserved tissue samples, and it occurred to him that he stood as good a chance of finding the Spanish flu as the scientists looking for frozen bodies. The archivists told Taubenberger that they had a hundred and twenty autopsy samples of flu victims. Some, though, were just microscopic slices of tissue between glass slides, and they didn't give him enough material to work with. Taubenberger wanted wax blocks, which reduced his choices to seventy. Taubenberger and Ann Reid, a technician who worked with him on the project, randomly selected the medical records of thirty of those seventy cases. Of those, in turn, they rejected all the soldiers whose disease had not progressed rapidly, on the theory that those victims were less likely to have had the virus in their lungs when they died. That left them with seven cases. They were ready to begin.
Taubenberger and Reid started by taking lung samples from all seven and slicing off a microscopically thin sliver from the end of each block. "You take that slice and put it in a test tube and get rid of the wax," Taubenberger explained. "And you take that tissue and spin it really fast, so it all goes to the bottom of the tube. You digest it in chemicals to chew up the membranes and the proteins, you go through a series of chemical purifications, and what you end up with is something highly enriched with the RNA." Over an entire year, Taubenberger, Reid, and other members of their team worked to perfect a method of genetic analysis that could isolate the right material and stretch the tiny pieces of tissue they had as far as possible. Given the fragility of RNA, it was not an easy task. No one had ever recovered RNA from a sample so old. Early last year, they began testing the seven samples. One turned up positive.
Taubenberger is wiry and intense, with thick dark-brown hair and a patient, precise manner. He speaks in complete sentences and strings them together in complete paragraphs, until he has made even the most abstruse point crystal clear. I met him and Ann Reid in his office at the institute, a squat, five-story concrete bunker originally built as a nuclear shelter for President Eisenhower. The building has no windows, only a battered concrete doorway, and the walls inside are covered with tiles of a disorienting government-issue yellow. Taubenberger, Reid, and I sat in a circle, and the room was so small and cluttered that our feet were nearly touching. "He was a twenty-one-year-old army private," Taubenberger said. "We know he died in South Carolina, at Fort Jackson. I believe he was from the state of New York. He had no prior medical history. He got sick at the height of the pandemic at Fort Jackson and had a fast downhill course. He presented with massive pneumonia and died six days later, on September 26th,at six-thirty in the morning. His autopsy was performed around noon."
V--Viral Sex
One would think that, with the soldier's sample in hand, many of the questions that surround the Spanish flu could be answered. In a certain sense, that's true. Taubenberger and Reid have so far decoded about fifteen per cent of the genes in the soldier's virus, and their work has made possible a few preliminary conclusions about the Spanish flu. It had already been hypothesized, for example, that the Spanish flu originated--at least, in part--with a bird, probably a wild duck. Waterfowl are what virologists call the "reservoir" for influenza. They carry most of the known subtypes of influenza--without apparent ill effect--and excrete them all in their feces, thereby spreading them through land and water to the rest of the animal kingdom. All animals that get the flu--horses, ferrets, seals, pigs, among others--and human beings probably get it originally from birds.
"At this time of the year in Canada, if you look at the wild ducks that are about to migrate south before the winter, around thirty per cent probably have the flu," I was told by Robert Webster, a leading flu expert at St. Jude Children's Research Hospital, in Memphis. "They're popping it out in the water. If you sampled the lakes in Canada, you'd find all kinds of avian influenza." At some point prior to the spring of 1918, then, a flu-carrying duck must have shed feces while flying over or nesting in some inhabited part of the world. If, in fact, the pandemic started in the place where the first case was reported--Camp Funston--the precipitating event was probably somewhere in or around Kansas.
That bird virus probably didn't directly infect a human being, though, because human beings generally can't catch flu directly from birds. Viruses are particular in that way. A virus infects and takes over a cell by latching onto what is called a receptor, but--as far as we know--there isn't a receptor for avian flu in human beings. So how did the 1918 virus get from ducks to people? One possibility, according to Taubenberger's analysis, is through pigs--one of the genes he studied looks like classic swine flu. This makes sense, because pigs, uniquely, have both human and avian flu receptors; they're the perfect bridge between species. So perhaps the flu-contaminated duck feces dropped into a barnyard, whereupon a pig became infected while nosing in the dirt and passed the virus on to a farmer.
It's not quite as simple as that, though, since another of the flu genes analyzed by Taubenberger looks very much as if it came from human flu. This wasn't just bird flu passed on by a pig, in other words. This could well have been bird, pig, and human flu that somehow got mixed up together. The pig must have already been infected with one flu when it picked up the other: what it passed on to the farmer was a hybrid.
This is not as far-fetched as it sounds. A flu virus consists of eight gene segments that are so loosely bundled that they are like pieces of a jigsaw puzzle thrown together in a bag. If a pig got infected with avian and human flu simultaneously, the eight jigsaw pieces from the duck and the eight jigsaw pieces from the human being would be thrown together, and an entirely new puzzle could emerge.
Some scientists call this process of two viruses combining "viral sex," which is an apt term, because, as in human reproduction, offspring split the genetic inheritance of mother and father. According to many influenza experts, this flukish interaction of separate species is probably how almost all the pandemic strains that periodically sweep the world first arise. The Hong Kong flu, for example, consisted of seven genes from an everyday human virus and one gene from a duck that combined inside a pig to create a nasty new hybrid. The Asian flu resulted from the same kind of reassortment.
Taubenberger couldn't tell from his sample, though, what everyone really wants to know, which is what made the Spanish flu so devastating. It is possible to look at the flu strains that have proved deadly in domestic poultry and to explain their lethality almost entirely by pointing to an insertion mutation in one of the genes--a curious genetic glitch that allows the virus to attack almost any cell. One idea had been that the Spanish flu shared this same mutation. But Taubenberger showed that this wasn't the case. The relevant segment of the soldier's virus showed no such anomalies, and that meant that the secret of the Spanish flu's lethality is probably somewhere else. Perhaps it lies in one of the genes that Taubenberger and Reid haven't looked at yet. Or perhaps it's not one mutation at all, but several, all combining in some subtle way.
"One thing to keep in mind is whether the virus that Taubenberger has is just a precursor," Webster pointed out. "We've only got one virus so far. It might have been early in the pandemic's evolution. Have we yet looked at the nasty bastard? We need more than one virus. One's not enough." With an earlier or later sample, Taubenberger could see what specific changes the virus made over the summer to become a killer. Just as good would be to find a strain from another part of the world which might have had a slightly different evolution, so that Taubenberger could eliminate the differences, and focus only on what the viruses had in common. But finding that second virus has proved difficult, since only the United States Army seems to have been so assiduous in hanging on to autopsy samples from the First World War. One famous pathology archive in Germany was destroyed in the Second World War. A handful of samples found in England have yet to turn up anything. Taubenberger has put out feelers to Spain and Italy, and found nothing. I asked him about Russia, since the Russians were also pioneers in medical record-keeping, but at that he and Reid burst out laughing. "There is an epidemiologist at the Centers for Disease Control who is Russian, and I spent some time talking to him about this when I was down there," Taubenberger said. At this point, he dropped his voice an octave, imitating a thick Russian accent, and said, "October, 1918. Very bad time for Russia. Very bad time."
This spring, Taubenberger met Duncan at a conference on the Spanish flu at the Centers for Disease Control, in Atlanta, and agreed to join her team. His lab will analyze whatever frozen samples she collects. The best hope for another copy of the Spanish flu, a second copy that will help make sense of the first, may well be lying in the permafrost of Longyearbyen.
VI--Drift and Shift
Every year, early in the winter, the Food and Drug Administration hosts what some call the Flu Meeting, to insure that if the Spanish flu ever happened again we would not be unprepared. This year, the meeting took place on January 30th in the Versailles Ballroom of the Holiday Inn in Bethesda, Maryland, beginning at eight in the morning and ending at four. At the front of the auditorium, twenty or so medical experts sat behind a long table. Off to the side was a lectern, where throughout the day officials from the Centers for Disease Control and the World Health Organization gave presentations. The audience was large--well over a hundred--and included public-health officials from around the world, and vaccine manufacturers eager to get guidance from the government about what kinds of flu strains to put in the upcoming fall flu shot. Video cameras recorded the proceedings for those who couldn't attend. Of the dozens of daylong conferences that the F.D.A. hosts every year, none are as important.
The first two speakers at this year's meeting were from the surveillance section of the C.D.C.'s flu division--the eyes and ears of the flu world. Flu surveillance is critical because the flu virus comes in so many shapes and varieties. All flu viruses wear a kind of protective coat--an outer covering made up of two proteins known as hemagglutinin (h) and neuraminidase (n). That's how you can tell a flu virus under a microscope. But there are at least fifteen varieties of h and nine varieties of n, and any one of the former can combine with any one of the latter to create a different virus family. For the past twenty years, the world has been dominated by two of these flu families--the descendants of the Hong Kong flu of 1968 and the Russian flu of 1977--and every year each of them spawns dozens of offspring: genetic variants that result as individual viruses spread from person to person and change to stay one step ahead of the human immune system. Whenever a new offspring emerges, virologists say the virus has "drifted." At the same time, there is always the possibility that another avian strain will get mixed up with a human strain inside a pig and an entirely new family will emerge. If that were to happen, virologists would say the virus had "shifted."
It's this constant drifting and shifting that makes the flu so dangerous. If the flu stayed the same each year, you could be vaccinated against it the way you can be vaccinated against polio--for life. But, since the flu is always changing, the World Health Organization has had to set up a far-flung international surveillance network. Every day, in Moscow or Berlin or Iowa City or some distant Chinese province, doctors take nose and throat samples from flu sufferers, pack them in plastic vials, and send them to laboratories to be tested. The labs send isolates of the most interesting cases to the C.D.C. or to one of three other national labs working with W.H.O., in Tokyo, Melbourne, and London, for complete analysis, from which virus family trees are constructed.
Every known subtype of h and n has been identified and numbered, and every known strain has been labelled as well, with the city or the place-name where it was first isolated. If you got the flu last winter, for example, chances are you came down with h3n2 A/Wuhan/359/95; that is, a virus with No. 3 hemagglutinin, No. 2 neuraminidase, which was the three-hundred-and-fifty-ninth sample isolated from the Wuhan area of China in 1995. (The Wuhans were very big last year.) If you got the flu two years ago, on the other hand, chances are that you came down with something very similar to h3n2 A/Johannesburg/33/94.
At the Flu Meeting, the C.D.C. presented a road map of where the virus had travelled, and what forms it had taken during the previous year. It's a fantastically detailed account, in which the flu virus comes across as a malevolent hitchhiker, stopping only to infect the locals before moving on. "February, there was a ship outbreak, the U.S.S.Arkansas, so severe that they brought the ship back into port," Helen Regnery, the chief of the C.D.C.'s surveillance section, told the meeting as she explained the travels of the h3n2 strain throughout America last year. "The people on board the ship, almost one hundred per cent, were ill, with varying degrees of severity of illness." On an overhead projector was a list of all the known offspring of the American h3n2 family, and Regnery pointed to another strain. "The Alaska/02 was an isolate in July. It is from a sporadic case and it has been sequenced, and will be on the sequencing tree. Hawaii in July had a nursing-home outbreak and increased activities.... Wisconsin, at a university, had an outbreak in September. New York/43 is from an H.I.V.-positive patient in November." New Jersey followed, and then Indiana and Texas.
Regnery's road map was intended to give the F.D.A. and vaccine makers a guide to the upcoming flu season. Vaccines consist, essentially, of a dose of virus that has been chemically deactivated, so that it will stimulate the immune system without causing disease. She was helping them to decide what virus strain to use. But if you want to inoculate a hundred million people you've got to grow enough virus to make a hundred million flu shots, and that takes time. Drug companies grow the virus in chicken eggs, injecting a microscopic droplet of flu virus into the air sac above the embryo and the yolk. There, in the nutrient-rich membrane of the sac, the virus grows until, after two or three days, the original droplet has become a tablespoonful. At that point, the tops of the eggs are lopped off and the virus is suctioned out. Mary Ritchey, an executive at Wyeth-Ayerst, one of the nation's biggest flu-vaccine makers, told me that her company might use a hundred and fifty thousand eggs at a time, from which it might harvest two hundred and fifty gallons of pure virus. To supply the entire country with enough virus, vaccine makers have to do dozens of those batches, totalling millions of eggs. Then they have to purify the virus, test it, run it by the F.D.A., and then have it packaged, labelled, and sent to clinics around the country--all of which takes at least six months.
If the drug companies are going to have a flu shot ready for the fall flu season, then, they have to be told what strains to use by February or March. That means that the W.H.O.'s international surveillance teams have to guess what's going to happen in the fall based on what they have seen the previous winter. There was a time, ten or twenty years ago, when this process was notoriously inexact: a flu shot might be prepared in the summer that offered only marginal protection against the flu strains that surfaced the following fall. With an improved surveillance system and more sophisticated genetic analysis, though, that has now changed. Every year, the C.D.C. gives itself a grade based on how closely the guesses made at the Flu Meeting correlate with the actual flu in the fall. For the last four years, those grades have been perfect.
If something like the Spanish flu ever came back, this is the system we are relying on to protect us. Right now, virtually all the flu in the human population is either h1n1 or h3n2, so the road map presented by the C.D.C. at the Flu Meeting was almost entirely an account of genetic drift within those two families. The minute that the C.D.C. or a W.H.O. laboratory received a flu that didn't fall into the h1n1 or h3n2 families, it would sound the alarm. The surveillance system is also specifically focussed on those parts of the world where flu is prevalent and the inter-species movement that creates pandemic strains is more likely to occur. That means China, where there are as many ducks as people, and where pigs are often raised on farms in close proximity to wild and domestic poultry. China has been the source of the last two pandemics, and most observers think it likely that the next will be from there as well, possibly arising out of the marshy resting sites for ducks both along the nation's eastern seaboard and inland in an arc extending from Gansu Province to Guangxi, on the southern coast. Over the past few years, the Centers for Disease Control has funded ten flu laboratories in China. The number of strains sent to the C.D.C. from China every year has now reached two hundred, up from about a dozen several years ago.
Perhaps more important, flu-watchers have a sense of when to be on the lookout for new and vicious flu strains, because any kind of major social upheaval can serve as a pandemic breeding ground. This is probably what happened with the Spanish flu: the 1918 virus was the result of a shift to h1n1. But that alone doesn't explain its lethality. In the 1957 Asian-flu epidemic, h1n1 shifted to h2n2, and not nearly as many people died. The difference was probably the First World War.
As the Amherst College biologist Paul Ewald argues in his brilliant 1994 book, "Evolution of Infectious Diseases," under normal circumstances the mildest offspring of any flu family will always triumph, because people who are infected with the worst strains go home and go to bed, whereas people infected with the mild strains go to work, ride the bus, and go to the movies. You're much more likely, in other words, to catch a mild virus than a nasty virus because you're more likely to run into someone with a mild case of flu than with a nasty case of flu. In 1918, Ewald says, these rules got inverted by the war. The Spanish flu turned nasty in the late summer in France. A mild strain of flu spreading from soldier to soldier in the trenches stayed in the trenches because none of the soldiers got so sick that they had to leave their posts. A debilitating strain, though, resulted in a soldier's being shipped out in a crowded troop transport, then moved to an even more crowded hospital, where he had every opportunity to infect others. Wars and refugee camps and urban overcrowding give the worst flu strains a huge evolutionary advantage. If there were ever again a civil war in China, flu-watchers would be on full alert.
It doesn't take much, however, to see that our pandemic preparedness is not foolproof. What if, for example, the new strain emerges not in the spring but in midsummer? How, under those circumstances, could a vaccine be made in time for the fall, which is when--for reasons that are unclear--any flu in temperate zones tends to strike in earnest? And what if it didn't emerge from China, where there is good surveillance, but from Africa, say, where neither the C.D.C. nor any other W.H.O. center has the infrastructure to monitor flu strains? Most troubling, though, is that knowing a virus's type and source doesn't tell you nearly enough.
On May 10th of this year, for example, a three-year-old boy from Hong Kong's New Territories came down with the flu. He died on May 21st with what looked like an unusual and severe case of viral pneumonia, compounded by Reye's syndrome. A routine respiratory sample was taken from the boy's body and analyzed at Hong Kong's Queen Mary Hospital. It didn't seem to be either h1n1 or h3n2, though, and the doctor, puzzled, forwarded the sample to the Centers for Disease Control in early July, and also to other flu labs in the Netherlands and London. "The doctor said she had a virus that was reacting differently with the reagents," the C.D.C.'s Helen Regnery told me. "This happens sometimes, and all viruses that don't behave well are sent here right away. But at the time the kind of reaction was such that we thought it might be a human strain. Then we got another batch from Hong Kong reacting the same way. Now it was a red flag: she'd identified two others. She E-mailed me, asking if we could confirm that particular isolate." This was on the first Friday in August. The lab staff worked through the weekend. On Monday, Regnery got a call from researchers at the flu lab in the Netherlands. They had identified the boy's virus. It was a pure avian flu, one that had never been seen in humans before: h5n1. "He may have had direct contact with chickens that were sick," Regnery said. "They had chickens at his preschool. We found this out in a conference call the other night. The epidemiologists we have over there tracked down the day-care center, and they found some sick chickens."
When I visited the C.D.C. recently, Regnery and other senior C.D.C. officials I spoke with were careful not to be alarmist when they discussed the case. The two subsequent isolates sent from Hong Kong turned out to be human flu, they told me, and none of the other children at the preschool got sick. Some members of the boy's family had mild respiratory ailments immediately before his death, but none appeared to have caught the same flu. A C.D.C. team of three epidemiologists and a virologist recently returned from a three-week trip to Hong Kong and mainland China to help the local health authorities investigate and collect serum samples to see if anyone else was exposed, but so far nothing has come up. "For there to be a pandemic, there has to be a strain to which all or most of the population has no immunity, and that is capable of spreading from person to person," Nancy Arden, a senior epidemiologist at the C.D.C., told me. "So far, this doesn't meet the second criterion."
Nonetheless, the situation was a little disturbing. Ducks fly across virtually every continent in the world, dive-bombing the landmass with flu virus. They pass it to chickens and chickens come in close contact with humans every day--on farms, in poultry markets, in chicken-processing plants. If avian flu can't infect humans, of course, this is irrelevant. But what if the Hong Kong case means that there may now be strains of avian flu which can infect humans directly? Arden said, "There's still a question of whether avian strains may be evolving to the point where they can replicate in humans and where a strain could be transmitted from bird to person. And, once an avian virus gets into a mammal, it's possible that the evolution would be speeded up. No one's so alarmed that they're saying this is the start of the next pandemic. But it's not something anyone would want to be complacent about."
Then, there's the type of flu the boy got. Avian h5 is famous among virologists as the strain that passed to domestic chickens in Pennsylvania in 1983, apparently from wild ducks. Originally, it was harmless. But as it raced from chicken to chicken in the giant commercial chicken warehouses it underwent an unusual mutation. Instead of just infecting the cells of the chicken's intestinal tract, the virus became systemic--capable of infecting all the cells in a chicken's body. The eyes of the chickens became swollen. The chickens had difficulty breathing. They stopped laying eggs. They became weak, and in some cases blood spots appeared in their eyes and on their legs. Upon autopsy, it turned out that they had been hemorrhaging throughout their bodies. In a matter of months, seventeen million chickens died or had to be destroyed. "It was chicken Ebola," the flu expert Robert Webster told me. The little boy's h5 doesn't seem capable of the same destruction in humans. But is there any other type of h5 that would be? And, if there is, what genetic clues in the virus would tip us off? This is the big unanswered question behind our plans for the next pandemic, and it is also, of course, the big unanswered question that drives the search for the Spanish flu. What is it, specifically, that turns influenza into a killer?
To get to Longyearbyen, you fly from Oslo for about an hour and a half to Tromsø, a small town on Norway's northern coast, and then for another ninety minutes over the Norwegian Sea to the Longyearbyen airport. The second leg of the trip retraces by air the route that the Forsete took seventy-nine years ago. It is an extraordinary journey. First, the choppy, frigid waters of the Norwegian Sea and then, out of the Arctic mist, the forbidding mountains and glaciers of Svalbard. Longyearbyen is at Svalbard's southernmost point, huddled on the fringes of the island of Spitsbergen, a small gray stain in a blanket of white. On a clear day, from the air, it seems as if you could see the North Pole.
All this, of course, is what is so strange about the Spanish flu--that after killing so many it must now be sought out at the ends of the earth. Crosby, in the final chapter of his book on the pandemic, wonders about the disappearance of the pandemic from the American memory as well. In the Readers' Guide to Periodical Literature, 1919-21, he reports, there are thirteen inches of column space devoted to citations of baseball stories, forty-seven inches devoted to Prohibition, twenty inches devoted to Bolshevism, and eight inches devoted to the flu. John Dos Passos, who crossed the Atlantic on a troopship on which soldiers were dying of the Spanish flu every day, has just one reference to the flu in his novel "1919" and a brief mention of the pandemic in his fictionalized war memoir, "Three Soldiers." The pandemic is largely absent from the writing of Fitzgerald, Faulkner, and Hemingway as well, all of whom witnessed its savagery at first hand. "The average college graduate born since 1918 literally knows more about the Black Death of the fourteenth century than the World War I pandemic," Crosby writes. He offers a number of explanations for this. In the end, though, he concludes that the virus's figurative disappearance is of a piece with its literal disappearance, that we don't remember it because we can't find it. The Longyearbyen expedition is, if nothing else, an attempt to recover our memory of the Spanish flu.
Kirsty Duncan made the trip to Svalbard for the first time last spring. She had written to the governor of Svalbard seven months before, who had, in turn, approached the Norwegian medical-research community, the church in Longyearbyen, the church council, the bishop, the town council, and the victims' families and secured approval from each. She flew to Norway in May. "I had been in Longyearbyen about a day before I went to see the minister of the church, and I was really concerned about meeting him, because of what I was asking to do," she told me. "I introduced myself, and I said, 'I hope in no way have I offended you or the church,' and he said, 'No. This is exciting, this is important work. It has to be done,' and I was so relieved, and then he asked me if I had been to the cemetery, and I said no, that I had no right to go there until I had spoken to him, and he said, 'You go.'"
The cemetery is a ten-minute walk from the church, along a gravel road that runs parallel to the mountain. You walk away from the water and the docks, and toward the glacier, and for the entire walk the cemetery is straight ahead--a lonely stand of crosses climbing up the side of the mountain. When Duncan talked about that walk from the church to the graveyard, she looked away, her eyes misting up and her voice catching with emotion. "It was May, and everything was completely ice-covered. Completely white. Longyearbyen is in a valley, and the cemetery is up on the side of the valley floor. I knew that the seven graves I was interested in were the last seven graves at the top of the cemetery, and walking up there"--she stopped for a moment--"walking up there was really hard. I was just one year older than the oldest of them, and going to look at them made me realize that they had just come of age. You think about how they were just beginning their lives. And then you see those crosses."
The Pima Paradox
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
February 2, 1998
ANNALS OF MEDICINE
Can we learn how to lose weight from one of
the most obese people in the world?
1.
Sacton lies in the center of Arizona, just off interstate 10, on the Gila River reservation of the Pima Indian tribe. It is a small town, dusty and unremarkable, which looks as if it had been blown there by a gust of desert wind. Shacks and plywood bungalows are scattered along a dirt-and-asphalt grid. Dogs crisscross the streets. Back yards are filled with rusted trucks and junk. The desert in these parts is scruffy and barren, drained of water by the rapid growth of Phoenix, just half an hour's drive to the north. The nearby Gila River is dry, and the fields of wheat and cushaw squash and tepary beans which the Pima used to cultivate are long gone. The only prepossessing building in Sacaton is a gleaming low-slung modern structure on the outskirts of town--the Hu Hu Kam Memorial Hospital. There is nothing bigger or more impressive for miles, and that is appropriate, since medicine is what has brought Sacaton any wisp of renown it has.
Thirty-five years ago, a team of National Institutes of Health researchers arrived in Sacaton to study rheumatoid arthritis. They wanted to see whether the Pima had higher or lower rates of the disease than the Blackfoot of Montana. A third of the way through their survey, however, they realized that they had stumbled on something altogether strange--a population in the grip of a plague. Two years later, the N.I.H. returned to the Gila River Indian Reservation in force. An exhaustive epidemiological expedition was launched, in which thousands of Pima were examined every two years by government scientists, their weight and height and blood pressure checked, their blood sugar monitored, and their eyes and kidneys scrutinized. In Phoenix, a modern medical center devoted to Native Americans was built; on its top floor, the N.I.H. installed a state-of-the-art research lab, including the first metabolic chamber in North America--a sealed room in which to measure the precise energy intake and expenditure of Pima research subjects. Genetic samples were taken; family histories were mapped; patterns of illness and death were traced from relative to relative and generation to generation. Today, the original study group has grown from four thousand people to seven thousand five hundred, and so many new studies have been added to the old that the total number of research papers arising from the Gila River reservation takes up almost forty feet of shelf space in the N.I.H. library in Phoenix.
The Pima are famous now--famous for being fatter than any other group in the world, with the exception only of the Nauru islanders of the West Pacific. Among those over thirty- five on the reservation, the rate of diabetes, the disease most closely associated with obesity, is fifty per cent, eight times the national average and a figure unmatched in medical history. It is not unheard of in Sacaton for adults to weigh five hundred pounds, for teen-agers to be suffering from diabetes, or for relatively young men and women to be already disabled by the disease--to be blind, to have lost a limb, to be confined to a wheelchair, or to be dependent on kidney dialysis.
When I visited the town, on a monotonously bright desert day not long ago, I watched a group of children on a playing field behind the middle school moving at what seemed to be half speed, their generous shirts and baggy jeans barely concealing their bulk. At the hospital, one of the tribe's public-health workers told me that when she began an education program on nutrition several years ago she wanted to start with second graders, to catch the children before it was too late. "We were under the delusion that kids didn't gain weight until the second grade," she said, shaking her head. "But then we realized we'd have to go younger. Those kids couldn't run around the block."
From the beginning, the N.I.H. researchers have hoped that if they can understand why the Pima are so obese they can better understand obesity in the rest of us; the assumption is that obesity in the Pima is different only in degree, not in kind. One hypothesis for the Pima's plight, favored by Eric Ravussin, of the N.I.H.'s Phoenix team, is that after generations of living in the desert the only Pima who survived famine and drought were those highly adept at storing fat in times of plenty. Under normal circumstances, this disposition was kept in check by the Pima's traditional diet: cholla-cactus buds, honey mesquite, povertyweed, and prickly pears from the desert floor; mule deer, white-winged dove, and black-tailed jackrabbit; squawfish from the Gila River; and wheat, squash, and beans grown in irrigated desert fields. By the end of the Second World War, however, the Pima had almost entirely left the land, and they began to eat like other Americans. Their traditional diet had been fifteen to twenty per cent fat. Their new diet was closer to forty per cent fat. Famine, which had long been a recurrent condition, gave way to permanent plenty, and so the Pima's "thrifty" genes, once an advantage, were now a liability. N.I.H. researchers are trying to find these genes, on the theory that they may be the same genes that contribute to obesity in the rest of us. Their studies at Sacaton have also uncovered valuable clues to how diabetes works, how obesity in pregnant women affects their children, and how human metabolism is altered by weight gain. All told, the collaboration between the N.I.H. and the Pima is one of the most fruitful relationships in modern medical science--with one fateful exception. After thirty-five years, no one has had any success helping the Pima lose weight. For all the prodding and poking, the hundreds of research papers describing their bodily processes, and the determined efforts of health workers, year after year the tribe grows fatter.
"I used to be a nurse, I used to work in the clinic, I used to be all gung ho about going out and teaching people about diabetics and obesity," Teresa Wall, who heads the tribe's public-health department, told me. "I thought that was all people needed--information. But they weren't interested. They had other issues." Wall is a Pima, short and stocky, who has long, straight black hair, worn halfway down her back. She spoke softly. "There's something missing. It's one thing to say to people, 'This is what you should do.' It's another to actually get them to take it in."
The Pima have built a new wellness center in downtown Sacaton, with a weight room and a gymnasium. They now have an education program on nutrition aimed at preschoolers and first graders, and at all tribal functions signs identify healthful food choices--a tray of vegetables or of fruit, say. They are doing, in other words, what public-health professionals are supposed to be doing. But results are hard to see.
"We've had kids who were diabetic, whose mothers had diabetes and were on dialysis and had died of kidney failure," one of the tribe's nutritionists told me. "You'd think that that would make a difference--that it would motivate them to keep their diet under control. It doesn't." She got up from her desk, walked to a bookshelf, and pulled out two bottles of Coca-Cola. One was an old glass bottle. The other was a modern plastic bottle, which towered over it. "The original Coke bottle, in the nineteen-thirties, was six and a half ounces." She held up the plastic bottle. "Now they are marketing one litre as a single serving. That's five times the original serving size. The McDonald's regular hamburger is two hundred and sixty calories, but now you've got the double cheeseburger, which is four hundred and forty-five calories. Portion sizes are getting way out of whack. Eating is not about hunger anymore. The fact that people are hungry is way down on the list of why they eat." I told her that I had come to Sacaton, the front lines of the weight battle, in order to find out what really works in fighting obesity. She looked at me and shrugged. "We're the last people who could tell you that," she said.
In the early nineteen-sixties, at about the time the N.I.H. team stumbled on the Pima, seventeen per cent of middle-aged Americans met the clinical definition of obesity. Today, that figure is 32.3 per cent. Between the early nineteen-seventies and the early nineteen-nineties, the percentage of preschool girls who were overweight went from 5.8 per cent to ten per cent. The number of Americans who fall into what epidemiologists call Class Three Obesity--that is, people too grossly overweight, say, to fit into an airline seat--has risen three hundred and fifty per cent in the past thirty years. "We've looked at trends by educational level, race, and ethnic group, we've compared smokers and non-smokers, and it's very hard to say that there is any group that is not experiencing this kind of weight gain," Katherine Flegal, a senior research epidemiologist at the National Center for Health Statistics, says. "It's all over the world. In China, the prevalence of obesity is vanishingly low, yet they are showing an increase. In Western Samoa, it is very high, and they are showing an increase." In the same period, science has unlocked many of obesity's secrets, the American public has been given a thorough education in the principles of good nutrition, health clubs have sprung up from one end of the country to another, dieting has become a religion, and health food a marketing phenomenon. None of it has mattered. It is the Pima paradox: in the fight against obesity all the things that worked in curbing behaviors like drunk driving and smoking and in encouraging things like safe sex and the use of seat belts--education, awareness, motivation--don't seem to work. For one reason or another, we cannot stop eating. "Since many people cannot lose much weight no matter how hard they try, and promptly regain whatever they do lose," the editors of The New England Journal of Medicine wearily concluded last month, "the vast amount of money spent on diet clubs, special foods and over-the-counter remedies, estimated to be on the order of $30 billion to $50 billion yearly, is wasted." Who could argue? If the Pima--who are surrounded by the immediate and tangible consequences of obesity, who have every conceivable motivation--can't stop themselves from eating their way to illness, what hope is there for the rest of us?
In the scientific literature, there is something called Gourmand Syndrome--a neurological condition caused by anterior brain lesions and characterized by an unusual passion for eating. The syndrome was described in a recent issue of the journal Neurology, and the irrational, seemingly uncontrollable obsession with food evinced by its victims seems a perfect metaphor for the irrational, apparently uncontrollable obsession with food which seems to have overtaken American society as a whole. Here is a diary entry from a Gourmand Syndrome patient, a fifty-five-year-old stroke victim who had previously displayed no more than a perfunctory interest in food.
After I could stand on my feet again, I dreamt to go downtown and sit down in this well-known restaurant. There I would get a beer, sausage, and potatoes. Slowly my diet improved again and thus did quality of life. The day after discharge, my first trip brought me to this restaurant, and here I order potato salad, sausage, and a beer. I feel wonderful. My spouse anxiously registers everything I eat and nibble. It irritates me. A few steps down the street, we enter a coffee-house. My hand is reaching for a pastry, my wife's hand reaches between. Through the window I see my bank. If I choose, I could buy all the pastry I wanted, including the whole store. The creamy pastry slips from the foil like a mermaid. I take a bite.
2.
Is there an easy way out of this problem? Every year, millions of Americans buy books outlining new approaches to nutrition and diet, nearly all of which are based on the idea that overcoming our obsession with food is really just a matter of technique: that the right foods eaten in the right combination can succeed where more traditional approaches to nutrition have failed. A cynic would say, of course, that the seemingly endless supply of these books proves their lack of efficacy, since if one of these diets actually worked there would be no need for another. But that's not quite fair. After all, the medical establishment, too, has been giving Americans nutritional advice without visible effect. We have been told that we must not take in more calories than we burn, that we cannot lose weight if we don't exercise consistently, that an excess of eggs, red meat, cheese, and fried food clogs arteries, that fresh vegetables and fruits help to ward off cancer, that fibre is good and sugar is bad and whole-wheat bread is better than white bread. That few of us are able to actually follow this advice is either our fault or the fault of the advice. Medical orthodoxy, naturally, tends toward the former position. Diet books tend toward the latter. Given how often the medical orthodoxy has been wrong in the past, that position is not, on its face, irrational. It's worth finding out whether it is true.
Arguably the most popular diet of the moment, for example, is one invented by the biotechnology entrepreneur Barry Sears. Sears's first book, "The Zone," written with Bill Lawren, sold a million and a half copies and has been translated into fourteen languages. His second book, "Mastering the Zone," was on the best-seller lists for eleven weeks. Madonna is rumored to be on the Zone diet, and so are Howard Stern and President Clinton, and if you walk into almost any major bookstore in the country right now Sears's two best-sellers--plus a new book, "Zone Perfect Meals in Minutes"--will quite likely be featured on a display table near the front. They are ambitious books, filled with technical discussions of food chemistry, metabolism, evolutionary theory, and obscure scientific studies, all apparently serving as proof of the idea that through careful management of"the most powerful and ubiquitous drug we have: food" we can enter a kind of high-efficiency, optimal metabolic state--the Zone.
The key to entering the Zone, according to Sears, is limiting your carbohydrates. When you eat carbohydrates, he writes, you stimulate the production of insulin, and insulin is a hormone that evolved to put aside excess carbohydrate calories in the form of fat in case of future famine. So the insulin that's stimulated by excess carbohydrates aggressively promotes the accumulation of body fat. In other words, when we eat too much carbohydrate, we're essentially sending a hormonal message, via insulin, to the body (actually to the adipose cells). The message: "Store fat."
His solution is a diet in which carbohydrates make up no more than forty per cent of all calories consumed (as opposed to the fifty per cent or more consumed by most Americans), with fat and protein coming to thirty per cent each. Maintaining that precise four-to-three ratio between carbohydrates and protein is, in Sears's opinion, critical for keeping insulin in check. "The Zone" includes all kinds of complicated instructions to help readers figure out how to do things like calculate their precise protein requirements in restaurants. ("Start with the protein, using the palm of your hand as a guide. The amount of protein that can fit into your palm is usually four protein blocks. That's about one chicken breast or 4 ounces sliced turkey.")
It should be said that the kind of diet Sears suggests is perfectly nutritious. Following the Zone diet, you'll eat lots of fibre, fresh fruit, fresh vegetables, and fish, and very little red meat. Good nutrition, though, isn't really the point. Sears's argument is that being in the Zone can induce permanent weight loss--that by controlling carbohydrates and the production of insulin you can break your obsession with food and fundamentally alter the way your body works. "Weight loss . . . can be an ongoing and usually frustrating struggle for most people," he writes. "In the Zone it is painless, almost automatic."
Does the Zone exist? Yes and no. Certainly, if people start eating a more healthful diet they'll feel better about themselves. But the idea that there is something magical about keeping insulin within a specific range is a little strange. Insulin is simply a hormone that regulates the storage of energy. Precisely how much insulin you need to store carbohydrates is dependent on all kinds of things, including how fit you are and whether, like many diabetics, you have a genetic predisposition toward insulin resistance. Generally speaking, the heavier and more out of shape you are, the more insulin your body needs to do its job. The Pima have a problem with obesity and that makes their problem with diabetes worse--not the other way around. High levels of insulin are the result of obesity. They aren't the cause of obesity. When I read the insulin section of "The Zone" to Gerald Reaven, an emeritus professor of medicine at Stanford University, who is acknowledged to be the country's leading insulin expert, I could hear him grinding his teeth. "I had the experience ofbeing on a panel discussion with Sears, and I couldn't believe the stuff that comes out of this guy's mouth," he said. "I think he's full of it."
What Sears would have us believe is that when it comes to weight loss your body treats some kinds of calories differently from others--that the combination of the food we eat is more critical than the amount. To this end, he cites what he calls an "amazing" and "landmark" study published in 1956 in the British medical journal Lancet. (It should be a tipoff that the best corroborating research he can come up with here is more than forty years old.) In the study, a couple of researchers compared the effects of two different thousand-calorie diets--the first high in fat and protein and low in carbohydrates, and the second low in fat and protein and high in carbohydrates--on two groups of obese men. After eight to ten days, the men on the low-carbohydrate diet had lost more weight than the men on the high-carbohydrate diet. Sears concludes from the study that if you want to lose weight you should eat protein and shun carbohydrates. Actually, it shows nothing of the sort. Carbohydrates promote water retention; protein acts like a diuretic. Over a week or so, someone on a high-protein diet will always look better than someone on a high-carbohydrate diet, simply because of dehydration. When a similar study was conducted several years later, researchers found that after about three weeks--when the effects of dehydration had evened out--the weight loss on the two diets was virtually identical. The key isn't how you eat, in other words; it's how much you eat. Calories, not carbohydrates, are still what matters. The dirty little secret of the Zone system is that, despite Sears's expostulations about insulin, all he has done is come up with another low-calorie diet. He doesn't do the math for his readers, but some nutritionists have calculated that if you follow Sears's prescriptions religiously you'll take in at most seventeen hundred calories a day, and at seventeen hundred calories a day virtually anyone can lose weight. The problem with low-calorie diets, of course, is that no one can stay on them for very long. Just ask Sears. "Diets based on choice restriction and calorie limits usually fail," he writes in the second chapter of"The Zone," just as he is about to present his own choice-restricted and calorie-limited diet. "People on restrictive diets get tired of feeling hungry and deprived. They go off their diets, put the weight back on (primarily, as increased body fat) and then feel bad about themselves for not having enough will power, discipline, or motivation."
These are not, however, the kinds of contradiction that seem to bother Sears. His first book's dust jacket claims that in the Zone you can "reset your genetic code" and "burn more fat watching TV than by exercising." By the time he's finished, Sears has held up his diet as the answer to virtually every medical ill facing Western society, from heart disease to cancer and on to alcoholism and PMS. He writes, "Dr. Paul Kahl, the same physician with whom I did the aids pilot study"--yes, Sears's diet is just the thing for aids, too--"told me the story of one of his patients, a fifty-year-old woman with MS."
Paul put her on a Zone-favorable diet, and after a few months on the program she came in for a checkup. Paul asked the basic question: "How are you feeling?" Her answer was "Great!" Noticing that she was still using a cane for stability, Paul asked her, "If you're feeling so great, why are you still using the cane?" Her only response was that since developing MS she always had. Paul took the cane away and told her to walk to the end of the hallway and back. After a few tentative steps, she made the round trip quickly. When Paul asked her if she wanted her cane back, she just smiled and told him to keep it for someone who really needed it.
Put down your carbohydrates and walk!
It is hard, while reading this kind of thing, to escape the conclusion that what is said in a diet book somehow matters less than how it's said. Sears, after all, isn't the only diet specialist who seems to be making things up. They all seem to be making things up. But if you read a large number of popular diet books in succession, what is striking is that they all seem to be making things up in precisely the same way. It is as if the diet-book genre had an unspoken set of narrative rules and conventions, and all that matters is how skillfully those rules and conventions are adhered to. Sears, for example, begins fearful and despondent, his father dead of a heart attack at fifty-three, a "sword of Damocles" over his head. Judy Moscovitz, author of "The Rice Diet Report" (three months on the Times best-seller list), tells us, "I was always the fattest kid in the class, and I knew all the pain that only a fat kid can know.... I was always the last one reluctantly chosen for the teams." Martin Katahn, in his best-seller "The Rotation Diet," writes, "I was one of those fat kids who had no memory of ever being thin. Instead, I have memories such as not being able to run fast enough to keep up with my playmates, being chosen last for all games that required physical movement."
Out of that darkness comes light: the Eureka Moment, when the author explains how he stumbled on the radical truth that inpired his diet. Sears found himself in the library of the Boston University School of Medicine, reading everything he could on the subject: "I had no preconceptions, no base of knowledge to work from, so I read everything. I eventually came across an obscure report..." Rachael Heller, who was a co-author of the best-selling "The Carbohydrate Addict's Diet" (and, incidentally, so fat growing up that she was "always the last one picked for the team"), was at home in bed when her doctor called, postponing her appointment and thereby setting in motion an extraordinary chain of events that involved veal parmigiana, a Greek salad, and two French crullers: "I will always be grateful for that particular arrangement of circumstances.... Sometimes we are fortunate enough to recognize and take advantage of them, sometimes not. This time I did. I believe it saved my life." Harvey Diamond, the co-author of the three-million-copy-selling "Fit for Life," was at a music festival two thousand miles from home, when he happened to overhear two people in front of him discussing the theories of a friend in Santa Barbara: "'Excuse me,' I interrupted, 'who is this fellow you are discussing?' In less than twenty-four hours I was on my way to Santa Barbara. Little did I know that I was on the brink of one of the most remarkable discoveries of my life."
The Eureka Moment is followed, typically within a few pages, by the Patent Claim--the point at which the author shows why his Eureka Moment, which explains how weight can be lost without sacrifice, is different from the Eureka Moment of all those other diet books explaining how weight can be lost without sacrifice. This is harder than it appears. Dieters are actually attracted to the idea of discipline, because they attribute their condition to a failure of discipline. It's just that they know themselves well enough to realize that if a diet requires discipline they won't be able to follow it. At the same time, of course, even as the dieter realizes that what he is looking for--discipline without the discipline--has never been possible, he still clings to the hope that someday it might be. The Patent Claim must negotiate both paradoxes. Here is Sears, in his deft six-paragraph Patent Claim: "These are not unique claims. The proponents of every new diet that comes along say essentially the same thing. But if you're reading this book, you probably know that these diets don't really work."Why don't they work? Because they "violate the basic biochemical laws required to enter the Zone."Other diets don't have discipline. The Zone does. Yet, he adds, "The beauty of the dietary system presented in this book is that . . . it doesn't call for a great deal of the kind of unrealistic self- sacrifice that causes many people to fall off the diet wagon. . . . In fact, I can even show you how to stay within these dietary guidelines while eating at fast-food restaurants." It is the very discipline of the Zone system that allows its adherent to lose weight without discipline.
Or consider this from Adele Puhn's recent runaway best- seller, "The 5-Day Miracle Diet." America's No. 1 diet myth, she writes, is that "you have to deprive yourself to lose weight":
Even though countless diet programs have said you can have your cake and eat it, too, in your heart of hearts, you have that "nibbling" doubt: For a diet to really work, you have to sacrifice. I know. I bought into this myth for a long time myself. And the fact is that on every other diet, deprivation is involved. Motivation can only take you so far. Eventually you're going to grab for that extra piece of cake, that box of cookies, that cheeseburger and fries. But not the 5-Day Miracle Diet.
Let us pause and savor the five-hundred-and-forty-degree rhetorical triple gainer taken in those few sentences: (1) the idea that diet involves sacrifice is a myth; (2) all diets, to be sure, say that on their diets dieting without sacrifice is not a myth; (3) but you believe that dieting without sacrifice is a myth; (4) and I, too, believed that dieting without sacrifice is a myth; (5) because in fact on all diets dieting without sacrifice is a myth; (6) except on my diet, where dieting without sacrifice is not a myth.
The expository sequence that these books follow--last one picked, moment of enlightenment, assertion of the one true way--finally amounts to nothing less than a conversion narrative. In conception and execution, diet books are self- consciously theological. (Whom did Harvey Diamond meet after his impulsive, desperate mission to Santa Barbara? A man he will only identify, pseudonymously and mysteriously, as Mr. Jensen, an ethereal figure with "clear eyes, radiant skin, serene demeanor and well-proportioned body.") It is the appropriation of this religious narrative that permits the suspension of disbelief.
There is a more general explanation for all this in the psychological literature--a phenomenon that might be called the Photocopier Effect, after the experiments of the Harvard social scientist Ellen Langer. Langer examined the apparently common-sense idea that if you are trying to persuade someone to do something for you, you are always better off if you provide a reason. She went up to a group of people waiting in line to use a library copying machine and said, "Excuse me, I have five pages. May I use the Xerox machine?" Sixty per cent said yes. Then she repeated the experiment on another group, except that she changed her request to "Excuse me, I have five pages. May I use the Xerox machine, because I'm in a rush?" Ninety-four per cent said yes. This much sounds like common sense: if you say, "because I'm in a rush"--if you explain your need--people are willing to step aside. But here's where the study gets interesting. Langer then did the experiment a third time, in this case replacing the specific reason with a statement of the obvious: "Excuse me, I have five pages. May I use the Xerox machine, because I have to make some copies?" The percentage who let her do so this time was almost exactly the same as the one in the previous round--ninety-three per cent. The key to getting people to say yes, in other words, wasn't the explanation "because I'm in a rush" but merely the use of the word "because." What mattered wasn't the substance of the explanation but merely the rhetorical form--the conjunctional footprint--of an explanation.
Isn't this how diet books work? Consider the following paragraph, taken at random from "The Zone":
In paracrine hormonal responses, the hormone travels only a very short distance from a secreting cell to a target cell. Because of the short distance between the secreting cell and the target cell, paracrine responses don't need the long-distance capabilities of the bloodstream. Instead, they use the body's version of a regional system: the paracrine system. Finally, there are the autocrine hormone systems, analogous to the cord that links the handset of the phone to the phone itself. Here the secreting cells release a hormone that comes immediately back to affect the secreting cell itself.
Don't worry if you can't follow what Sears is talking about here--following isn't really the point. It is enough that he is using the word "because."
3.
If there is any book that defines the diet genre, however, it is "Dr. Atkins' New Diet Revolution." Here is the conversion narrative at its finest. Dr. Atkins, a humble corporate physician, is fat. ("I had three chins.") He begins searching for answers. ("One evening I read about the work that Dr. Garfield Duncan had done in nutrition at the University of Pennsylvania. Fasting patients, he reported, lose all sense of hunger after forty-eight hours without food. That stunned me. . . . That defied logic.") He tests his unorthodox views on himself. As if by magic, he loses weight. He tests his unorthodox views on a group of executives at A.T. & T. As if by magic, they lose weight. Incredibly, he has come up with a diet that "produces steady weight loss" while setting "no limit on the amount of food you can eat." In 1972, inspired by his vision, he puts pen to paper. The result is "Dr. Atkins' Diet Revolution," one of the fifty best-selling books of all time. In the early nineties, he publishes "Dr. Atkins' New Diet Revolution," which sells more than three million copies and is on the Times best-seller list for almost all of 1997. More than two decades of scientific research into health and nutrition have elapsed in the interim, but Atkins' message has remained the same. Carbohydrates are bad. Everything else is good. Eat the hamburger, hold the bun. Eat the steak, hold the French fries. Here is the list of ingredients for one of his breakfast "weight loss" recommendations: scrambled eggs for six. Keep in mind that Atkins is probably the most influential diet doctor in the world.
12 link sausages (be sure they contain no sugar)
1 3-ounce package cream cheese
1 tablespoon butter
3/4 cup cream
1/4 cup water
1 teaspoon seasoned salt
2 teaspoons parsley
8 eggs, beaten
Atkins' Patent Claim centers on the magical weight-loss properties of something called "ketosis." When you eat carbohydrates, your body converts them into glycogen and stores them for ready use. If you are deprived of carbohydrates, however, your body has to turn to its own stores of fat and muscle for energy. Among the intermediate metabolic products of this fat breakdown are ketones, and when you produce lots of ketones, you're in ketosis. Since an accumulation of these chemicals swiftly becomes toxic, your body works very hard to get rid of them, either through the kidneys, as urine, or through the lungs, by exhaling, so people in ketosis commonly spend a lot of time in the bathroom and have breath that smells like rotten apples. Ketosis can also raise the risk of bone fracture and cardiac arrhythmia and can result in light-headedness, nausea, and the loss of nutrients like potassium and sodium. There is no doubt that you can lose weight while you're in ketosis. Between all that protein and those trips to the bathroom, you'll quickly become dehydrated and drop several pounds just through water loss. The nausea will probably curb your appetite. And if you do what Atkins says, and suddenly cut out virtually all carbohydrates, it will take a little while for your body to compensate for all those lost calories by demanding extra protein and fat. The weight loss isn't permanent, though. After a few weeks your body adjusts, and the weight--and your appetite--comes back.
For Atkins, however, ketosis is as "delightful as sex and sunshine," which is why he wants dieters to cut out carbohydrates almost entirely. (To avoid bad breath he recommends carrying chlorophyll tablets and purse-size aerosol breath fresheners at all times; to avoid other complications, he recommends regular blood tests.) Somehow, he has convinced himself that his kind of ketosis is different from the bad kind of ketosis, and that his ketosis can actually lead to permanent weight loss. Why he thinks this, however, is a little unclear. In "Dr. Atkins' Diet Revolution" he thought that the key was in the many trips to the bathroom:"Hundreds of calories are sneaked out of your body every day in the form of ketones and a host of other incompletely broken down molecules of fat. You are disposing of these calories not by work or violent exercise--but just by breathing and allowing your kidneys to function. All this is achieved merely by cutting out your carbohydrates." Unfortunately, the year after that original edition of Atkins' book came out, the American Medical Association published a devastating critique of this theory, pointing out, among other things, that ketone losses in the urine and the breath rarely exceed a hundred calories a day--a quantity, the A.M.A. pointed out, "that could not possibly account for the dramatic results claimed for such diets." In "Dr. Atkins' New Diet Revolution," not surprisingly, he's become rather vague on the subject, mysteriously invoking something he calls Fat Mobilizing Substance. Last year, when I interviewed him, he offered a new hypothesis: that ketosis takes more energy than conventional food metabolism does, and that it is "a much less efficient pathway to burn up your calories via stored fat than it is via glucose." But he didn't want to be pinned down. "Nobody has really been able to work out that mechanism as well as I would have liked,"he conceded.
Atkins is a big, white-haired man in his late sixties, well over six feet, with a barrel chest and a gruff, hard-edged voice. On the day we met, he was wearing a high-lapelled, four-button black suit. Given a holster and a six-shooter, he could have passed for the sheriff in a spaghetti western. He is an intimidating figure, his manner brusque and impatient. He gives the impression that he doesn't like having to explain his theories, that he finds the details tedious and unnecessary. Given the Photocopier Effect, of course, he is quite right. The appearance of an explanation is more important than the explanation itself. But Atkins seems to take this principle farther than anyone else.
For example, in an attempt to convince his readers that eating pork chops, steaks, duck, and rack of lamb in abundance is good for them, Atkins points out that primitive Eskimo cultures had virtually no heart disease, despite a high-fat diet of fish and seal meat. But one obvious explanation for the Eskimo paradox is that cold-water fish and seal meat are rich in n-3 fatty acids--the "good" kind of fat. Red meat, on the other hand, is rich in saturated fat--the "bad" kind of fat. That dietary fats come in different forms, some of which are particularly bad for you and some of which are not, is the kind of basic fact that seventh graders are taught in Introduction to Nutrition. Atkins has a whole chapter on dietary fat in "New Diet Revolution" and doesn't make the distinction once. All diet-book authors profit from the Photocopier Effect. Atkins lives it.
I watched Atkins recently as he conducted his daily one- hour radio show on New York's WEVD. We were in a Manhattan town house in the East Fifties, where he has his headquarters, in a sleek, modernist office filled with leather furniture and soapstone sculpture. He sat behind his desk--John Wayne in headphones--as his producer perched in front of him. It was a bravura performance. He spoke quickly and easily, glancing at his notes only briefly, and then deftly gave counsel to listeners around the region.
The first call came from George, on his car phone. George told Atkins his ratio of triglycerides to cholesterol. It wasn't good. George was a very unhealthy man. "You're in big trouble," Atkins said. "You have to change your diet. What do you generally eat? What's your breakfast?"
"I've stopped taking junk foods," George says. "I don't eat eggs. I don't eat bacon."
"Then that's-- See there." Atkins' voice rose in exasperation. "What do you have for breakfast?"
"I have skim milk, cereal, with banana."
"That's three carbs!" Atkins couldn't believe that in this day and age people were still consuming fruit and skim milk. "That's how you are getting into trouble!... What you need to do, George, seriously, is get ahold of'New Diet Revolution' and just read what it says."
Atkins took another call. This time, it was from Robert, forty-one years old, three hundred pounds, and possessed of a formidable Brooklyn accent. He was desperate to lose weight--up on a ledge and wanting Atkins to talk him down. "I really don't know anything about dieting," he said. "I'm getting a little discouraged."
"It's really very easy," Atkins told him, switching artfully to the Socratic method. "Do you like meat?"
"Yes."
"Could you eat a steak?"
"Yes."
"All by itself, without any French fries?"
"Yes."
"And let's say we threw in a salad, but you couldn't have any bread or anything else."
"Yeah, I could do that."
"Well, if you could go through life like that.... Do you like eggs in the morning? Or a cheese omelette?"
"Yes,"Robert said, his voice almost giddy with relief. He called expecting a life sentence of rice cakes. Now he was being sent forth to eat cheeseburgers. "Yes, I do!"
"If you just eat that way," Atkins told him, "you'll have eighty pounds off in six months."
When I first arrived at Atkins' headquarters, two members of his staff took me on a quick tour of the facility, a vast medical center, where Atkins administers concoctions of his own creation to people suffering from a variety of disorders. Starting from the fifth floor, we went down to the third, and then from the third to the second, taking the elevator each time. It's a small point, but it did strike me as odd that I should be in the headquarters of the world's most popular weight-loss expert and be taking the elevator one floor at a time. After watching Atkins' show, I was escorted out by his public-relations assistant. We were on the second floor. He pressed the elevator button, down. "Why don't we take the stairs?" I asked. It was just a suggestion. He looked at me and then at the series of closed doors along the corridor. Tentatively, he opened the second. "I think this is it," he said, and we headed down, first one flight and then another. At the base of the steps was a door. The P.R. man, a slender fellow in a beautiful Italian suit, peered through it: for the moment, he was utterly lost. We were in the basement. It seemed as if nobody had gone down those stairs in a long time.
4.
Why are the Pima so fat? The answer that diet books would give is that the Pima don't eat as well as they used to. But that's what is ultimately wrong with diet books. They talk as if food were the only cause of obesity and its only solution, and we know, from just looking at the Pima, that things are not that simple. The diet of the Pima is bad, but no worse than anyone else's diet.
Exercise is also clearly part of the explanation for why obesity has become epidemic in recent years. Half as many Americans walk to work today as did twenty years ago. Over the same period, the number of calories burned by the average American every day has dropped by about two hundred and fifty. But this doesn't explain why obesity has hit the Pima so hard, either, since they don't seem to be any less active than the rest of us.
The answer, of course, is that there is something beyond diet and exercise that influences obesity--that can make the consequences of a bad diet or of a lack of exercise much worse than they otherwise would be--and this is genetic inheritance. Claude Bouchard, a professor of social and preventive medicine at Laval University, in Quebec City, and one of the world's leading obesity specialists, estimates that we human beings probably carry several dozen genes that are directly related to our weight. "Some affect appetite, some affect satiety. Some affect metabolic rate, some affect the partitioning of excess energy in fat or lean tissue," he told me. "There are also reasons to believe that there are genes affecting physical-activity level." Bouchard did a study not long ago in which he took a group of men of similar height, weight, and life style and overfed them by a thousand calories a day, six days a week, for a hundred days. The average weight gain in the group was eighteen pounds. But the range was from nine to twenty-six pounds. Clearly, the men who gained just nine pounds were the ones whose genes had given them the fastest possible metabolism--the ones who burn the most calories in daily living and are the least efficient at storing fat. These are people who have the easiest time staying thin. The men at the other end of the scale are closer to the Pima in physiology. Their obesity genes thriftily stored away as much of the thousand extra calories a day as possible.
One of the key roles for genes appears to be in determining what obesity researchers refer to as setpoints. In the classic experiment in the field, researchers took a group of rats and made a series of lesions in the base of each rat's brain. As a result, the rats began overeating and ended up much more obese than normal rats. The first conclusion is plain: there is a kind of thermostat in the brain that governs appetite and weight, and if you change the setting on that thermostat appetite and weight will change accordingly. With that finding in mind, the researchers took a second step. They took those same brain-damaged rats and put them on a diet, severely limiting the amount of food they could eat. What happened? The rats didn't lose weight. In fact, after some initial fluctuations, they ended up at exactly the same weight as before. Only, this time, being unable to attain their new thermostat setting by eating, they reached it by becoming less active--by burning less energy.
Two years ago, a group at Rockefeller University in New York published a landmark study essentially duplicating in human beings what had been done years ago in rats. They found that if you lose weight your body responds by starting to conserve energy: your metabolism slows down; your muscles seem to work more efficiently, burning fewer calories to do the same work. "Let's say you have two people, side by side, and these people have exactly the same body composition," Jules Hirsch, a member of the Rockefeller team, says. "They both weigh a hundred and thirty pounds. But there is one difference--the first person maintains his weight effortlessly, while the second person, who used to weigh two hundred pounds, is trying to maintain a lower weight. The second will need fifteen per cent fewer calories per day to do his work. He needs less oxygen and will burn less energy." The body of the second person is backpedalling furiously in response to all that lost weight. It is doing everything it can to gain it back. In response to weight gain, by contrast, the Rockefeller team found that the body speeds up metabolism and burns more calories during exercise. It tries to lose that extra weight. Human beings, like rats, seem to have a predetermined setpoint, a weight that their body will go to great lengths to maintain.
One key player in this regulatory system may be a chemical called leptin--or, as it is sometimes known, Ob protein--whose discovery four years ago, by Jeff Friedman, of the Howard Hughes Medical Institute at Rockefeller University, prompted a flurry of headlines. In lab animals, leptin tells the brain to cut back on appetite, to speed up metabolism, and to burn stored fat. The theory is that the same mechanism may work in human beings. If you start to overeat, your fat cells will produce more leptin, so your body will do everything it can to get back to the setpoint. That's why after gaining a few pounds over the holiday season most of us soon return to our normal weight. But if you eat too little or exercise too much, the theory goes, the opposite happens: leptin levels fall. "This is probably the reason that virtually every weight-loss program known to man fails," José F. Caro, vice-president of endocrine research and clinical investigation at Eli Lilly & Company, told me. "You go to Weight Watchers. You start losing weight. You feel good. But then your fat cells stop producing leptin. Remember, leptin is the hormone that decreases appetite and increases energy expenditure, so just as you are trying to lose weight you lose the hormone that helps you lose weight."
Obviously, our body's fat thermostat doesn't keep us at one weight all our adult lives. "There isn't a single setpoint for a human being or an animal," Thomas Wadden, the director of the Weight and Eating Disorders Clinic at the University of Pennsylvania, told me. "The body will regulate a stable weight but at very different levels, depending on food intake--quality of the diet, high fat versus low fat, high sweet versus low sweet--and depending on the amount of physical activity." It also seems to be a great deal easier to move the setpoint up than to move it down--which, if you think about the Pima, makes perfect sense. In their long history in the desert, those Pima who survived were the ones who were very good at gaining weight during times of plenty--very good, in other words, at overriding the leptin system at the high end. But there would have been no advantage for the ones who were good at losing weight in hard times. The same is probably true for the rest of us, albeit in a less dramatic form. In our evolutionary history, there was advantage in being able to store away whatever calorific windfalls came our way. To understand this interplay between genes and environment, imagine two women, both five feet five. The first might have a setpoint range of a hundred and ten to a hundred and fifty pounds; the second a range of a hundred and twenty-five to a hundred and eighty. The difference in the ranges of the two women is determined by their genes. Where they are in that range is determined by their life styles.
Not long after leptin was discovered, researchers began testing obese people for the hormone, to see whether a fat person was fat because his body didn't produce enough leptin. They found the opposite: fat people had lots of leptin. Some of the researchers thought this meant that the leptin theory was wrong--that leptin didn't do what it was supposed to do. But some other scientists now think that as people get fatter and fatter, their bodies simply get less and less sensitive to leptin. The body still pumps out messages to the brain calling for the metabolism to speed up and the appetite to shrink, but the brain just doesn't respond to those messages with as much sensitivity as it did. This is probably why it is so much easier to gain weight than it is to lose it. The fatter you get, the less effective your own natural weight-control system becomes.
This doesn't mean that diets can't work. In those instances in which dieters have the discipline and the will power to restrict their calories permanently, to get regular and vigorous exercise, and to fight the attempt by their own bodies to maintain their current weight, pounds can be lost. (There is also some evidence that if you can keep weight off for an extensive period--three years, say--a lower setpoint can be established.) Most people, though, don't have that kind of discipline, and even if they do have it the amount of weight that most dieters can expect to lose on a permanent basis may be limited by their setpoint range. The N.I.H. has a national six-year diabetes-prevention study going on right now, in which it is using a program of intensive, one-on-one counselling, dietary modification, and two and a half hours of exercise weekly to see if it can get overweight volunteers to lose seven per cent of their body weight. If that sounds like a modest goal, it should. "A lot of studies look at ten-per-cent weight loss," said Mary Hoskin, who is coördinating the section of the N.I.H. study involving the Pima. "But if you look at long-term weight loss nobody can maintain ten per cent. That's why we did seven."
On the other hand, now that we're coming to understand the biology of weight gain, it is possible to conceive of diet drugs that would actually work. If your body sabotages your diet by lowering leptin levels as you lose weight, why not give extra leptin to people on diets? That's what a number of drug companies, including Amgen and Eli Lilly, are working on now. They are trying to develop a leptin or leptin-analogue pill that dieters could take to fool their bodies into thinking they're getting fatter when they're actually getting thinner. "It is very easy to lose weight," José Caro told me. "The difficult thing is to maintain your weight loss. The thinking is that people fail because their leptin goes down. Here is where replacement therapy with leptin or an Ob-protein analogue might prevent the relapse. It is a subtle and important concept. What it tells you is that leptin is not going to be a magic bullet that allows you to eat whatever you want. You have to initiate the weight loss. Then leptin comes in."
Another idea, which the Hoffmann-La Roche company is exploring, is to focus on the problems obese people have with leptin. Just as Type II diabetics can become resistant to insulin, many overweight people may become resistant to leptin. So why not try to resensitize them? The idea is to find the leptin receptor in the brain and tinker with it to make it work as well in a fat person as it does in a thin person. (Drug companies have actually been pursuing the same strategy with the insulin receptors of diabetics.) Arthur Campfield, who heads the leptin project for Roche, likens the process by which leptin passes the signal about fat to the brain to a firemen's bucket brigade, where water is passed from hand to hand. "If you have all tall people, you can pass the bucket and it's very efficient,"he said. "But if two of the people in the chain are small children, then you're going to spill a lot of water and slow everything down. We want to take a tablet or a capsule that goes into your brain and puts a muscular person in the chain and overcomes that weakness. The elegant solution is to find the place in the chain where we are losing water."
The steps that take place in the brain when it receives the leptin message are known as the Ob pathway, and any number of these steps may lend themselves to pharmaceutical intervention. Using the Ob pathway to fight obesity represents a quantum leap beyond the kinds of diet drugs that have been available so far. Fen-phen, the popular medication removed from the market last year because of serious side effects, was, by comparison, a relatively crude product, which worked indirectly to suppress appetite. Hoffmann-La Roche is working now on a drug called Xenical, a compound that blocks the absorption of dietary fat by the intestine. You can eat fat; you just don't keep as much of it in your system. The drug is safe and has shown real, if modest, success in helping chronically obese patients lose weight. It will probably be the next big diet drug. But no one is pretending that it has anywhere near the potential of, say, a drug that would resensitize your leptin receptors.
Campfield talks about the next wave of drug therapy as the third leg of a three-legged stool--as the additional element that could finally make diet and exercise an easy and reliable way to lose weight. Wadden speaks of the new drugs as restoring sanity:"What I think will happen is that people on these medications will report that they are less responsive to their environment. They'll say that they are not as turned on by Wendy's or McDonald's. Food in America has become a recreational activity. It is divorced from nutritional need and hunger. We eat to kill time, to stimulate ourselves, to alter our mood. What these drugs may mean is that we're going to become less susceptible to these messages." In the past thirty years, the natural relationship between our bodies and our environment--a relation that was developed over thousands of years--has fallen out ofbalance. For people who cannot restore that natural balance themselves--who lack the discipline, the wherewithal, or, like the Pima, the genes--drugs could be a way of restoring it for them.
5.
Seven years ago, Peter Bennett, the epidemiologist who first stumbled on the Gila River Pima twenty-eight years earlier, led an N.I.H. expedition to Mexico's Sierra Madre Mountains. Their destination was a a tiny Indian community on the border of Sonora and Chihuahua, seven thousand feet above the desert. "I had known about their existence for at least fifteen years before that," Bennett says. "The problem was that I could never find anyone who knew much about them. In 1991, it just happened that we linked up with an investigator down in Mexico." The journey was a difficult one, but the Mexican government had just built a road linking Sonora and Chihuahua, so the team didn't have to make the final fifty- or sixty-mile trek on horseback. "They were clearly a group who have got along together for a very long time," Bennett recalls. "My reaction as a stranger going in was: Gee, I think these people are really very friendly, very coöperative. They seem to be interested in what we want to do, and they are willing to stick their arms out and let us take blood samples." He laughed. "Which is always a good sign."
The little town in the Sierra Madre is home to the Mexican Pima, the southern remnants of a tribe that once stretched from present-day Arizona down to central Mexico. Like the Pima of the Gila River reservation, they are farmers, living in small clusters of wood-and-adobe rancherĂas among the pine trees, cultivating beans, corn, and potatoes in the valleys. On that first trip, the N.I.H. team examined no more than a few dozen Pima. Since then, the team has been back five or six times, staying for as many as ten days at a time. Two hundred and fifty of the mountain Pima have now been studied. They have been measured and weighed, their blood sugar has been checked, and their kidneys and eyes have been examined for signs of damage. Genetic samples have been taken and their metabolism has been monitored. The Mexican Pima, it turns out, eat a diet consisting almost entirely ofbeans, potatoes, and corn tortillas, with chicken perhaps once a month. They take in twenty-two hundred calories a day, which is slightly more than the Pima of Arizona do. But on the average each of them puts in twenty-three hours a week of moderate to hard physical labor, whereas the average Arizona Pima puts in two hours. The Mexican Pima's rates of diabetes are normal. They are slightly shorter than their American counterparts. In weight, there is no comparison: "I would say they are thin," Bennett says. "Thin. Certainly by American standards."
There are, of course, a hundred reasons not to draw any great lessons from this. Subsistence farming is no way to make a living in America today, nor are twenty-three hours ofhard physical labor feasible in a society where most people sit at a desk from nine to five. And even if the Arizona Pima wanted to return to the land, they couldn't. It has been more than a hundred years since the Gila River, which used to provide the tribe with fresh fish and with water for growing beans and squash, was diverted upstream for commercial farming. Yet there is value in the example of the Mexican Pima. People who work with the Pima of Arizona say that the biggest problem they have in trying to fight diabetes and obesity is fatalism--a sense among the tribe that nothing can be done, that the way things are is the way things have to be. It is possible to see in the attitudes of Americans toward weight loss the same creeping resignation. As the world grows fatter, and as one best-selling diet scheme after another inevitably fails, the idea that being slender is an attainable--or even an advisable--condition is slowly receding. Last month, when The New England Journal of Medicine published a study suggesting that the mortality costs of obesity had been overstated, the news was greeted with resounding relief, as if we were all somehow off the hook, as if the issue with obesity were only mortality and not the thousand ways in which being fat undermines our quality of life: the heightened risk of heart disease, hypertension, diabetes, cancer, arthritis, gallbladder disease, trauma, gout, blindness, birth defects, and other aches, pains, and physical indignities too numerous to mention. What we are in danger of losing in the epidemic of obesity is not merely our health but our memory of health. Those Indian towns high in the Sierra Madre should remind the people of Sacaton--and all the rest of us as well--that it is still possible, even for a Pima, to be fit.
The Spin Myth
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 6, 1998
A CRITIC AT LARGE
Are our spin meisters just spinning one another?
On Easter Sunday, 1929, the legendary public-relations man Edward L. Bernays rounded up ten carefully chosen women, put cigarettes in their hands, and sent them down Fifth Avenue in what was billed as the Torches of Freedom march. The marchers were given detailed instructions, including when and how their cigarettes should be lit. Spokeswomen were enlisted to describe the protest as an advance for feminism. Photographers were hired to take pictures. It was an entirely contrived event that nonetheless looked so "real" that the next day it made front-page headlines across the country, prompting a debate over whether women should be allowed to smoke as freely as men, and--some historians believe--forever changing the social context of cigarettes. What Bernays never told anyone was that he was working for the American Tobacco Company.
It is difficult to appreciate how brazen Bernays's ruse was at the time. In the twenties, the expectation was that if you were trying to sell people something--even if you were planning to deceive them in the process--you had at least to admit that you were trying to sell them something. Bernays was guided by the principle that this wasn't true: that sometimes the best way to sell something (cigarettes, say) was to pretend to be selling something else (freedom, say).
Bernays helped the brewing industry establish beer as "the beverage of moderation." For Dixie cups, he founded the Committee for the Study and Promotion of the Sanitary Dispensing of Food and Drink. For the Mack truck company, he drummed up national support for highway construction through front groups called the Trucking Information Service, the Trucking Service Bureau, and Better Living Through Increased Highway Transportation. In a torrent of books and articles (including one book, "Crystallizing Public Opinion," that was found in Joseph Goebbels's library) he argued that the P.R. professional could "continuously and systematically" perform the task of "regimenting the public mind." He wasn't talking about lying. He was talking about artful, staged half- truth. It's the kind of sly deception that we've come to associate with the Reagan Administration's intricately scripted photo ops (the cowboy hats, the flannel shirts, the horse), with the choreographed folksiness of Clinton's Town Hall meetings, with the "Wag the Dog" world of political operatives, and with the Dilbertian byways of boardroom euphemism, in which firing is "rightsizing" and dismembering companies becomes "unlocking shareholder value." Edward L. Bernays invented spin.
Today, we're told, Bernays's touch is everywhere. The advertising critic Randall Rothenberg has suggested that there is something called a Media-Spindustrial Complex, which encompasses advertising, P.R., lobbying, polling, direct mail, investor relations, focus groups, jury consulting, speechwriting, radio and television stations, and newspapers--all in the business of twisting and turning and gyrating. Argument now masquerades as conversation. Spin, the political columnist E.J. Dionne wrote recently, "obliterates the distinction between persuasion and deception." Should P.R. people tell "the whole truth about our clients? No sirree!" Thomas Madden, the chairman of one of the largest P.R. firms in the country, declares in his recent memoir, entitled "Spin Man." In the best-seller "Spin Cycle: Inside the Clinton Propaganda Machine," Howard Kurtz,the media critic for the Washington Post, even describes as spin the White House's decision in the spring of 1997 to release thousands of pages of documents relating to the Democratic fund-raising scandal.
This was the documentation that the press had been clamoring for. You might have thought that it was full disclosure. Not so, says Kurtz, who dubs the diabolical plan Operation Candor. In playing the honesty card, he argues, the White House preëmpted embarrassing leaks by congressional investigators and buried incriminating documents under an avalanche of paper. Of course, not releasing any documents at all would also have been spin (Stonewall Spin), and so would releasing only a handful of unrepresentative documents (Selection Spin). But, if you think that calling everything "spin" renders the term meaningless (if this is all spin, then what is not spin?), you've missed the point. The notion that this is the age of spin rests on the premise that everything, including the truth, is potentially an instrument of manipulation.
In "P.R.!:ASocial History of Spin," the media critic Stuart Ewen describes how, in 1990, he went to visit Bernays at his home near Harvard Square, in Cambridge. He was ushered in by a maid and waited in the library, looking, awestruck, at the shelves. "It was a remarkable collection of books, thousands of them: about public opinion, individual and social psychology, survey research, propaganda, psychological warfare, and so forth--a comprehensive library spanning matters of human motivation and strategies of influence, scanning a period of more than one hundred years," he writes. "These were not the bookshelves of some shallow huckster, but the arsenal of an intellectual. The cross- hairs of nearly every volume were trained on the target of forging public attitudes. Here--in a large white room in Cambridge, Massachusetts--was the constellation of ideas that had inspired and informed a twentieth century preoccupation: the systematic molding of public opinion."
Suddenly, Ewen's reverie was broken. In walked Bernays, a "puckish little man" of ninety-eight, with "swift eyes," who looked like "an aged Albert Einstein." Bernays led Ewen past his picture gallery--Bernays and Henry Ford, Bernays and Thomas Edison, Bernays and Eisenhower, Bernays en route to the 1919 Paris Peace Conference, an autographed photo of Sigmund Freud, who was Bernays's uncle. And for four hours Bernays and Ewen talked. Ewen was "entranced": he had located the fountainhead of all spin. At one point, Bernays hypothesized about how he might have promoted Ewen's previous book, which was an account of consumer imagery in the modern economy. He would, he said, have called the big consumer organizations and suggested to them that they devote one of their annual meetings to a discussion of consumers and images. Ewen thought nothing of it. Then, three months later, he got a call from the president of the Consumer Federation of America asking him if he wanted to be the keynote speaker at its annual meeting. Was Bernays behind it? Was he still spinning, even as he approached his hundredth birthday? Ewen never found out. "Yet the question remained, and remains, open," he writes, in the breathless opening chapter of his book. "Things had uncannily come to pass much as Bernays had described in his hypothetical disquisition on the work of a P.R. practitioner, and I was left to ponder whether there is any reality anymore, save the reality of public relations."
The curious thing about our contemporary obsession with spin, however, is that we seldom consider whether spin works. We simply assume that, because people are everywhere trying to manipulate us, we're being manipulated. Yet it makes just as much sense to assume the opposite: that the reason spin is everywhere today is that it doesn't work--that, because the public is getting increasingly inured to spin, spinners feel they must spin even harder, on and on, in an ever-escalating arms race. The Torches of Freedom march worked because nobody had ever pulled a stunt like that before. Today, those same marchers would be stopped cold at ten feet. (First question at the press conference: Who put you up to this?) Once spun, twice shy. When, last week, the Clinton spokesman Rahm Emanuel called Steven Brill's revelations about Kenneth Starr's leaking to the press a "bombshell," that was spin, but we are so accustomed to Rahm Emanuel's spinning that the principal effect of his comment was to prompt a meta-discussion about, of all things, his comment. ("If the wonderful word oleaginous didn't exist," Frank Rich wrote in the Times, "someone would have to invent it to describe Rahm Emanuel.")Emanuel might have been better off saying nothing at all, except that--under the Howard Kurtz rule--this, too, would have been decoded as an attempt to spin us, by ostentatiously letting the Brill revelations speak for themselves: Silent Spin, perhaps. Spin sets into motion a never-ending cycle of skepticism.
There is a marvellous illustration of this arms-race problem in the work of two psychology professors, Deborah Gruenfeld and Robert Wyer, Jr. They gave people statements that were said to be newspaper headlines, and asked them to rate their plausibility, on a scale of zero to ten. Since the headlines basically stated the obvious--for example, "black democrats supported jesse jackson for president in 1988"--the scores were all quite high. The readers were then given a series of statements that contradicted the headlines. Not surprisingly, the belief scores went down significantly. Then another group of people was asked to read a series of statements that supported the headlines--statements like "Black Democrats presently support Jesse Jackson for President." This time, the belief scores still dropped. Telling people that what they think is true actually is true, in other words, has almost the same effect as telling them that what they think is true isn't true. Gruenfeld and Wyer call this a "boomerang effect," and it suggests that people are natural skeptics. How we respond to a media proposition has at least as much to do with its pragmatic meaning (why we think the statement is being made) as with its semantic meaning (what is literally being said). And when the pragmatic meaning is unclear--why, for example, would someone tell us over and over that Jesse Jackson has the support of black Democrats--we start to get suspicious. This is the dilemma of spin. When Rahm Emanuel says "bombshell," we focus not on the actual bombshell but on why he used the word "bombshell."
The point is that spin is too clever by half. In a forthcoming biography, "The Father of Spin," Larry Tye writes that in 1930 Bernays went to work for a number of major book publishers, including Simon &Schuster and Harcourt Brace: "'Where there are bookshelves,' he reasoned, 'there will be books.' So he got respected public figures to endorse the importance of books to civilization, and then he persuaded architects, contractors, and decorators to put up shelves on which to store the precious volumes--which is why so many homes from that era have built-in bookshelves."
This is the kind of slick move that makes Bernays such an inspiration for contemporary spin meisters. (Tye, admiringly, calls it "infinitely more effective" than simply promoting books one by one, in the conventional way.) But wait a minute. Did Bernays really reach all these architects and contractors? If so, how? Wouldn't there have been thousands of them? And, if he did, why would they ever have listened to him? (My limited experience with contractors and architects is that advice from someone outside their field has the opposite of its intended effect.) And, even if we assume that he did cause a surge in bookshelf building, is there a magical relationship between built-in shelves and the purchase of books? Most of us, I think, acquire books because we like books and we want to read them--not because we have customized space to fill in our apartments. The best way to promote cigarettes probably isn't to subsidize ashtrays.
People who worry about spin have bought into a particular mythology about persuasion--a mythology that runs from Tom Sawyer to Vance Packard--according to which the best way to persuade someone to do something is to hide the act of persuasion. The problem is, though, that if the seller is too far removed from the transaction, if his motives are too oblique, there's a good chance that his message will escape the buyer entirely. (People don't always think books when they think shelves.) In fact, successful persuasion today is characterized by the opposite principle--that it is better to be obvious and get your message across than it is to pull invisible strings and risk having your message miss the mark. Bernays sacrificed clarity for subtlety. Most effective advertising today sacrifices subtlety for clarity. Recently, at a Robert Wood Johnson Foundation conference on how to fight teen-age smoking, one prominent California ad executive talked about the reason for the success of the Marlboro and the Camel brands. It was not, he said, because of any of the fancy behind-the-scenes psychological tricks that Big Tobacco is so often accused of by its critics. On the contrary. The tobacco companies, he said, understand what Nike and Coca- Cola understand: that if they can make their brands ubiquitous--if they can plaster them on billboards, on product displays inside grocery stores, on convenience-store windows, on the sides of buildings, on T-shirts and baseball caps, on the hoods and the roofs of racing cars, in colorful spreads in teen magazines--they can make their message impossible to ignore. The secret is not deception but repetition, not artful spinning but plain speaking.
There's a second, related difficulty with spin--one that people in the marketing business call the internal-audience problem. Let's say you are the head of the ad agency that has the Burger King account. Your ultimate goal is to make ads that appeal to the kind of people who buy Burger King burgers. But, in order to keep Burger King's business and get your commercials on the air, you must first appeal to Burger King's marketing executives, who are probably quite different in temperament and taste from the target Burger King customer. Ideally, your ads will appeal to the folks at Burger King because they appeal to the Burger King customers; that is, the internal audience will be pleased because the external audience is pleased. But it has always been extremely difficult to measure the actual impact of a television commercial (especially, as is the case with many ads, where the aim is simply to maintain the current market share). Unless you're careful, then, you may start creating ads that appeal only to your internal audience, with the unfortunate result that the relationship between ad agency and ad buyer becomes a kind of closed loop. The internal audience supplants the real audience.
The internal-audience effect can be seen in all sorts of businesses. The reason so many magazines look alike is that their Manhattan-based editors and writers end up trying to impress not readers but other Manhattan-based editors and writers. It was in an effort to avoid this syndrome that Lincoln Mercury recently decided to move its headquarters from Detroit to California. The company said that the purpose was to get closer to its customers; more precisely, the purpose was to get away from people who weren't its customers. Why do you think it took so long to get Detroit to install seat belts? Because to the internal audience a seat belt is a cost center. It is only to the external audience that it's a life saver.
Edward Bernays was a master of the internal audience. He was intellectually indefatigable, a diminutive, mustachioed, impatient dervish. Larry Tye writes that as Bernays sat in his office "four or five young staff members, their chairs pulled close, would have been listening to him spew forth a stream of thoughts about peddling Ivory or keeping Luckies number one. With each new idea he'd scratch out a note, wad it up, and toss it on the floor." Afterward, the floor looked blanketed by snow. But it was all an inside joke. The wadded-up pieces of paper were, Tye quotes one former employee as saying, "a trick to demonstrate all the ideas he was generating." To promote bacon, Bernays persuaded prominent doctors to testify to the benefits of a hearty breakfast. His client, a bacon producer, no doubt regarded this as a dazzling feat. But does a hearty breakfast mean bacon? And does bacon mean his client's bacon? Bernays's extraordinary success is proof that in the P.R. world, where no hard-and-fast measures exist to gauge the true effectiveness of a message, he could prosper by playing only to his internal audience. But often the very things that make you successful with that audience prevent you from being successful with your real audience. To Simon & Schuster--to people in the book business--bookshelves really do mean books. To the rest of us, a bookshelf may be no more than a place to put unopened mail.
This is the mistake Howard Kurtz makes in "Spin Cycle." His book is a detailed account of how in the year following the 1996 elections Clinton's spokesman Mike McCurry successfully spun the White House press corps during the fund-raising and Whitewater scandals. Kurtz tries to argue that this, in turn, reflects Clinton's ability to manage his image with the wider public--with the external audience. In fact, "Spin Cycle" reads more like an extended treatise on the internal-audience problem, a three-hundred-page account of how McCurry's heroic attempts to spin the White House press corps had the effect of, well, spinning the White House press corps.
For example, Kurtz recounts the story of Rita Braver, a former White House correspondent for CBS television. Braver believed that the Clinton Administration would go to "unbelievable lengths" to keep her from breaking a story--on the ground, Kurtz says, that "bad stories came across as more sensational on television." In one instance, early in Clinton's second term, the White House announced that it was turning over a large number of Whitewater documents to the Justice Department. Braver, according to Kurtz, smelled a rat. She knew that you don't just turn over documents to the Justice Department. She made some calls and found out that, sure enough, the White House had actually been subpoenaed. Braver wrote a script: "CBS News has learned..." Then disaster struck. "Half an hour before the evening news began," Kurtz writes, "White House officials publicly announced the subpoena. No way they were going to let her break the news and look like they were hiding something, which they had been. They were determined to beat her to the punch."
Let's deconstruct this episode. Braver wanted to write a story that said, in effect, The documents the White House said it is handing over to the Justice Department today are, I have learned, being handed over because of a subpoena. Instead, she was forced to say, The documents that the White House is handing over to the Justice Department today are, the White House said, being handed over because of a subpoena. To the internal audience--to Braver and her colleagues--there is a real distinction between Statements A and B. In the first case, the White House is seen as reluctant to disclose the existence of a subpoena. In the second, it is not. More important, in the first case it is clear that the subpoena story is the result of the efforts of Rita Braver--of the efforts, in other words, of the White House press corps--and in the second that role has been erased. This distinction also matters to Clinton, McCurry's other internal audience. But why does this matter to the rest of us? The news of interest to the external audience is not the nuance of the White House's reaction to a subpoena, or the particular reporting talents of Rita Braver; it is the fact of the subpoena itself. Kurtz is entirely correct that the Braver episode is an example of the ascendancy of spin. But the only thing that's being spun here is ten square blocks in the center of Washington, D.C. This is dog-whistle politics.
The irony of Edward L. Bernays's enshrinement in the spin literature is that, in fact, he is not the father of contemporary persuasion. That honor belongs--if it belongs to anyone--to the wizard of direct marketing, Lester Wunderman. Wunderman was Bernays's antithesis. He was born in a tenement in the East Bronx, far from the privilege and wealth of Bernays's Manhattan. While Bernays was sending women marching down Fifth Avenue, Wunderman was delivering chickens for Izzy, a local kosher butcher. He started off in advertising making twenty-five dollars a week at Casper Pinsker's mail-order ad agency, in lower Manhattan, and in one of his first successes he turned the memoirs of Hitler's personal physician into a wartime best-seller by promoting them on the radio with some of the first-ever infomercials. If Bernays was the master of what Tye calls Big Think--splashy media moments, behind-the-scenes manipulations, concocted panels of "experts"--Wunderman, in the course of his career, established himself as the genius of Little Think, of the small but significant details that turn a shopper into a buyer. He was the person who first put bound- in subscription cards in magazines, who sold magazines on late-night television with an 800 number, who invented the forerunner of the scratch-'n'-sniff ad, who revolutionized the mail-order business, and who, in a thousand other ways, perfected the fine detail of true salesmanship.
In "Being Direct," his recent autobiography, Wunderman relates the story of how he turned the Columbia Record Club into the largest marketing club of its kind in the world. It's a story worth retelling, if only because it provides such an instructive counterpoint to the ideas of Edward Bernays. The year was 1955. Wunderman was already the acknowledged king of mail order, long since gone from Casper Pinsker, and by then a senior vice-president at the ad firm Maxwell Sackheim & Company, and Columbia came to him with a problem. Independent mail-order companies, using the model of the Book-of-the-Month Club, were starting to chip away at retail sales of records. (In those days, record companies sold records through dealers, the same way that car companies sell cars.) To stem the tide, Columbia wanted to start a club of its own.
Wunderman's response was to create a kind of mail- order department store, with four sections--classical, Broadway, jazz, and listening and dancing--and a purchase plan that offered a free record for joining. The offer was then advertised in magazines, with a coupon to clip and mail back. This initial campaign did respectably, but not well enough to break even. Wunderman went back to the drawing board. In 1956, he began testing hundreds of different kinds of ads in different publications and in different markets, comparing the response rate to each. The best response was to a plan that allowed the customer, for every four records purchased, to choose three free records from a list of twelve options. He went with that nationwide. By 1957, the club had a million members. But that summer the club-advertising response rate suddenly fell off by twenty per cent. Wunderman, who was travelling in Europe at the time, had another brainstorm:
What I had discovered in Italy was antipasto.... The idea of so many choices intrigued me, and the larger the selection, the longer the line at the antipasto table. Restaurant owners seemed to know this, because antipasto carts and tables were usually displayed prominently at the entrance. I made a point of counting the number of individual antipasto choices people took in relation to the number that were offered, and I discovered that they helped themselves to about the same number of dishes no matter how many were set in front of them.
Wunderman rushed back to New York with the solution. The free records Columbia offered to new members were "antipasto." But three free records from a list of twelve weren't enough, Wunderman argued. That wasn't a true antipasto bar. He persuaded Columbia that it should test an ad that increased the choice from twelve records to thirty- two. The response rate doubled. (Today, Columbia members get to choose from more than four hundred albums.) Columbia scrapped its old ad run, and replaced it with the antipasto campaign. The year 1958 was the best one in the club's history.
The next challenge Wunderman faced was that the club seemed stalled at a million members, so he began searching for new ways to get his message across. Taking an idea he had pioneered several years earlier, while he was selling mail-order roses for Jackson & Perkins, he persuaded a number of publishers to put post-paid insert cards in magazines--the now familiar little cards that are an ad on one side and a coupon on the other. Columbia jumped to two million members. In Life he inserted a sheet of "value stamps," each with the title of an album on it, which readers could stick to a response card. He was also the first to use an "answer card" in newspapers, and among the first to get newspapers to insert freestanding four- and eight-page special advertising sections in their Sunday editions. On another occasion, he put a little gold box in the corner of all Columbia's print ads, and, in a series of television commercials, instructed viewers to look in the ads for the "buried treasure"; if they found the box, they could get another free record. The gold-box campaign raised responses by eighty per cent. "The Gold Box," he writes, "had made the reader/viewer part of an interactive advertising system. Viewers were not just an audience but had become participants. It was like playing a game."
All these strategies amount to a marketing system of extraordinary sensitivity. Answer cards and gold boxes and antipasto and the other techniques of Little Think are sophisticated ways of listening, of overcoming the problems of distance and distortion which so handicap other forms of persuasion. There are times when we all get annoyed at the business reply cards that Wunderman invented. But at a conceptual level, surely, those cards are a thing of beauty. To the consumer--to us--they offer almost perfect convenience. Is there an easier way to subscribe to a magazine? To the client, they offer ubiquity: it knows that every time a magazine is opened a response card falls on someone's lap. And to the ad agency they offer a finely calibrated instrument to measure effectiveness: the adman can gauge precisely how successful his campaign is merely by counting the number of cards that come back. Much of the apparatus of modern-day marketing--the computer databases, the psychographic profiles, the mailing lists, the market differentiations, the focus groups--can be seen, in some sense, as an attempt to replicate the elegance and transparency of this model. Marketers don't want to spin us. They want to hold us perfectly still, so they can figure out who we are, what we want, and how to reach us.
There is a moment in Kurtz's book in which he stumbles on this truth: that it isn't spin, after all, that accounts for Clinton's popularity but, rather, the opposite of spin--the President's ability to listen, to offer his agenda like antipasto, to sidestep the press and speak directly to the public. Kurtz writes that the former White House communications director Don Baer believed that Clinton "cut through what was said about him":
He was having his own conversation with America, one that, if all went well, sailed over the heads of the journalists, who were nothing but theater critics and did little to shape public opinion. Baer saw the phenomenon time and again. When Clinton unveiled his plan for hope scholarships, which would give parents of college students up to $1500 a year in tax credits, the media verdict was swift: cynical political ploy to pander to middle-class voters. But what the public heard was that Clinton was concerned about the difficulty of sending kids to college and was willing to help them with tax credits. Voters got it. They liked constructive proposals and hated partisan sniping.
But Kurtz doesn't believe Baer, and why would he? The spin fantasy offers a far more satisfying explanation for the world around us. Spin suggests a drama, a script to decode, a game played at the highest of levels. Spinning is the art of telling a story, even when there is no story to tell, and this is irresistible (particularly to journalists, who make a living by telling stories even when there is no story to tell). In truth, the world of persuasion is a good deal more prosaic. Ideas and candidacies--not to mention albums--are sold by talking plainly and clearly, and the louder and faster the whirring of the spinners becomes, the more effective this clarity and plainspokenness will be. We think we belong to the world of Edward L. Bernays. We don't. We are all Wundermanians now.
Perils of a Parable
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
January 11, 1999
COMMENT
Science and the Perils of a Parable
In the movie "A Civil Action," the families of eight leukemia victims accuse two major corporations of contaminating the drinking water of Woburn, Massachusetts. John Travolta's portrayal of the lawyer who argues their case has been justifiably praised by critics for its subtlety: he is neither a villain nor a hero but an uncomfortable and ambiguous combination of the two--a man of equal parts greed and idealism who is in the grip of a powerful obsession. Curiously, though, when it comes to the scientific premise of the story, "A Civil Action" (like Jonathan Harr's best-seller, on which it is based) permits no ambiguity at all. It is taken as a given that the chemical allegedly dumped, trichloroethylene (TCE), is a human carcinogen-- even though, in point of fact, TCE is only a probable human carcinogen: tests have been made on animals, but no human-based data have tied it to cancer. It is also taken as a given that the particular carcinogenic properties of TCE were what resulted in the town's leukemia outbreak, even though the particular causes and origins of that form of cancer remain mysterious. The best that can be said is that there might be a link between TCE and disease. But the difference between what "might be" and what "is"--which in scientific circles is all the difference in the world--does not appear to amount to much among the rest of us. We know that human character can be complex and ambiguous. But we want science to conform to a special kind of narrative simplicity: to begin from obvious premises and proceed, tidily and expeditiously, to morally satisfying conclusions.
Consider the strange saga of silicone breast implants. Almost seven years ago, the Food and Drug Administration placed a moratorium on most uses of silicone implants, because the devices had been inadequately tested and the agency wanted to give researchers time to gather new data on their safety. Certain that the data would indict implants in the end, personal-injury lawyers rounded up hundreds of thousands of women in a massive class- action suit. By 1994, four manufacturers of implants had been instructed to pay out the largest class-action settlement in history: $4.25 billion. And when that amount proved insufficient for all the plaintiffs, the largest of the defendants--Dow Corning--filed for Chapter 11, offering $3.2 billion last November to settle its part of the suit.
Now, however, we actually have the evidence on implant safety. More than twenty studies have been completed, by institutions ranging from Harvard Medical School to the Mayo Clinic. The governments of Germany, Australia, and Britain have convened scientific panels. The American College of Rheumatology, the American Academy of Neurology, and the Council on Scientific Affairs of the American Medical Association have published reviews of the evidence, and last month, in a long-awaited decision, an independent scientific panel, appointed by a federal court, released its findings. All of the groups have reached the same conclusion: there is little or no reason to believe that breast implants cause disease of any kind. The author of the toxicological section of the federal court's panel concluded, "There is no evidence silicone breast implants precipitate novel immune responses or induce systemic inflammation," and the author of the immunology section of the same report stated, "Women with silicone breast implants do not display a silicone-induced systemic abnormality in the types or functions of cells of the immune system."
There is some sense now that with the unequivocal of the December report, the tide against implants may finally be turning. But that is small consolation. For almost seven years, at a cost of billions and in the face of some twenty-odd studies to the contrary, the courts and the public clung to a conclusion with no particular merit other than that it sounded as if it might be true. Here, after all, was a group of profit-driven multinationals putting gooey, leaky, largely untested patties of silicone into the chests of a million American women. In the narrative we have imposed on science, that act ought to have consequences, just as the contamination of groundwater by a profit-seeking multinational ought to cause leukemia. Our moral sense said so, and, apparently, that was enough. Of course, if science always made moral sense we would not need scientists. We could staff our laboratories with clergy.
It may be hard to shed a tear for implant manufacturers Dow Corning, even though their shareholders have been royally ransomed for no good reason. Those who sell drugs and medical devices must expect to be held hostage, from time to time, by the irrationalities of the legal system. The women in this country with breast implants do, however, deserve our compassion. They chose cosmetic in order to feel better about themselves. For this, they were first accused of an unnatural vanity and then warned that they had placed themselves in physical peril, and the first charge informed the second until the imagined threat of silicone implants took on the force of moral judgment: they asked for it, these women. They should have been satisfied with what God gave them and stayed home to reread "The Beauty Myth." Well, they didn't ask for anything, and what they did with their bodies turns out to have no larger meaning at all. Science, tempting though it is to believe otherwise, is not in the business of punishing the politically retrograde, nor is it a means of serving retribution to the wicked and the irresponsible. In the end, one may find that the true health toll of breast implants was the seven years of needless anxiety suffered by implant wearers at the hands of all those lawyers and health "advocates" who were ostensibly acting on their behalf.
Six Degrees of Lois Weisberg
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
January 11, 1999
ANNALS OF SOCIETY
She's a grandmother, she lives in a
big house in Chicago, and you've never
heard of her. Does she run the world?
1.
Everyone who knows Lois Weisberg has a story about meeting Lois Weisberg, and although she has done thousands of things in her life and met thousands of people, all the stories are pretty much the same. Lois (everyone calls her Lois) is invariably smoking a cigarette and drinking one of her dozen or so daily cups of coffee. She will have been up until two or three the previous morning, and up again at seven or seven-thirty, because she hardly seems to sleep. In some accounts -- particularly if the meeting took place in the winter -- she'll be wearing her white, fur-topped Dr. Zhivago boots with gold tights; but she may have on her platform tennis shoes, or the leather jacket with the little studs on it, or maybe an outrageous piece of costume jewelry, and, always, those huge, rhinestone-studded glasses that make her big eyes look positively enormous. "I have no idea why I asked you to come here, I have no job for you," Lois told Wendy Willrich when Willrich went to Lois's office in downtown Chicago a few years ago for an interview. But by the end of the interview Lois did have a job for her, because for Lois meeting someone is never just about meeting someone. If she likes you, she wants to recruit you into one of her grand schemes -- to sweep you up into her world. A while back, Lois called up Helen Doria, who was then working for someone on Chicago's city council, and said, "I don't have a job for you. Well, I might have a little job. I need someone to come over and help me clean up my office." By this, she meant that she had a big job for Helen but just didn't know what it was yet. Helen came, and, sure enough, Lois got her a big job.
Cindy Mitchell first met Lois twenty-three years ago, when she bundled up her baby and ran outside into one of those frigid Chicago winter mornings because some people from the Chicago Park District were about to cart away a beautiful sculpture of Carl von Linné from the park across the street. Lois happened to be driving by at the time, and, seeing all the commotion, she slammed on her brakes, charged out of her car -- all five feet of her -- and began asking Cindy questions, rat-a-tat-tat: "Who are you? What's going on here? Why do you care?" By the next morning, Lois had persuaded two Chicago Tribune reporters to interview Cindy and turn the whole incident into a cause célèbre, and she had recruited Cindy to join an organization she'd just started called Friends of the Parks, and then, when she found out that Cindy was a young mother at home who was too new in town to have many friends, she told her, "I've found a friend for you. Her name is Helen, and she has a little boy your kid's age, and you will meet her next week and the two of you will be best friends." That's exactly what happened, and, what's more, Cindy went on to spend ten years as president of Friends of the Park. "Almost everything that I do today and eighty to ninety per cent of my friends came about because of her, because of that one little chance meeting," Cindy says. "That's a scary thing. Try to imagine what would have happened if she had come by five minutes earlier."
It could be argued, of course, that even if Cindy hadn't met Lois on the street twenty-three years ago she would have met her somewhere else, maybe a year later or two years later or ten years later, or, at least, she would have met someone who knew Lois or would have met someone who knew someone who knew Lois, since Lois Weisberg is connected, by a very short chain, to nearly everyone. Weisberg is now the Commissioner of Cultural Affairs for the City of Chicago. But in the course of her seventy-three years she has hung out with actors and musicians and doctors and lawyers and politicians and activists and environmentalists, and once, on a whim, she opened a secondhand-jewelry store named for her granddaughter Becky Fyffe, and every step of the way Lois has made friends and recruited people, and a great many of those people have stayed with her to this day. "When we were doing the jazz festival, it turned out -- surprise, surprise -- that she was buddies with Dizzy Gillespie," one of her friends recalls. "This is a woman who cannot carry a tune. She has no sense of rhythm. One night Tony Bennett was in town, and so we hang out with Tony Bennett, hearing about the old days with him and Lois."
Once, in the mid-fifties, on a whim, Lois took the train to New York to attend the World Science Fiction Convention and there she met a young writer by the name of Arthur C. Clarke. Clarke took a shine to Lois, and next time he was in Chicago he called her up. "He was at a pay phone," Lois recalls. "He said, 'Is there anyone in Chicago I should meet?' I told him to come over to my house." Lois has a throaty voice, baked hard by half a century of nicotine, and she pauses between sentences to give herself the opportunity for a quick puff. Even when she's not smoking, she pauses anyway, as if to keep in practice. "I called Bob Hughes, one of the people who wrote for my paper." Pause. "I said, 'Do you know anyone in Chicago interested in talking to Arthur Clarke?' He said, 'Yeah, Isaac Asimov is in town. And this guy Robert, Robert...Robert Heinlein.' So they all came over and sat in my study." Pause. "Then they called over to me and they said, 'Lois' -- I can't remember the word they used. They had some word for me. It was something about how I was the kind of person who brings people together."
This is in some ways the archetypal Lois Weisberg story. First, she reaches out to somebody -- somebody outside her world. (At the time, she was running a drama troupe, whereas Arthur C. Clarke wrote science fiction.) Equally important, that person responds to her. Then there's the fact that when Arthur Clarke came to Chicago and wanted to meet someone Lois came up with Isaac Asimov. She says it was a fluke that Asimov was in town. But if it hadn't been Asimov it would have been someone else. Lois ran a salon out of her house on the North Side in the late nineteen-fifties, and one of the things that people remember about it is that it was always, effortlessly, integrated. Without that salon, blacks would still have socialized with whites on the North Side -- though it was rare back then, it happened. But it didn't happen by accident: it happened because a certain kind of person made it happen. That's what Asimov and Clarke meant when they said that Lois has this thing -- whatever it is -- that brings people together.
2.
Lois is a type -- a particularly rare and extraordinary type, but a type nonetheless. She's the type of person who seems to know everybody, and this type can be found in every walk of life. Someone I met at a wedding (actually, the wedding of the daughter of Lois's neighbors, the Newbergers) told me that if I ever went to Massapequa I should look up a woman named Marsha, because Marsha was the type of person who knew everybody. In Cambridge, Massachusetts, the word is that a tailor named Charlie Davidson knows everybody. In Houston, I'm told, there is an attorney named Harry Reasoner who knows everybody. There are probably Lois Weisbergs in Akron and Tucson and Paris and in some little town in the Yukon Territory, up by the Arctic Circle. We've all met someone like Lois Weisberg. Yet, although we all know a Lois Weisberg type, we don't know much about the Lois Weisberg type. Why is it, for example, that these few, select people seem to know everyone and the rest of us don't? And how important are the people who know everyone? This second question is critical, because once you begin even a cursory examination of the life of someone like Lois Weisberg you start to suspect that he or she may be far more important than we would ever have imagined -- that the people who know everyone, in some oblique way, may actually run the world. I don't mean that they are the sort who head up the Fed or General Motors or Microsoft, but that, in a very down-to-earth, day-to-day way, they make the world work. They spread ideas and information. They connect varied and isolated parts of society. Helen Doria says someone high up in the Chicago government told her that Lois is "the epicenter of the city administration," which is the right way to put it. Lois is far from being the most important or the most powerful person in Chicago. But if you connect all the dots that constitute the vast apparatus of government and influence and interest groups in the city of Chicago you'll end up coming back to Lois again and again. Lois is a connector.
Lois, it must be said, did not set out to know everyone. "She doesn't network for the sake of networking," says Gary Johnson, who was Lois's boss years ago, when she was executive director of the Chicago Council of Lawyers. "I just think she has the confidence that all the people in the world, whether she's met them or not, are in her Rolodex already, and that all she has to do is figure out how to reach them and she'll be able to connect with them."
Nor is Lois charismatic -- at least, not in the way that we think of extroverts and public figures as being charismatic. She doesn't fill a room; eyes don't swivel toward her as she makes her entrance. Lois has frizzy blond hair, and when she's thinking -- between her coffee and her cigarette -- she kneads the hair on the top of her head, so that by the end of a particularly difficult meeting it will be standing almost straight up. "She's not like the image of the Washington society doyenne," Gary Johnson says. "You know, one of those people who identify you, take you to lunch, give you the treatment. Her social life is very different. When I bump into her and she says, 'Oh, we should catch up,' what she means is that someday I should go with her to her office, and we'd go down to the snack bar and buy a muffin and then sit in her office while she answered the phone. For a real treat, when I worked with her at the Council of Lawyers she would take me to the dining room in the Wieboldt's department store." Johnson is an old-school Chicago intellectual who works at a fancy law firm and has a corner office with one of those Midwestern views in which, if you look hard enough, you can almost see Nebraska, and the memory of those lunches at Wieboldt's seems to fill him with delight. "Now, you've got to understand that the Wieboldt's department store -- which doesn't exist anymore -- was a notch below Field's, where the suburban society ladies have their lunch, and it's also a notch below Carson's," he says. "There was a kind of room there where people who bring their own string bags to go shopping would have a quick lunch. This was her idea of a lunch out. We're not talking Pamela Harriman here."
In the mid-eighties, Lois quit a job she'd had for four years, as director of special events in the administration of Harold Washington, and somehow hooked up with a group of itinerant peddlers who ran the city's flea markets. "There was this lady who sold jewelry," Lois said. "She was a person out of Dickens. She was bedraggled. She had a houseful of cats. But she knew how to buy jewelry, and I wanted her to teach me. I met her whole circle of friends, all these old gay men who had antique stores. Once a week, we would go to the Salvation Army." Lois was arguably the most important civic activist in the city. Her husband was a judge. She lived in a huge house in one of Chicago's nicest neighborhoods. Yet somehow she managed to be plausible as a flea-market peddler to a bunch of flea-market peddlers, the same way she managed to be plausible as a music lover to a musician like Tony Bennett. It doesn't matter who she's with or what she's doing; she always manages to be in the thick of things. "There was a woman I knew -- Sandra -- who had a kid in school with my son Joseph," Lois told me. Lois has a habit of telling stories that appear to be tangential and digressive but, on reflection, turn out to be parables of a sort. "She helped all these Asians living uptown. One day, she came over here and said there was this young Chinese man who wanted to meet an American family and learn to speak English better and was willing to cook for his room and board. Well, I'm always eager to have a cook, and especially a Chinese cook, because my family loves Chinese food. They could eat it seven days a week. So Sandra brought this man over here. His name was Shi Young. He was a graduate student at the Art Institute of Chicago." Shi Young lived with Lois and her family for two years, and during that time Chicago was in the midst of political turmoil. Harold Washington, who would later become the first black mayor of the city, was attempting to unseat the remains of the Daley political machine, and Lois's house, naturally, was the site of late-night, top-secret strategy sessions for the pro- Washington reformers of Chicago's North Side. "We'd have all these important people here, and Shi Young would come down and listen," Lois recalls. "I didn't think anything of it." But Shi Young, as it turns out, was going back up to his room and writing up what he heard for the China Youth Daily, a newspaper with a circulation in the tens of millions. Somehow, in the improbable way that the world works, a portal was opened up, connecting Chicago's North Side reform politics and the readers of the China Youth Daily, and that link was Lois's living room. You could argue that this was just a fluke -- just as it was a fluke that Isaac Asimov was in town and that Lois happened to be driving by when Cindy Mitchell came running out of her apartment. But sooner or later all those flukes begin to form a pattern.
3.
In the late nineteen-sixties, a Harvard social psychologist named Stanley Milgram conducted an experiment in an effort to find an answer to what is known as the small-world problem, though it could also be called the Lois Weisberg problem. It is this: How are human beings connected? Do we belong to separate worlds, operating simultaneously but autonomously, so that the links between any two people, anywhere in the world, are few and distant? Or are we all bound up together in a grand, interlocking web? Milgram's idea was to test this question with a chain letter. For one experiment, he got the names of a hundred and sixty people, at random, who lived in Omaha, Nebraska, and he mailed each of them a packet. In the packet was the name and address of a stockbroker who worked in Boston and lived in Sharon, Massachusetts. Each person was instructed to write his name on a roster in the packet and send it on to a friend or acquaintance who he thought would get it closer to the stockbroker. The idea was that when the letters finally arrived at the stockbroker's house Milgram could look at the roster of names and establish how closely connected someone chosen at random from one part of the country was to another person chosen at random in another part. Milgram found that most of the letters reached the stockbroker in five or six steps. It is from this experiment that we got the concept of six degrees of separation.
That phrase is now so familiar that it is easy to lose sight of how surprising Milgram's finding was. Most of us don't have particularly diverse groups of friends. In one well-known study, two psychologists asked people living in the Dyckman public-housing project, in uptown Manhattan, about their closest friend in the project; almost ninety per cent of the friends lived in the same building, and half lived on the same floor. In general, people chose friends of similar age and race. But if the friend lived down the hall, both age and race became a lot less important. Proximity overpowered similarity. Another study, involving students at the University of Utah, found that if you ask someone why he is friendly with someone else he'll say that it is because they share similar attitudes. But if you actually quiz the pairs of students on their attitudes you'll find out that this is an illusion, and that what friends really tend to have in common are activities. We're friends with the people we do things with, not necessarily with the people we resemble. We don't seek out friends; we simply associate with the people who occupy the same physical places that we do: People in Omaha are not, as a rule, friends with people who live in Sharon, Massachusetts. So how did the packets get halfway across the country in just five steps? "When I asked an intelligent friend of mine how many steps he thought it would take, he estimated that it would require 100 intermediate persons or more to move from Nebraska to Sharon," Milgram wrote. "Many people make somewhat similar estimates, and are surprised to learn that only five intermediaries will -- on the average -- suffice. Somehow it does not accord with intuition."
The explanation is that in the six degrees of separation not all degrees are equal. When Milgram analyzed his experiments, for example, he found that many of the chains reaching to Sharon followed the same asymmetrical pattern. Twenty-four packets reached the stockbroker at his home, in Sharon, and sixteen of those were given to him by the same person, a clothing merchant whom Milgram calls Mr. Jacobs. The rest of the packets were sent to the stockbroker at his office, and of those the majority came through just two men, whom Milgram calls Mr. Brown and Mr. Jones. In all, half of the responses that got to the stockbroker were delivered to him by these three people. Think of it. Dozens of people, chosen at random from a large Midwestern city, sent out packets independently. Some went through college acquaintances. Some sent their packets to relatives. Some sent them to old workmates. Yet in the end, when all those idiosyncratic chains were completed, half of the packets passed through the hands of Jacobs, Jones, and Brown. Six degrees of separation doesn't simply mean that everyone is linked to everyone else in just six steps. It means that a very small number of people are linked to everyone else in a few steps, and the rest of us are linked to the world through those few.
There's an easy way to explore this idea. Suppose that you made a list of forty people whom you would call your circle of friends (not including family members or co-workers), and you worked backward from each person until you could identify who was ultimately responsible for setting in motion the series of connections which led to that friendship. Imet my oldest friend, Bruce, for example, in first grade, so I'm the responsible party. That's easy. I met my college friend Nigel because he lived down the hall in the dormitory from Tom, whom I had met because in my freshman year he invited me to play touch football. Tom, then, is responsible for Nigel. Once you've made all the connections, you will find the same names coming up again and again. I met my friend Amy when she and her friend Katie came to a restaurant where I was having dinner. I know Katie because she is best friends with my friend Larissa, whom I know because I was told to look her up by a mutual friend, Mike A., whom I know because he went to school with another friend of mine, Mike H., who used to work at a political weekly with my friend Jacob. No Jacob, no Amy. Similarly, I met my friend Sarah S. at a birthday party a year ago because she was there with a writer named David, who was there at the invitation of his agent, Tina, whom I met through my friend Leslie, whom I know because her sister Nina is best friends with my friend Ann, whom I met through my old roommate Maura, who was my roommate because she had worked with a writer named Sarah L., who was a college friend of my friend Jacob. No Jacob, no Sarah S. In fact, when I go down my list of forty friends, thirty of them, in one way or another, lead back to Jacob. My social circle is really not a circle but an inverted pyramid. And the capstone of the pyramid is a single person, Jacob, who is responsible for an overwhelming majority of my relationships. Jacob's full name, incidentally, is Jacob Weisberg. He is Lois Weisberg's son.
This isn't to say, though, that Jacob is just like Lois. Jacob may be the capstone of my pyramid, but Lois is the capstone of lots and lots of people's pyramids, and that makes her social role different. In Milgram's experiment, Mr. Jacobs the clothing merchant was the person to go through to get to the stockbroker. Lois is the kind of person you would use to get to the stockbrokers of Sharon and also the cabaret singers of Sharon and the barkeeps of Sharon and the guy who gave up a thriving career in orthodontics to open a small vegetarian falafel hut.
4.
There is another way to look at this question, and that's through the popular parlor game Six Degrees of Kevin Bacon. The idea behind the game is to try to link in fewer than six steps any actor or actress, through the movies they've been in, to the actor Kevin Bacon. For example, O. J. Simpson was in "Naked Gun" with Priscilla Presley, who was in "The Adventures of Ford Fairlane" with Gilbert Gottfried, who was in "Beverly Hills Cop II" with Paul Reiser, who was in "Diner" with Kevin Bacon. That's four steps. Mary Pickford was in "Screen Snapshots" with Clark Gable, who was in "Combat America" with Tony Romano, who, thirty-five years later, was in "Starting Over" with Bacon. That's three steps. What's funny about the game is that Bacon, although he is a fairly young actor, has already been in so many movies with so many people that there is almost no one to whom he can't be easily connected. Recently, a computer scientist at the University of Virginia by the name of Brett Tjaden actually sat down and figured out what the average degree of connectedness is for the quarter million or so actors and actresses listed in the Internet Movie Database: he came up with 2.8312 steps. That sounds impressive, except that Tjaden then went back and performed an even more heroic calculation, figuring out what the average degree of connectedness was for everyone in the database. Bacon, it turns out, ranks only six hundred and sixty- eighth. Martin Sheen, by contrast, can be connected, on average, to every other actor, in 2.63681 steps, which puts him almost six hundred and fifty places higher than Bacon. Elliott Gould can be connected even more quickly, in 2.63601. Among the top fifteen are people like Robert Mitchum, Gene Hackman, Donald Sutherland, Rod Steiger, Shelley Winters, and Burgess Meredith.
Why is Kevin Bacon so far behind these actors? Recently, in the journal Nature, the mathematicians Duncan Watts and Steven Strogatz published a dazzling theoretical explanation of connectedness, but a simpler way to understand this question is to look at who Bacon is. Obviously, he is a lot younger than the people at the top of the list are and has made fewer movies. But that accounts for only some of the difference. A top-twenty person, like Burgess Meredith, made a hundred and fourteen movies in the course of his career. Gary Cooper, though, starred in about the same number of films and ranks only eight hundred and seventy-eighth, with a 2.85075 score. John Wayne made a hundred and eighty-three movies in his fifty-year career and still ranks only a hundred and sixteenth, at 2.7173. What sets someone like Meredith apart is his range. More than half of John Wayne's movies were Westerns, and that means he made the same kind of movie with the same kind of actors over and over again. Burgess Meredith, by contrast, was in great movies, like the Oscar-winning "Of Mice and Men" (1939), and in dreadful movies, like "Beware! The Blob" (1972). He was nominated for an Oscar for his role in "The Day of the Locust" and also made TV commercials for Skippy peanut butter. He was in four "Rocky" movies, and also played Don Learo in Godard's "King Lear." He was in schlocky made- for-TV movies, in B movies that pretty much went straight to video, and in pictures considered modern classics. He was in forty-two dramas, twenty-two comedies, eight adventure films, seven action films, five sci-fi films, five horror flicks, five Westerns, five documentaries, four crime movies, four thrillers, three war movies, three films noir, two children's films, two romances, two mysteries, one musical, and one animated film. Burgess Meredith was the kind of actor who was connected to everyone because he managed to move up and down and back and forth among all the different worlds and subcultures that the acting profession has to offer. When we say, then, that Lois Weisberg is the kind of person who "knows everyone," we mean it in precisely this way. It is not merely that she knows lots of people. It is that she belongs to lots of different worlds.
In the nineteen-fifties, Lois started her drama troupe in Chicago. The daughter of a prominent attorney, she was then in her twenties, living in one of the suburbs north of the city with two small children. In 1956, she decided to stage a festival to mark the centenary of George Bernard Shaw's birth. She hit up the reclusive billionaire John D. MacArthur for money. ("I go to the Pump Room for lunch. Booth One. There is a man, lurking around a pillar, with a cowboy hat and dirty, dusty boots. It's him.") She invited William Saroyan and Norman Thomas to speak on Shaw's legacy; she put on Shaw plays in theatres around the city; and she got written up in Life. She then began putting out a newspaper devoted to Shaw, which mutated into an underground alternative weekly called the Paper. By then, Lois was living in a big house on Chicago's near North Side, and on Friday nights people from the Paper gathered there for editorial meetings. William Friedkin, who went on to direct "The French Connection" and "The Exorcist," was a regular, and so were the attorney Elmer Gertz (who won parole for Nathan Leopold) and some of the editors from Playboy, which was just up the street. People like Art Farmer and Thelonious Monk and Dizzy Gillespie and Lenny Bruce would stop by when they were in town. Bruce actually lived in Lois's house for a while. "My mother was hysterical about it, especially one day when she rang the doorbell and he answered in a bath towel," Lois told me. "We had a window on the porch, and he didn't have a key, so the window was always left open for him. There were a lot of rooms in that house, and a lot of people stayed there and I didn't know they were there." Pause. Puff. "I never could stand his jokes. I didn't really like his act. I couldn't stand all the words he was using."
Lois's first marriage -- to a drugstore owner named Leonard Solomon -- was breaking up around this time, so she took a job doing public relations for an injury-rehabilitation institute. From there, she went to work for a public-interest law firm called B.P.I., and while she was at B.P.I. she became concerned about the fact that Chicago's parks were neglected and crumbling, so she gathered together a motley collection of nature lovers, historians, civic activists, and housewives, and founded the lobbying group Friends of the Parks. Then she became alarmed on discovering that a commuter railroad that ran along the south shore of Lake Michigan -- from South Bend to Chicago -- was about to shut down, so she gathered together a motley collection of railroad enthusiasts and environmentalists and commuters, and founded South Shore Recreation, thereby saving the railroad. Lois loved the railroad buffs. "They were all good friends of mine," she says. "They all wrote to me. They came from California. They came from everywhere. We had meetings. They were really interesting. I came this close" -- and here she held her index finger half an inch above her thumb -- "to becoming one of them." Instead, though, she became the executive director of the Chicago Council of Lawyers, a progressive bar association. Then she ran Congressman Sidney Yates's reëlection campaign. Then her sister June introduced her to someone who got her the job with Mayor Washington. Then she had her flea-market period. Finally, she went to work for Mayor Daley as Chicago's Commissioner of Cultural Affairs.
If you go through that history and keep count, the number of worlds that Lois has belonged to comes to eight: the actors, the writers, the doctors, the lawyers, the park lovers, the politicians, the railroad buffs, and the flea-market aficionados. When I asked Lois to make her own list, she added musicians and the visual artists and architects and hospitality-industry people whom she works with in her current job. But if you looked harder at Lois's life you could probably subdivide her experiences into fifteen or twenty worlds. She has the same ability to move among different subcultures and niches that the busiest actors do. Lois is to Chicago what Burgess Meredith is to the movies.
Lois was, in fact, a friend of Burgess Meredith. I learned this by accident, which is the way I learned about most of the strange celebrity details of Lois's life, since she doesn't tend to drop names. It was when I was with her at her house one night, a big, rambling affair just off the lakeshore, with room after room filled with odds and ends and old photographs and dusty furniture and weird bric-a- brac, such as a collection of four hundred antique egg cups. She was wearing bluejeans and a flowery-print top and she was smoking Carlton Menthol 100s and cooking pasta and holding forth to her son Joe on the subject of George Bernard Shaw, when she started talking about Burgess Meredith. "He was in Chicago in a play called 'Teahouse of the August Moon,' in 1956," she said, "and he came to see my production of 'Back to Methuselah,' and after the play he came up to me and said he was teaching acting classes, and asked would I come and talk to his class about Shaw. Well, I couldn't say no." Meredith liked Lois, and when she was running her alternative newspaper he would write letters and send in little doodles, and later she helped him raise money for a play he was doing called "Kicks and Company." It starred a woman named Nichelle Nichols, who lived at Lois's house for a while. "Nichelle was a marvellous singer and dancer," Lois said. "She was the lead. She was also the lady on the first..." Lois was doing so many things at once -- chopping and stirring and smoking and eating and talking -- that she couldn't remember the name of the show that made Nichols a star. "What's that space thing?" She looked toward Joe for help. He started laughing. "Star something," she said. "'Star...Star Trek'! Nichelle was Lieutenant Uhura!"
5.
On a sunny morning not long ago, Lois went to a little café just off the Magnificent Mile, in downtown Chicago, to have breakfast with Mayor Daley. Lois drove there in a big black Mercury, a city car. Lois always drives big cars, and, because she is so short and the cars are so big, all that you can see when she drives by is the top of her frizzy blond head and the lighted ember of her cigarette. She was wearing a short skirt and a white vest and was carrying a white cloth shopping bag. Just what was in the bag was unclear, since Lois doesn't have a traditional relationship to the trappings of bureaucracy. Her office, for example, does not have a desk in it, only a sofa and chairs and a coffee table. At meetings, she sits at the head of a conference table in the adjoining room, and, as often as not, has nothing in front of her except a lighter, a pack of Carltons, a cup of coffee, and an octagonal orange ceramic ashtray, which she moves a few inches forward or a few inches back when she's making an important point, or moves a few inches to the side when she is laughing at something really funny and feels the need to put her head down on the table.
Breakfast was at one of the city's tourist centers. The Mayor was there in a blue suit, and he had two city officials by his side and a very serious and thoughtful expression on his face. Next to him was a Chicago developer named Al Friedman, a tall and slender and very handsome man who is the chairman of the Commission on Chicago Landmarks. Lois sat across from them, and they all drank coffee and ate muffins and batted ideas back and forth in the way that people do when they know each other very well. It was a "power breakfast," although if you went around the table you'd find that the word "power" meant something very different to everyone there. Al Friedman is a rich developer. The Mayor, of course, is the administrative leader of one of the largest cities in the country. When we talk about power, this is usually what we're talking about: money and authority. But there is a third kind of power as well -- the kind Lois has -- which is a little less straightforward. It's social power.
At the end of the nineteen-eighties, for example, the City of Chicago razed an entire block in the heart of downtown and then sold it to a developer. But before he could build on it the real-estate market crashed. The lot was an eyesore. The Mayor asked for ideas about what to do with it. Lois suggested that they cover the block with tents. Then she heard that Keith Haring had come to Chicago in 1989 and worked with Chicago high-school students to create a giant five-hundred-foot-long mural. Lois loved the mural. She began to think. She'd long had a problem with the federal money that Chicago got every year to pay for summer jobs for disadvantaged kids. She didn't think it helped any kid to be put to work picking up garbage. So why not pay the kids to do arts projects like the Haring mural, and put the whole program in the tents? She called the program Gallery 37, after the number of the block. She enlisted the help of the Mayor's wife, Maggie Daley, whose energy and clout were essential in order to make the program a success. Lois hired artists to teach the kids. She realized, though, that the federal money was available only for poor kids, and, Lois says, "I don't believe poor kids can advance in any way by being lumped together with other poor kids." So Lois raised money privately to bring in middle-income kids, to mix with the poor kids and be put in the tents with the artists. She started small, with two hundred and sixty "apprentices" the first year, 1990. This year, there were more than three thousand. The kids study sculpture, painting, drawing, poetry, theatre, graphic design, dance, textile design, jewelry-making, and music. Lois opened a store downtown, where students' works of art are sold. She has since bought two buildings to house the project full time. She got the Parks Department to run Gallery 37 in neighborhoods around the city, and the Board of Education to let them run it as an after- school program in public high schools. It has been copied all around the world. Last year, it was given the Innovations in American Government Award by the Ford Foundation and the Harvard school of government.
Gallery 37 is at once a jobs program, an arts program, a real- estate fix, a schools program, and a parks program. It involves federal money and city money and private money, stores and buildings and tents, Maggie Daley and Keith Haring, poor kids and middle-class kids. It is everything, all at once -- a jumble of ideas and people and places which Lois somehow managed to make sense of. The ability to assemble all these disparate parts is, as should be obvious, a completely different kind of power from the sort held by the Mayor and Al Friedman. The Mayor has key allies on the city council or in the statehouse. Al Friedman can do what he does because, no doubt, he has a banker who believes in him, or maybe a lawyer whom he trusts to negotiate the twists and turns of the zoning process. Their influence is based on close relationships. But when Lois calls someone to help her put together one of her projects, chances are she's not calling someone she knows particularly well. Her influence suggests something a little surprising -- that there is also power in relationships that are not close at all.
6.
The sociologist Mark Granovetter examined this question in his classic 1974 book "Getting a Job." Granovetter interviewed several hundred professional and technical workers from the Boston suburb of Newton, asking them in detail about their employment history. He found that almost fifty-six per cent of those he talked to had found their jobs through a personal connection, about twenty per cent had used formal means (advertisements, headhunters), and another twenty per cent had applied directly. This much is not surprising: the best way to get in the door is through a personal contact. But the majority of those personal connections, Granovetter found, did not involve close friends. They were what he called "weak ties." Of those who used a contact to find a job, for example, only 16.7 per cent saw that contact "often," as they would have if the contact had been a good friend; 55.6 per cent saw their contact only "occasionally"; and 27.8 per cent saw the contact "rarely." People were getting their jobs not through their friends but through acquaintances.
Granovetter argues that when it comes to finding out about new jobs -- or, for that matter, gaining new information, or looking for new ideas -- weak ties tend to be more important than strong ties. Your friends, after all, occupy the same world that you do. They work with you, or live near you, and go to the same churches, schools, or parties. How much, then, do they know that you don't know? Mere acquaintances, on the other hand, are much more likely to know something that you don't. To capture this apparent paradox, Granovetter coined a marvellous phrase: "the strength of weak ties." The most important people in your life are, in certain critical realms, the people who aren't closest to you, and the more people you know who aren't close to you the stronger your position becomes.
Granovetter then looked at what he called "chain lengths" -- that is, the number of people who had to pass along the news about your job before it got to you. A chain length of zero means that you learned about your job from the person offering it. A chain length of one means that you heard about the job from someone who had heard about the job from the employer. The people who got their jobs from a zero chain were the most satisfied, made the most money, and were unemployed for the shortest amount of time between jobs. People with a chain of one stood second in the amount of money they made, in their satisfaction with their jobs, and in the speed with which they got their jobs. People with a chain of two stood third in all three categories, and so on. If you know someone who knows someone who knows someone who has lots of acquaintances, in other words, you have a leg up. If you know someone who knows someone who has lots of acquaintances, your chances are that much better. But if you know someone who has lots of acquaintances -- if you know someone like Lois -- you are still more fortunate, because suddenly you are just one step away from musicians and actors and doctors and lawyers and park lovers and politicians and railroad buffs and flea-market aficionados and all the other weak ties that make Lois so strong.
This sounds like a reformulation of the old saw that it's not what you know, it's who you know. It's much more radical than that, though. The old idea was that people got ahead by being friends with rich and powerful people -- which is true, in a limited way, but as a practical lesson in how the world works is all but useless. You can expect that Bill Gates's godson is going to get into Harvard and have a fabulous job waiting for him when he gets out. And, of course, if you play poker with the Mayor and Al Friedman it is going to be a little easier to get ahead in Chicago. But how many godsons can Bill Gates have? And how many people can fit around a poker table? This is why affirmative action seems pointless to so many people: It appears to promise something -- entry to the old-boy network -- that it can't possibly deliver. The old-boy network is always going to be just for the old boys.
Granovetter, by contrast, argues that what matters in getting ahead is not the quality of your relationships but the quantity -- not how close you are to those you know but, paradoxically, how many people you know whom you aren't particularly close to. What he's saying is that the key person at that breakfast in downtown Chicago is not the Mayor or Al Friedman but Lois Weisberg, because Lois is the kind of person who it really is possible for most of us to know. If you think about the world in this way, the whole project of affirmative action suddenly starts to make a lot more sense. Minority-admissions programs work not because they give black students access to the same superior educational resources as white students, or access to the same rich cultural environment as white students, or any other formal or grandiose vision of engineered equality. They work by giving black students access to the same white students as white students -- by allowing them to make acquaintances outside their own social world and so shortening the chain lengths between them and the best jobs.
This idea should also change the way we think about helping the poor. When we're faced with an eighteen-year-old high-school dropout whose only career option is making five dollars and fifty cents an hour in front of the deep fryer at Burger King, we usually talk about the importance of rebuilding inner-city communities, attracting new jobs to depressed areas, and re-investing in neglected neighborhoods. We want to give that kid the option of another, better-paying job, right down the street. But does that really solve his problem? Surely what that eighteen-year-old really needs is not another marginal inducement to stay in his neighborbood but a way to get out of his neighborhood altogether. He needs a school system that provides him with the skills to compete for jobs with middle-class kids. He needs a mass-transit system to take him to the suburbs, where the real employment opportunities are. And, most of all, he needs to know someone who knows someone who knows where all those good jobs are. If the world really is held together by people like Lois Weisberg, in other words, how poor you are can be defined quite simply as how far you have to go to get to someone like her. Wendy Willrich and Helen Doria and all the countless other people in Lois's circle needed to make only one phone call. They are well-off. The dropout wouldn't even know where to start. That's why he's poor. Poverty is not deprivation. It is isolation.
7.
I once met a man named Roger Horchow. If you ever go to Dallas and ask around about who is the kind of person who might know everyone, chances are you will be given his name. Roger is slender and composed. He talks slowly, with a slight Texas drawl. He has a kind of wry, ironic charm that is utterly winning. If you sat next to him on a plane ride across the Atlantic, he would start talking as the plane taxied to the runway, you would be laughing by the time the seat-belt sign was turned off, and when you landed at the other end you'd wonder where the time had gone.
I met Roger through his daughter Sally, whose sister Lizzie went to high school in Dallas with my friend Sara M., whom I know because she used to work with Jacob Weisberg. (No Jacob, no Roger.) Roger spent at least part of his childhood in Ohio, which is where Lois's second husband, Bernie Weisberg, grew up, so I asked Roger if he knew Bernie. It would have been a little too apt if he did -- that would have made it all something out of "The X-Files" -- but instead of just answering, "Sorry, I don't," which is what most of us would have done, he paused for a long time, as if to flip through the "W"s in his head, and then said, "No, but I'm sure if I made two phone calls..."
Roger has a very good memory for names. One time, he says, someone was trying to talk him into investing his money in a business venture in Spain, and when he asked the names of the other investors he recognized one of them as the same man with whom one of his ex-girlfriends had had a fling during her junior year abroad, fifty years before. Roger sends people cards on their birthdays: he has a computerized Rolodex with sixteen hundred names on it. When I met him, I became convinced that these techniques were central to the fact that he knew everyone -- that knowing everyone was a kind of skill. Horchow is the founder of the Horchow Collection, the first high-end mail-order catalogue, and I kept asking him how all the connections in his life had helped him in the business world, because I thought that this particular skill had to have been cultivated for a reason. But the question seemed to puzzle him. He didn't think of his people collection as a business strategy, or even as something deliberate. He just thought of it as something he did -- as who he was. One time, Horchow said, a close friend from childhood suddenly resurfaced. "He saw my catalogue and knew it had to be me, and when he was out here he showed up on my doorstep. I hadn't seen him since I was seven. We had zero in common. It was wonderful." The juxtaposition of those last two sentences was not ironic; he meant it.
In the book "The Language Instinct," the psychologist Steven Pinker argues against the idea that language is a cultural artifact -- something that we learn "the way we learn to tell time." Rather, he says, it is innate. Language develops "spontaneously," he writes, "without conscious effort or formal instruction," and "is deployed without awareness of its underlying logic.... People know how to talk in more or less the sense that spiders know how to spin webs." The secret to Roger Horchow and Lois Weisberg is, I think, that they have a kind of social equivalent of that instinct -- an innate and spontaneous and entirely involuntary affinity for people. They know everyone because -- in some deep and less than conscious way -- they can't help it.
8.
Once, in the very early nineteen-sixties, after Lois had broken up with her first husband, she went to a party for Ralph Ellison, who was then teaching at the University of Chicago. There she spotted a young lawyer from the South Side named Bernie Weisberg. Lois liked him. He didn't notice her, though, so she decided to write a profile of him for the Hyde Park Herald. It ran with a huge headline. Bernie still didn't call. "I had to figure out how I was going to get to meet him again, so I remembered that he was standing in line at the reception with Ralph Ellison," Lois says. "So I called up Ralph Ellison" -- whom she had never met -- "and said, 'It's so wonderful that you are in Chicago. You really should meet some people on the North Side. Would it be O.K. if I have a party for you?'" He said yes, and Lois sent out a hundred invitations, including one to Bernie. He came. He saw Dizzy Gillespie in the kitchen and Ralph Ellison in the living room. He was impressed. He asked Lois to go with him to see Lenny Bruce. Lois was mortified; she didn't want this nice Jewish lawyer from the South Side to know that she knew Lenny Bruce, who was, after all, a drug addict. "I couldn't get out of it," she said. "They sat us down at a table right at the front, and Lenny keeps coming over to the edge of the stage and saying" -- here Lois dropped her voice down very low -- "'Hello, Lois.'I was sitting there like this." Lois put her hands on either side of her face. "Finally I said to Bernie, 'There are some things I should tell you about. Lenny Bruce is a friend of mine. He's staying at my house. The second thing is I'm defending a murderer.'"(But that's another story.) Lois and Bernie were married a year later.
The lesson of this story isn't obvious until you diagram it culturally: Lois got to Bernie through her connections with Ralph Ellison and Lenny Bruce, one of whom she didn't know (although later, naturally, they became great friends) and one of whom she was afraid to say that she knew, and neither of whom, it is safe to speculate, had ever really been connected with each other before. It seems like an absurdly roundabout way to meet someone. Here was a thirtyish liberal Jewish intellectual from the North Side of Chicago trying to meet a thirtyish liberal Jewish intellectual from the South Side of Chicago, and to get there she charted a cross-cultural social course through a black literary lion and an avant-garde standup comic. Yet that's a roundabout journey only if you perceive the worlds of Lenny Bruce and Ralph Ellison and Bernie Weisberg to be impossibly isolated. If you don't -- if, like Lois, you see them all as three points of an equilateral triangle -- then it makes perfect sense. The social instinct makes everyone seem like part of a whole, and there is something very appealing about this, because it means that people like Lois aren't bound by the same categories and partitions that defeat the rest of us. This is what the power of the people who know everyone comes down to in the end. It is not -- as much as we would like to believe otherwise -- something rich and complex, some potent mixture of ambition and energy and smarts and vision and insecurity. It's much simpler than that. It's the same lesson they teach in Sunday school. Lois knows lots of people because she likes lots of people. And all those people Lois knows and likes invariably like her, too, because there is nothing more irresistible to a human being than to be unqualifiedly liked by another.
Not long ago, Lois took me to a reception at the Museum of Contemporary Art, in Chicago -- a brand-new, Bauhaus-inspired building just north of the Loop. The gallery space was impossibly beautiful -- cool, airy, high-ceilinged. The artist on display was Chuck Close. The crowd was sleek and well groomed. Black-clad young waiters carried pesto canapés and glasses of white wine. Lois seemed a bit lost. She can be a little shy sometimes, and at first she stayed on the fringes of the room, standing back, observing. Someone important came over to talk to her. She glanced up uncomfortably. I walked away for a moment to look at the show, and when I came back her little corner had become a crowd. There was her friend from the state legislature. A friend in the Chicago Park District. A friend from her neighborhood. A friend in the consulting business. A friend from Gallery 37. A friend from the local business- development group. And on and on. They were of all ages and all colors, talking and laughing, swirling and turning in a loose circle, and in the middle, nearly hidden by the commotion, was Lois, clutching her white bag, tiny and large-eyed, at that moment the happiest person in the room
Running from Ritalin
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
February 2, 1999
BOOKS
Is the hectic pace of contemporary
life really to blame for A.D.D.? Not so fast.
1.
There has always been a temptation in American culture to think of drugs as social metaphors. In the early sixties, the pharmaceutical metaphor for the times was Valium. During the sexual revolution, it was the Pill, and that was followed, in quick succession, by marijuana in the nineteen-seventies, cocaine in the nineteen-eighties, and Prozac in the early nineteen-nineties. Today, of course, the drug that has come to symbolize our particular predicaments is Ritalin, the widely prescribed treatment for attention-deficit hyperactivity disorder, or attention-deficit disorder, as it is more popularly known. In his new book, "The Hyperactivity Hoax," the neuropsychiatrist Sydney Walker calls attention disorders and the rise of Ritalin "symptoms of modern life, rather than symptoms of modern disease." In "Ritalin Nation" the psychologist Richard DeGrandpre argues that Ritalin and A.D.H.D. are the inevitable by-products of a culture-wide addiction to speed--to cellular phones and beepers and faxes and overnight mail and computers with powerful chips and hard-driving rock music and television shows that splice together images at hundredth-of-a-second intervals, and a thousand other social stimulants that have had the effect of transforming human expectations. The soaring use of Ritalin, the physician Lawrence Diller concludes in his new book, "Running on Ritalin," reveals something about the kind of society we are at the turn of the millennium.... It throws a spotlight on some of our most sensitive issues: what kind of parents we are, what kind of schools we have, what kind of health care is available to us. It brings into question our cultural standards for behavior, performance, and punishment; it reaches into the workplace, the courts and the halls of Congress. It highlights the most basic psychological conundrum of nature versus nurture, and it raises fundamental philosophical questions about the nature of free will and responsibility.
In a recent Time cover story on Ritalin, the mother of a child with A.D.H.D. is described as tearing up her daughter's Ritalin prescription. "I thought, maybe there is something else we can do," she says. "I knew that medicine can mask things." That is the kind of question that Ritalin provokes--not the simple, traditional "Will this cure my child?" but the harder, postmodern question "In curing my child, what deeper pathology might this drug be hiding?"
It's important that we ask questions like this, particularly of drugs that are widely used. The problem with Ritalin is that many of the claims made to support the drug's status as a symbol turn out, on closer examination, to be vague or confusing. Diller, DeGrandpre, and Walker are all, for example, deeply suspicious of our reliance on Ritalin. They think that it is overprescribed--that it is being used to avoid facing broader questions about our values and our society. This sounds plausible: the amount of Ritalin consumed in the United States has more than tripled since 1990. Then again, it has been only in the last ten years that clinical trials have definitively proved that Ritalin is effective in treating A.D.H.D. And, even with that dramatic increase, the number of American children taking Ritalin is estimated to be one or two per cent. Given that most estimates put the incidence of A.D.H.D. at between three and five per cent, are too many children taking the drug--or too few? "You really run into problems with teen-agers," William Pelham, a professor of psychology at SUNY Buffalo and a prominent A.D.H.D. expert, told me. "They don't want to take this medication. They don't feel they need to. It's part of the oppositional stuff you run into. The kids whom you most want to take it are the ones who are aggressive, and they are the most likely to blow it off."
Or consider how A.D.H.D. is defined. According to the Diagnostic and Statistical Manual-IV, a child has A.D.H.D. if, for a period of six months, he or she exhibits at least six symptoms from a list of behavioral signs. Among them: "often has difficulty organizing tasks and activities," "often does not seem to listen when spoken to directly," "is often easily distracted by extraneous stimuli," "is often 'on the go' or acts as if 'driven by a motor,'" and "often blurts out answers before questions have been completed," and so on. "Ritalin Nation" argues that all these are essentially symptoms of boredom--the impatience of those used to the rapid-fire pace of MTV, Nintendo, and the rest of contemporary culture. The A.D.H.D. child blurts out answers before questions have been completed because, DeGrandpre says, "listening is usually a waiting situation that provides a low level of stimulation." The A.D.H.D. child is easily distracted because, "by definition, extraneous stimuli are novel." Give A.D.H.D. kids something novel to do, something that can satisfy their addiction, DeGrandpre argues, and they'll be fine. Diller works with a different definition of A.D.H.D. but comes to some of the same conclusions. High-stimulus activities like TV and video games "constitute a strange sort of good-fit situation for distractible children," he writes. "These activities are among the few things they can concentrate on well."
2.
When A.D.H.D. kids are actually tested on activities like video games, however, this alleged "good fit" disappears. Rosemary Tannock, a behavioral scientist at the Hospital for Sick Children, in Toronto, recently looked at how well a group of boys between the ages of eight and twelve actually did at Pac Man and Super Mario World, and she found that the ones with A.D.H.D. completed fewer levels and had to restart more games than their unaffected peers. "They often failed to inhibit their forward trajectory and crashed headlong into obstacles," she explained. A.D.H.D. kids may like the stimulation of a video game, but that doesn't mean they can handle it. Tannock has also given a group of A.D.H.D. children what's called a letter-naming test. The child is asked to read as quickly as he can five rows of letters, each of which consists of five letters repeated in different orders--"A, B, C, D, E," for example, followed by "D, E, B, A, C," and so on. A normal eight-year-old might take twenty-five seconds to complete the list. His counterpart with attention deficit might take thirty-five seconds, which is the kind of performance usually associated with dyslexia. "Some of our most articulate [A.D.H.D.] youngsters describe how doing this test is like speaking a foreign language in a foreign land," Tannock told me. "You get exhausted. That's how they feel. They have a thousand different ideas crowding into their heads at the same time." This doesn't sound like a child attuned to the quicksilver rhythms of the modern age. This sounds like a garden-variety learning disorder.
What further confounds the culture-of-Ritalin school is that A.D.H.D. turns out to have a considerable genetic component. As a result of numerous studies of twins conducted around the world over the past decade, scientists now estimate that A.D.H.D. is about seventy per cent heritable. This puts it up there with the most genetically influenced of traits--traits such as blood pressure, height, and weight. Meanwhile, the remaining thirty per cent--the environmental contribution to the disorder--seems to fall under what behavioral geneticists call "non-shared environment," meaning that it is likely to be attributable to such factors as fetal environment or illness and injury rather than factors that siblings share, such as parenting styles or socioeconomic class. That's why the way researchers describe A.D.H.D. has changed over the past decade. There is now less discussion of the role of bad parents, television, and diet and a lot more discussion of neurology and the role of specific genes.
This doesn't mean that there is no social role at all in the expression of A.D.H.D. Clearly, something has happened to make us all suddenly more aware of the disorder. But when, for instance, Diller writes that "the conditions that have fueled the A.D.D. epidemic and the Ritalin boom" will not change until "America somehow regains its balance between material gain and emotional and spiritual satisfaction," it's clear that he is working with a definition of A.D.H.D. very different from that of the scientific mainstream. In fact, books like "Running on Ritalin" and "Ritalin Nation" don't seem to have a coherent definition of A.D.H.D. at all. This is what is so confusing about the popular debate over this disorder: it's backward. We've become obsessed with what A.D.H.D. means. Don't we first have to figure out what it is?
3.
One of the tests researchers give to children with A.D.H.D. is called a stop-signal task. A child sits down at a computer and is told to hit one key if he sees an "X" on the screen and another key if he sees an "O." If he hears a tone, however, he is to refrain from hitting the key. By changing the timing of the tone--playing it just before or just as or just a millisecond after the "X" or "O" appears on the screen--you can get a very good idea of how well someone reacts. "Kids with A.D.H.D. have a characteristically longer reaction time," Gordon Logan, a cognitive psychologist at the University of Illinois, told me. "They're fifty per cent slower than other kids." Unless the tone is played very early, giving them plenty of warning, they can't stop themselves from hitting the keys.
The results may seem a relatively trivial matter--these are differences measured in fractions of a second, after all. But for many researchers the idea that children with A.D.H.D. lack some fundamental ability to inhibit themselves, to stop a pre-programmed action, is at the heart of the disorder. Suppose, for example, that you have been given a particularly difficult math problem. Your immediate, impulsive response might be to throw down your pencil in frustration. But most of us wouldn't do that. We would check those impulses, and try to slog our way through the problem, and, with luck, maybe get it right. Part of what it takes to succeed in a complex world, in other words, is the ability to inhibit our impulses. But the child with A.D.H.D., according to the official diagnosis, "often does not follow through on instructions and fails to finish schoolwork, chores, or duties in the workplace" and "often runs about or climbs excessively in situations in which it is inappropriate." He cannot apply himself because he cannot regulate his behavior in a consistent manner. He is at the mercy of the temptations and distractions in his immediate environment. "It's not that a child or an individual is always hyperactive or always inattentive or distracted," Tannock says. "The same individual can one minute be restless and fidgeting or the next minute lethargic or yawning. The individual can be overfocussed one minute and incredibly distractible the next. It is this variability, from day to day and moment to moment, that is the most robust finding we have."
Russell Barkley, a professor of psychiatry at the University of Massachusetts at Worcester, has done experiments that look at the way A.D.H.D. kids experience time, and the results demonstrate how this basic problem with self-regulation can have far-reaching consequences. In one experiment, he turns on a light for a predetermined length of time and then asks a child to turn the light back on and off for what the child guesses to be the same interval. Children without A.D.H.D. perform fairly consistently. At twelve seconds, for example, their guesses are just a little low. At thirty-six seconds, they are slightly less accurate--still on the low side--and at sixty seconds their guesses are coming in at about fifty seconds. A.D.H.D. kids, on the other hand, are terrible at this game. At twelve seconds, they are well over; apparently, twelve seconds seems much, much longer to them. But at sixty seconds their guesses are much lower than everyone else's; apparently, the longer interval is impossible to comprehend. The consequences of having so profoundly subjective a sense of time are obvious. It's no surprise that people with A.D.H.D. often have problems with punctuality and with patience. An accurate sense of time is a function of a certain kind of memory--an ability to compare the duration of ongoing events with that of past events, so that a red light doesn't seem like an outrageous imposition, or five minutes doesn't seem so impossibly long that you can imagine getting from one side of town to the other in that amount of time. Time is about imposing order, about exercising control over one's perceptions, and that's something that people with attention deficit have trouble with.
This way of thinking about A.D.H.D. clarifies some of the more confusing aspects of the disorder. In DeGrandpre's formulation, the A.D.H.D. child can't follow through on instructions or behaves inappropriately because there isn't enough going on in his environment. What the inhibition theory implies is the opposite: that the A.D.H.D. child can't follow through or behaves inappropriately because there is too much going on; he falters in situations that require him to exercise self-control and his higher cognitive skills. DeGrandpre cannot explain why A.D.H.D. kids like video games but are also so bad at them. Shouldn't they thrive in that most stimulating of environments? If their problem is self-control, that apparent contradiction makes perfect sense. The A.D.H.D. child likes video games because they permit--even encourage--him to play impulsively. But he's not very good at them because to succeed at Pac Man or Super Mario World a child must learn to overcome the temptation posed by those games to respond impulsively to every whiz and bang: the child has to learn to stop and think (ever so quickly) before reacting.
At the same time, this theory makes it a lot clearer what kind of problem A.D.H.D. represents. The fact that children with the disorder can't finish the hard math problem doesn't mean that they're not smart enough to know the answer. It means they can't focus long enough to get to the answer. As Barkley puts it, A.D.H.D. is a problem not of knowing what you should do but, rather, of doing what you know. Motivation and memory and higher cognitive skills are intact in people with attention deficit. "But they are secondarily delayed," Barkley says. "They have no chance. They are rarely engaged and highly ineffective, because impulsive actions take precedence." The inability to stop pressing that "X" or "O" key ends up causing much more serious problems down the road.
This way of thinking about A.D.H.D. also demystifies Ritalin. Implicit in the popular skepticism about the drug has always been the idea that you cannot truly remedy something as complicated as A.D.H.D. with a pill. That's why the mother quoted in the Time story ripped up her child's Ritalin prescription, and why Diller places so much emphasis on the need for "real" social and spiritual solutions. But if A.D.H.D. is merely a discrete problem in inhibition why couldn't Ritalin be a complete solution? People with A.D.H.D. don't need a brain overhaul. They just need a little help with stopping..
4.
There is another way to look at the A.D.H.D.-Ritalin question, which is known as the dopamine theory. This is by no means a conclusive account of A.D.H.D., but it may help clarify some of the issues surrounding the disorder. Dopamine is the chemical in the brain--the neurotransmitter--that appears to play a major role in things like attention and inhibition. When you tackle a difficult task or pay attention to a complex social situation, you are essentially generating dopamine in the parts of the brain that deal with higher cognitive tasks. If you looked at a thousand people at random, you would find a huge variation in their dopamine systems, just as you would if you looked at, say, blood pressure in a random population. A.D.H.D., according to this theory, is the name we give to people whose dopamine falls at the lower end of the scale, the same way we say that people suffer from hypertension if their blood pressure is above a certain point. In order to get normal levels of attention and inhibition, you have to produce normal levels of dopamine.
This is what Ritalin does. Dopamine is manufactured in the brain by special receptors, and each of those receptors has a "transport," a kind of built-in vacuum cleaner that sucks up any excess dopamine floating around and stores it inside the neuron. Ritalin shuts down that transport, so the amount of dopamine available for cognition remains higher than it would be otherwise. In about sixty-five per cent of those who take the drug, Ritalin appears to make them "normal," and in an additional ten per cent it appears to bring about substantial improvement. It does have a few minor side effects--appetite loss and insomnia, in some users--but by and large it's a remarkably safe drug, with remarkably specific effects.
So what does the fact that we seem to be relying more and more on Ritalin mean? The beginning of the answer, I think, lies in the fact that Ritalin is not the only drug in existence that enhances dopamine. Cocaine affects the brain in almost exactly the same way. Nicotine, too, is a dopamine booster, although its mechanism is somewhat different. Obviously, taking Ritalin doesn't have the same consequences as snorting cocaine or smoking a cigarette. It's not addictive, and its effect is a lot more specific. Still, nicotine, cocaine, and Ritalin are all performing the same basic neurological function.
What, for instance, was the appeal of cocaine at the beginning of the coke epidemic of the eighties? It was a feel-good drug. But it was a feel-good drug of a certain kind--a drug that people thought would help them master the complexity and the competitive pressures of the world around them. In the now infamous Time story on cocaine that ran in the summer of 1981, there is a picture of a "freelance artist" in Manhattan doing lines on his lunch break, with the caption "Feeling stronger, smarter, faster, more able to cope." Cocaine, the article begins, "is becoming the all-American drug," and its popularity, in the words of one expert, is a symptom of the fact that "right from childhood in this country there is pressure for accomplishment." At the moment of its greatest popularity, cocaine was considered a thinking drug, an achievement drug, a drug for the modern world. Does that sound familiar?
Nicotine has a similar profile. Cigarettes aid concentration. Understandably, this isn't a fact that has received much publicity in recent years. But there are plenty of data showing that nicotine does exactly what you would expect a dopamine enhancer to do. In one experiment, for example, smokers were given three minutes to read randomly ordered letters, in rows of thirty, and cross out the letter "e" every time they encountered it. The smokers took the test twice, before and after smoking a cigarette, and, on average, they were able to read 161.5 more letters--or more than five extra lines--after smoking than before. It's no surprise that this test sounds a lot like the test that A.D.H.D. kids do so poorly on, because we are really talking about the same set of cognitive skills--the ability to concentrate and screen out distractions. Numerous studies have shown that children with A.D.H.D. are much more likely to smoke and take illegal drugs in later life; what the dopamine theory suggests is that many people resort to such substances as a way of medicating themselves. Nora Volkow, the chairman of medicine at Brookhaven National Laboratory, says that between ten and twenty per cent of drug addicts have A.D.H.D. "In studies, when they were given Ritalin they would stop taking cocaine," she told me. Timothy Wilens, a psychiatrist at Harvard Medical School, presented data at a recent National Institutes of Health conference on A.D.H.D. which showed that treating A.D.H.D. kids with Ritalin and the like lowered the risk of their developing drug problems in adolescence by an extraordinary sixty-eight per cent. Among people with dopamine deficits, Ritalin is becoming a safe pharmaceutical alternative to the more dangerous dopamine boosters of the past.
Here, surely, is one of the deeper implications of the rise of Ritalin--particularly among adults, whose use of the drug has increased rapidly in recent years. For decades, in this country and around the world, millions of people used smoking as a way of boosting their dopamine and sharpening focus and concentration. Over the past twenty years, we have gradually taken away that privilege, by making it impossible for people to smoke at work and by marshalling an array of medical evidence to convince people that they should not start at all. From a public-health standpoint, this has been of critical importance: countless lives have been saved. But the fact remains that millions of people have lost a powerful pharmacological agent--nicotine--that they had been using to cope with the world around them. In fact, they have lost it precisely at a moment when the rising complexity of modern life would seem to make dopamine enhancement more important than ever. Among adults, Ritalin is a drug that may fill the void left by nicotine.
Among children, Ritalin is clearly performing a similar function. We are extending to the young cognitive aids of a kind that used to be reserved exclusively for the old. It is this reliance on a drug--the idea that children should have to be medicated--that, of course, people like Diller, Walker, and DeGrandpre find so upsetting. If some children need to take a drug in order to be "normal," they think that the problem is with our definition of "normal." Diller asks, "Is there still a place for childhood in the anxious, downsizing America of the late nineteen-nineties? What if Tom Sawyer or Huckleberry Finn were to walk into my office tomorrow? Tom's indifference to schooling and Huck's 'oppositional' behavior would surely have been cause for concern. Would I prescribe Ritalin for them, too?" But this is just the point. Huck Finn and Tom Sawyer lived in an age where difficult children simply dropped out of school, or worked on farms, or drifted into poverty and violence. The "childhood" Diller romanticizes was a ruthlessly Darwinian place, which provided only for the most economically--and genetically--privileged. Children are now being put into situations that demand attention and intellectual consideration, and it is no longer considered appropriate simply to cast aside those who because of some neurological quirk have difficulty coping. Only by a strange inversion of moral responsibility do books like "Ritalin Nation" and "Running on Ritalin" seek to make those parents and physicians trying to help children with A.D.H.D. feel guilty for doing so. The rise of A.D.H.D. is a consequence of what might otherwise be considered a good thing: that the world we live in increasingly values intellectual consideration and rationality--increasingly demands that we stop and focus. Modernity didn't create A.D.H.D. It revealed it.
Drunk Drivers
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
March 8, 1999
talk of the town
Drunk Drivers and Other Dangers
Last week, New York City began confiscating the automobiles of people caught drinking and driving. On the first day of the crackdown, the police seized three cars, including one from a man who had been arrested for drunk driving on eight previous occasions. The tabloids cheered. Mothers Against Drunk Driving nodded in approval. After a recent series of brutal incidents involving the police tarnished the Giuliani administration, the Mayor's anti-crime crusade appeared to right itself. The city now has the toughest anti-drunk-driving policy in the country, and the public was given a welcome reminder that the vast majority of the city's thirty-eight thousand cops are neither racist nor reckless and that the justice they mete out is largely deserved. "There's a very simple way to stay out of this problem, for you, your family, and anyone else," a triumphant Giuliani said. "Do not drink and get behind the wheel of a car."
Let's leave aside, for a moment, the question of whether the new policy is constitutional. That is a matter for the courts. A more interesting issue is what the willing acceptance of such a hard-line stance on drunk driving says about the sometimes contradictory way we discuss traffic safety. Suppose, for example, that I was stopped by the police for running a red light on Madison Avenue. I would get points on my license and receive a fine. If I did the same thing while my blood-alcohol level was above the prescribed limit, however, I would be charged with drunk driving and lose my car. The behavior is the same in both cases, but the consequences are very different. We believe, as a society, that the combination of alcohol and driving deserves particular punishment. And that punishment isn't necessarily based on what you have actually done. It's often based on what you could do--or, to be more precise, on the extra potential for harm that your drinking poses.
There is nothing wrong with this approach. We have laws against threatening people with guns for the same reason. It hardly makes sense to wait for drunks or people waving guns to kill someone before we arrest them. But if merely posing a threat to others on the road is the threshold for something as drastic as civil forfeiture, then why are we stopping with drunks? Fifty per cent of all car accidents in the United States are attributed to driver inattention, for example. Some of that inattention is caused by inebriation, but there are other common and obvious distractions. Two studies made in the past three years--the first conducted at the Rochester Institute of Technology and the second published in the New England Journal of Medicine-- suggest that the use of car phones is associated with a four-to-fivefold increase in the risk of accidents, and that hands-free phones may not be any safer than conventional ones. The driver on the phone is a potential risk to others, just as the driver who has been drinking is. It is also now abundantly clear that sport-utility vehicles and pickup trucks can--by virtue of their weight, high clearance, and structural rigidity--do far more damage in an accident than conventional automobiles can. S.U.V.s and light trucks account for about a third of the vehicles on the road. But a disproportionate number of the fatalities in two-vehicle crashes are caused by collisions between those bigger vehicles and conventional automobiles, and the people riding in the cars make up a stunning eighty-one per cent of those killed.
The reason we don't like drunk drivers is that by making the decision to drink and drive an individual deliberately increases his or her chance of killing someone else with a vehicle. But how is the moral culpability of the countless Americans who have walked into a dealership and made a decision to buy a fifty-six- hundred-pound sport utility any different? Of course, there are careful S.U.V. drivers and careful car-phone users. Careful people can get drunk, too, and overcompensate for their impairment by creeping along at twenty-five miles an hour, and in New York City we won't hesitate to take away their vehicles. Obviously, Giuliani, even in his most crusading moments, isn't about to confiscate all the car phones and S.U.V.s on the streets of New York. States should, however, stop drivers from using car phones while the car is in motion, as some countries, including England, do. And a prohibitive weight tax on sport utilities would probably be a good idea. The moneys collected could be used to pay the medical bills and compensate the family of anyone hit by some cell-phone-wielding yuppie in a four-wheeled behemoth.
True Colors
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
March 22, 1999
ANNALS OF ADVERTISING
Hair dye and the hidden history of postwar America.
1.
During the Depression-long before she became one of the most famous copywriters of her day-Shirley Polykoff met a man named George Halperin. He was the son of an Orthodox rabbi from Reading, Pennsylvania, and soon after they began courting he took her home for Passover to meet his family. They ate roast chicken, tzimmes, and sponge cake, and Polykoff hit it off with Rabbi Halperin, who was warm and funny. George's mother was another story. She was Old World Orthodox, with severe, tightly pulled back hair; no one was good enough for her son.
"How'd I do, George?" Shirley asked as soon as they got in the car for the drive home. "Did your mother like me?"
He was evasive. "My sister Mildred thought you were great."
"That's nice, George," she said. "But what did your mother say?"
There was a pause. "She says you paint your hair." Another pause. "Well, do you?"
Shirley Polykoff was humiliated. In her mind she could hear her future mother-in-law: Fahrbt zi der huer? Oder fahrbt zi nisht? Does she color her hair? Or doesn't she?
The answer, of course, was that she did. Shirley Polykoff always dyed her hair, even in the days when the only women who went blond were chorus girls and hookers. At home in Brooklyn, starting when she was fifteen, she would go to Mr. Nicholas's beauty salon, one flight up, and he would "lighten the back" until all traces of her natural brown were gone. She thought she ought to be a blonde-or, to be more precise, she thought that the decision about whether she could be a blonde was rightfully hers, and not God's. Shirley dressed in deep oranges and deep reds and creamy beiges and royal hues. She wore purple suede and aqua silk, and was the kind of person who might take a couture jacket home and embroider some new detail on it. Once, in the days when she had her own advertising agency, she was on her way to Memphis to make a presentation to Maybelline and her taxi broke down in the middle of the expressway. She jumped out and flagged down a Pepsi-Cola truck, and the truck driver told her he had picked her up because he'd never seen anyone quite like her before. "Shirley would wear three outfits, all at once, and each one of them would look great," Dick Huebner, who was her creative director, says. She was flamboyant and brilliant and vain in an irresistible way, and it was her conviction that none of those qualities went with brown hair. The kind of person she spent her life turning herself into did not go with brown hair. Shirley's parents were Hyman Polykoff, small-time necktie merchant, and Rose Polykoff, housewife and mother, of East New York and Flatbush, by way of the Ukraine. Shirley ended up on Park Avenue at Eighty-second. "If you asked my mother 'Are you proud to be Jewish?' she would have said yes," her daughter, Alix Nelson Frick, says. "She wasn't trying to pass. But she believed in the dream, and the dream was that you could acquire all the accouterments of the established affluent class, which included a certain breeding and a certain kind of look. Her idea was that you should be whatever you want to be, including being a blonde."
In 1956, when Shirley Polykoff was a junior copywriter at Foote, Cone & Belding, she was given the Clairol account. The product the company was launching was Miss Clairol, the first hair-color bath that made it possible to lighten, tint, condition, and shampoo at home, in a single step-to take, say, Topaz (for a champagne blond) or Moon Gold (for a medium ash), apply it in a peroxide solution directly to the hair, and get results in twenty minutes. When the Clairol sales team demonstrated their new product at the International Beauty Show, in the old Statler Hotel, across from Madison Square Garden, thousands of assembled beauticians jammed the hall and watched, openmouthed, demonstration after demonstration. "They were astonished," recalls Bruce Gelb, who ran Clairol for years, along with his father, Lawrence, and his brother Richard. "This was to the world of hair color what computers were to the world of adding machines. The sales guys had to bring buckets of water and do the rinsing off in front of everyone, because the hairdressers in the crowd were convinced we were doing something to the models behind the scenes."
Miss Clairol gave American women the ability, for the first time, to color their hair quickly and easily at home. But there was still the stigma-the prospect of the disapproving mother-in-law. Shirley Polykoff knew immediately what she wanted to say, because if she believed that a woman had a right to be a blonde she also believed that a woman ought to be able to exercise that right with discretion. "Does she or doesn't she?" she wrote, translating from the Yiddish to the English. "Only her hairdresser knows for sure." Clairol bought thirteen ad pages in Life in the fall of 1956, and Miss Clairol took off like a bird. That was the beginning. For Nice 'n Easy, Clairol's breakthrough shampoo-in hair color, she wrote, "The closer he gets, the better you look." For Lady Clairol, the cream-and-bleach combination that brought silver and platinum shades to Middle America, she wrote, "Is it true blondes have more fun?" and then, even more memorably, "If I've only one life, let me live it as a blonde!" (In the summer of 1962, just before "The Feminine Mystique" was published, Betty Friedan was, in the words of her biographer, so "bewitched" by that phrase that she bleached her hair.) Shirley Polykoff wrote the lines; Clairol perfected the product. And from the fifties to the seventies, when Polykoff gave up the account, the number of American women coloring their hair rose from seven per cent to more than forty per cent.
Today, when women go from brown to blond to red to black and back again without blinking, we think of hair-color products the way we think of lipstick. On drugstore shelves there are bottles and bottles of hair-color products with names like Hydrience and Excellence and Preference and Natural Instincts and Loving Care and Nice 'n Easy, and so on, each in dozens of different shades. Feria, the new, youth-oriented brand from L'Oreal, comes in Chocolate Cherry and Champagne Cocktail-colors that don't ask "Does she or doesn't she?" but blithely assume "Yes, she does." Hair dye is now a billion-dollar-a-year commodity.
Yet there was a time, not so long ago-between, roughly speaking, the start of Eisenhower's Administration and the end of Carter's-when hair color meant something. Lines like "Does she or doesn't she?" or the famous 1973 slogan for L'Oreal's Preference-"Because I'm worth it" were as instantly memorable as "Winston tastes good like a cigarette should" or "Things go better with Coke." They lingered long after advertising usually does and entered the language; they somehow managed to take on meanings well outside their stated intention. Between the fifties and the seventies, women entered the workplace, fought for social emancipation, got the Pill, and changed what they did with their hair. To examine the hair-color campaigns of the period is to see, quite unexpectedly, all these things as bound up together, the profound with the seemingly trivial. In writing the history of women in the postwar era, did we forget something important? Did we leave out hair?
2.
When the "Does she or doesn't she?" campaign first ran, in 1956, most advertisements that were aimed at women tended to be high glamour-"cherries in the snow, fire and ice," as Bruce Gelb puts it. But Shirley Polykoff insisted that the models for the Miss Clairol campaign be more like the girl next door-"Shirtwaist types instead of glamour gowns," she wrote in her original memo to Clairol. "Cashmere-sweater-over-the-shoulder types. Like larger-than-life portraits of the proverbial girl on the block who's a little prettier than your wife and lives in a house slightly nicer than yours." The model had to be a Doris Day type-not a Jayne Mansfield-because the idea was to make hair color as respectable and mainstream as possible. One of the earliest "Does she or doesn't she?" television commercials featured a housewife, in the kitchen preparing hors d'ouvres for a party. She is slender and pretty and wearing a black cocktail dress and an apron. Her husband comes in, kisses her on the lips, approvingly pats her very blond hair, then holds the kitchen door for her as she takes the tray of hors d'ouvres out for her guests. It is an exquisitely choreographed domestic tableau, down to the little dip the housewife performs as she hits the kitchen light switch with her elbow on her way out the door. In one of the early print ads-which were shot by Richard Avedon and then by Irving Penn-a woman with strawberry-blond hair is lying on the grass, holding a dandelion between her fingers, and lying next to her is a girl of about eight or nine. What's striking is that the little girl's hair is the same shade of blond as her mother's. The "Does she or doesn't she?" print ads always included a child with the mother to undercut the sexual undertones of the slogan-to make it clear that mothers were using Miss Clairol, and not just "fast" women-and, most of all, to provide a precise color match. Who could ever guess, given the comparison, that Mom's shade came out of a bottle?
The Polykoff campaigns were a sensation. Letters poured in to Clairol. "Thank you for changing my life,"read one, which was circulated around the company and used as the theme for a national sales meeting. "My boyfriend, Harold, and I were keeping company for five years but he never wanted to set a date. This made me very nervous. I am twenty-eight and my mother kept saying soon it would be too late for me." Then, the letter writer said, she saw a Clairol ad in the subway. She dyed her hair blond, and "that is how I am in Bermuda now on my honeymoon with Harold." Polykoff was sent a copy with a memo: "It's almost too good to be true!" With her sentimental idyll of blond mother and child, Shirley Polykoff had created something iconic.
"My mother wanted to be that woman in the picture," Polykoff's daughter, Frick, says. "She was wedded to the notion of that suburban, tastefully dressed, well-coddled matron who was an adornment to her husband, a loving mother, a long-suffering wife, a person who never overshadowed him.She wanted the blond child. In fact, I was blond as a kid, but when I was about thirteen my hair got darker and my mother started bleaching it." Of course-and this is the contradiction central to those early Clairol campaigns-Shirley Polykoff wasn't really that kind of woman at all. She always had a career. She never moved to the suburbs. "She maintained that women were supposed to be feminine, and not too dogmatic and not overshadow their husband, but she greatly overshadowed my father, who was a very pure, unaggressive, intellectual type," Frick says. "She was very flamboyant, very emotional, very dominating."
One of the stories Polykoff told about herself repeatedly- and that even appeared after her death last year, in her Times obituary-was that she felt that a woman never ought to make more than her husband, and that only after George's death, in the early sixties, would she let Foote, Cone & Belding raise her salary to its deserved level. "That's part of the legend, but it isn't the truth," Frick says. "The ideal was always as vividly real to her as whatever actual parallel reality she might be living. She never wavered in her belief in that dream, even if you would point out to her some of the fallacies of that dream, or the weaknesses, or the internal contradictions, or the fact that she herself didn't really live her life that way." For Shirley Polykoff, the color of her hair was a kind of useful fiction, a way of bridging the contradiction between the kind of woman she was and the kind of woman she felt she ought to be. It was a way of having it all. She wanted to look and feel like Doris Day without having to be Doris Day. In twenty-seven years of marriage, during which she bore two children, she spent exactly two weeks as a housewife, every day of which was a domestic and culinary disaster. "Listen, sweetie," an exasperated George finally told her. "You make a lousy little woman in the kitchen." She went back to work the following Monday.
This notion of the useful fiction-of looking the part without being the part-had a particular resonance for the America of Shirley Polykoff's generation. As a teen-ager, Shirley Polykoff tried to get a position as a clerk at an insurance agency and failed. Then she tried again, at another firm, applying as Shirley Miller. This time, she got the job. Her husband, George, also knew the value of appearances. The week Polykoff first met him, she was dazzled by his worldly sophistication, his knowledge of out-of-the-way places in Europe, his exquisite taste in fine food and wine. The second week, she learned that his expertise was all show, derived from reading the Times. The truth was that George had started his career loading boxes in the basement of Macy's by day and studying law at night. He was a faker, just as, in a certain sense, she was, because to be Jewish-or Irish or Italian or African-American or, for that matter, a woman of the fifties caught up in the first faint stirrings of feminism--was to be compelled to fake it in a thousand small ways, to pass as one thing when, deep inside, you were something else. "That's the kind of pressure that comes from the immigrants' arriving and thinking that they don't look right, that they are kind of funny-looking and maybe shorter than everyone else, and their clothes aren't expensive," Frick says. "That's why many of them began to sew, so they could imitate the patterns of the day. You were making yourself over. You were turning yourself into an American." Frick, who is also in advertising (she's the chairman of Spier NY), is a forcefully intelligent woman, who speaks of her mother with honesty and affection. "There were all those phrases that came to fruition at that time-you know, 'clothes make the man' and 'first impressions count.'" So the question "Does she or doesn't she?" wasn't just about how no one could ever really know what you were doing. It was about how no one could ever really know who you were. It really meant not "Does she?" but "Is she?" It really meant "Is she a contented homemaker or a feminist, a Jew or a Gentile--or isn't she?"
3. I am Ilon Specht, hear me roar
In 1973, Ilon Specht was working as a copywriter at the McCann-Erickson advertising agency, in New York. She was a twenty-three-year-old college dropout from California. She was rebellious, unconventional, and independent, and she had come East to work on Madison Avenue, because that's where people like that went to work back then. "It was a different business in those days," Susan Schermer, a long-time friend of Specht's, says. "It was the seventies. People were wearing feathers to work." At her previous agency, while she was still in her teens, Specht had written a famous television commercial for the Peace Corps. (Single shot. No cuts. A young couple lying on the beach. "It's a big, wide wonderful world" is playing on a radio. Voice-over recites a series of horrible facts about less fortunate parts of the world: in the Middle East half the children die before their sixth birthday, and so forth. A news broadcast is announced as the song ends, and the woman on the beach changes the station.)
"Ilon? Omigod! She was one of the craziest people I ever worked with," Ira Madris, another colleague from those years, recalls, using the word "crazy" as the highest of compliments. "And brilliant. And dogmatic. And highly creative. We all believed back then that having a certain degree of neurosis made you interesting. Ilon had a degree of neurosis that made her very interesting."
At McCann, Ilon Specht was working with L'Oreal, a French company that was trying to challenge Clairol's dominance in the American hair-color market. L'Oreal had originally wanted to do a series of comparison spots, presenting research proving that their new product-Preference-was technologically superior to Nice 'n Easy, because it delivered a more natural, translucent color. But at the last minute the campaign was killed because the research hadn't been done in the United States. At McCann, there was panic. "We were four weeks before air date and we had nothing-nada," Michael Sennott, a staffer who was also working on the account, says. The creative team locked itself away: Specht, Madris-who was the art director on the account-and a handful of others. "We were sitting in this big office," Specht recalls. "And everyone was discussing what the ad should be. They wanted to do something with a woman sitting by a window, and the wind blowing through the curtains. You know, one of those fake places with big, glamorous curtains. The woman was a complete object. I don't think she even spoke. They just didn't get it. We were in there for hours."
Ilon Specht is now the executive creative director of Jordan, McGrath, Case & Partners, in the Flatiron district, with a big office overlooking Fifth Avenue. She has long, thick black hair, held in a loose knot at the top of her head, and lipstick the color of maraschino cherries. She talks fast and loud, and swivels in her chair as she speaks, and when people walk by her office they sometimes bang on her door, as if the best way to get her attention is to be as loud and emphatic as she is. Reminiscing not long ago about the seventies, she spoke about the strangeness of corporate clients in shiny suits who would say that all the women in the office looked like models. She spoke about what it meant to be young in a business dominated by older men, and about what it felt like to write a line of copy that used the word "woman" and have someone cross it out and write "girl."
"I was a twenty-three-year-old girl-a woman," she said. "What would my state of mind have been? I could just see that they had this traditional view of women, and my feeling was that I'm not writing an ad about looking good for men, which is what it seems to me that they were doing. I just thought, Fuck you. I sat down and did it, in five minutes. It was very personal. I can recite to you the whole commercial, because I was so angry when I wrote it."
Specht sat stock still and lowered her voice: "I use the most expensive hair color in the world. Preference, by L'Oreal. It's not that I care about money. It's that I care about my hair. It's not just the color. I expect great color. What's worth more to me is the way my hair feels. Smooth and silky but with body. It feels good against my neck. Actually, I don't mind spending more for L'Oreal. Because I'm" -and here Specht took her fist and struck her chest-"worth it."
The power of the commercial was originally thought to lie in its subtle justification of the fact that Preference cost ten cents more than Nice 'n Easy. But it quickly became obvious that the last line was the one that counted. On the strength of "Because I'm worth it," Preference began stealing market share from Clairol. In the nineteen-eighties, Preference surpassed Nice 'n Easy as the leading hair-color brand in the country, and two years ago L'Oreal took the phrase and made it the slogan for the whole company. An astonishing seventy-one per cent of American women can now identify that phrase as the L'Oreal signature, which, for a slogan-as opposed to a brand name-is almost without precedent.
4.
From the very beginning, the Preference campaign was unusual. Polykoff's Clairol spots had male voice-overs. In the L'Oreal ads, the model herself spoke, directly and personally. Polykoff's commercials were "other-directed" -they were about what the group was saying ("Does she or doesn't she?") or what a husband might think ("The closer he gets, the better you look"). Specht's line was what a woman says to herself. Even in the choice of models, the two campaigns diverged. Polykoff wanted fresh, girl-next-door types. McCann and L'Oreal wanted models who somehow embodied the complicated mixture of strength and vulnerability implied by "Because I'm worth it." In the late seventies, Meredith Baxter Birney was the brand spokeswoman. At that time, she was playing a recently divorced mom going to law school on the TV drama "Family." McCann scheduled her spots during "Dallas" and other shows featuring so-called "silk blouse" women--women of strength and independence. Then came Cybill Shepherd, at the height of her run as the brash, independent Maddie on "Moonlighting," in the eighties. Now the brand is represented by Heather Locklear, the tough and sexy star of "Melrose Place." All the L'Oreal spokeswomen are blondes, but blondes of a particular type. In his brilliant 1995 book, "Big Hair: A Journey into the Transformation of Self," the Canadian anthropologist Grant McCracken argued for something he calls the "blondness periodic table," in which blondes are divided into six categories: the "bombshell blonde" (Mae West, Marilyn Monroe), the "sunny blonde" (Doris Day, Goldie Hawn), the "brassy blonde" (Candice Bergen), the "dangerous blonde" (Sharon Stone), the "society blonde" (C.Z. Guest), and the "cool blonde" (Marlene Dietrich, Grace Kelly). L'Oreal's innovation was to carve out a niche for itself in between the sunny blondes-the "simple, mild, and innocent" blondes-and the smart, bold, brassy blondes, who, in McCracken's words, "do not mediate their feelings or modulate their voices."
This is not an easy sensibility to capture. Countless actresses have auditioned for L'Oreal over the years and been turned down. "There was one casting we did with Brigitte Bardot," Ira Madris recalls (this was for another L'Oreal product), "and Brigitte, being who she is, had the damnedest time saying that line. There was something inside of her that didn't believe it. It didn't have any conviction." Of course it didn't: Bardot is bombshell, not sassy. Clairol made a run at the Preference sensibility for itself, hiring Linda Evans in the eighties as the pitchwoman for Ultress, the brand aimed at Preference's upscale positioning. This didn't work, either. Evans, who played the adoring wife of Blake Carrington on "Dynasty," was too sunny. ("The hardest thing she did on that show," Michael Sennott says, perhaps a bit unfairly, "was rearrange the flowers.")
Even if you got the blonde right, though, there was still the matter of the slogan. For a Miss Clairol campaign in the seventies, Polykoff wrote a series of spots with the tag line "This I do for me." But "This I do for me" was at best a halfhearted approximation of "Because I'm worth it"--particularly for a brand that had spent its first twenty years saying something entirely different. "My mother thought there was something too brazen about 'I'm worth it,'" Frick told me. "She was always concerned with what people around her might think. She could never have come out with that bald-faced an equation between hair color and self-esteem."
The truth is that Polykoff's sensibility-which found freedom in assimilation-had been overtaken by events. In one of Polykoff's "Is it true blondes have more fun?" commercials for Lady Clairol in the sixties, for example, there is a moment that by 1973 must have been painful to watch. A young woman, radiantly blond, is by a lake, being swung around in the air by a darkly handsome young man. His arms are around her waist. Her arms are around his neck, her shoes off, her face aglow. The voice-over is male, deep and sonorous. "Chances are," the voice says, "she'd have gotten the young man anyhow, but you'll never convince her of that." Here was the downside to Shirley Polykoff's world. You could get what you wanted by faking it, but then you would never know whether it was you or the bit of fakery that made the difference. You ran the risk of losing sight of who you really were. Shirley Polykoff knew that the all-American life was worth it, and that "he" -the handsome man by the lake, or the reluctant boyfriend who finally whisks you off to Bermuda-was worth it. But, by the end of the sixties, women wanted to know that they were worth it, too.
5. What Herta Herzog knew
Why are Shirley Polykoff and Ilon Specht important? That seems like a question that can easily be answered in the details of their campaigns. They were brilliant copywriters, who managed in the space of a phrase to capture the particular feminist sensibilities of the day. They are an example of a strange moment in American social history when hair dye somehow got tangled up in the politics of assimilation and feminism and self-esteem. But in a certain way their stories are about much more: they are about the relationship we have to the products we buy, and about the slow realization among advertisers that unless they understood the psychological particulars of that relationship-unless they could dignify the transactions of everyday life by granting them meaning-they could not hope to reach the modern consumer. Shirley Polykoff and Ilon Specht perfected a certain genre of advertising which did just this, and one way to understand the Madison Avenue revolution of the postwar era is as a collective attempt to define and extend that genre. The revolution was led by a handful of social scientists, chief among whom was an elegant, Viennese-trained psychologist by the name of Herta Herzog. What did Herta Herzog know? She knew-or, at least, she thought she knew-the theory behind the success of slogans like "Does she or doesn't she?" and "Because I'm worth it," and that makes Herta Herzog, in the end, every bit as important as Shirley Polykoff and Ilon Specht.
Herzog worked at a small advertising agency called Jack Tinker & Partners, and people who were in the business in those days speak of Tinker the way baseball fans talk about the 1927 Yankees. Tinker was the brainchild of the legendary adman Marion Harper, who came to believe that the agency he was running, McCann-Erickson, was too big and unwieldy to be able to consider things properly. His solution was to pluck a handful of the very best and brightest from McCann and set them up, first in the Waldorf Towers (in the suite directly below the Duke and Duchess of Windsor's and directly above General Douglas MacArthur's) and then, more permanently, in the Dorset Hotel, on West Fifty-fourth Street, overlooking the Museum of Modern Art. The Tinker Group rented the penthouse, complete with a huge terrace, Venetian-tiled floors, a double-height living room, an antique French polished-pewter bar, a marble fireplace, spectacular skyline views, and a rotating exhibit of modern art (hung by the partners for motivational purposes), with everything-walls, carpets, ceilings, furnishings-a bright, dazzling white. It was supposed to be a think tank, but Tinker was so successful so fast that clients were soon lined up outside the door. When Buick wanted a name for its new luxury coup?, the Tinker Group came up with Riviera. When Bulova wanted a name for its new quartz watch, Tinker suggested Accutron. Tinker also worked with Coca-Cola and Exxon and Westinghouse and countless others, whose names-according to the strict standards of secrecy observed by the group-they would not divulge. Tinker started with four partners and a single phone. But by the end of the sixties it had taken over eight floors of the Dorset.
What distinguished Tinker was its particular reliance on the methodology known as motivational research, which was brought to Madison Avenue in the nineteen-forties by a cadre of European intellectuals trained at the University of Vienna. Advertising research up until that point had been concerned with counting heads-with recording who was buying what. But the motivational researchers were concerned with why: Why do people buy what they do?What motivates them when they shop? The researchers devised surveys, with hundreds of questions, based on Freudian dynamic psychology. They used hypnosis, the Rosenzweig Picture-Frustration Study, role-playing, and Rorschach blots, and they invented what we now call the focus group. There was Paul Lazarsfeld, one of the giants of twentieth-century sociology, who devised something called the Lazarsfeld-Stanton Program Analyzer, a little device with buttons to record precisely the emotional responses of research subjects. There was Hans Zeisel, who had been a patient of Alfred Adler's in Vienna, and went to work at McCann-Erickson. There was Ernest Dichter, who had studied under Lazarsfeld at the Psychological Institute in Vienna, and who did consulting for hundreds of the major corporations of the day. And there was Tinker's Herta Herzog, perhaps the most accomplished motivational researcher of all, who trained dozens of interviewers in the Viennese method and sent them out to analyze the psyche of the American consumer.
"For Puerto Rican rum once, Herta wanted to do a study of why people drink, to tap into that below-the-surface kind of thing," Rena Bartos, a former advertising executive who worked with Herta in the early days, recalls. "We would would invite someone out to drink and they would order whatever they normally order, and we would administer a psychological test. Then we'd do it again at the very end of the discussion, after the drinks. The point was to see how people's personality was altered under the influence of alcohol." Herzog helped choose the name of Oasis cigarettes, because her psychological research suggested that the name-with its connotations of cool, bubbling springs-would have the greatest appeal to the orally-fixated smoker.
"Herta was graceful and gentle and articulate," Herbert Krugman, who worked closely with Herzog in those years, says. "She had enormous insights. Alka-Seltzer was a client of ours, and they were discussing new approaches for the next commercial. She said, 'You show a hand dropping an Alka-Seltzer tablet into a glass of water. Why not show the hand dropping two? You'll double sales.' And that's just what happened. Herta was the gray eminence. Everybody worshipped her."
Herta Herzog is now eighty-nine. After retiring from Tinker, she moved back to Europe, first to Germany and then to Austria, her homeland. She wrote an analysis of the TV show "Dallas" for the academic journal Society. She taught college courses on communications theory. She conducted a study on the Holocaust for the Vidal Sassoon Center for the Study of Anti-Semitism, in Jerusalem. Today, she lives in the mountain village of Leutasch, half an hour's hard drive up into the Alps from Innsbruck, in a white picture-book cottage with a sharply pitched roof. She is a small woman, slender and composed, her once dark hair now streaked with gray. She speaks in short, clipped, precise sentences, in flawless, though heavily accented, English. If you put her in a room with Shirley Polykoff and Ilon Specht, the two of them would talk and talk and wave their long, bejeweled fingers in the air, and she would sit unobtrusively in the corner and listen. "Marion Harper hired me to do qualitative research-the qualitative interview, which was the specialty that had been developed in Vienna at the .sterreichische Wirtschaftspsychologische Forschungsstelle," Herzog told me. "It was interviewing not with direct questions and answers but where you open some subject of the discussion relevant to the topic and then let it go. You have the interviewer not talk but simply help the person with little questions like 'And anything else?' As an interviewer, you are not supposed to influence me. You are merely trying to help me. It was a lot like the psychoanalytic method." Herzog was sitting, ramrod straight, in a chair in her living room. She was wearing a pair of black slacks and a heavy brown sweater to protect her against the Alpine chill. Behind her was row upon row of bookshelves, filled with the books of a postwar literary and intellectual life: Mailer in German, Reisman in English. Open and face down on a long couch perpendicular to her chair was the latest issue of the psychoanalytic journal Psyche. "Later on, I added all kinds of psychological things to the process, such as word-association tests, or figure drawings with a story. Suppose you are my respondent and the subject is soap. I've already talked to you about soap. What you see in it. Why you buy it. What you like about it. Dislike about it. Then at the end of the interview I say, 'Please draw me a figure-anything you want-and after the figure is drawn tell me a story about the figure.'"
When Herzog asked her subjects to draw a figure at the end of an interview, she was trying to extract some kind of narrative from them, something that would shed light on their unstated desires. She was conducting, as she says, a psychoanalytic session. But she wouldn't ask about hair-color products in order to find out about you, the way a psychoanalyst might; she would ask about you in order to learn about hair-color products. She saw that the psychoanalytic interview could go both ways. You could use the techniques of healing to figure out the secrets of selling. "Does she or doesn't she?" and "Because I'm worth it" did the same thing: they not only carried a powerful and redemptive message, but-and this was their real triumph-they succeeded in attaching that message to a five-dollar bottle of hair dye. The lasting contribution of motivational research to Madison Avenue was to prove that you could do this for just about anything-that the products and the commercial messages with which we surround ourselves are as much a part of the psychological furniture of our lives as the relationships and emotions and experiences that are normally the subject of psychoanalytic inquiry.
"There is one thing we did at Tinker that I remember well,"Herzog told me, returning to the theme of one of her, and Tinker's, coups. "I found out that people were using Alka-Seltzer for stomach upset, but also for headaches," Herzog said. "We learned that the stomach ache was the kind of ache where many people tended to say 'It was my fault.' Alka-Seltzer had been mostly advertised in those days as a cure for overeating, and overeating is something you have done. But the headache is quite different. It is something imposed on you." This was, to Herzog, the classic psychological insight. It revealed Alka-Seltzer users to be divided into two apparently incompatible camps-the culprit and the victim-and it suggested that the company had been wooing one at the expense of the other. More important, it suggested that advertisers, with the right choice of words, could resolve that psychological dilemma with one or, better yet, two little white tablets. Herzog allowed herself a small smile. "So I said the nice thing would be if you could find something that combines these two elements. The copywriter came up with 'the blahs.'" Herzog repeated the phrase, "the blahs," because it was so beautiful. "The blahs was not one thing or the other-it was not the stomach or the head. It was both."
6.
This notion of household products as psychological furniture is, when you think about it, a radical idea. When we give an account of how we got to where we are, we're inclined to credit the philosophical over the physical, and the products of art over the products of commerce. In the list of sixties social heroes, there are musicians and poets and civil-rights activists and sports figures. Herzog's implication is that such a high-minded list is incomplete. What, say, of Vidal Sassoon? In the same period, he gave the world the Shape, the Acute Angle, and the One-Eyed Ungaro. In the old "cosmology of cosmetology," McCracken writes, "the client counted only as a plinth...the conveyor of the cut." But Sassoon made individualization the hallmark of the haircut, liberating women's hair from the hair styles of the times-from, as McCracken puts it, those "preposterous bits of rococo shrubbery that took their substance from permanents, their form from rollers, and their rigidity from hair spray." In the Herzogian world view, the reasons we might give to dismiss Sassoon's revolution-that all he was dispensing was a haircut, that it took just half an hour, that it affects only the way you look, that you will need another like it in a month-are the very reasons that Sassoon is important. If a revolution is not accessible, tangible, and replicable, how on earth can it be a revolution?
"Because I'm worth it" and "Does she or doesn't she?" were powerful, then, precisely because they were commercials, for commercials come with products attached, and products offer something that songs and poems and political movements and radical ideologies do not, which is an immediate and affordable means of transformation. "We discovered in the first few years of the 'Because I'm worth it' campaign that we were getting more than our fair share of new users to the category-women who were just beginning to color their hair," Sennott told me. "And within that group we were getting those undergoing life changes, which usually meant divorce. We had far more women who were getting divorced than Clairol had. Their children had grown, and something had happened, and they were reinventing themselves." They felt different, and Ilon Specht gave them the means to look different-and do we really know which came first, or even how to separate the two? They changed their lives and their hair. But it wasn't one thing or the other. It was both.
7.
Since the mid-nineties, the spokesperson for Clairol's Nice 'n Easy has been Julia Louis-Dreyfus, better known as Elaine, from "Seinfeld." In the Clairol tradition, she is the girl next door-a postmodern Doris Day. But the spots themselves could not be less like the original Polykoff campaigns for Miss Clairol. In the best of them, Louis-Dreyfus says to the dark-haired woman in front of her on a city bus, "You know, you'd look great as a blonde." Louis-Dreyfus then shampoos in Nice 'n Easy Shade 104 right then and there, to the gasps and cheers of the other passengers. It is Shirley Polykoff turned upside down: funny, not serious; public, not covert.
L'Oreal, too, has changed. Meredith Baxter Birney said "Because I'm worth it" with an earnestness appropriate to the line. By the time Cybill Shepherd became the brand spokeswoman, in the eighties, it was almost flip-a nod to the materialism of the times-and today, with Heather Locklear, the spots have a lush, indulgent feel. "New Preference by L'Oreal,"she says in one of the current commercials. "Pass it on. You're worth it." The "because" -which gave Ilon Specht's original punch line such emphasis-is gone. The forceful "I'm" has been replaced by "you're." The Clairol and L'Oreal campaigns have converged. According to the Spectra marketing firm, there are almost exactly as many Preference users as Nice 'n Easy users who earn between fifty thousand and seventy-five thousand dollars a year, listen to religious radio, rent their apartments, watch the Weather Channel, bought more than six books last year, are fans of professional football, and belong to a union.
But it is a tribute to Ilon Specht and Shirley Polykoff's legacy that there is still a real difference between the two brands. It's not that there are Clairol women or L'Oreal women. It's something a little subtler. As Herzog knew, all of us, when it comes to constructing our sense of self, borrow bits and pieces, ideas and phrases, rituals and products from the world around us-over-the-counter ethnicities that shape, in some small but meaningful way, our identities. Our religion matters, the music we listen to matters, the clothes we wear matter, the food we eat matters-and our brand of hair dye matters, too. Carol Hamilton, L'Oreal's vice-president of marketing, says she can walk into a hair-color focus group and instantly distinguish the Clairol users from the L'Oreal users. "The L'Oreal user always exhibits a greater air of confidence, and she usually looks better-not just her hair color, but she always has spent a little more time putting on her makeup, styling her hair," Hamilton told me. "Her clothing is a little bit more fashion-forward. Absolutely, I can tell the difference." Jeanne Matson, Hamilton's counterpart at Clairol, says she can do the same thing. "Oh, yes," Matson told me. "There's no doubt. The Clairol woman would represent more the American-beauty icon, more naturalness. But it's more of a beauty for me, as opposed to a beauty for the external world. L'Oreal users tend to be a bit more aloof. There is a certain warmth you see in the Clairol people. They interact with each other more. They'll say, 'I use Shade 101.' And someone else will say, 'Ah, I do, too!' There is this big exchange."
These are not exactly the brand personalities laid down by Polykoff and Specht, because this is 1999, and not 1956 or 1973. The complexities of Polykoff's artifice have been muted. Specht's anger has turned to glamour. We have been left with just a few bars of the original melody. But even that is enough to insure that "Because I'm worth it" will never be confused with "Does she or doesn't she?" Specht says, "It meant I know you don't think I'm worth it, because that's what it was with the guys in the room. They were going to take a woman and make her the object. I was defensive and defiant. I thought, I'll fight you. Don't you tell me what I am. You've been telling me what I am for generations." As she said "fight," she extended the middle finger of her right hand. Shirley Polykoff would never have given anyone the finger. She was too busy exulting in the possibilities for self-invention in her America-a land where a single woman could dye her hair and end up lying on a beach with a ring on her finger. At her retirement party, in 1973, Polykoff reminded the assembled executives of Clairol and of Foote, Cone & Belding about the avalanche of mail that arrived after their early campaigns: "Remember that letter from the girl who got to a Bermuda honeymoon by becoming a blonde?"
Everybody did.
"Well," she said, with what we can only imagine was a certain sweet vindication, "I wrote it."
Dept. of Finales
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 24, 1999
talk of the town
"Melrose Place," 1992-1999, R.I.P.
During the 1995-96 season of "Melrose Place"--unquestionably the finest in the seven-year run of the prime-time soap, which comes to an end this week--the winsome redhead known as Dr. Kimberly Shaw experienced a sudden breakthrough in her therapy with Dr. Peter Burns, whom, according to the convolutions of the Melrose narrative, she happened to be living with at the moment. Burns was also acting as her lawyer and guardian, in addition to being her lover and therapist, although those last two descriptions are not quite accurate, since Kimberly and Dr. Burns weren't sleeping together at the time of her therapy breakthrough and, what's more, Burns wasn't really a therapist. From all appearances, he was actually a surgeon, or--since he also treated the show's central figure (and his future lover), Amanda, when she had her cancer scare--an oncologist, or, at the very least, a hunky guy with a stethoscope and a pager, which, in the Melrose universe, is all you really need to be to pass your medical boards.
In any case, in the first or second session between Kimberly and Dr. Peter Burns--her lawyer, suitor, guardian, non-therapist therapist, landlord, and room-mate--Kimberly realized that the reason she had been exhibiting strong homicidal tendencies was that she had been suppressing the childhood memory of having killed a very evil man who bore a distinct resemblance to a ferret. In a daring plot twist, Michael and Sydney--Kimberly's ex-lover and her romantic rival, respectively--got hold of a sketch she had made of her ferret-faced tormentor and hired an actor to impersonate him in an effort to make Kimberly think that she was still as crazy as ever. And that's exactly what happened, until Kimberly, toward the end of one episode, confronted the actor playing the ferret man and peeled off his prosthetic makeup, vanquishing her personal demon once and for all.
If you talk to aficionados of "Melrose Place," they will tell you that the ferret-man moment, more than any other, captured what was truly important about the series: here was an actor playing a doctor, in therapy with another actor playing a doctor who was himself impersonating a therapist, confronting an actor playing an actor playing her own personal demon, and when she unmasked him she found . . . that he was just another actor! Or something like that. The wonderful thing about "Melrose Place" was that just when you thought that the show was about to make some self-consciously postmodern commentary on, say, the relationship between art and life, it had the courage to take the easy way out and go for the laugh.
"Melrose Place" was often, mistakenly, lumped with its sister show on Fox, "Beverly Hills, 90210," which, like "Melrose," was an Aaron Spelling Production. At one point, Fox even ran the two shows back to back on Wednesday nights. But they were worlds apart. "90210" was the most conventional kind of television. It played to the universal desire of adolescents to be grownups, and it presented the world inside West Beverly High as one driven by the same social and ethical and political issues as the real world. "90210" was all about teens behaving like adults. "Melrose" was the opposite. It started with a group of adults--doctors, advertising executives, fashion designers--and dared to have them behave as foolishly and as naively as adolescents. Most of them lived in the same apartment building, where they fought and drank and wore really tight outfits and slept together in every conceivable permutation. They were all dumb, and the higher they rose in the outside world the dumber they got when they came home to Melrose Place.
In the mid-nineteen-nineties, when a generation of Americans reached adulthood and suddenly realized that they didn't want to be there, the inverted world of Melrose was a wonderfully soothing place. Here, after all, was a show that ostensibly depicted sophisticated grownup society, and every viewer was smarter than the people on the screen. Could anyone believe, for example, that when Kimberly came back from her breakthrough session with Peter Burns and went home to make dinner for Peter Burns, and Peter Burns sidled up to kiss her as she was slicing carrots, bra-less, he never stopped to think that here was his client and patient and tenant and analysand--a woman who had just tried to kill all kinds of people--and she was in his kitchen holding a knife? Peter! You moron! Watch the knife!
Dept. of Straight Thinking
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 12, 1999
talk of the town
Is the Belgian Coca-Cola hysteria the real thing?
The wave of illness among Belgian children last month had the look and feel--in the beginning, at least--of an utterly typical food poisoning outbreak. First, forty-two children in the Belgian town of Bornem became mysteriously ill after drinking Coca-Cola and had to be hospitalized. Two days later, eight more school children fell sick in Bruges, followed by thirteen in Harelbeke the next day and forty two in Lochristi three days after that--and on and on in a widening spiral that, in the end, sent more than one hundred children to the hospital complaining of nausea, dizziness, and headaches, and forced Coca-Cola into the biggest product recall in its hundred-and-thirteen-year history. Upon investigation, an apparent culprit was found. In the Coca-Cola plant in Antwerp, contaminated carbon dioxide had been used to carbonate a batch of the soda's famous syrup. With analysts predicting that the scare would make a dent in Coca-Cola's quarterly earnings, the soft-drink giant apologized to the Belgian people, and the world received a sobering reminder of the fragility of food safety.
The case isn't as simple as it seems, though. A scientific study ordered by Coca-Cola found that the contaminants in the carbon dioxide were sulfur compounds left over from the production process. In the tainted bottles of Coke, these residues were present at between five and seventeen parts per billion. These sulfides can cause illness, however, only at levels about a thousand times greater than that. At seventeen parts per billion, they simply leave a bad smell--like rotten eggs--which means that Belgium should have experienced nothing more than a minor epidemic of nose-wrinkling. More puzzling is the fact that, in four of the five schools were the bad Coke allegedly struck, half of the kids who got sick hadn't drunk any Coke that day. Whatever went on Belgium, in other words, probably wasn't Coca-Cola poisoning. So what was it? Maybe nothing at all.
"You know, when this business started I bet two of my friends a bottle of champagne each that I knew the cause," Simon Wessely, a psychiatrist who teaches at the King's College School of Medicine in London, said.
"It's quite simple. It's just mass hysteria. These things usually are."
Wessely has been collecting reports of this kind of hysteria for about ten years and now has hundreds of examples, dating back as far as 1787, when millworkers in Lancashire suddenly took ill after they became persuaded that they were being poisoned by tainted cotton. According to Wessely, almost all cases fit a pattern. Someone sees a neighbor fall ill and becomes convinced that he is being contaminated by some unseen evil--in the past it was demons and spirits; nowadays it tends to be toxins and gases--and his fear makes him anxious. His anxiety makes him dizzy and nauseous. He begins to hyperventilate. He collapses. Other people hear the same allegation, see the "victim" faint, and they begin to get anxious themselves. They feel nauseous. They hyperventilate. They collapse, and before you know it everyone in the room is hyperventilating and collapsing. These symptoms, Wessely stresses, are perfectly genuine. It's just that they are manifestations of a threat that is wholly imagined. "This kind of thing is extremely common," he says, "and it's almost normal. It doesn't mean that you are mentally ill or crazy."
Mass hysteria comes in several forms. Mass motor hysteria, for example, involves specific physical movements: shaking, tremors, and convulsions. According to the sociologist Robert Bartholomew, motor hysteria often occurs in environments of strict emotional repression; it was common in medieval nunneries and in nineteenth-century European schools, and it is seen today in some Islamic cultures. What happened in Belgium, he says, is a fairly typical example of a more standard form of contagious anxiety, possibly heightened by the recent Belgian scare over dioxin-contaminated animal feed. The students' alarm over the rotten-egg odor of their Cokes, for example, is straight out of the hysteria textbooks. "The vast majority of these events are triggered by some abnormal but benign smell," Wessely said. "Something strange, like a weird odor coming from the air conditioning."
The fact that the outbreaks occurred in schools is also typical of hysteria cases. "The classic ones always involve schoolchildren," Wessely continued. "There is a famous British case involving hundreds of schoolgirls who collapsed during a 1980 Nottinghamshire jazz festival. They blamed it on a local farmer spraying pesticides." Bartholomew has just published a paper on a hundred and fifteen documented hysteria cases in schools over the past three hundred years. As anyone who has ever been to a rock concert knows, large numbers of adolescents in confined spaces seem to be particularly susceptible to mass hysteria. Those intent on pointing the finger at Coca-Cola in this sorry business ought to remember that. "We let the people of Belgium down," Douglas Ivester, the company's chairman, said in the midst of the crisis. Or perhaps it was the other way around.
The Science of the Sleeper
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
October 4, 1999
ANNALS OF MARKETING
How the Information Age
could blow away the blockbuster.
1.
In 1992, a sometime actress named Rebecca Wells published a novel called "Little Altars Everywhere" with a small, now defunct press in Seattle. Wells was an unknown author, and the press had no money for publicity. She had a friend, however, who spent that Thanksgiving with a friend who was a producer of National Public Radio's "All Things Considered." The producer read the book and passed it on to Linda Wertheimer, a host of the show, and she liked it so much that she put Wells on her program. That interview, in turn, was heard by a man who was listening to the radio in Blytheville, Arkansas, and whose wife, Mary Gay Shipley, ran the town bookstore. He bought the book and gave it to her; she loved it, and, with that, the strange and improbable rise of Rebecca Wells, best-selling author, began. Blytheville is a sleepy little town about an hour or so up the Mississippi from Memphis, and Mary Gay Shipley's bookstore--That Bookstore in Blytheville--sits between the Red Ball Barber Shop and Westbrook's shoe store on a meandering stretch of Main Street. The store is just one long room in a slightly shabby storefront, with creaky floors and big overhead fans and subject headings on the shelves marked with Post-it notes. Shipley's fiction section takes up about as much shelf space as a typical Barnes & Noble devotes to, say, homeopathic medicine. That's because Shipley thinks that a book buyer ought to be able to browse and read the jacket flap of everything that might catch her eye, without being overwhelmed by thousands of choices. Mostly, though, people come to Mary Gay Shipley's store in order to find out what Mary Gay thinks they ought to be reading, and in 1993 Mary Gay Shipley thought people ought to be reading "Little Altars Everywhere." She began ordering it by the dozen, which, Shipley says, "for us, is huge." She put it in the little rack out front where she lists her current favorites. She wrote about it in the newsletter she sends to her regular customers. "We could tell it was going to have a lot of word of mouth," she says. "It was the kind of book where you could say, 'You'll love it. Take it home.' " The No. 1 author at That Bookstore in Blytheville in 1993 was John Grisham, as was the case in nearly every bookstore in the country. But No. 2 was Rebecca Wells.
"Little Altars Everywhere" was not a best-seller. But there were pockets of devotees around the country--in Blytheville; at the Garden District Book Shop, in New Orleans; at Parkplace books, in Kirkland, Washington--and those pockets created a buzz that eventually reached Diane Reverand, an editor in New York. Reverand published Wells's next book, "Divine Secrets of the Ya-Ya Sisterhood," and when it hit the bookshelves the readers and booksellers of Blytheville, the Garden District, and Kirkland were ready. "When 'The Ya-Ya Sisterhood' came out, I met with an in-store sales rep from HarperCollins," Shipley said. She is a tall woman with graying hair and a quiet, dignified bearing. "I'm not real sure he knew what a hot book this was. When he came in the store, I just turned the page of the catalogue and said, 'I want one hundred copies,' and his jaw fell to the table, because I usually order four or two or one. And I said, 'I want her to come here! And if you go anywhere, tell people this woman sells in Blytheville!'"
Wells made the trip to Arkansas and read in the back of Shipley's store; the house was packed, and the women in the front row wore placards saying "Ya-Ya." She toured the country, and the crowds grew steadily bigger. "Before the numbers really showed it, I'd be signing books and there would be groups of women who would come together, six or seven, and they would have me sign anywhere between three and ten books," Wells recalls. "And then, after that, I started noticing mothers and daughters coming. Then I noticed that the crowds started to be three-generational--there would be teen-agers and sixth graders." "Ya-Ya" sold fifteen thousand copies in hardcover. The paperback sold thirty thousand copies in its first two months. Diane Reverand took out a single- column ad next to the contents page of The New Yorker--the first dollar she'd spent on advertising for the paperback--and sales doubled to sixty thousand in a month. It sold and sold, and by February of 1998, almost two years after the book was published, it reached the best-seller lists. There are now nearly three million copies in print. Rebecca Wells, needless to say, has a warm spot in her heart for people like Mary Gay Shipley. "Mary Gay is a legend," she says. "She just kept putting my books in people's hands."
2.
In the book business, as in the movie business, there are two kinds of hits: sleepers and blockbusters. John Grisham and Tom Clancy and Danielle Steel write blockbusters. Their books are announced with huge publicity campaigns. Within days of publication, they leap onto the best-seller lists. Sales start high--hundreds of thousands of copies in the first few weeks--and then taper off. People who buy or watch blockbusters have a clear sense of what they are going to get: a Danielle Steel novel is always--well, a Danielle Steel novel. Sleepers, on the other hand, are often unknown quantities. Sales start slowly and gradually build; publicity, at least early on, is often nonexistent. Sleepers come to your attention by a slow, serendipitous path: a friend who runs into a friend who sets up the interview that just happens to be heard by a guy married to a bookseller. Sleepers tend to emerge from the world of independent bookstores, because independent bookstores are the kinds of places where readers go to ask the question that launches all sleeper hits: Can you recommend a book to me? Shipley was plugging Terry Kay's "To Dance with the White Dog" long before it became a best-seller. She had Melinda Haynes lined up to do a reading at her store before Oprah tapped "Mother of Pearl" as one of her recommended books and it shot onto the best-seller lists. She read David Guterson's "Snow Falling on Cedars" in manuscript and went crazy for it. "I called the publisher, and they said, 'We think it's a regional book.' And I said, 'Write it down. "M.G.S. says this is an important book."'" All this makes it sound as if she has a sixth sense for books that will be successful, but that's not quite right. People like Mary Gay Shipley don't merely predict sleeper hits; they create sleeper hits.
Most of us, of course, don't have someone like Mary Gay Shipley in our lives, and with the decline of the independent bookstore in recent years the number of Shipleys out there creating sleeper hits has declined as well. The big chain bookstores that have taken over the bookselling business are blockbuster factories, since the sheer number of titles they offer can make browsing an intimidating proposition. As David Gernert, who is John Grisham's agent and editor, explains, "If you walk into a superstore, that's where being a brand makes so much more of a difference. There is so much more choice it's overwhelming. You see walls and walls of books. In that kind of environment, the reader is drawn to the known commodity. The brand-name author is now a safe haven." Between 1986 and 1996, the share of book sales represented by the thirty top-selling hardcover books in America nearly doubled.
The new dominance of the blockbuster is part of a familiar pattern. The same thing has happened in the movie business, where a handful of heavily promoted films featuring "bankable" stars now command the lion's share of the annual box-office. We live, as the economists Robert Frank and Philip Cook have argued, in a "winner-take-all society," which is another way of saying that we live in the age of the blockbuster. But what if there were a way around the blockbuster? What if there were a simple way to build your very own Mary Gay Shipley? This is the promise of a new technology called collaborative filtering, one of the most intriguing developments to come out of the Internet age.
3.
If you want a recommendation about what product to buy, you might want to consult an expert in the field. That's a function that magazines like Car and Driver and Sound & Vision perform. Another approach is to poll users or consumers of a particular product or service and tabulate their opinions. That's what the Zagat restaurant guides and consumer-ratings services like J. D. Power and Associates do. It's very helpful to hear what an "expert" audiophile has to say about the newest DVD player, or what the thousands of owners of the new Volkswagen Passat have to say about reliability and manufacturing defects. But when it comes to books or movies--what might be called "taste products"--these kinds of recommendations aren't nearly as useful. Few moviegoers, for example, rely on the advice of a single movie reviewer. Most of us gather opinions from a variety of sources--from reviewers whom we have agreed with in the past, from friends who have already seen the movie, or from the presence of certain actors or directors whom we already like--and do a kind of calculation in our heads. It's an imperfect procedure. You can find out a great deal about what various critics have to say. But they're strangers, and, to predict correctly whether you'll like something, the person making the recommendation really has to know something about you.
That's why Shipley is such a powerful force in touting new books. She has lived in Blytheville all her life and has run the bookstore there for twenty-three years, and so her customers know who she is. They trust her recommendations. At the same time, she knows who they are, so she knows how to match up the right book with the right person. For example, she really likes David Guterson's new novel, "East of the Mountains," but she's not about to recommend it to anyone. It's about a doctor who has cancer and plans his own death and, she says, "there are some people dealing with a death in their family for whom this is not the book to read right now." She had similar reservations about Charles Frazier's "Cold Mountain." "There were people I know who I didn't think would like it," Shipley said. "And I'd tell them that. It's a journey story. It's not what happens at the end that matters, and there are some people for whom that's just not satisfying. I don't want them to take it home, try to read it, not like it, then not go back to that writer." Shipley knows what her customers will like because she knows who they are.
Collaborative filtering is an attempt to approximate this kind of insider knowledge. It works as a kind of doppelgänger search engine. All of us have had the experience of meeting people and discovering that they appear to have the very same tastes we do--that they really love the same obscure foreign films that we love, or that they are fans of the same little-known novelist whom we are obsessed with. If such a person recommended a book to you, you'd take that recommendation seriously, because cultural tastes seem to run in patterns. If you and your doppelgänger love the same ten books, chances are you'll also like the eleventh book he likes. Collaborative filtering is simply a system that sifts through the opinions and preferences of thousands of people and systematically finds your doppelgänger--and then tells you what your doppelgänger's eleventh favorite book is.
John Riedl, a University of Minnesota computer scientist who is one of the pioneers of this technology, has set up a Web site called MovieLens, which is a very elegant example of collaborative filtering at work. Everyone who logs on--and tens of thousands of people have already done so--is asked to rate a series of movies on a scale of 1 to 5, where 5 means "must see" and 1 means "awful." For example, Irated "Rushmore" as a 5, which meant that I was put into the group of people who loved "Rushmore." I then rated "Summer of Sam" as a 1, which put me into the somewhat smaller and more select group that both loved "Rushmore" and hated "Summer of Sam." Collaborative-filtering systems don't work all that well at first, because, obviously, in order to find someone's cultural counterparts you need to know a lot more about them than how they felt about two movies. Even after I had given the system seven opinions (including "Election," 4; "Notting Hill," 2; "The Sting," 4; and "Star Wars," 1), it was making mistakes. It thought I would love "Titanic" and "Zero Effect," and I disliked them both. But after I had plugged in about fifteen opinions--which Riedl says is probably the minimum--I began to notice that the rating that MovieLens predicted I would give a movie and the rating I actually gave it were nearly always, almost eerily, the same. The system had found a small group of people who feel exactly the same way I do about a wide range of popular movies.
What makes this collaborative-filtering system different from those you may have encountered on Amazon.com or Barnesandnoble.com? In order to work well, collaborative filtering requires a fairly representative sample of your interests or purchases. But most of us use retailers like Amazon only for a small percentage of our purchases. For example, I buy the fiction I read at the Barnes & Noble around the corner from where I live. I buy most of my nonfiction in secondhand bookstores, and I use Amazon for gifts and for occasional work-related books that I need immediately, often for a specific and temporary purpose. That's why, bizarrely, Amazon currently recommends that I buy a number of books by the radical theorist Richard Bandler, none of which I have any desire to read. But if I were to buy a much bigger share of my books on-line, or if I "educated" the filter--as Amazon allows every customer to do--and told it what I think of its recommendations, it's easy to see how, over time, it could turn out to be a powerful tool.
In a new book, "Net Worth," John Hagel, an E-commerce consultant with McKinsey & Company, and his co-author, Marc Singer, suggest that we may soon see the rise of what they call "infomediaries," which are essentially brokers who will handle our preference information. Imagine, for example, that I had set up a company that collected and analyzed all your credit-card transactions. That information could be run through a collaborative filter, and the recommendations could be sold to retailers in exchange for discounts. Steve Larsen, the senior vice-president of marketing for Net Perceptions--a firm specializing in collaborative filtering which was started by Riedl and the former Microsoft executive Steven Snyder, among others--says that someday there might be a kiosk at your local video store where you could rate a dozen or so movies and have the computer generate recommendations for you from the movies the store has in stock. "Better yet, when I go there with my wife we put in my card and her card and say, 'Find us a movie we both like,'" he elaborates. "Or, even better yet, when we go with my fifteen-year-old daughter, 'Find us a movie all three of us like.'" Among marketers, the hope is that such computerized recommendations will increase demand. Right now, for example, thirty-five per cent of all people who enter a video store leave empty-handed, because they can't figure out what they want; the point of putting kiosks in those stores would be to lower that percentage. "It means that people might read more, or listen to music more, or watch videos more, because of the availability of an accurate and dependable and reliable method for them to learn about things that they might like," Snyder says.
One of Net Perceptions' clients is SkyMall, which is a company that gathers selections from dozens of mail-order catalogues--from Hammacher Schlemmer and L. L. Bean to the Wine Enthusiast--and advertises them in the magazines that you see in the seat pockets of airplanes. SkyMall licensed the system both for their Web site and for their 800-number call center, where the software looks for your doppelgänger while you are calling in with your order, and a few additional recommendations pop up on the operator's screen. SkyMall's system is still in its infancy, but, in a test, the company found that it has increased the total sales per customer somewhere between fifteen and twenty-five per cent. What's remarkable about the SkyMall system is that it links products from many different categories. It's one thing, after all, to surmise that if someone likes "The Remains of the Day" he is also going to like "A Room with a View." But it's quite another to infer that if you liked a particular item from the Orvis catalogue there's a certain item from Reliable Home Office that you'll also be interested in. "Their experience has been absolutely hilarious," Larsen says. "One of the very first recommendations that came out of the engine was for a gentleman who was ordering a blue cloth shirt, a twenty-eight-dollar shirt. Our engine recommended a hundred-and-thirty-five-dollar cigar humidor--and he bought it! I don't think anybody put those two together before."
The really transformative potential of collaborative filtering, however, has to do with the way taste products--books, plays, movies, and the rest--can be marketed. Marketers now play an elaborate game of stereotyping. They create fixed sets of groups--middle-class-suburban, young-urban-professional, inner-city- working-class, rural-religious, and so on--and then find out enough about us to fit us into one of those groups. The collaborative-filtering process, on the other hand, starts with who we are, then derives our cultural "neighborhood" from those facts. And these groups aren't permanent. They change as we change. I have never seen a film by Luis Buñuel, and I have no plans to. I don't put myself in the group of people who like Buñuel. But if I were to see "That Obscure Object of Desire" tomorrow and love it, and enter my preference on MovieLens, the group of people they defined as "just like me" would immediately and subtly change.
A group at Berkeley headed by the computer scientist Ken Goldberg has, for instance, developed a collaborative-filtering system for jokes. If you log on to the site, known as Jester, you are given ten jokes to rate. (Q.: Did you hear about the dyslexic devil worshipper? A.: He sold his soul to Santa.) These jokes aren't meant to be especially funny; they're jokes that reliably differentiate one "sense of humor" from another. On the basis of the humor neighborhood you fall into, Jester gives you additional jokes that it thinks you'll like. Goldberg has found that when he analyzes the data from the site--and thirty-six thousand people so far have visited Jester--the resulting neighborhoods are strikingly amorphous. In other words, you don't find those thirty-six thousand people congregating into seven or eight basic humor groups--off-color, say, or juvenile, or literary. "What we'd like to see is nice little clusters," Goldberg says. "But, when you look at the results, what you see is something like a cloud with sort of bunches, and nothing that is nicely defined. It's kind of like looking into the night sky. It's very hard to identify the constellations." The better you understand someone's particular taste pattern--the deeper you probe into what he finds interesting or funny--the less predictable and orderly his preferences become.
Collaborative filtering underscores a lesson that, for the better part of history, humans have been stubbornly resistant to learning: if you want to understand what one person thinks or feels or likes or does it isn't enough to draw inferences from the general social or demographic category to which he belongs. You cannot tell, with any reasonable degree of certainty, whether someone will like "The Girl's Guide to Hunting and Fishing" by knowing that the person is a single twenty-eight-year-old woman who lives in Manhattan, any more than you can tell whether somebody will commit a crime knowing only that he's a twenty-eight- year-old African-American male who lives in the Bronx. Riedl has taken demographic data from the people who log on to MovieLens--such as their age and occupation and sex--but he has found that it hardly makes his predictions any more accurate. "What you tell us about what you like is far more predictive of what you will like in the future than anything else we've tried," he says. "It seems almost dumb to say it, but you tell that to marketers sometimes and they look at you puzzled."
None of this means that standard demographic data is useless. If you were trying to figure out how to market a coming- of-age movie, you'd be most interested in collaborative-filtering data from people below, say, the age of twenty-eight. Facts such as age and sex and place of residence are useful in sorting the kinds of information you get from a recommendation engine. But the central claim of the collaborative-filtering movement is that, head to head, the old demographic and "psychographic" data cannot compete with preference data. This is a potentially revolutionary argument. Traditionally, there has been almost no limit to the amount of information marketers have wanted about their customers: academic records, work experience, marital status, age, sex, race, Zip Code, credit records, focus-group sessions--everything has been relevant, because in trying to answer the question of what we want marketers have taken the long way around and tried to find out first who we are. Collaborative filtering shows that, in predicting consumer preferences, none of this information is all that important. In order to know what someone wants, what you really need to know is what they've wanted.
4.
How will this affect the so-called blockbuster complex? When a bookstore's sales are heavily driven by the recommendations of a particular person--a Mary Gay Shipley--sleepers, relatively speaking, do better and blockbusters do worse. If you were going to read only Clancy and Grisham and Steel, after all, why would you need to ask Shipley what to read? This is what David Gernert, Grisham's agent, meant when he said that in a Barnes & Noble superstore a brand like Grisham enjoys a "safe haven." It's a book you read when there is no one, like Shipley, with the credibility to tell you what else you ought to read. Gernert says that at this point in Grisham's career each of his novels follows the same general sales pattern. It rides high on the best-seller lists for the first few months, of course, but, after that, "his sales pick up at very specific times--notably, Father's Day and Mother's Day, and then it will sell well again for Christmas." That description makes it clear that Grisham's books are frequently bought as gifts. And that's because gifts are the trickiest of all purchases. They require a guess about what somebody else likes, and in conditions of uncertainty the logical decision is to buy the blockbuster, the known quantity.
Collaborative filtering is, in effect, anti-blockbuster. The more information the system has about you, the more narrow and exclusive its recommendations become. It's just like Shipley: it uses its knowledge about you to steer you toward choices you wouldn't normally know about. I gave MovieLens my opinions on fifteen very mainstream American movies. I'm a timid and unsophisticated moviegoer. I rarely see anything but very commercial Hollywood releases. It told me, in return, that I would love "C'est Arrivé Près de Chez Vous," an obscure 1992 Belgian comedy, and "Shall We Dance," the 1937 Fred and Ginger vehicle. In other words, among my moviegoing soul mates are a number of people who share my views on mainstream fare but who also have much greater familiarity with foreign and classic films. The system essentially put me in touch with people who share my tastes but who happen to know a good deal more about movies. Collaborative filtering gives voice to the expert in every preference neighborhood. A world where such customized recommendations were available would allow Shipley's well-read opinions to be known not just in Blytheville but wherever there are people who share her taste in books.
Collaborative filtering, in short, has the ability to reshape the book market. When customized recommendations are available, choices become more heterogeneous. Big bookstores lose their blockbuster bias, because customers now have a way of narrowing down their choices to the point where browsing becomes easy again. Of the top hundred best-selling books of the nineteen-nineties, there are only a handful that can accurately be termed sleepers--Robert James Waller's "The Bridges of Madison County," James Redfield's "The Celestine Prophecy," John Berendt's "Midnight in the Garden of Good and Evil," Charles Frazier's "Cold Mountain." Just six authors--John Grisham, Tom Clancy, Stephen King, Michael Crichton, Dean Koontz, and Danielle Steel--account for sixty-three of the books on the list. In a world more dependent on collaborative filtering, Grisham, Clancy, King, and Steel would still sell a lot of books. But you'd expect to see many more books like "Divine Secrets of the Ya-Ya Sisterhood"--many more new writers--make their way onto the best- seller list. And the gap between the very best selling books and those in the middle would narrow. Collaborative filtering, Hagel says, "favors the smaller, the more talented, more quality products that may have a hard time getting visibility because they are not particularly good at marketing."
5.
In recent years, That Bookstore in Blytheville has become a mecca for fiction in the South. Prominent writers drop by all the time to give readings in the back, by the potbellied stove. John Grisham himself has been there nine times, beginning with his tour for "The Firm," which was the hit that turned him into a blockbuster author. Melinda Haynes, Bobbie Ann Mason, Roy Blount, Jr., Mary Higgins Clark, Billie Letts, Sandra Brown, Jill Conner Browne, and countless others have recently made the drive up from Memphis. Sometimes Shipley will host a supper for them after the reading, and send the proceeds from the event to a local literacy program.
There seems, in this era of mega-stores, something almost impossibly quaint about That Bookstore in Blytheville. The truth is, though, that the kind of personalized recommendation offered by Mary Gay Shipley represents the future of marketing, not its past. The phenomenal success in recent years of Oprah Winfrey's book club--which created one best-seller after another on the strength of its nominations--suggests that, in this age of virtually infinite choice, readers are starved for real advice, desperate for a recommendation from someone they know and who they feel knows them. "Certain people don't want to waste their time experimenting with new books, and the function we provide here is a filter," Shipley says, and as she speaks you can almost hear the makings of another sleeper on the horizon. "If we like something, we get behind it. I'm reading a book right now called 'Nissa's Place,' by Alexandria LaFaye. She's a woman I think we're going to be hearing more from."
Clicks and Mortar
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
December 6, 1999
ANNALS OF RETAIL
Don't believe the Internet hype:
the real E-commerce revolution happened off-line.
1.
At the turn of this century, a Missouri farmer named D.Ward King invented a device that came to be known, in his honor, as the King Road Drag. It consisted of two wooden rails that lay side by side about three feet apart, attached by a series of wooden braces. If you pulled the King Drag along a muddy road, it had the almost magical effect of smoothing out the ruts and molding the dirt into a slight crown, so that the next time it rained the water would drain off to the sides. In 1906, when King demonstrated his device to a group of farmers in Wellsville, Kansas, the locals went out and built a hundred King Drags of their own within the week, which makes sense, because if you had asked a farmer at the turn of the century what single invention could make his life easier he would probably have wanted something that improved the roads. They were, in the late nineteenth century, a disaster: of the country's two million miles of roads, fewer than a hundred and fifty thousand had been upgraded with gravel or oil. The rest were dirt. They turned into rivers of mud when it was raining, and hardened into an impassable sea of ruts when it was not. A trip to church or to go shopping was an exhausting ordeal for many farmers. At one point in the early part of this century, economists estimated that it cost more to haul a bushel of wheat along ten miles of American dirt road than it did to ship it across the ocean from New York to Liverpool.
The King Road Drag was a simple invention that had the effect of reducing the isolation of the American farmer, and soon that simple invention led to all kinds of dramatic changes. Ever since the Post Office was established, for example, farmers had to make the difficult trek into town to pick up their mail. In the eighteen-nineties, Congress pledged that mail would be delivered free to every farmer's home, but only so long as rural communities could demonstrate that their roads were good enough for a mailman to pass by every day--which was a Catch-22 neatly resolved by the King Road Drag. And once you had rural free delivery and good roads, something like parcel post became inevitable. Through the beginning of the century, all packages that weighed more than four pounds were carried by private-express services, which were unreliable and expensive and would, outside big cities, deliver only to a set of depots. But if the mail was being delivered every day to rural dwellers, why not have the mailman deliver packages, too? In 1912, Congress agreed, and with that the age of the mail-order house began: now a farmer could look through a catalogue that contained many thousands of products and have them delivered right to his door. Smaller companies, with limited resources, had a way to bypass the middleman and reach customers all over the country. You no longer needed to sell to the consumer through actual stores made of bricks and mortar. You could build a virtual store!
In the first fifteen years of this century, in other words, America underwent something of a revolution. Before rural free delivery, if you didn't live in a town--and most Americans didn't--it wasn't really practical to get a daily newspaper. It was only after daily delivery that the country became "wired," in the sense that if something happened in Washington or France or the Congo one evening, everyone would know about it by the next morning. In 1898, mailmen were delivering about eighteen thousand pieces of mail per rural route. Within five years, that number had more than doubled, and by 1929 it had topped a hundred thousand.
Here was the dawn of the modern consumer economy--an economy in which information moved freely around the country, in which retailers and consumers, buyers and sellers became truly connected for the first time. "You may go to an average store, spend valuable time and select from a limited stock at retail prices," the fall 1915 Sears, Roebuck catalogue boasted, "or have our Big Store of World Wide Stocks at Economy Prices come to you in this catalog--the Modern Way." By the turn of the century, the Sears catalogue had run to over a thousand pages, listing tens of thousands of items in twenty-four departments: music, buggies, stoves, carriage hardware, drugs, vehicles, shoes, notions, sewing machines, cloaks, sporting goods, dry goods, hardware, groceries, furniture and baby carriages, jewelry, optical goods, books, stereopticons, men's clothing, men's furnishings, bicycles, gramophones, and harnesses. Each page was a distinct site, offering a reader in- depth explanations and descriptions well beyond what he would expect if he went to a store, talked to a sales clerk, and personally examined a product. To find all those products, the company employed scores of human search engines--"missionaries" who, the historians Boris Emmet and John Jeuck write, were "said to travel constantly, inspecting the stocks of virtually all retail establishments in the country, conversing with the public at large to discover their needs and desires, and buying goods 'of all kinds and descriptions'" in order to post them on the World Wide Stock.
The catalogue, as economists have argued, represented a radical transformation in the marketing and distribution of consumer goods. But, of course, that transformation would not have been possible unless you had parcel post, and you couldn't have had parcel post unless you had rural free delivery, and you could not have had rural free delivery without good roads, and you would not have had good roads without D. Ward King. So what was the genuine revolution? Was it the World Wide Stock or was it the King Road Drag?
2.
We are now, it is said, in the midst of another business revolution. "This new economy represents a tectonic upheaval in our commonwealth, a far more turbulent reordering than mere digital hardware has produced," Kevin Kelly, a former executive editor of Wired, writes in his book "New Rules for the New Economy." In "Cyber Rules," the software entrepreneurs Thomas M. Siebel and Pat House compare the advent of the Internet to the invention of writing, the appearance of a metal currency in the eastern Mediterranean several thousand years ago, and the adoption of the Arabic zero. "Business," Bill Gates states flatly in the opening sentence of "Business @ the Speed of Thought," "is going to change more in the next ten years than it has in the last fifty."
The revolution of today, however, turns out to be as difficult to define as the revolution of a hundred years ago. Kelly, for example, writes that because of the Internet "the new economy is about communication, deep and wide." Communication, he maintains, "is not just a sector of the economy. Communication is the economy." But which is really key--how we communicate, or what we communicate? Gates, meanwhile, is preoccupied with the speed of interaction in the new economy. Going digital, he writes, will "shatter the old way of doing business" because it will permit almost instant communication. Yet why is the critical factor how quickly I communicate some decision or message to you--as opposed to how long it takes me to make that decision, or how long it takes you to act on it? Gates called his book "Business @ the Speed of Thought," but thought is a slow and messy thing. Computers do nothing to speed up our thought process; they only make it a lot faster to communicate our thoughts once we've had them. Gates should have called his book "Business @ the Speed of Typing." In "Growing Up Digital," Don Tapscott even goes so far as to claim that the rise of the Internet has created an entirely new personality among the young. N-Geners, as Tapscott dubs the generation, have a different set of assumptions about work than their parents have. They thrive on collaboration, and many find the notion of a boss somewhat bizarre....They are driven to innovate and have a mindset of immediacy requiring fast results. They love hard work because working, learning, and playing are the same thing to them. They are creative in ways their parents could only imagine....Corporations who hire them should be prepared to have their windows and walls shaken.
Let's leave aside the fact that the qualities Tapscott ascribes to the Net Generation--energy, a "mindset of immediacy," creativity, a resistance to authority, and (of all things) sharp differences in outlook from their parents--could safely have been ascribed to every upcoming generation in history. What's interesting here is the blithe assumption, which runs through so much of the thinking and talking about the Internet, that this new way of exchanging information must be at the root of all changes now sweeping through our economy and culture. In these last few weeks before Christmas, as the country's magazines and airways become crowded with advertisements for the fledgling class of dot coms, we may be tempted to concur. But is it possible that, once again, we've been dazzled by the catalogues and forgotten the roads?
3.
The world's largest on-line apparel retailer is Lands' End, in Wisconsin. Lands' End began in 1963 as a traditional mail-order company. It mailed you its catalogue, and you mailed back your order along with a check. Then, in the mid-nineteen-eighties, Lands' End, like the rest of the industry, reinvented itself. It mailed you its catalogue, and you telephoned an 800 number with your order and paid with a credit card. Now Lands' End has moved on line. In the first half of this year, E-commerce sales accounted for ten per cent of Lands' End's total business, up two hundred and fifty per cent from last year. What has this move to the Web meant?
Lands' End has its headquarters in the tiny farming town of Dodgeville, about an hour's drive west of Madison, through the rolling Midwestern countryside. The main Lands' End campus is composed of half a dozen modern, low-slung buildings, clustered around a giant parking lot. In one of those buildings, there is a huge open room filled with hundreds of people sitting in front of computer terminals and wearing headsets. These are the people who take your orders. Since the bulk of Lands' End's business is still driven by the catalogue and the 800 number, most of those people are simply talking on the phone to telephone customers. But a growing percentage of the reps are now part of the company's Internet team, serving people who use the Lands' End Live feature on the company's Web site. Lands' End Live allows customers, with the click of a mouse, to start a live chat with a Lands' End representative or get a rep to call them at home, immediately.
On a recent fall day, a Lands' End Live user--let's call her Betty--was talking to one of the company's customer-service reps, a tall, red-haired woman named Darcia. Betty was on the Lands' End Web site to buy a pair of sweatpants for her young daughter, and had phoned to ask a few questions.
"What size did I order last year?" Betty asked. "I think I need one size bigger." Darcia looked up the record of Betty's purchase. Last year, she told Betty, she bought the same pants in big- kid's small.
"I'm thinking medium or large," Betty said. She couldn't decide.
"The medium is a ten or a twelve, really closer to a twelve," Darcia told her. "I'm thinking if you go to a large, it will throw you up to a sixteen, which is really big."
Betty agreed. She wanted the medium. But now she had a question about delivery. It was Thursday morning, and she needed the pants by Tuesday. Darcia told her that the order would go out on Friday morning, and with U.P.S. second-day air she would almost certainly get it by Tuesday. They briefly discussed spending an extra six dollars for the premium, next- day service, but Darcia talked Betty out of it. It was only an eighteen-dollar order, after all.
Betty hung up, her decision made, and completed her order on the Internet. Darcia started an on-line chat with a woman from the East Coast. Let's call her Carol. Carol wanted to buy the forty-nine-dollar attaché case but couldn't decide on a color. Darcia was partial to the dark olive, which she said was "a professional alternative to black." Carol seemed convinced, but she wanted the case monogrammed and there were eleven monogramming styles on the Web-site page.
"Can I have a personal suggestion?" she wrote.
"Sure," Darcia typed back. "Who is the case for?"
"A conservative psychiatrist," Carol replied.
Darcia suggested block initials, in black. Carol agreed, and sent the order in herself on the Internet. "All right," Darcia said, as she ended the chat. "She feels better." The exchange had taken twenty-three minutes.
Notice that in each case the customer filled out the actual order herself and sent it in to the Lands' End computer electronically--which is, of course, the great promise of E-commerce. But that didn't make some human element irrelevant. The customers still needed Darcia for advice on colors, and styles, or for reassurance that their daughter was a medium and not a large. In each case, the sale was closed because that human interaction allayed the last-minute anxieties and doubts that so many of us have at the point of purchase. It's a mistake, in other words, to think that E-commerce will entirely automate the retail process. It just turns reps from order-takers into sales advisers.
"One of the big fallacies when the Internet came along was that you could get these huge savings by eliminating customer- service costs," Bill Bass, the head of E-commerce for Lands' End, says. "People thought the Internet was self-service, like a gas station. But there are some things that you cannot program a computer to provide. People will still have questions, and what you get are much higher-level questions. Like, 'Can you help me come up with a gift?' And they take longer."
Meanwhile, it turns out, Internet customers at Lands' End aren't much different from 800-number customers. Both groups average around a hundred dollars an order, and they have the same rate of returns. Call volume on the 800 numbers is highest on Mondays and Tuesdays, from ten in the morning until one in the afternoon. So is E-commerce volume. In the long term, of course, the hope is that the Web site will reduce dependence on the catalogue, and that would be a huge efficiency. Given that last year the company mailed two hundred and fifty million catalogues, costing about a dollar each, the potential savings could be enormous. And yet customers' orders on the Internet spike just after a new catalogue arrives at people's homes in exactly the same way that the 800-number business spikes just after the catalogue arrives. E-commerce users, it seems, need the same kind of visual, tangible prompting to use Lands' End as traditional customers. If Lands' End did all its business over the Internet, it would still have to send out something in the mail--a postcard or a bunch of fabric swatches or a slimmed-down catalogue. "We thought going into E-commerce it would be a different business," Tracy Schmit, an Internet analyst at the company, says. "But it's the same business, the same patterns, the same contacts. It's an extension of what we already do."
4.
Now consider what happens on what retailers call the "back end"--the customer-fulfillment side--of Lands' End's operations. Say you go to the company's Web site one afternoon and order a blue 32-16 oxford-cloth button-down shirt and a pair of size-9 Top-Siders. At midnight, the computer at Lands' End combines your order with all the other orders for the day: it lumps your shirt order with the hundred other orders, say, that came in for 32-16 blue oxford-cloth button-downs, and lumps your shoe order with the fifty other size-9 Top-Sider orders of the day. It then prints bar codes for every item, so each of those hundred shirts is assigned a sticker listing the location of blue oxford 32-16 shirts in the warehouse, the order that it belongs to, shipping information, and instructions for things like monogramming.
The next morning, someone known as a "picker" finds the hundred oxford- cloth shirts in that size, yours among them, and puts a sticker on each one, as does another picker in the shoe area with the fifty size-9 Top-Siders. Each piece of merchandise is placed on a yellow plastic tray along an extensive conveyor belt, and as the belt passes underneath a bar-code scanner the computer reads the label and assembles your order. The tray with your shirt on it circles the room until it is directly above a bin that has been temporarily assigned to you, and then tilts, sending the package sliding downward. Later, when your shoes come gliding along on the belt, the computer reads the bar code on the box and sends the shoe box tumbling into the same bin. Then the merchandise is packed and placed on another conveyor belt, and a bar-code scanner sorts the packages once again, sending the New York-bound packages to the New York-bound U.P.S. truck, the Detroit packages to the Detroit truck, and so on.
It's an extraordinary operation. When you stand in the middle of the Lands' End warehouse--while shirts and pants and sweaters and ties roll by at a rate that, at Christmas, can reach twenty-five thousand items an hour--you feel as if you're in Willy Wonka's chocolate factory. The warehouses are enormous buildings--as big, in all, as sixteen football fields--and the conveyor belts hang from the ceiling like giant pieces of industrial sculpture. Every so often, a belt lurches to a halt, and a little black scanner box reads the bar code and sends the package off again, directing it left or right or up or down, onto any number of separate sidings and overpasses. In the middle of one of the buildings, there is another huge room where thousands of pants, dangling from a jumbo-sized railing like a dry cleaner's rack, are sorted by color (so sewers don't have to change thread as often) and by style, then hemmed, pressed, bagged, and returned to the order-fulfillment chain--all within a day.
This system isn't unique to Lands' End. If you went to L. L. Bean or J.Crew or, for that matter, a housewares-catalogue company like Pottery Barn, you'd find the same kind of system. It's what all modern, automated warehouses look like, and it is as much a part of E-commerce as a Web site. In fact, it is the more difficult part of E-commerce. Consider the problem of the Christmas rush. Lands' End records something like thirty per cent of its sales during November and December. A well- supported Web site can easily handle those extra hits, but for the rest of the operation that surge in business represents a considerable strain. Lands' End, for example, aims to respond to every phone call or Lands' End Live query within twenty seconds, and to ship out every order within twenty-four hours of its receipt. In August, those goals are easily met. But, to maintain that level of service in November and December, Lands' End must hire an extra twenty-six hundred people, increasing its normal payroll by more than fifty per cent. Since unemployment in the Madison area is hovering around one per cent, this requires elaborate planning: the company charters buses to bring in students from a nearby college, and has made a deal in the past with a local cheese factory to borrow its workforce for the rush. Employees from other parts of the company are conscripted to help out as pickers, while others act as "runners" in the customer-service department, walking up and down the aisles and jumping into any seat made vacant by someone taking a break. Even the structure of the warehouse is driven, in large part, by the demands of the holiday season. Before the popularization of the bar code, in the early nineteen- eighties, Lands' End used what is called an "order picking" method. That meant that the picker got your ticket, then went to the shirt room and got your shirt, and the shoe room and got your shoes, then put your order together. If another shoe-and- shirt order came over next, she would have to go back to the shirts and back to the shoes all over again. A good picker under the old system could pick between a hundred and fifty and a hundred and seventy-five pieces an hour. The new technique, known as "batch picking," is so much more efficient that a good picker can now retrieve between six hundred and seven hundred pieces an hour. Without bar codes, if you placed an order in mid-December, you'd be hard pressed to get it by Christmas.
None of this is to minimize the significance of the Internet. Lands' End has a feature on its Web site which allows you to try clothes on a virtual image of yourself--a feature that is obviously not possible with a catalogue. The Web site can list all the company's merchandise, whereas a catalogue has space to list only a portion of the inventory. But how big a role does the Internet ultimately play in E-commerce? It doesn't much affect the cost of running a customer-service department. It reduces catalogue costs, but it doesn't eliminate traditional marketing, because you still have to remind people of your Web site. You still need to master batch picking. You still need the Willy Wonka warehouse. You still need dozens of sewers in the inseaming department, and deals with the local cheese factory, and buses to ship in students every November and December. The head of operations for Lands' End is a genial man in his fifties named Phil Schaecher, who works out of a panelled office decorated with paintings of ducks which overlooks the warehouse floor. When asked what he would do if he had to choose between the two great innovations of the past twenty years--the bar code, which has transformed the back end of his business, and the Internet, which is transforming the front end--Schaecher paused, for what seemed a long time. "I'd take the Internet," he said finally, toeing the line that all retailers follow these days. Then he smiled. "But of course if we lost bar codes I'd retire the next day."
5.
On a recent fall morning, a young woman named Charlene got a call from a shipping agent at a firm in Oak Creek, Wisconsin. Charlene is a dispatcher with a trucking company in Akron, Ohio, called Roberts Express. She sits in front of a computer with a telephone headset on, in a large crowded room filled with people in front of computers wearing headsets, not unlike the large crowded room at Lands' End. The shipping agent told Charlene that she had to get seven drums of paint to Muskegon, Michigan, as soon as possible. It was 11:25 a.m. Charlene told the agent she would call her back, and immediately typed those details into her computer, which relayed the message to the two-way-communications satellite that serves as the backbone for the Roberts transportation network. The Roberts satellite, in turn, "pinged" the fifteen hundred independent truckers that Roberts works with, and calculated how far each available vehicle was from the customer in Oak Creek. Those data were then analyzed by proprietary software, which sorted out the cost of the job and the distance between Muskegon and Oak Creek, and sifted through more than fifteen variables governing the optimal distribution of the fleet.
This much--the satellite relay and the probability calculation--took a matter of seconds. The trip, Charlene's screen told her, was two hundred and seventy-four miles and would cost seven hundred and twenty-six dollars. The computer also gave her twenty-three candidates for the run, ranked in order of preference. The first, Charlene realized, was ineligible, because federal regulations limit the number of hours drivers can spend on the road. The second, she found out, was being held for another job. The third, according to the satellite, was fifty miles away, which was too far. But the fourth, a husband- and-wife team named Jerry and Ann Love, seemed ideal. They were just nineteen miles from OakCreek. "I've worked with them before," Charlene said. "They're really nice people." At eleven-twenty-seven, Charlene sent the Loves an E-mail message, via satellite, that would show up instantly on the computer screens Roberts installs in the cabs of all its contractors. According to Roberts' rules, they had ten minutes to respond. "I'm going to give them a minute or two," Charlene said. There was no answer, so she called the Loves on their cell phone. Ann Love answered. "We'll do that," she said. Charlene chatted with her for a moment and then, as an afterthought, E-mailed the Loves again: "Thank you!" It was eleven-thirty.
Trucking companies didn't work this way twenty years ago. But Roberts uses its state-of-the-art communications and computer deployment to give the shipping business a new level of precision. If your pickup location is within twenty-five miles of one of the company's express centers--and Roberts has express centers in most major North American cities--Roberts will pick up a package of almost any size within ninety minutes, and it will do so twenty-four hours a day, seven days a week. If the cargo is located between twenty-six and fifty miles of an express center, it will be picked up within two hours. More than half of those deliveries will be made by midnight of the same day. Another twenty-five per cent will be made by eight o'clock the next morning. Ninety-six per cent of all Roberts deliveries are made within fifteen minutes of the delivery time promised when the order is placed. Because of its satellite system, the company knows precisely, within yards, where your order is at all times. The minute the computer tells her your truck is running fifteen minutes behind, Charlene or one of her colleagues will call you to work out some kind of solution. Roberts has been known to charter planes or send in Huey helicopters to rescue time-sensitive cargo stranded in traffic or in a truck that has broken down. The result is a truck-based system so efficient that Roberts estimates it can outperform air freight at distances of up to seven hundred or eight hundred miles.
Roberts, of course, isn't the only company to reinvent the delivery business over the past twenty years. In the same period, Federal Express has put together, from scratch, a network of six hundred and forty-three planes, forty-three thousand five hundred vehicles, fourteen hundred service centers, thirty-four thousand drop boxes, and a hundred and forty-eight thousand employees--all coordinated by satellite links and organized around a series of huge, automated, bar- code-driven Willy Wonka warehouses. Federal Express was even a pioneer in the development of aircraft antifog navigational equipment: if it absolutely, positively has to get there overnight, the weather can't be allowed to get in the way.
E-commerce would be impossible without this extraordinary infrastructure. Would you care that you could order a new wardrobe with a few clicks of a mouse if the package took a couple of weeks to get to you? Lands' End has undergone three major changes over the past couple of decades. The first was the introduction of an 800 number, in 1978; the second was express delivery, in 1994; and the third was the introduction of a Web site, in 1995. The first two innovations cut the average transaction time--the time between the moment of ordering and the moment the goods are received--from three weeks to four days. The third innovation has cut the transaction time from four days to, well, four days.
It isn't just that E-commerce depends on express mail; there's a sense in which E-commerce is express mail. Right now, billions of dollars are being spent around the country on so-called "last-mile delivery systems." Companies such as Webvan, in San Francisco, or Kozmo.com, in New York, are putting together networks of trucks and delivery personnel which can reach almost any home in their area within an hour. What if Webvan or Kozmo were somehow integrated into a huge, national, Roberts-style network of connected trucks? And what if that network were in turn integrated into the operations of a direct merchant like Lands' End? There may soon come a time when a customer from Northampton could order some shirts on LandsEnd.com at the height of the Christmas rush, knowing that the retailer's computer could survey its stock, assess its warehouse capabilities, "ping" a network of thousands of trucks it has at its disposal, look up how many other orders are going to his neck of the woods, check in with his local Kozmo or Webvan, and tell him, right then and there, precisely what time it could deliver those shirts to him that evening or the next morning. It's not hard to imagine, under such a system, that Lands' End's sales would soar; the gap between the instant gratification of a real store and the delayed gratification of a virtual store would narrow even further. It would be a revolution of sorts, a revolution of satellites, probability models, people in headsets, cell phones, truckers, logistics experts, bar codes, deals with the local cheese factory, and--oh yes, the Internet.
The interesting question, of course, is why we persist in identifying the E-commerce boom as an Internet revolution. Part of the reason, perhaps, is simply the convenience of the word "Internet" as a shorthand for all the technological wizardry of the last few decades. But surely whom and what we choose to celebrate in any period of radical change says something about the things we value. This fall, for example, the Goodyear Tire & Rubber Company--a firm with sales of more than thirteen billion dollars--was dropped from the Dow Jones industrial average. After all, Goodyear runs factories, not Web sites. It is based in Akron, not in Silicon Valley. It is part of the highway highway, not the information highway. The manufacturing economy of the early twentieth century, from which Goodyear emerged, belonged to trade unions and blue-collar men. But ours is the first economic revolution in history that the educated classes have sought to claim as wholly their own, a revolution of Kevin Kelly's "communication" and Bill Gates's "thought"--the two activities for which the Net-Geners believe themselves to be uniquely qualified. Today's talkers and thinkers value the conception of ideas, not their fulfillment. They give credit to the catalogue, but not to the postman who delivered it, or to the road he travelled on. The new economy was supposed to erase all hierarchies. Instead, it has devised another one. On the front end, there are visionaries. On the back end, there are drones.
6.
One of the very first packages ever delivered by parcel post, in 1913, was an eight-pound crate of apples sent from New Jersey to President Wilson at the White House. The symbolism of that early delivery was deliberate. When the parcel post was established, the assumption was that it would be used by farmers as a way of sending their goods cheaply and directly to customers in the city. "Let us imagine that the Gotham family," one journalist wrote at the time, immured in the city by the demands of Father Gotham's business, knew that twice a week during the summer they could get from Farmer Ruralis, forty miles out in the country, a hamper of fresh-killed poultry, green peas, string beans, asparagus, strawberries, lettuce, cherries, summer squash, and what not; that the "sass" would be only a day from garden to table; that prices would be lower than market prices; that the cost of transportation would be only thirty-five cents in and, say, eleven cents for the empty hamper back again. Would the Gotham family be interested?
The Post Office told rural mailmen to gather the names and addresses of all those farmers along their routes who wanted to sell their produce by mail. Those lists were given to city mailmen, who delivered them along their routes, so interested customers could get in contact with interested farmers directly. Because customers wanted to know what kind of produce each farmer had to sell, local postmasters began including merchandise information on their lists, essentially creating a farm-produce mail-order catalogue. A California merchant named David Lubin proposed a scheme whereby a farmer would pick up colored cards from the post office--white for eggs, pink for chickens, yellow for butter--mark each card with his prices, and mail the cards back. If he had three chickens that week for a dollar each, he would mail three pink cards to the post office. There they would be put in a pigeonhole with all the other pink cards. Customers could come by and comparison shop, pick out the cards they liked, write their address on these cards, and have the postal clerk mail them back to the farmer. It was a pre-digital eBay. The scheme was adopted in and around Sacramento, and Congress appropriated ten thousand dollars to try a similar version of it on a large scale.
At about the same time, an assistant Postmaster General, James Blakslee, had the bright idea of putting together a fleet of parcel-post trucks, which would pick up farm produce from designated spots along the main roads and ship it directly to town. Blakslee laid out four thousand miles of produce routes around the country, to be covered by fifteen hundred parcel- post trucks. In 1918, in the system's inaugural run, four thousand day-old chicks, two hundred pounds of honey, five hundred pounds of smoked sausage, five hundred pounds of butter, and eighteen thousand eggs were carried from Lancaster, Pennsylvania, to New York City, all for $31.60 in postage. New York's Secretary of State called it "an epoch in the history of the United States and the world."
Only, it wasn't. The Post Office had devised a wonderful way of communicating between farmer and customer. But there is more to a revolution than communication, and within a few years the farm-to-table movement, which started out with such high hopes, was dead. The problem was that Blakslee's trucks began to break down, which meant that the food on board spoiled. Eggs proved hard to package, and so they often arrived damaged. Butter went rancid. In the winter of 1919-20, Blakslee collected a huge number of orders for potatoes, but, as Wayne Fuller writes in his wonderful history of the era, "RFD:The Changing Face of Rural America," the potatoes that year were scarce, and good ones even scarcer, and when Blakslee's men were able to buy them and attempted delivery, nothing but trouble followed. Some of the potatoes were spoiled to begin with; some froze in transit; prices varied, deliveries went astray, and customers complained loudly enough for Congress to hear. One harried official wrote Blakslee that he could "fill the mails with complaints from people who have ordered potatoes from October to December."... Some people had been waiting over four months, either to have the potatoes delivered or their money refunded.
Parcel post, in the end, turned out to be something entirely different from what was originally envisioned--a means not to move farm goods from country to town but to move consumer goods from town to country. That is the first lesson from the revolution of a hundred years ago, and it's one that should give pause to all those eager to pronounce on the significance of the Internet age: the nature of revolutions is such that you never really know what they mean until they are over. The other lesson, of course, is that coming up with a new way of connecting buyers and sellers is a very fine thing, but what we care about most of all is getting our potatoes.
John Rock's Error
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
March 10, 2000
ANNALS OF MEDICINE
What the co-inventor of the Pill
didn't know about menstruation
can endanger women's health.
1.
John Rock was christened in 1890 at the Church of the Immaculate Conception in Marlborough, Massachusetts, and married by Cardinal William O'Connell, of Boston. He had five children and nineteen grandchildren. A crucifix hung above his desk, and nearly every day of his adult life he attended the 7 a.m. Mass at St. Mary's in Brookline. Rock, his friends would say, was in love with his church. He was also one of the inventors of the birth-control pill, and it was his conviction that his faith and his vocation were perfectly compatible. To anyone who disagreed he would simply repeat the words spoken to him as a child by his home-town priest: "John, always stick to your conscience. Never let anyone else keep it for you. And I mean anyone else." Even when Monsignor Francis W. Carney, of Cleveland, called him a "moral rapist," and when Frederick Good, the longtime head of obstetrics at Boston City Hospital, went to Boston's Cardinal Richard Cushing to have Rock excommunicated, Rock was unmoved. "You should be afraid to meet your Maker," one angry woman wrote to him, soon after the Pill was approved. "My dear madam," Rock wrote back, "in my faith, we are taught that the Lord is with us always. When my time comes, there will be no need for introductions."
In the years immediately after the Pill was approved by the F.D.A., in 1960, Rock was everywhere. He appeared in interviews and documentaries on CBS and NBC, in Time, Newsweek, Life, The Saturday Evening Post. He toured the country tirelessly. He wrote a widely discussed book, "The Time Has Come: A Catholic Doctor's Proposals to End the Battle Over Birth Control," which was translated into French, German, and Dutch. Rock was six feet three and rail-thin, with impeccable manners; he held doors open for his patients and addressed them as "Mrs." or "Miss." His mere association with the Pill helped make it seem respectable. "He was a man of great dignity," Dr. Sheldon J. Segal, of the Population Council, recalls. "Even if the occasion called for an open collar, you'd never find him without an ascot. He had the shock of white hair to go along with that. And posture, straight as an arrow, even to his last year." At Harvard Medical School, he was a giant, teaching obstetrics for more than three decades. He was a pioneer in in-vitro fertilization and the freezing of sperm cells, and was the first to extract an intact fertilized egg. The Pill was his crowning achievement. His two collaborators, Gregory Pincus and Min- Cheuh Chang, worked out the mechanism. He shepherded the drug through its clinical trials. "It was his name and his reputation that gave ultimate validity to the claims that the pill would protect women against unwanted pregnancy," Loretta McLaughlin writes in her marvellous 1982 biography of Rock. Not long before the Pill's approval, Rock travelled to Washington to testify before the F.D.A. about the drug's safety. The agency examiner, Pasquale DeFelice, was a Catholic obstetrician from Georgetown University, and at one point, the story goes, DeFelice suggested the unthinkable--that the Catholic Church would never approve of the birth-control pill. "I can still see Rock standing there, his face composed, his eyes riveted on DeFelice," a colleague recalled years later, "and then, in a voice that would congeal your soul, he said, 'Young man, don't you sell my church short.' "
In the end, of course, John Rock's church disappointed him. In 1968, in the encyclical "Humanae Vitae," Pope Paul VI outlawed oral contraceptives and all other "artificial" methods of birth control. The passion and urgency that animated the birth-control debates of the sixties are now a memory. John Rock still matters, though, for the simple reason that in the course of reconciling his church and his work he made an error. It was not a deliberate error. It became manifest only after his death, and through scientific advances he could not have anticipated. But because that mistake shaped the way he thought about the Pill--about what it was, and how it worked, and most of all what it meant--and because John Rock was one of those responsible for the way the Pill came into the world, his error has colored the way people have thought about contraception ever since.
John Rock believed that the Pill was a "natural" method of birth control. By that he didn't mean that it felt natural, because it obviously didn't for many women, particularly not in its earliest days, when the doses of hormone were many times as high as they are today. He meant that it worked by natural means. Women can get pregnant only during a certain interval each month, because after ovulation their bodies produce a surge of the hormone progesterone. Progesterone--one of a class of hormones known as progestin--prepares the uterus for implantation and stops the ovaries from releasing new eggs; it favors gestation. "It is progesterone, in the healthy woman, that prevents ovulation and establishes the pre- and post-menstrual 'safe' period," Rock wrote. When a woman is pregnant, her body produces a stream of progestin in part for the same reason, so that another egg can't be released and threaten the pregnancy already under way. Progestin, in other words, is nature's contraceptive. And what was the Pill? Progestin in tablet form. When a woman was on the Pill, of course, these hormones weren't coming in a sudden surge after ovulation and weren't limited to certain times in her cycle. They were being given in a steady dose, so that ovulation was permanently shut down. They were also being given with an additional dose of estrogen, which holds the endometrium together and--as we've come to learn--helps maintain other tissues as well. But to Rock, the timing and combination of hormones wasn't the issue. The key fact was that the Pill's ingredients duplicated what could be found in the body naturally. And in that naturalness he saw enormous theological significance.
In 1951, for example, Pope Pius XII had sanctioned the rhythm method for Catholics because he deemed it a "natural" method of regulating procreation: it didn't kill the sperm, like a spermicide, or frustrate the normal process of procreation, like a diaphragm, or mutilate the organs, like sterilization. Rock knew all about the rhythm method. In the nineteen-thirties, at the Free Hospital for Women, in Brookline, he had started the country's first rhythm clinic for educating Catholic couples in natural contraception. But how did the rhythm method work? It worked by limiting sex to the safe period that progestin created. And how did the Pill work? It worked by using progestin to extend the safe period to the entire month. It didn't mutilate the reproductive organs, or damage any natural process. "Indeed," Rock wrote, oral contraceptives "may be characterized as a 'pill-established safe period,' and would seem to carry the same moral implications" as the rhythm method. The Pill was, to Rock, no more than "an adjunct to nature."
In 1958, Pope Pius XII approved the Pill for Catholics, so long as its contraceptive effects were "indirect"--that is, so long as it was intended only as a remedy for conditions like painful menses or "a disease of the uterus." That ruling emboldened Rock still further. Short-term use of the Pill, he knew, could regulate the cycle of women whose periods had previously been unpredictable. Since a regular menstrual cycle was necessary for the successful use of the rhythm method--and since the rhythm method was sanctioned by the Church--shouldn't it be permissible for women with an irregular menstrual cycle to use the Pill in order to facilitate the use of rhythm? And if that was true why not take the logic one step further? As the federal judge John T. Noonan writes in "Contraception," his history of the Catholic position on birth control:
If it was lawful to suppress ovulation to achieve a regularity necessary for successfully sterile intercourse, why was it not lawful to suppress ovulation without appeal to rhythm? If pregnancy could be prevented by pill plus rhythm, why not by pill alone? In each case suppression of ovulation was used as a means. How was a moral difference made by the addition of rhythm?
These arguments, as arcane as they may seem, were central to the development of oral contraception. It was John Rock and Gregory Pincus who decided that the Pill ought to be taken over a four-week cycle--a woman would spend three weeks on the Pill and the fourth week off the drug (or on a placebo), to allow for menstruation. There was and is no medical reason for this. A typical woman of childbearing age has a menstrual cycle of around twenty- eight days, determined by the cascades of hormones released by her ovaries. As first estrogen and then a combination of estrogen and progestin flood the uterus, its lining becomes thick and swollen, preparing for the implantation of a fertilized egg. If the egg is not fertilized, hormone levels plunge and cause the lining--the endometrium--to be sloughed off in a menstrual bleed. When a woman is on the Pill, however, no egg is released, because the Pill suppresses ovulation. The fluxes of estrogen and progestin that cause the lining of the uterus to grow are dramatically reduced, because the Pill slows down the ovaries. Pincus and Rock knew that the effect of the Pill's hormones on the endometrium was so modest that women could conceivably go for months without having to menstruate. "In view of the ability of this compound to prevent menstrual bleeding as long as it is taken," Pincus acknowledged in 1958, "a cycle of any desired length could presumably be produced." But he and Rock decided to cut the hormones off after three weeks and trigger a menstrual period because they believed that women would find the continuation of their monthly bleeding reassuring. More to the point, if Rock wanted to demonstrate that the Pill was no more than a natural variant of the rhythm method, he couldn't very well do away with the monthly menses. Rhythm required "regularity," and so the Pill had to produce regularity as well.
It has often been said of the Pill that no other drug has ever been so instantly recognizable by its packaging: that small, round plastic dial pack. But what was the dial pack if not the physical embodiment of the twenty-eight-day cycle? It was, in the words of its inventor, meant to fit into a case "indistinguishable" from a woman's cosmetics compact, so that it might be carried "without giving a visual clue as to matters which are of no concern to others." Today, the Pill is still often sold in dial packs and taken in twenty-eight-day cycles. It remains, in other words, a drug shaped by the dictates of the Catholic Church--by John Rock's desire to make this new method of birth control seem as natural as possible. This was John Rock's error. He was consumed by the idea of the natural. But what he thought was natural wasn't so natural after all, and the Pill he ushered into the world turned out to be something other than what he thought it was. In John Rock's mind the dictates of religion and the principles of science got mixed up, and only now are we beginning to untangle them.
2.
In 1986, a young scientist named Beverly Strassmann travelled to Africa to live with the Dogon tribe of Mali. Her research site was the village of Sangui in the Sahel, about a hundred and twenty miles south of Timbuktu. The Sahel is thorn savannah, green in the rainy season and semi-arid the rest of the year. The Dogon grow millet, sorghum, and onions, raise livestock, and live in adobe houses on the Bandiagara escarpment. They use no contraception. Many of them have held on to their ancestral customs and religious beliefs. Dogon farmers, in many respects, live much as people of that region have lived since antiquity. Strassmann wanted to construct a precise reproductive profile of the women in the tribe, in order to understand what female biology might have been like in the millennia that preceded the modern age. In a way, Strassmann was trying to answer the same question about female biology that John Rock and the Catholic Church had struggled with in the early sixties: what is natural? Only, her sense of "natural" was not theological but evolutionary. In the era during which natural selection established the basic patterns of human biology--the natural history of our species--how often did women have children? How often did they menstruate? When did they reach puberty and menopause? What impact did breast-feeding have on ovulation? These questions had been studied before, but never so thoroughly that anthropologists felt they knew the answers with any certainty.
Strassmann, who teaches at the University of Michigan at Ann Arbor, is a slender, soft-spoken woman with red hair, and she recalls her time in Mali with a certain wry humor. The house she stayed in while in Sangui had been used as a shelter for sheep before she came and was turned into a pigsty after she left. A small brown snake lived in her latrine, and would curl up in a camouflaged coil on the seat she sat on while bathing. The villagers, she says, were of two minds: was it a deadly snake--Kere me jongolo, literally, "My bite cannot be healed"--or a harmless mouse snake? (It turned out to be the latter.) Once, one of her neighbors and best friends in the tribe roasted her a rat as a special treat. "I told him that white people aren't allowed to eat rat because rat is our totem," Strassmann says. "I can still see it. Bloated and charred. Stretched by its paws. Whiskers singed. To say nothing of the tail." Strassmann meant to live in Sangui for eighteen months, but her experiences there were so profound and exhilarating that she stayed for two and a half years. "I felt incredibly privileged," she says. "I just couldn't tear myself away."
Part of Strassmann's work focussed on the Dogon's practice of segregating menstruating women in special huts on the fringes of the village. In Sangui, there were two menstrual huts--dark, cramped, one-room adobe structures, with boards for beds. Each accommodated three women, and when the rooms were full, latecomers were forced to stay outside on the rocks. "It's not a place where people kick back and enjoy themselves," Strassmann says. "It's simply a nighttime hangout. They get there at dusk, and get up early in the morning and draw their water." Strassmann took urine samples from the women using the hut, to confirm that they were menstruating. Then she made a list of all the women in the village, and for her entire time in Mali--seven hundred and thirty- six consecutive nights--she kept track of everyone who visited the hut. Among the Dogon, she found, a woman, on average, has her first period at the age of sixteen and gives birth eight or nine times. From menarche, the onset of menstruation, to the age of twenty, she averages seven periods a year. Over the next decade and a half, from the age of twenty to the age of thirty-four, she spends so much time either pregnant or breast-feeding (which, among the Dogon, suppresses ovulation for an average of twenty months) that she averages only slightly more than one period per year. Then, from the age of thirty-five until menopause, at around fifty, as her fertility rapidly declines, she averages four menses a year. All told, Dogon women menstruate about a hundred times in their lives. (Those who survive early childhood typically live into their seventh or eighth decade.) By contrast, the average for contemporary Western women is somewhere between three hundred and fifty and four hundred times.
Strassmann's office is in the basement of a converted stable next to the Natural History Museum on the University of Michigan campus. Behind her desk is a row of battered filing cabinets, and as she was talking she turned and pulled out a series of yellowed charts. Each page listed, on the left, the first names and identification numbers of the Sangui women. Across the top was a time line, broken into thirty-day blocks. Every menses of every woman was marked with an X. In the village, Strassmann explained, there were two women who were sterile, and, because they couldn't get pregnant, they were regulars at the menstrual hut. She flipped through the pages until she found them. "Look, she had twenty-nine menses over two years, and the other had twenty- three." Next to each of their names was a solid line of x's. "Here's a woman approaching menopause," Strassmann went on, running her finger down the page. "She's cycling but is a little bit erratic. Here's another woman of prime childbearing age. Two periods. Then pregnant. I never saw her again at the menstrual hut. This woman here didn't go to the menstrual hut for twenty months after giving birth, because she was breast-feeding. Two periods. Got pregnant. Then she miscarried, had a few periods, then got pregnant again. This woman had three menses in the study period." There weren't a lot of x's on Strassmann's sheets. Most of the boxes were blank. She flipped back through her sheets to the two anomalous women who were menstruating every month. "If this were a menstrual chart of undergraduates here at the University of Michigan, all the rows would be like this."
Strassmann does not claim that her statistics apply to every preindustrial society. But she believes--and other anthropological work backs her up--that the number of lifetime menses isn't greatly affected by differences in diet or climate or method of subsistence (foraging versus agriculture, say). The more significant factors, Strassmann says, are things like the prevalence of wet-nursing or sterility. But over all she believes that the basic pattern of late menarche, many pregnancies, and long menstrual-free stretches caused by intensive breast-feeding was virtually universal up until the "demographic transition" of a hundred years ago from high to low fertility. In other words, what we think of as normal--frequent menses--is in evolutionary terms abnormal. "It's a pity that gynecologists think that women have to menstruate every month,"Strassmann went on. "They just don't understand the real biology of menstruation."
To Strassmann and others in the field of evolutionary medicine, this shift from a hundred to four hundred lifetime menses is enormously significant. It means that women's bodies are being subjected to changes and stresses that they were not necessarily designed by evolution to handle. In a brilliant and provocative book, "Is Menstruation Obsolete?," Drs. Elsimar Coutinho and Sheldon S. Segal, two of the world's most prominent contraceptive researchers, argue that this recent move to what they call "incessant ovulation" has become a serious problem for women's health. It doesn't mean that women are always better off the less they menstruate. There are times--particularly in the context of certain medical conditions--when women ought to be concerned if they aren't menstruating: In obese women, a failure to menstruate can signal an increased risk of uterine cancer. In female athletes, a failure to menstruate can signal an increased risk of osteoporosis. But for most women, Coutinho and Segal say, incessant ovulation serves no purpose except to increase the occurence of abdominal pain, mood shifts, migraines, endometriosis, fibroids, and anemia--the last of which, they point out, is "one of the most serious health problems in the world."
Most serious of all is the greatly increased risk of some cancers. Cancer, after all, occurs because as cells divide and reproduce they sometimes make mistakes that cripple the cells' defenses against runaway growth. That's one of the reasons that our risk of cancer generally increases as we age: our cells have more time to make mistakes. But this also means that any change promoting cell division has the potential to increase cancer risk, and ovulation appears to be one of those changes. Whenever a woman ovulates, an egg literally bursts through the walls of her ovaries. To heal that puncture, the cells of the ovary wall have to divide and reproduce. Every time a woman gets pregnant and bears a child, her lifetime risk of ovarian cancer drops ten per cent. Why? Possibly because, between nine months of pregnancy and the suppression of ovulation associated with breast-feeding, she stops ovulating for twelve months--and saves her ovarian walls from twelve bouts of cell division. The argument is similar for endometrial cancer. When a woman is menstruating, the estrogen that flows through her uterus stimulates the growth of the uterine lining, causing a flurry of potentially dangerous cell division. Women who do not menstruate frequently spare the endometrium that risk. Ovarian and endometrial cancer are characteristically modern diseases, consequences, in part, of a century in which women have come to menstruate four hundred times in a lifetime.
In this sense, the Pill really does have a "natural"effect. By blocking the release of new eggs, the progestin in oral contraceptives reduces the rounds of ovarian cell division. Progestin also counters the surges of estrogen in the endometrium, restraining cell division there. A woman who takes the Pill for ten years cuts her ovarian-cancer risk by around seventy per cent and her endometrial-cancer risk by around sixty per cent. But here "natural" means something different from what Rock meant. He assumed that the Pill was natural because it was an unobtrusive variant of the body's own processes. In fact, as more recent research suggests, the Pill is really only natural in so far as it's radical--rescuing the ovaries and endometrium from modernity. That Rock insisted on a twenty-eight-day cycle for his pill is evidence of just how deep his misunderstanding was: the real promise of the Pill was not that it could preserve the menstrual rhythms of the twentieth century but that it could disrupt them.
Today, a growing movement of reproductive specialists has begun to campaign loudly against the standard twenty-eight-day pill regimen. The drug company Organon has come out with a new oral contraceptive, called Mircette, that cuts the seven-day placebo interval to two days. Patricia Sulak, a medical researcher at Texas A.& M. University, has shown that most women can probably stay on the Pill, straight through, for six to twelve weeks before they experience breakthrough bleeding or spotting. More recently, Sulak has documented precisely what the cost of the Pill's monthly "off" week is. In a paper in the February issue of the journal Obstetrics and Gynecology, she and her colleagues documented something that will come as no surprise to most women on the Pill: during the placebo week, the number of users experiencing pelvic pain, bloating, and swelling more than triples, breast tenderness more than doubles, and headaches increase by almost fifty per cent. In other words, some women on the Pill continue to experience the kinds of side effects associated with normal menstruation. Sulak's paper is a short, dry, academic work, of the sort intended for a narrow professional audience. But it is impossible to read it without being struck by the consequences of John Rock's desire to please his church. In the past forty years, millions of women around the world have been given the Pill in such a way as to maximize their pain and suffering. And to what end? To pretend that the Pill was no more than a pharmaceutical version of the rhythm method?
3.
In 1980 and 1981, Malcolm Pike, a medical statistician at the University of Southern California, travelled to Japan for six months to study at the Atomic Bomb Casualties Commission. Pike wasn't interested in the effects of the bomb. He wanted to examine the medical records that the commission had been painstakingly assembling on the survivors of Hiroshima and Nagasaki. He was investigating a question that would ultimately do as much to complicate our understanding of the Pill as Strassmann's research would a decade later: why did Japanese women have breast-cancer rates six times lower than American women?
In the late forties, the World Health Organization began to collect and publish comparative health statistics from around the world, and the breast-cancer disparity between Japan and America had come to obsess cancer specialists. The obvious answer--that Japanese women were somehow genetically protected against breast cancer--didn't make sense, because once Japanese women moved to the United States they began to get breast cancer almost as often as American women did. As a result, many experts at the time assumed that the culprit had to be some unknown toxic chemical or virus unique to the West. Brian Henderson, a colleague of Pike's at U.S.C. and his regular collaborator, says that when he entered the field, in 1970, "the whole viral- and chemical- carcinogenesis idea was huge--it dominated the literature." As he recalls, "Breast cancer fell into this large, unknown box that said it was something to do with the environment--and that word 'environment' meant a lot of different things to a lot of different people. They might be talking about diet or smoking or pesticides."
Henderson and Pike, however, became fascinated by a number of statistical pecularities. For one thing, the rate of increase in breast-cancer risk rises sharply throughout women's thirties and forties and then, at menopause, it starts to slow down. If a cancer is caused by some toxic outside agent, you'd expect that rate to rise steadily with each advancing year, as the number of mutations and genetic mistakes steadily accumulates. Breast cancer, by contrast, looked as if it were being driven by something specific to a woman's reproductive years. What was more, younger women who had had their ovaries removed had a markedly lower risk of breast cancer; when their bodies weren't producing estrogen and progestin every month, they got far fewer tumors. Pike and Henderson became convinced that breast cancer was linked to a process of cell division similar to that of ovarian and endometrial cancer. The female breast, after all, is just as sensitive to the level of hormones in a woman's body as the reproductive system. When the breast is exposed to estrogen, the cells of the terminal-duct lobular unit--where most breast cancer arises--undergo a flurry of division. And during the mid-to-late stage of the menstrual cycle, when the ovaries start producing large amounts of progestin, the pace of cell division in that region doubles.
It made intuitive sense, then, that a woman's risk of breast cancer would be linked to the amount of estrogen and progestin her breasts have been exposed to during her lifetime. How old a woman is at menarche should make a big difference, because the beginning of puberty results in a hormonal surge through a woman's body, and the breast cells of an adolescent appear to be highly susceptible to the errors that result in cancer. (For more complicated reasons, bearing children turns out to be protective against breast cancer, perhaps because in the last two trimesters of pregnancy the cells of the breast mature and become much more resistant to mutations.) How old a woman is at menopause should matter, and so should how much estrogen and progestin her ovaries actually produce, and even how much she weighs after menopause, because fat cells turn other hormones into estrogen.
Pike went to Hiroshima to test the cell-division theory. With other researchers at the medical archive, he looked first at the age when Japanese women got their period. A Japanese woman born at the turn of the century had her first period at sixteen and a half. American women born at the same time had their first period at fourteen. That difference alone, by their calculation, was sufficient to explain forty per cent of the gap between American and Japanese breast-cancer rates. "They had collected amazing records from the women of that area," Pike said. "You could follow precisely the change in age of menarche over the century. You could even see the effects of the Second World War. The age of menarche of Japanese girls went up right at that point because of poor nutrition and other hardships. And then it started to go back down after the war. That's what convinced me that the data were wonderful."
Pike, Henderson, and their colleagues then folded in the other risk factors. Age at menopause, age at first pregnancy, and number of children weren't sufficiently different between the two countries to matter. But weight was. The average post- menopausal Japanese woman weighed a hundred pounds; the average American woman weighed a hundred and forty-five pounds. That fact explained another twenty-five per cent of the difference. Finally, the researchers analyzed blood samples from women in rural Japan and China, and found that their ovaries-- possibly because of their extremely low-fat diet--were producing about seventy-five per cent the amount of estrogen that American women were producing. Those three factors, added together, seemed to explain the breast-cancer gap. They also appeared to explain why the rates of breast cancer among Asian women began to increase when they came to America: on an American diet, they started to menstruate earlier, gained more weight, and produced more estrogen. The talk of chemicals and toxins and power lines and smog was set aside. "When people say that what we understand about breast cancer explains only a small amount of the problem, that it is somehow a mystery, it's absolute nonsense," Pike says flatly. He is a South African in his sixties, with graying hair and a salt-and-pepper beard. Along with Henderson, he is an eminent figure in cancer research, but no one would ever accuse him of being tentative in his pronouncements. "We understand breast cancer extraordinarily well. We understand it as well as we understand cigarettes and lung cancer."
What Pike discovered in Japan led him to think about the Pill, because a tablet that suppressed ovulation--and the monthly tides of estrogen and progestin that come with it--obviously had the potential to be a powerful anti-breast-cancer drug. But the breast was a little different from the reproductive organs. Progestin prevented ovarian cancer because it suppressed ovulation. It was good for preventing endometrial cancer because it countered the stimulating effects of estrogen. But in breast cells, Pike believed, progestin wasn't the solution; it was one of the hormones that caused cell division. This is one explanation for why, after years of studying the Pill, researchers have concluded that it has no effect one way or the other on breast cancer: whatever beneficial effect results from what the Pill does is cancelled out by how it does it. John Rock touted the fact that the Pill used progestin, because progestin was the body's own contraceptive. But Pike saw nothing "natural"about subjecting the breast to that heavy a dose of proges- tin. In his view, the amount of progestin and estrogen needed to make an effective contraceptive was much greater than the amount needed to keep the reproductive system healthy--and that excess was unnecessarily raising the risk of breast cancer. A truly natural Pill might be one that found a way to suppress ovulation without using progestin. Throughout the nineteen-eighties, Pike recalls, this was his obsession. "We were all trying to work out how the hell we could fix the Pill. We thought about it day and night."
4.
Pike's proposed solution is a class of drugs known as GnRHAs, which has been around for many years. GnRHAs disrupt the signals that the pituitary gland sends when it is attempting to order the manufacture of sex hormones. It's a circuit breaker. "We've got substantial experience with this drug," Pike says. Men suffering from prostate cancer are sometimes given a GnRHA to temporarily halt the production of testosterone, which can exacerbate their tumors. Girls suffering from what's called precocious puberty--puberty at seven or eight, or even younger--are sometimes given the drug to forestall sexual maturity. If you give GnRHA to women of childbearing age, it stops their ovaries from producing estrogen and progestin. If the conventional Pill works by convincing the body that it is, well, a little bit pregnant, Pike's pill would work by convincing the body that it was menopausal.
In the form Pike wants to use it, GnRHA will come in a clear glass bottle the size of a saltshaker, with a white plastic mister on top. It will be inhaled nasally. It breaks down in the body very quickly. A morning dose simply makes a woman menopausal for a while. Menopause, of course, has its risks. Women need estrogen to keep their hearts and bones strong. They also need progestin to keep the uterus healthy. So Pike intends to add back just enough of each hormone to solve these problems, but much less than women now receive on the Pill. Ideally, Pike says, the estrogen dose would be adjustable: women would try various levels until they found one that suited them. The progestin would come in four twelve-day stretches a year. When someone on Pike's regimen stopped the progestin, she would have one of four annual menses.
Pike and an oncologist named Darcy Spicer have joined forces with another oncologist, John Daniels, in a startup called Balance Pharmaceuticals. The firm operates out of a small white industrial strip mall next to the freeway in Santa Monica. One of the tenants is a paint store, another looks like some sort of export company. Balance's offices are housed in an oversized garage with a big overhead door and concrete floors. There is a tiny reception area, a little coffee table and a couch, and a warren of desks, bookshelves, filing cabinets, and computers. Balance is testing its formulation on a small group of women at high risk for breast cancer, and if the results continue to be encouraging, it will one day file for F.D.A. approval.
"When I met Darcy Spicer a couple of years ago," Pike said recently, as he sat at a conference table deep in the Balance garage, "he said, 'Why don't we just try it out? By taking mammograms, we should be able to see changes in the breasts of women on this drug, even if we add back a little estrogen to avoid side effects.' So we did a study, and we found that there were huge changes." Pike pulled out a paper he and Spicer had published in the Journal of the National Cancer Institute, showing breast X-rays of three young women. "These are the mammograms of the women before they start," he said. Amid the grainy black outlines of the breast were large white fibrous clumps--clumps that Pike and Spicer believe are indicators of the kind of relentless cell division that increases breast-cancer risk. Next to those x-rays were three mammograms of the same women taken after a year on the GnRHA regimen. The clumps were almost entirely gone. "This to us represents that we have actually stopped the activity inside the breasts," Pike went on. "White is a proxy for cell proliferation. We're slowing down the breast."
Pike stood up from the table and turned to a sketch pad on an easel behind him. He quickly wrote a series of numbers on the paper. "Suppose a woman reaches menarche at fifteen and menopause at fifty. That's thirty-five years of stimulating the breast. If you cut that time in half, you will change her risk not by half but by half raised to the power of 4.5." He was working with a statistical model he had developed to calculate breast-cancer risk. "That's one-twenty-third. Your risk of breast cancer will be one- twenty-third of what it would be otherwise. It won't be zero. You can't get to zero. If you use this for ten years, your risk will be cut by at least half. If you use it for five years, your risk will be cut by at least a third. It's as if your breast were to be five years younger, or ten years younger--forever." The regimen, he says, should also provide protection against ovarian cancer.
Pike gave the sense that he had made this little speech many times before, to colleagues, to his family and friends--and to investors. He knew by now how strange and unbelievable what he was saying sounded. Here he was, in a cold, cramped garage in the industrial section of Santa Monica, arguing that he knew how to save the lives of hundreds of thousands of women around the world. And he wanted to do that by making young women menopausal through a chemical regimen sniffed every morning out of a bottle. This was, to say the least, a bold idea. Could he strike the right balance between the hormone levels women need to stay healthy and those that ultimately make them sick? Was progestin really so important in breast cancer? There are cancer specialists who remain skeptical. And, most of all, what would women think? John Rock, at least, had lent the cause of birth control his Old World manners and distinguished white hair and appeals from theology; he took pains to make the Pill seem like the least radical of interventions--nature's contraceptive, something that could be slipped inside a woman's purse and pass without notice. Pike was going to take the whole forty-year mythology of "natural" and sweep it aside. "Women are going to think, I'm being manipulated here. And it's a perfectly reasonable thing to think." Pike's South African accent gets a little stronger as he becomes more animated. "But the modern way of living represents an extraordinary change in female biology. Women are going out and becoming lawyers, doctors, presidents of countries. They need to understand that what we are trying to do isn't abnormal. It's just as normal as when someone hundreds of years ago had menarche at seventeen and had five babies and had three hundred fewer menstrual cycles than most women have today. The world is not the world it was. And some of the risks that go with the benefits of a woman getting educated and not getting pregnant all the time are breast cancer and ovarian cancer, and we need to deal with it. I have three daughters. The earliest grandchild I had was when one of them was thirty-one. That's the way many women are now. They ovulate from twelve or thirteen until their early thirties. Twenty years of uninterrupted ovulation before their first child! That's a brand-new phenomenon!"
5.
John Rock's long battle on behalf of his birth-control pill forced the Church to take notice. In the spring of 1963, just after Rock's book was published, a meeting was held at the Vatican between high officials of the Catholic Church and Donald B. Straus, the chairman of Planned Parenthood. That summit was followed by another, on the campus of the University of Notre Dame. In the summer of 1964, on the eve of the feast of St. John the Baptist, Pope Paul VI announced that he would ask a committee of church officials to reëxamine the Vatican's position on contraception. The group met first at the Collegio San Jose, in Rome, and it was clear that a majority of the committee were in favor of approving the Pill. Committee reports leaked to the National Catholic Register confirmed that Rock's case appeared to be winning. Rock was elated. Newsweek put him on its cover, and ran a picture of the Pope inside. "Not since the Copernicans suggested in the sixteenth century that the sun was the center of the planetary system has the Roman Catholic Church found itself on such a perilous collision course with a new body of knowledge," the article concluded. Paul VI, however, was unmoved. He stalled, delaying a verdict for months, and then years. Some said he fell under the sway of conservative elements within the Vatican. In the interim, theologians began exposing the holes in Rock's arguments. The rhythm method " 'prevents' conception by abstinence, that is, by the non-performance of the conjugal act during the fertile period," the Catholic journal America concluded in a 1964 editorial. "The pill prevents conception by suppressing ovulation and by thus abolishing the fertile period. No amount of word juggling can make abstinence from sexual relations and the suppression of ovulation one and the same thing." On July 29, 1968, in the "Humanae Vitae" encyclical, the Pope broke his silence, declaring all "artificial" methods of contraception to be against the teachings of the Church.
In hindsight, it is possible to see the opportunity that Rock missed. If he had known what we know now and had talked about the Pill not as a contraceptive but as a cancer drug--not as a drug to prevent life but as one that would save life--the church might well have said yes. Hadn't Pius XII already approved the Pill for therapeutic purposes? Rock would only have had to think of the Pill as Pike thinks of it: as a drug whose contraceptive aspects are merely a means of attracting users, of getting, as Pike put it, "people who are young to take a lot of stuff they wouldn't otherwise take."
But Rock did not live long enough to understand how things might have been. What he witnessed, instead, was the terrible time at the end of the sixties when the Pill suddenly stood accused--wrongly--of causing blood clots, strokes, and heart attacks. Between the mid-seventies and the early eighties, the number of women in the United States using the Pill fell by half. Harvard Medical School, meanwhile, took over Rock's Reproductive Clinic and pushed him out. His Harvard pension paid him only seventy-five dollars a year. He had almost no money in the bank and had to sell his house in Brookline. In 1971, Rock left Boston and retreated to a farmhouse in the hills of New Hampshire. He swam in the stream behind the house. He listened to John Philip Sousa marches. In the evening, he would sit in the living room with a pitcher of martinis. In 1983, he gave his last public interview, and it was as if the memory of his achievements was now so painful that he had blotted it out.
He was asked what the most gratifying time of his life was. "Right now," the inventor of the Pill answered, incredibly. He was sitting by the fire in a crisp white shirt and tie, reading "The Origin," Irving Stone's fictional account of the life of Darwin. "It frequently occurs to me, gosh, what a lucky guy I am. I have no responsibilities, and I have everything I want. I take a dose of equanimity every twenty minutes. I will not be disturbed about things."
Once, John Rock had gone to seven-o'clock Mass every morning and kept a crucifix above his desk. His interviewer, the writer Sara Davidson, moved her chair closer to his and asked him whether he still believed in an afterlife.
"Of course I don't," Rock answered abruptly. Though he didn't explain why, his reasons aren't hard to imagine. The church could not square the requirements of its faith with the results of his science, and if the church couldn't reconcile them how could Rock be expected to? John Rock always stuck to his conscience, and in the end his conscience forced him away from the thing he loved most. This was not John Rock's error. Nor was it his church's. It was the fault of the haphazard nature of science, which all too often produces progress in advance of understanding. If the order of events in the discovery of what was natural had been reversed, his world, and our world, too, would have been a different place.
"Heaven and Hell, Rome, all the Church stuff--that's for the solace of the multitude," Rock said. He had only a year to live. "I was an ardent practicing Catholic for a long time, and I really believed it all then, you see."
The Young Garmentos
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
April 24, 2000
LETTER FROM LOS ANGELES
The T-shirt trade becomes a calling.
1.
Dov Charney started his T-shirt business, American Apparel, on the corner of Santa Fe Avenue and the 10 Freeway, a mile or so from downtown Los Angeles. Actually, his factory was built directly underneath the eastbound and westbound lanes, and the roof over the room where the cutters and sewers work was basically the freeway itself, so that the clicking and clacking of sewing machines mixed with the rumble of tractor trailers. It was not, as Dov was the first to admit, an ideal location, with the possible exception that it was just two blocks from the Playpen, the neighborhood strip bar, which made it awfully convenient whenever he decided to conduct a fitting. "Big companies tend to hire fitting models at a hundred bucks an hour," Dov explained recently as he headed over to the Playpen to test some of his new T-shirts. "But they only give you one look. At a strip bar, you get a cross- section of chicks. You've got big chicks, little chicks, big-assed chicks, little-assed chicks, chicks with big tits, and chicks with little tits. You couldn't ask for a better place to fit a shirt."
He had three of his staff with him, and half a dozen samples of his breakthrough Classic Girl line of "baby T"s, in this case shirts with ribbed raglan three-quarter sleeves in lilac and pink. He walked quickly, leaning forward slightly, as if to improve his aerodynamics. Dov is thirty-one years old and has thick black hair and blue-tinted aviator glasses, and tends to dress in khakis and knit vintage shirts, with one of his own T-shirts as an undergarment. In front of the Playpen, Dov waved to the owner, a middle-aged Lebanese man in a red guayabera, and ushered his group into the gloom of the bar. At this hour--two o'clock in the afternoon--the Playpen was almost empty; just one girl gyrated for a customer, to what sounded like the music from "Ali Baba and the Forty Thieves." The situation was ideal, because it meant the rest of the girls had time to model.
The first to come over was Diana, dark-haired and buxom. She slipped out of a yellow mesh dress and pulled on one of Dov's baby T's. Dov examined her critically. He was concerned about the collar. The Classic Girl is supposed to have a snug fit, with none of the torquing and bowing that plague lesser shirts. But the prototype was bunching around the neck. Dov gestured to one of his colleagues. "Olin, look what's going on here. I think there's too much binding going into the machine." Diana turned around, and wiggled her behind playfully. Dov pulled the T-shirt tight. "I think it could be a little longer here," he said, pursing his lips. Baby T's, in their earlier incarnation, were short, in some cases above the belly button--something that Dov considers a mistake. The music was now deafening, and over a loudspeaker a "lap-dance promo" was being announced. Dov, oblivious, turned his attention to Mandy, a svelte, long-legged blonde in a black bikini. On her, Dov observed, the shirt did not fit so "emphatically" around the chest as it had on Diana. Dov looked Mandy up and down, tugging and pulling to get the shirt just right. "When you're doing a fitting, often the more oddly shaped girl will tell you a lot more," he said. By now, a crowd of strippers was gathering around him, presumably attracted by the novelty of being asked by a customer to put clothes on. But Dov had seen all he needed to. His life's great cause--which is to produce the world's finest T-shirt for between three and four dollars wholesale--had advanced another step. "What did I learn today?" he asked, as he strode out the door. "I learned that my sleeves are perfect. But I see a quality problem with the collar." He thought for a moment. "And I definitely have to add an inch to the garment."
2.
There is a town in upstate New York, just north and west of Albany, called Gloversville, so named because in the late nineteenth century and the early part of the twentieth century ninety-five per cent of the fine gloves sold in the United States were manufactured there. At one time, there were a hundred and sixteen glove factories in the town, employing twelve thousand people and turning out fifteen million dollars' worth of gloves a year. New glove start-ups appeared all the time, whenever some glove entrepreneur--some ambitious handschumacher--had a better idea about how to make a glove. A trade journal, Glovers Review, covered the industry's every step. Local firms--such as Jacob Adler & Co. and Louis Meyers & Sons and Elite Glove Co.--became nationally known brands. When the pogroms of Eastern Europe intensified, in the eighteen-eighties, the Jewish glove cutters of Warsaw--the finest leather artisans of nineteenth-century Europe--moved en masse to Gloversville, because Gloversville was where you went in those days if you cared about gloves.
It's hard to imagine anyone caring so deeply about gloves, and had we visited Gloversville in its prime most of us would have found it a narrow and provincial place. But if you truly know gloves and think about them and dream about them and, more important, if you are surrounded every day by a community of people who know and think and dream about gloves, a glove becomes more than a glove. In Gloversville, there was an elaborate social hierarchy. The handschumacher considered himself socially and intellectually superior to the schuster and the schneider--the shoemaker and the tailor. To cover the hands, after all, was the highest calling. (As the glover's joke goes, "Did you ever see anyone talk using his boots?") Within the glove world, in turn, the "makers"--the silkers, the closers, and the fourchetters, who sewed the gloves--were inferior to the "cutters," who first confronted the hide, and who advertised their status by going to work wearing white shirts and collars, bow ties or cravats, tigereye cufflinks, and carefully pressed suits. A skilled cutter could glance at a glove and see in it the answers to a hundred questions. Is the leather mocha, the most pliable of all skins, taken from the hide of long-black-haired Arabian sheep? Or is it South African capeskin, the easiest to handle? Is it kid from Spain, peccary from the wild pigs of Brazil and Mexico, chamois from Europe, or cabretta, from a Brazilian hairy sheep? Is the finish "grained"--showing the outside of the hide--or "velvet," meaning that the leather has been buffed? Is it sewn in a full- piqué stitch or a half-piqué, an osann or an overseam? Do the color and texture of the fourchette--the strip of leather that forms the sides of the fingers--match the adjoining leather? The lesson of Gloversville is that behind every ordinary object is a group of people to whom that object is anything but ordinary.
Dov Charney lives in his own, modern-day version of Gloversville. He is part of a world that cares about T-shirts every bit as much as the handschumachers cared about peccary and cabretta. It is impossible to talk about Dov, for example, without talking about his best friend, Rick Klotz, who runs a clothing company named Fresh Jive, about a mile and a half from Dov's factory. Rick, who is thirty-two, designs short-sleeve shirts and baggy pants and pullovers and vests and printed T-shirts with exquisite graphics (featuring everything from an obscure typographical scheme to the Black Panthers). In the eighties, Rick was a punker, at least until everyone else got short hair, at which point he grew his hair long. Later, in his Ted Nugent-and-TransAm phase, he had, he says, a "big, filthy mustache, like Cheech." Now he is perfectly bald, and drives a black custom-made late-model Cadillac Fleetwood Limited, with a VCR in the back, and, because he sits very low in the seat, and bobs up and down to very loud hip-hop as he drives, the effect, from the street, is slightly comic, like that of a Ping-Pong ball in choppy water. When Dov first came to Los Angeles, a few years ago, he crashed at Rick's apartment in Hollywood, and the two grew so close that Rick believes he and Dov were "separated at birth."
"If it wasn't for Rick, I wouldn't have been able to make it," Dov says. "I slept on his couch. I checked in for a few days, stayed for a year." This was after an initial foray that Dov had made into the T-shirt business, in South Carolina in the early nineties, failed. "When he lived with me, he was on the brink," Rick added. "Every day was the same. Go to sleep at two with the phone. Then wake up at six to call back East. One time, he was just crying and losing it. It was just so heavy. I was, like, 'Dude, what are you doing?'"
What do Rick and Dov have in common? It isn't a matter of personality. Dov says that sometimes when he's out with Rick he'll spot one of Rick's T-shirts, and he'll shout, "There's one of your T-shirts!" Rick will look down and away, embarrassed, because he's so acutely aware of how uncool that sounds. Dov couldn't care less. When he spots his own work, he can hardly contain himself. "I always say, 'Hey' "--Dov put on the accent of his native Montreal--"'where did you get that shirt?' Like, if I'm on the subway in New York City. I say, 'You want some more?' I take my bag and give them out for free. I'm excited about it. I could be watching TV at night, or I could be watching a porno, and, boom, there is my T-shirt. I've made millions of them. I always know it!"
What the two of them share is a certain sensibility. Rick grew up in the Valley and Dov grew up in Montreal, but it's as if they were born and raised in the same small town, where the T-shirt was something that you lived and died for. At dinner one recent night in L.A., Rick talked about how he met Dov, several years ago, at a big trade show in Las Vegas. "I'm at this party sitting out on the balcony. I see this guy dancing and he's--what's the word?" And here Rick did a kind of spastic gyration in his seat. "Imbecilic. He didn't care what anybody thought. And he catches me looking and goes like this." Rick made two pistols out of his fingers, and fired one hand after another. "I was, like, in love."
Dov seemed touched. "You know, I knew of Rick long before I ever met him. His T-shirt graphics are some of the most respected T-shirt graphics in the world. I swear to God."
But Rick was being modest again. "No, they're not."
"If you mention Fresh Jive in most industrialized countries to people that know what good graphics are on T-shirts, they're, like . . . " Dov made an appreciative noise. "I swear, it's like a connoisseur's wine."
"Maybe at one time," Rick murmured.
"He is an artist!" Dov went on, his voice rising. "His canvas is fabric!"
3.
On the day that he made his foray to the Playpen, Dov met with a fortyish man named Jhean. In the garment-manufacturing business in Los Angeles, the up-and-coming entrepreneurs are Persian and Korean. (Dov has a partner who is Korean.) The occasional throwback, like Dov, is Jewish. Jhean, however, is Haitian. He used to work in government, but now he is in the garment business, a career change of which Dov heartily approved. Jhean was wearing tight black pants, a red silk shirt open to mid-chest, and a gold chain. Dov put his arm around him affectionately. "Jhean is a crazy man," he announced, to no one in particular. "He was going to be one of my partners. We were going to get this whole Montreal Jewish-Korean-Haitian thing going." Jhean turned away, and Dov lowered his voice to a whisper. "Jhean has it in his blood, you know," he said, meaning a feel for T-shirts.
Dov led Jhean outside, and they sat on a bench, the sun peeking through at them between the off-ramp and the freeway lanes. Jhean handed Dov a men's Fruit of the Loom undershirt, size medium. It was the reason for Jhean's visit. "Who can do this for me?" he asked.
Dov took the shirt and unfolded it slowly. He held it up in front of his eyes, as a mother might hold a baby, and let out a soft whistle. "This is an unbelievable garment," he said. "Nobody has the machines to make it, except for two parties that I'm aware of. Fruit of the Loom. And Hanes. The shirt is a two-by-one rib. They've taken out one or two of the needles. It's a coarse yarn. And it's tubular, so there is no waste. This is one of the most efficient garments in the world. It comes off the tube like a sock."
Some T-shirts have two seams down each side: they are made with "open width" fabric, by sewing together the front and the back of the T-shirt. This T-shirt had no seams. It was cut from cotton fabric that had been knitted into a T-shirt-size tube, which is a trickier procedure but means less wasted fabric, lower sewing costs, and less of the twisting that can distort a garment.
Dov began to run his fingers along the bottom of the shirt, which had been not hemmed but overlocked--with a stitch--to save even more fabric. "This costs, with the right equipment, maybe a dollar. My cost is a dollar-thirty, a dollar-fifty. The finest stuff is two-fifty, two-sixty. If you can make this shirt, you can make millions. But you can't make this shirt. Hanes actually does this even better than Fruit of the Loom. They've got this dialled down." Jhean wondered if he could side-seam it, but Dov just shook his head. "If you side-seam it, you lose the whole energy."
You could tell that Dov was speaking as much to himself as to Jhean. He was saying that he couldn't reproduce a masterpiece like that undershirt, either. But there was no defeat in his voice, because he knew enough about T-shirts to realize that there is more than one way to make a perfect garment. Dov likes to point out that the average American owns twenty-ve T-shirts--twenty- five!--and, even if you reckon, as he does, that of those only between four and seven are in regular rotation, that's still an enormous market.
The garment in question was either eighteen- or twenty-singles yarn, which is standard for T-shirts. But what if a T-shirt maker were to use thirty-singles yarn, knitted on a fine-gauge machine, which produces a thinner, more "fashion-forward" fabric? The Fruit of the Loom piece was open-end cotton, and open-end is coarse. Dov likes "ring-spun combed" yarn, which is much softer, and costs an extra eighty cents a pound. Softness also comes from the way the fabric is processed before cutting, and Dov is stickler for that kind of detail. "I have a lot of secret ingredients," he says. "Just like K.F.C. There is the amount of yarn in one revolution, which determines the tightness. There's the spacing of the needle. Then there's the finishing. What kind of chemicals are you using in the finishing? We think this through. We've developed a neurosis about this." In his teens, Dov hooked up with a friend who was selling printed T's outside the Montreal Forum, and Dov's contribution was to provide American Hanes instead of the Canadian poly-cotton-blend Penmans. The Hanes, Dov says, was "creamier," and he contended that the Canadian T-shirt consumer deserved that extra creaminess. When he's inspecting rolls of fabric, Dov will sometimes break into the plastic package wrap and run his hand over the cotton, palm flat, and if you look behind his tinted aviators you'll see that his eyes have closed slightly. Once, he held two white swatches up to the light, in order to demonstrate how one had "erections"--little fibres that stood up straight on the fabric--and the other did not, and then he ran his hand ever so slightly across the surface of the swatch he liked, letting the fibres tickle his palm. "I'm particular," Dov explained. "Like in my underwear. I'm very committed to Hanes thirty-two. I've been wearing it for twelve years. I sleep in it. And if Hanes makes any adjustments I'm picking it up. I watch. They change their labels, they use different countries to make their shit, I know."
Dov was back inside his factory now, going from the room where all the sewers sit, stitching up T-shirts, to a passageway lined with big rolls of fabric. The fact that Jhean's Fruit of the Loom undershirt was of rib fabric launched him on one of his favorite topics, which was the fabric he personally helped rediscover--baby rib. Baby rib is rib in which the ridges are so close together and the cotton is so fine that it looks like standard T-shirt jersey, and Dov's breakthrough was to realize that because of the way it stretches and supports and feels it was perfect for girls. "See this, that's conventional rib." He pulled on a piece of white fabric, exposing wide ridges of cotton. "It's knitted on larger machines. And it's a larger, bulkier yarn. It's poor-quality cotton. But girls want softness. So, rather than take the cheap road, I've taken the higher road." Dov's baby rib uses finer cotton and tighter stitching, and the fit is tighter across the chest and shoulders, the way he believes a T-shirt ought to look. "There were a few influences," he said, reflecting on the creative process that brought him to baby rib. "I'm not sure which girlfriend, but we can name some." He ticked them off on his fingers. "There's Marcella, from Argentina. I met her in South Beach. She wore these little tops made in South America. And they were finer than the tops that girls were wearing in the States. I got such a boner looking at her in that T-shirt that I thought, This is doing something for me. We've got to explore this opportunity. This was four, five years ago. O.K., I broke up with her, and I started going out with this stripper, Julie, from South Carolina. She had a gorgeous body. She was all-American. And, you know, Julie looked so great in those little T-shirts. She put one on and it meant something."
Dov pulled out a single typewritten page, a draft of a "mission statement" he was preparing for the industry. This was for a new line of Standard American T-shirts he wanted to start making--thirty-singles, ring-spun, tubular shirts knit on custom- made Asian equpiment. "Dear Client," it began:
During the last ten years major T-shirt makers such as Hanes and Fruit of the Loom have focused on being "heavier" and generously cut. Innovation and style have been put aside, and there has been a perpetual price war during the last four years. The issues are who can be cheaper, bigger or heavier. . . .Concerns about fit or issues of softness or stretch have been the last priority and have been barely considered. In order to create leadership we have reconstructed the T-shirt and have made a deviation from the traditional "Beefy-T" styled garment. We have redone the typical pattern. It is slightly more fitted--especially in the sleeve and armhole opening. . . . Yes the fabric is lighter, and we think that is a positive aspect of the garment. The garment has a stretch that is reminiscent of T-shirts from decades ago.
Dov was peering over my shoulder as I read. "We're going to kick everybody's ass," he announced. "The finest T-shirts are six dollars a piece wholesale. The shittiest shirts are like two dollars. We're going to come in at three and have the right stuff. I'm making the perfect fit. I'm going to manufacture this like gasoline."
If you ask Dov why he's going to these lengths, he'll tell you that it matters to him that Americans can buy an affordable and high-quality T-shirt. That's an admirable notion, but, of course, most of us don't really know what constitutes a high-quality T-shirt: we don't run our hands over a swatch of cotton and let the little fibres tickle our palm, or ruminate on the difference between side-seaming and tubularity. For that matter, few people who bought dress gloves in 1900 knew the difference between a full-piqué or a half-piqué stitch, between high-grade or merely medium-grade peccary. Producers, the economics textbooks tell us, are disciplined by the scrutiny of the marketplace. Yet what of commonplace articles such as T-shirts and gloves, about which most customers don't know enough or care enough to make fine discriminations? Discipline really comes not from customers but from other producers. And here again the economics textbooks steer us wrong, because they place too much emphasis on the role of formal competitors, the Gap or Hanes or the other big glove-maker in your niche. To be sure, Dov can occasionally be inspired by a truly exceptional garment like, say, a two-by-one ribbed undershirt from Fruit of the Loom. But in Gloversville the critical person is not so much the distant rival as the neighbor who is also a contractor, or the guy at the bar downtown who used to be in the business, or the friend at synagogue who is also an expert glove-maker--all of whom can look at your work with a practiced eye and shame you if it isn't right. Dov is motivated to produce a high-quality T-shirt at three dollars because that would mean something to Jhean and to Olin and, most of all, to Rick, whose T-shirt graphics are respected around the world. In Gloversville, the market is not an economic mechanism but--and this is the real power of a place like that--a social one.
"Everybody got so technically obsessed with reduced shrinkage," Dov went on, and by "everyone" he meant a group of people you could count on the fingers of one hand. "That was a big mistake for the industry because they took away the natural stretch property of a lot of the jersey. If you look at vintage shirts, they had a lot of stretch. Today, they don't. They are like these print boards. They are practically woven in comparison. I say fuck the shrinkage. I have a theory on width shrinkage on rib: I don't care. In fact, you put it on, it will come back." He was pacing back and forth and talking even more rapidly than usual. "I'm concerned about linear shrinkage. But, if it doesn't have any width shrinkage at all, I become concerned, too. I have a fabric I'm working on with a T-shirt engineer. It keeps having zero width shrinkage. That's not desirable!"
Dov stopped. He had spotted something out of the corner of his eye. It was one of his workers, a young man with a mustache and a goatee and slicked-back hair. He was wearing a black custom T, with two white stripes down the arms. Dov started walking toward him. "Oh, my God. You want to see something?" He reached out and flipped up the tag at the back of the cutter's shirt. "It's a Fresh Jive piece. I made it for Rick five years ago. Somehow this shirt just trickled back here." The sweet serendipity of it all brought a smile to his face.
4.
While Dov was perfecting his baby T's, Rick was holding a fashion shoot for his elegant new women's-wear line, Fresh Jive Domestics, which had been conceived by a young designer named Jessica. The shoot was at Rick's friend Deidre's house, a right-angled, white- stuccoed, shag-rugged modernist masterpiece under the Hollywood sign. Deidre rents it from the drummer of the seventies supergroup Bread. Madonna's old house is several hundred yards to the west of Deidre's place, and Aldous Huxley used to live a few hundred yards in the other direction, with the result that her block functions as a kind of architectural enactment of postwar Los Angeles intellectual life. For Rick's purposes, though, the house's main points of attraction were its fabulous details, like the little white Star Trek seats around the kitchen counter and the white baby grand in the window with the autographed Hugh Hefner photo and the feisty brown-haired spitz-collie named Sage barricaded in the kitchen. Rick had a box of disposable cameras, and as he shot the models other people joined in with the disposables, so that in the end Rick would be able to combine both sets of pictures in a brag book. It made for a slightly chaotic atmosphere--particularly since there were at least seven highly active cell phones in the room, each with a different ring, competing with the hip-hop on the stereo--and in the midst of it all Rick walked over to the baby grand and, with a mischievous look on his face, played the opening chords of Beethoven's "Pathétique" sonata.
Rick was talking about his plans to open a Fresh Jive store in Los Angeles. But he kept saying that it couldn't be on Melrose Avenue, where all the street-wear stores are. "Maybe that would be good for sales," he said. Then he shook his head. "No way."
Deidre, who was lounging next to the baby grand, started laughing. "You know what, Rick?" she said. "I think it's all about a Fresh Jive store without any Fresh Jive stuff in it."
It was a joke, but in some way not a joke, because that's the sort of thing that Rick might actually do. He's terrified by the conventional. At dinner the previous evening, for example, he and Dov had talked about a particular piece--the sports-style V-necked raglan custom T with stripes that Dov had spotted on the cutter. Rick introduced it years ago and then stopped making it when everyone else started making it, too.
"One of our biggest retailers takes me into this room last year," Rick explained. "It's full of custom T-shirts. He said, 'You started this, and everybody else took advantage of it. But you didn't go with it.' He was pissed off at me."
The businessman in Rick knew that he shouldn't have given up on the shirt so quickly, that he could have made a lot more money had he stayed and exploited the custom-T market. But he couldn't do that, because if he had been in that room with all the other custom T's he risked being known in his world as the guy who started the custom-T trend and then ran out of new ideas. Retail chains like J.C. Penney and Millers Outpost sometimes come to Rick and ask if they can carry Fresh Jive, or ask if he will sell them a big run of a popular piece, and he usually says no. He will allow his clothes to appear only in certain street-wear boutiques. His ambition is to grow three times as big as he is now--to maybe a thirty-million-dollar company--but no larger.
This is the sensibility of the artisan, and it isn't supposed to play much of a role anymore. We live in the age of the entrepreneur, who responds rationally to global pressures and customer demands in order to maximize profit. To the extent that we still talk of Gloversville--and the glove-making business there has long since faded away--we talk of it as a place that people need to leave behind. There was Lucius N. Littauer, for example, who, having made his fortune with Littauer Brothers Glove Co., in downtown Gloversville, went on to Congress, became a confidant of Presidents McKinley and Roosevelt, and then put up the money for what is now the Kennedy School of Government, at Harvard University. There was Samuel Goldwyn, the motion-picture magnate, who began his career as a cutter with Gloversville's Elite Glove Co. In 1912, he jumped into the movie business. He went to Hollywood. He rode horses and learned to play tennis and croquet. Like so many immigrant Jews in the movie industry, he enacted through his films a very public process of assimilation. This is the oldest of American stories: the heroic young man who leaves the small town to play on the big stage--who wants to be an entrepreneur, not an artisan. But the truth is that we always get the story wrong. It isn't that Littauer and Goldwyn left Gloversville to find the real culture, because the real culture comes from Gloversville, too; places like Washington and Hollywood persist and renew themselves only because Littauers and Goldwyns arrive from time to time, bringing with them a little piece of the real thing.
"The one paranoia Rick has is that, God forbid, he makes something that another company has," Dov said, at dinner with Rick that night.
Rick nodded. "In my personal life. Ask Dov. Every piece of clothing I own. Nobody else can have it."
Rick was wearing a pair of jeans and a plain white T-shirt, but if you looked closely you noticed that it wasn't just any jeans-and- T-shirt ensemble. The pants were an unusual denim chino, from Rick's Beggars Banquet collection. And the shirt?
"That is a very well-thought-out item," Dov said, gesturing toward Rick. "It's a purple-label BVD. It's no longer available. Size medium. Of all the shirts I've studied, this one has a phenomenal fit."He reached across the table and ran his fingers around the lower edge of the sleeve. Dov is a believer in a T-shirt that is snug on the biceps. "It's not the greatest fabric. But it shrinks perfectly. I actually gave him that shirt. I came back from one of my customers in New York City, on Grand Street, that happens to resell that particular garment."
It's all of a piece, in the end: the purple-label BVD, the custom-T that he designed but now won't touch. If in Dov's world the true competitive pressures are not economic but social, Rick's version of Gloversville is driven not by the marketplace but by personality--the particular, restless truculence of the sort of person who will give up almost anything and go to any lengths not to be like anyone else.
"We're doing this line of casual shoes," Rick said, during a rare lull in one of Dov's T-shirt soliloquies. "One is the Crip Slip. It's that corduroy slipper that the gang kids would always wear. The other is the Wino, which is that really cheap canvas slipper that you can buy at K mart for seven dollars and that the winos wear when they're, like, really hung over." His big new idea, Rick explained, was to bring out a line of complementary twelve-inch dolls in those characters. "We could have a guy with baggy pants and a pushcart," he went on. "You know, you pull down his pants and there's skid marks. And we have a full gangster for the Crip Slip."
Rick was so excited about the idea that he was still talking about it the next day at work. He was with a Fresh Jive designer named Jupiter--a skateboarder from Las Vegas of German, Welsh, Irish, French, Chinese, and Spanish extraction--and a guy named Matt, who wore on his chest a gold-plated, diamond-encrusted Star of David the size of a Peppermint Pattie. "The idea is that the doll would pump the shoe, and the shoe would pump the doll,"Rick said. "The doll for the Crip Slip would be totally gangster. The handkerchief. The plaid shirt or the wife beater. A forty in his hand. Flashing signs. Wouldn't that be crazy?" And then Rick caught himself. "Omigod.The doll for the Crip Slip will have interchangeable hands, with different gang signs!"
Matt looked awestruck: "Ohhh, that'll be sick!"
"Wooooow." Jupiter dragged the word out, and shook his head slowly. "That's crazy!"
5.
A few days later, Dov drove down to San Diego for Action Sports Retail, a big trade show in the street-wear world. Dov makes the rounds of A.S.R. twice a year, walking up and down through the cavernous conference center, stopping at the booths of hundreds of T-shirt companies and persuading people to buy his shirts wholesale for their lines. This year, he was busy scouting locations for American Apparel's new factory, and so he arrived a day late, clutching a motorized mini-scooter. To his great irritation, he wasn't allowed to carry it in. "This is the most uncool show," he announced, after haggling fruitlessly with the guard at the gate.
But his mood lifted quickly. How could it not? This was A.S.R., and everyone was wearing T-shirts or selling T-shirts, and because this was a place where people knew their T-shirts a lot of those T-shirts were Dov's. He started down one of the aisles. He pointed to a booth on the left. "They use my T-shirts." Next to that booth was another small company. "They use my T-shirts, too." He was wearing khakis and New Balance sneakers and one of his men's T-shirts in baby rib (a controversial piece, because the binding on the collar was a mere half inch). On his back he had a huge orange pack full of catalogues and samples, and every time he spotted a potential customer he would pull the backpack off and rummage through it, and the contents would spill on the floor.
Dov spotted a young woman walking toward him in a baby T. "That's a competitor's shirt. I can tell right away. The spacing of the needle. The fabric is not baby rib." He high-fived someone in another booth. Another young woman, in another T-shirt booth, loomed up ahead. "That's my shirt right there. In the green. I even know the stock number." He turned to her: "You're the girl in the olive forty-three, sixty-six sleeveless V with one-inch binding."
She laughed, but Dov was already off again, plunging back into the fray. "I always have an insecurity that I can be crushed by a bigger business," he said. "Like, Fruit of the Loom decided to do baby T's, and I got a little scared. But then I saw their shirt, and I laughed, because they missed it." Do the suits over at Fruit of the Loom have the same feel for a shirt that Dov does? Were they inspired by Marcella of Argentina and Julie from South Carolina? Those guys were off somewhere in a suburban office park. They weren't in Gloversville. "It was horribly designed," Dov went on. "It was thick, open-end, eighteen-singles coarse rib. It's not the luxury that I offer. See the rib on that collar?" He pulled up the binding on the T-shirt of a friend standing next to him. "Look how thick and spacey it is. That's what they did. They missed the point." Somewhere a cell phone was ringing. A young woman walked past. "Hey!" Dov called out. "That's my T-shirt!" The New-Boy Network
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 29, 2000
DEPT. OF HUMAN RESOURCES
What do job interviews really tell us?
1.
Nolan Myers grew up in Houston, the elder of two boys in a middle- class family. He went to Houston's High School for the Performing and Visual Arts and then Harvard, where he intended to major in History and Science. After discovering the joys of writing code, though, he switched to computer science. "Programming is one of those things you get involved in, and you just can't stop until you finish," Myers says. "You get involved in it, and all of a sudden you look at your watch and it's four in the morning! I love the elegance of it." Myers is short and slightly stocky and has pale-blue eyes. He smiles easily, and when he speaks he moves his hands and torso for emphasis. He plays in a klezmer band called the Charvard Chai Notes. He talks to his parents a lot. He gets B's and B-pluses.
This spring, in the last stretch of his senior year, Myers spent a lot of time interviewing for jobs with technology companies. He talked to a company named Trilogy, down in Texas, but he didn't think he would fit in. "One of Trilogy's subsidiaries put ads out in the paper saying that they were looking for the top tech students, and that they'd give them two hundred thousand dollars and a BMW," Myers said, shaking his head in disbelief. In another of his interviews, a recruiter asked him to solve a programming problem, and he made a stupid mistake and the recruiter pushed the answer back across the table to him, saying that his "solution" accomplished nothing. As he remembers the moment, Myers blushes. "I was so nervous. I thought, Hmm, that sucks!" The way he says that, though, makes it hard to believe that he really was nervous, or maybe what Nolan Myers calls nervous the rest of us call a tiny flutter in the stomach. Myers doesn't seem like the sort to get flustered. He's the kind of person you would call the night before the big test in seventh grade, when nothing made sense and you had begun to panic.
I like Nolan Myers. He will, I am convinced, be very good at whatever career he chooses. I say those two things even though I have spent no more than ninety minutes in his presence. We met only once, on a sunny afternoon in April at the Au Bon Pain in Harvard Square. He was wearing sneakers and khakis and a polo shirt, in a dark-green pattern. He had a big backpack, which he plopped on the floor beneath the table. I bought him an orange juice. He fished around in his wallet and came up with a dollar to try and repay me, which I refused. We sat by the window. Previously, we had talked for perhaps three minutes on the phone, setting up the interview. Then I E-mailed him, asking him how I would recognize him at Au Bon Pain. He sent me the following message, with what I'm convinced—again, on the basis of almost no evidence—to be typical Myers panache: "22ish, five foot seven, straight brown hair, very good-looking. :)." I have never talked to his father, his mother, or his little brother, or any of his professors. I have never seen him ecstatic or angry or depressed. I know nothing of his personal habits, his tastes, or his quirks. I cannot even tell you why I feel the way I do about him. He's good-looking and smart and articulate and funny, but not so good-looking and smart and articulate and funny that there is some obvious explanation for the conclusions I've drawn about him. I just like him, and I'm impressed by him, and if I were an employer looking for bright young college graduates, I'd hire him in a heartbeat.
I heard about Nolan Myers from Hadi Partovi, an executive with Tellme, a highly touted Silicon Valley startup offering Internet access through the telephone. If you were a computer-science major at M.I.T., Harvard, Stanford, Caltech, or the University of Waterloo this spring, looking for a job in software, Tellme was probably at the top of your list. Partovi and I talked in the conference room at Tellme's offices, just off the soaring, open floor where all the firm's programmers and marketers and executives sit, some of them with bunk beds built over their desks. (Tellme recently moved into an old printing plant—a low- slung office building with a huge warehouse attached—and, in accordance with new-economy logic, promptly turned the old offices into a warehouse and the old warehouse into offices.) Partovi is a handsome man of twenty-seven, with olive skin and short curly black hair, and throughout our entire interview he sat with his chair tilted precariously at a forty-five-degree angle. At the end of a long riff about how hard it is to find high-quality people, he blurted out one name: Nolan Myers. Then, from memory, he rattled off Myers's telephone number. He very much wanted Myers to come to Tellme.
Partovi had met Myers in January, during a recruiting trip to Harvard. "It was a heinous day," Partovi remembers. "I started at seven and went until nine. I'd walk one person out and walk the other in." The first fifteen minutes of every interview he spent talking about Tellme—its strategy, its goals, and its business. Then he gave everyone a short programming puzzle. For the rest of the hour-long meeting, Partovi asked questions. He remembers that Myers did well on the programming test, and after talking to him for thirty to forty minutes he became convinced that Myers had, as he puts it, "the right stuff." Partovi spent even less time with Myers than I did. He didn't talk to Myers's family, or see him ecstatic or angry or depressed, either. He knew that Myers had spent last summer as an intern at Microsoft and was about to graduate from an Ivy League school. But virtually everyone recruited by a place like Tellme has graduated from an Ă©lite university, and the Microsoft summer-internship program has more than six hundred people in it. Partovi didn't even know why he liked Myers so much. He just did. "It was very much a gut call," he says.
This wasn't so very different from the experience Nolan Myers had with Steve Ballmer, the C.E.O. of Microsoft. Earlier this year, Myers attended a party for former Microsoft interns called Gradbash. Ballmer gave a speech there, and at the end of his remarks Myers raised his hand. "He was talking a lot about aligning the company in certain directions," Myers told me, "and I asked him about how that influences his ability to make bets on other directions. Are they still going to make small bets?" Afterward, a Microsoft recruiter came up to Myers and said, "Steve wants your E-mail address." Myers gave it to him, and soon he and Ballmer were E-mailing. Ballmer, it seems, badly wanted Myers to come to Microsoft. "He did research on me," Myers says. "He knew which group I was interviewing with, and knew a lot about me personally. He sent me an E-mail saying that he'd love to have me come to Microsoft, and if I had any questions I should contact him. So I sent him a response, saying thank you. After I visited Tellme, I sent him an E-mail saying I was interested in Tellme, here were the reasons, that I wasn't sure yet, and if he had anything to say I said I'd love to talk to him. I gave him my number. So he called, and after playing phone tag we talked—about career trajectory, how Microsoft would influence my career, what he thought of Tellme. I was extremely impressed with him, and he seemed very genuinely interested in me."
What convinced Ballmer he wanted Myers? A glimpse! He caught a little slice of Nolan Myers in action and—just like that—the C.E.O. of a four-hundred-billion-dollar company was calling a college senior in his dorm room. Ballmer somehow knew he liked Myers, the same way Hadi Partovi knew, and the same way I knew after our little chat at Au Bon Pain. But what did we know? What could we know? By any reasonable measure, surely none of us knew Nolan Myers at all.
It is a truism of the new economy that the ultimate success of any enterprise lies with the quality of the people it hires. At many technology companies, employees are asked to all but live at the office, in conditions of intimacy that would have been unthinkable a generation ago. The artifacts of the prototypical Silicon Valley office—the videogames, the espresso bar, the bunk beds, the basketball hoops—are the elements of the rec room, not the workplace. And in the rec room you want to play only with your friends. But how do you find out who your friends are?Today, recruiters canvas the country for rĂ©sumĂ©s. They analyze employment histories and their competitors' staff listings. They call references, and then do what I did with Nolan Myers: sit down with a perfect stranger for an hour and a half and attempt to draw conclusions about that stranger's intelligence and personality. The job interview has become one of the central conventions of the modern economy. But what, exactly, can you know about a stranger after sitting down and talking with him for an hour?
2.
Some years ago, an experimental psychologist at Harvard University, Nalini Ambady, together with Robert Rosenthal, set out to examine the nonverbal aspects of good teaching. As the basis of her research, she used videotapes of teaching fellows which had been made during a training program at Harvard. Her plan was to have outside observers look at the tapes with the sound off and rate the effectiveness of the teachers by their expressions and physical cues. Ambady wanted to have at least a minute of film to work with. When she looked at the tapes, though, there was really only about ten seconds when the teachers were shown apart from the students. "I didn't want students in the frame, because obviously it would bias the ratings," Ambady says. "So I went to my adviser, and I said, 'This isn't going to work.'"
But it did. The observers, presented with a ten-second silent video clip, had no difficulty rating the teachers on a fifteen- item checklist of personality traits. In fact, when Ambady cut the clips back to five seconds, the ratings were the same. They were even the same when she showed her raters just two seconds of videotape. That sounds unbelievable unless you actually watch Ambady's teacher clips, as I did, and realize that the eight seconds that distinguish the longest clips from the shortest are superfluous: anything beyond the first flash of insight is unnecessary. When we make a snap judgment, it is made in a snap. It's also, very clearly, a judgment:we get a feeling that we have no difficulty articulating.
Ambady's next step led to an even more remarkable conclusion. She compared those snap judgments of teacher effectiveness with evaluations made, after a full semester of classes, by students of the same teachers. The correlation between the two, she found, was astoundingly high. A person watching a two-second silent video clip of a teacher he has never met will reach conclusions about how good that teacher is that are very similar to those of a student who sits in the teacher's class for an entire semester.
Recently, a comparable experiment was conducted by Frank Bernieri, a psychologist at the University of Toledo. Bernieri, working with one of his graduate students, Neha Gada-Jain, selected two people to act as interviewers, and trained them for six weeks in the proper procedures and techniques of giving an effective job interview. The two then interviewed ninety-eight volunteers, of various ages and backgrounds. The interviews lasted between fifteen and twenty minutes, and afterward each interviewer filled out a six-page, five-part evaluation of the person he'd just talked to. Originally, the intention of the study was to find out whether applicants who had been coached in certain nonverbal behaviors designed to ingratiate themselves with their interviewers—like mimicking the interviewers' physical gestures or posture—would get better ratings than applicants who behaved normally. As it turns out, they didn't. But then another of Bernieri's students, an undergraduate named Tricia Prickett, decided that she wanted to use the interview videotapes and the evaluations that had been collected to test out the adage that "the handshake is everything."
"She took fifteen seconds of videotape showing the applicant as he or she knocks on the door, comes in, shakes the hand of the interviewer, sits down, and the interviewer welcomes the person," Bernieri explained. Then, like Ambady, Prickett got a series of strangers to rate the applicants based on the handshake clip, using the same criteria that the interviewers had used. Once more, against all expectations, the ratings were very similar to those of the interviewers. "On nine out of the eleven traits the applicants were being judged on, the observers significantly predicted the outcome of the interview," Bernieri says. "The strength of the correlations was extraordinary."
This research takes Ambady's conclusions one step further. In the Toledo experiment, the interviewers were trained in the art of interviewing. They weren't dashing off a teacher evaluation on their way out the door. They were filling out a formal, detailed questionnaire, of the sort designed to give the most thorough and unbiased account of an interview. And still their ratings weren't all that different from those of people off the street who saw just the greeting.
This is why Hadi Partovi, Steve Ballmer, and I all agreed on Nolan Myers. Apparently, human beings don't need to know someone in order to believe that they know someone. Nor does it make that much difference, apparently, that Partovi reached his conclusion after putting Myers through the wringer for an hour, I reached mine after ninety minutes of amiable conversation at Au Bon Pain, and Ballmer reached his after watching and listening as Myers asked a question.
Bernieri and Ambady believe that the power of first impressions suggests that human beings have a particular kind of prerational ability for making searching judgments about others. In Ambady's teacher experiments, when she asked her observers to perform a potentially distracting cognitive task—like memorizing a set of numbers—while watching the tapes, their judgments of teacher effectiveness were unchanged. But when she instructed her observers to think hard about their ratings before they made them, their accuracy suffered substantially. Thinking only gets in the way. "The brain structures that are involved here are very primitive," Ambady speculates. "All of these affective reactions are probably governed by the lower brain structures." What we are picking up in that first instant would seem to be something quite basic about a person's character, because what we conclude after two seconds is pretty much the same as what we conclude after twenty minutes or, indeed, an entire semester. "Maybe you can tell immediately whether someone is extroverted, or gauge the person's ability to communicate,"Bernieri says. "Maybe these clues or cues are immediately accessible and apparent." Bernieri and Ambady are talking about the existence of a powerful form of human intuition. In a way, that's comforting, because it suggests that we can meet a perfect stranger and immediately pick up on something important about him. It means that I shouldn't be concerned that I can't explain why I like Nolan Myers, because, if such judgments are made without thinking, then surely they defy explanation.
But there's a troubling suggestion here as well. I believe that Nolan Myers is an accomplished and likable person. But I have no idea from our brief encounter how honest he is, or whether he is self-centered, or whether he works best by himself or in a group, or any number of other fundamental traits. That people who simply see the handshake arrive at the same conclusions as people who conduct a full interview also implies, perhaps, that those initial impressions matter too much—that they color all the other impressions that we gather over time.
For example, I asked Myers if he felt nervous about the prospect of leaving school for the workplace, which seemed like a reasonable question, since I remember how anxious I was before my first job. Would the hours scare him? Oh no, he replied, he was already working between eighty and a hundred hours a week at school. "Are there things that you think you aren't good at, which make you worry?" I continued.
His reply was sharp: "Are there things that I'm not good at, or things that I can't learn? I think that's the real question. There are a lot of things I don't know anything about, but I feel comfortable that given the right environment and the right encouragement I can do well at." In my notes, next to that reply, I wrote "Great answer!" and I can remember at the time feeling the little thrill you experience as an interviewer when someone's behavior conforms with your expectations. Because I had decided, right off, that I liked him, what I heard in his answer was toughness and confidence. Had I decided early on that I didn't like Nolan Myers, I would have heard in that reply arrogance and bluster. The first impression becomes a self-fulfilling prophecy: we hear what we expect to hear. The interview is hopelessly biased in favor of the nice.
3.
When Ballmer and Partovi and I met Nolan Myers, we made a prediction. We looked at the way he behaved in our presence—at the way he talked and acted and seemed to think—and drew conclusions about how he would behave in other situations. I had decided, remember, that Myers was the kind of person you called the night before the big test in seventh grade. Was I right to make that kind of generalization?
This is a question that social psychologists have looked at closely. In the late nineteen-twenties, in a famous study, the psychologist Theodore Newcomb analyzed extroversion among adolescent boys at a summer camp. He found that how talkative a boy was in one setting—say, lunch—was highly predictive of how talkative that boy would be in the same setting in the future. A boy who was curious at lunch on Monday was likely to be curious at lunch on Tuesday. But his behavior in one setting told you almost nothing about how he would behave in a different setting: from how someone behaved at lunch, you couldn't predict how he would behave during, say, afternoon playtime. In a more recent study, of conscientiousness among students at Carleton College, the researchers Walter Mischel, Neil Lutsky, and Philip K. Peake showed that how neat a student's assignments were or how punctual he was told you almost nothing about how often he attended class or how neat his room or his personal appearance was. How we behave at any one time, evidently, has less to do with some immutable inner compass than with the particulars of our situation.
This conclusion, obviously, is at odds with our intuition. Most of the time, we assume that people display the same character traits in different situations. We habitually underestimate the large role that context plays in people's behavior. In the Newcomb summer-camp experiment, for example, the results showing how little consistency there was from one setting to another in talkativeness, curiosity, and gregariousness were tabulated from observations made and recorded by camp counsellors on the spot. But when, at the end of the summer, those same counsellors were asked to give their final impressions of the kids, they remembered the children's behavior as being highly consistent.
"The basis of the illusion is that we are somehow confident that we are getting what is there, that we are able to read off a person's disposition," Richard Nisbett, a psychologist at the University of Michigan, says. "When you have an interview with someone and have an hour with them, you don't conceptualize that as taking a sample of a person's behavior, let alone a possibly biased sample, which is what it is. What you think is that you are seeing a hologram, a small and fuzzy image but still the whole person."
Then Nisbett mentioned his frequent collaborator, Lee Ross, who teaches psychology at Stanford. "There was one term when he was teaching statistics and one term he was teaching a course with a lot of humanistic psychology. He gets his teacher evaluations. The first referred to him as cold, rigid, remote, finicky, and uptight. And the second described this wonderful warmhearted guy who was so deeply concerned with questions of community and getting students to grow. It was Jekyll and Hyde. In both cases, the students thought they were seeing the real Lee Ross."
Psychologists call this tendency—to fixate on supposedly stable character traits and overlook the influence of context—the Fundamental Attri-bution Error, and if you combine this error with what we know about snap judgments the interview becomes an even more problematic encounter. Not only had I let my first impressions color the informationI gathered about Myers, but I had also assumed that the way he behaved with me in an interview setting was indicative of the way he would always behave. It isn't that the interview is useless; what I learned about Myers—that he and I get along well—is something I could never have got from a rĂ©sumĂ© or by talking to his references. It's just that our conversation turns out to have been less useful, and potentially more misleading, than I had supposed. That most basic of human rituals—the conversation with a stranger—turns out to be a minefield.
4.
Not long after I met with Nolan Myers, I talked with a human- resources consultant from Pasadena named Justin Menkes. Menkes's job is to figure out how to extract meaning from face-to-face encounters, and with that in mind he agreed to spend an hour interviewing me the way he thinks interviewing ought to be done. It felt, going in, not unlike a visit to a shrink, except that instead of having months, if not years, to work things out, Menkes was set upon stripping away my secrets in one session. Consider, he told me, a commonly asked question like "Describe a few situations in which your work was criticized. How did you handle the criticism?" The problem, Menkes said, is that it's much too obvious what the interviewee is supposed to say. "There was a situation where I was working on a project, and I didn't do as well as I could have," he said, adopting a mock-sincere singsong. "My boss gave me some constructive criticism. And I redid the project. It hurt. Yet we worked it out." The same is true of the question "What would your friends say about you?"—to which the correct answer (preferably preceded by a pause, as if to suggest that it had never dawned on you that someone would ask such a question) is "My guess is that they would call me a people person—either that or a hard worker."
Myers and I had talked about obvious questions, too. "What is your greatest weakness?" I asked him. He answered, "I tried to work on a project my freshman year, a children's festival. I was trying to start a festival as a benefit here in Boston. And I had a number of guys working with me. I started getting concerned with the scope of the project we were working on—how much responsibility we had, getting things done. I really put the brakes on, but in retrospect I really think we could have done it and done a great job."
Then Myers grinned and said, as an aside, "Do I truly think that is a fault? Honestly, no." And, of course, he's right. All I'd really asked him was whether he could describe a personal strength as if it were a weakness, and, in answering as he did, he had merely demonstrated his knowledge of the unwritten rules of the interview.
But, Menkes said, what if those questions were rephrased so that the answers weren't obvious? For example: "At your weekly team meetings, your boss unexpectedly begins aggressively critiquing your performance on a current project. What do you do?"
I felt a twinge of anxiety. What would I do? I remembered a terrible boss I'd had years ago. "I'd probably be upset," I said. "But I doubt I'd say anything. I'd probably just walk away." Menkes gave no indication whether he was concerned or pleased by that answer. He simply pointed out that another person might well have said something like "I'd go and see my boss later in private, and confront him about why he embarrassed me in front of my team." I was saying that I would probably handle criticism—even inappropriate criticism—from a superior with stoicism; in the second case, the applicant was saying he or she would adopt a more confrontational style. Or, at least, we were telling the interviewer that the workplace demands either stoicism or confrontation—and to Menkes these are revealing and pertinent pieces of information.
Menkes moved on to another area—handling stress. A typical question in this area is something like "Tell me about a time when you had to do several things at once. How did you handle the situation? How did you decide what to do first?" Menkes says this is also too easy. "I just had to be very organized," he began again in his mock-sincere singsong. "I had to multitask. I had to prioritize and delegate appropriately. I checked in frequently with my boss." Here's how Menkes rephrased it: "You're in a situation where you have two very important responsibilities that both have a deadline that is impossible to meet. You cannot accomplish both. How do you handle that situation?"
"Well," I said, "I would look at the two and decide what I was best at, and then go to my boss and say, 'It's better that I do one well than both poorly,' and we'd figure out who else could do the other task."
Menkes immediately seized on a telling detail in my answer. I was in-terested in what job I would do best. But isn't the key issue what job the company most needed to have done? With that comment, I had revealed some-thing valuable: that in a time of work-related crisis I start from a self-centered consideration. "Perhaps you are a bit of a solo practitioner," Menkes said diplomatically. "That's an essential bit of information."
Menkes deliberately wasn't drawing any broad conclusions. If we are not people who are shy or talkative or outspoken but people who are shy in some contexts, talkative in other situations, and outspoken in still other areas, then what it means to know someone is to catalogue and appreciate all those variations. Menkes was trying to begin that process of cataloguing. This interviewing technique is known as "structured interviewing," and in studies by industrial psychologists it has been shown to be the only kind of interviewing that has any success at all in predicting performance in the workplace. In the structured interviews, the format is fairly rigid. Each applicant is treated in precisely the same manner. The questions are scripted. The interviewers are carefully trained, and each applicant is rated on a series of predetermined scales.
What is interesting about the structured interview is how narrow its objectives are. When I interviewed Nolan Myers I was groping for some kind of global sense of who he was; Menkes seemed entirely uninterested in arriving at that same general sense of me—he seemed to realize how foolish that expectation was for an hour-long interview. The structured interview works precisely because it isn't really an interview; it isn't about getting to know someone, in a traditional sense. It's as much concerned with rejecting information as it is with collecting it.
Not surprisingly, interview specialists have found it extraordinarily difficult to persuade most employers to adopt the structured interview. It just doesn't feel right. For most of us, hiring someone is essentially a romantic process, in which the job interview functions as a desexualized version of a date. We are looking for someone with whom we have a certain chemistry, even if the coupling that results ends in tears and the pursuer and the pursued turn out to have nothing in common. We want the unlimited promise of a love affair. The structured interview, by contrast, seems to offer only the dry logic and practicality of an arranged marriage.
5.
Nolan Myers agonized over which job to take. He spent half an hour on the phone with Steve Ballmer, and Ballmer was very persuasive. "He gave me very, very good advice," Myers says of his conversations with the Microsoft C.E.O. "He felt that I should go to the place that excited me the most and that I thought would be best for my career. He offered to be my mentor." Myers says he talked to his parents every day about what to do. In February, he flew out to California and spent a Saturday going from one Tellme executive to another, asking and answering questions. "Basically, I had three things I was looking for. One was long-term goals for the company. Where did they see themselves in five years? Second, what position would I be playing in the company?" He stopped and burst out laughing. "And I forget what the third one is." In March, Myers committed to Tellme.
Will Nolan Myers succeed at Tellme? I think so, although I honestly have no idea. It's a harder question to answer now than it would have been thirty or forty years ago. If this were 1965, Nolan Myers would have gone to work at I.B.M. and worn a blue suit and sat in a small office and kept his head down, and the particulars of his personality would not have mattered so much. It was not so important that I.B.M. understood who you were before it hired you, because you understood what I.B.M. was. If you walked through the door at Armonk or at a branch office in Illinois, you knew what you had to be and how you were supposed to act. But to walk through the soaring, open offices of Tellme, with the bunk beds over the desks, is to be struck by how much more demanding the culture of Silicon Valley is. Nolan Myers will not be provided with a social script, that blue suit and organization chart. Tellme, like any technology startup these days, wants its employees to be part of a fluid team, to be flexible and innovative, to work with shifting groups in the absence of hierarchy and bureaucracy, and in that environment, where the workplace doubles as the rec room, the particulars of your personality matter a great deal.
This is part of the new economy's appeal, because Tellme's soaring warehouse is a more productive and enjoyable place to work than the little office boxes of the old I.B.M. But the danger here is that we will be led astray in judging these newly important particulars of character. If we let personability—some indefinable, prerational intuition, magnified by the Fundamental Attribution Error—bias the hiring process today, then all we will have done is replace the old-boy network, where you hired your nephew, with the new-boy network, where you hire whoever impressed you most when you shook his hand. Social progress, unless we're careful, can merely be the means by which we replace the obviously arbitrary with the not so obviously arbitrary.
Myers has spent much of the past year helping to teach Introduction to Computer Science. He realized, he says, that one of the reasons that students were taking the course was that they wanted to get jobs in the software industry. "I decided that, having gone through all this interviewing, I had developed some expertise, and I would like to share that. There is a real skill and art in presenting yourself to potential employers. And so what we did in this class was talk about the kinds of things that employers are looking for—what are they looking for in terms of personality. One of the most important things is that you have to come across as being confident in what you are doing and in who you are. How do you do that? Speak clearly and smile." As he said that, Nolan Myers smiled. "For a lot of people, that's a very hard skill to learn. But for some reason I seem to understand it intuitively."
The New-Boy Network
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 29, 2000
DEPT. OF HUMAN RESOURCES
What do job interviews really tell us?
1.
Nolan Myers grew up in Houston, the elder of two boys in a middle- class family. He went to Houston's High School for the Performing and Visual Arts and then Harvard, where he intended to major in History and Science. After discovering the joys of writing code, though, he switched to computer science. "Programming is one of those things you get involved in, and you just can't stop until you finish," Myers says. "You get involved in it, and all of a sudden you look at your watch and it's four in the morning! I love the elegance of it." Myers is short and slightly stocky and has pale-blue eyes. He smiles easily, and when he speaks he moves his hands and torso for emphasis. He plays in a klezmer band called the Charvard Chai Notes. He talks to his parents a lot. He gets B's and B-pluses.
This spring, in the last stretch of his senior year, Myers spent a lot of time interviewing for jobs with technology companies. He talked to a company named Trilogy, down in Texas, but he didn't think he would fit in. "One of Trilogy's subsidiaries put ads out in the paper saying that they were looking for the top tech students, and that they'd give them two hundred thousand dollars and a BMW," Myers said, shaking his head in disbelief. In another of his interviews, a recruiter asked him to solve a programming problem, and he made a stupid mistake and the recruiter pushed the answer back across the table to him, saying that his "solution" accomplished nothing. As he remembers the moment, Myers blushes. "I was so nervous. I thought, Hmm, that sucks!" The way he says that, though, makes it hard to believe that he really was nervous, or maybe what Nolan Myers calls nervous the rest of us call a tiny flutter in the stomach. Myers doesn't seem like the sort to get flustered. He's the kind of person you would call the night before the big test in seventh grade, when nothing made sense and you had begun to panic.
I like Nolan Myers. He will, I am convinced, be very good at whatever career he chooses. I say those two things even though I have spent no more than ninety minutes in his presence. We met only once, on a sunny afternoon in April at the Au Bon Pain in Harvard Square. He was wearing sneakers and khakis and a polo shirt, in a dark-green pattern. He had a big backpack, which he plopped on the floor beneath the table. I bought him an orange juice. He fished around in his wallet and came up with a dollar to try and repay me, which I refused. We sat by the window. Previously, we had talked for perhaps three minutes on the phone, setting up the interview. Then I E-mailed him, asking him how I would recognize him at Au Bon Pain. He sent me the following message, with what I'm convinced—again, on the basis of almost no evidence—to be typical Myers panache: "22ish, five foot seven, straight brown hair, very good-looking. :)." I have never talked to his father, his mother, or his little brother, or any of his professors. I have never seen him ecstatic or angry or depressed. I know nothing of his personal habits, his tastes, or his quirks. I cannot even tell you why I feel the way I do about him. He's good-looking and smart and articulate and funny, but not so good-looking and smart and articulate and funny that there is some obvious explanation for the conclusions I've drawn about him. I just like him, and I'm impressed by him, and if I were an employer looking for bright young college graduates, I'd hire him in a heartbeat.
I heard about Nolan Myers from Hadi Partovi, an executive with Tellme, a highly touted Silicon Valley startup offering Internet access through the telephone. If you were a computer-science major at M.I.T., Harvard, Stanford, Caltech, or the University of Waterloo this spring, looking for a job in software, Tellme was probably at the top of your list. Partovi and I talked in the conference room at Tellme's offices, just off the soaring, open floor where all the firm's programmers and marketers and executives sit, some of them with bunk beds built over their desks. (Tellme recently moved into an old printing plant—a low- slung office building with a huge warehouse attached—and, in accordance with new-economy logic, promptly turned the old offices into a warehouse and the old warehouse into offices.) Partovi is a handsome man of twenty-seven, with olive skin and short curly black hair, and throughout our entire interview he sat with his chair tilted precariously at a forty-five-degree angle. At the end of a long riff about how hard it is to find high-quality people, he blurted out one name: Nolan Myers. Then, from memory, he rattled off Myers's telephone number. He very much wanted Myers to come to Tellme.
Partovi had met Myers in January, during a recruiting trip to Harvard. "It was a heinous day," Partovi remembers. "I started at seven and went until nine. I'd walk one person out and walk the other in." The first fifteen minutes of every interview he spent talking about Tellme—its strategy, its goals, and its business. Then he gave everyone a short programming puzzle. For the rest of the hour-long meeting, Partovi asked questions. He remembers that Myers did well on the programming test, and after talking to him for thirty to forty minutes he became convinced that Myers had, as he puts it, "the right stuff." Partovi spent even less time with Myers than I did. He didn't talk to Myers's family, or see him ecstatic or angry or depressed, either. He knew that Myers had spent last summer as an intern at Microsoft and was about to graduate from an Ivy League school. But virtually everyone recruited by a place like Tellme has graduated from an Ă©lite university, and the Microsoft summer-internship program has more than six hundred people in it. Partovi didn't even know why he liked Myers so much. He just did. "It was very much a gut call," he says.
This wasn't so very different from the experience Nolan Myers had with Steve Ballmer, the C.E.O. of Microsoft. Earlier this year, Myers attended a party for former Microsoft interns called Gradbash. Ballmer gave a speech there, and at the end of his remarks Myers raised his hand. "He was talking a lot about aligning the company in certain directions," Myers told me, "and I asked him about how that influences his ability to make bets on other directions. Are they still going to make small bets?" Afterward, a Microsoft recruiter came up to Myers and said, "Steve wants your E-mail address." Myers gave it to him, and soon he and Ballmer were E-mailing. Ballmer, it seems, badly wanted Myers to come to Microsoft. "He did research on me," Myers says. "He knew which group I was interviewing with, and knew a lot about me personally. He sent me an E-mail saying that he'd love to have me come to Microsoft, and if I had any questions I should contact him. So I sent him a response, saying thank you. After I visited Tellme, I sent him an E-mail saying I was interested in Tellme, here were the reasons, that I wasn't sure yet, and if he had anything to say I said I'd love to talk to him. I gave him my number. So he called, and after playing phone tag we talked—about career trajectory, how Microsoft would influence my career, what he thought of Tellme. I was extremely impressed with him, and he seemed very genuinely interested in me."
What convinced Ballmer he wanted Myers? A glimpse! He caught a little slice of Nolan Myers in action and—just like that—the C.E.O. of a four-hundred-billion-dollar company was calling a college senior in his dorm room. Ballmer somehow knew he liked Myers, the same way Hadi Partovi knew, and the same way I knew after our little chat at Au Bon Pain. But what did we know? What could we know? By any reasonable measure, surely none of us knew Nolan Myers at all.
It is a truism of the new economy that the ultimate success of any enterprise lies with the quality of the people it hires. At many technology companies, employees are asked to all but live at the office, in conditions of intimacy that would have been unthinkable a generation ago. The artifacts of the prototypical Silicon Valley office—the videogames, the espresso bar, the bunk beds, the basketball hoops—are the elements of the rec room, not the workplace. And in the rec room you want to play only with your friends. But how do you find out who your friends are?Today, recruiters canvas the country for rĂ©sumĂ©s. They analyze employment histories and their competitors' staff listings. They call references, and then do what I did with Nolan Myers: sit down with a perfect stranger for an hour and a half and attempt to draw conclusions about that stranger's intelligence and personality. The job interview has become one of the central conventions of the modern economy. But what, exactly, can you know about a stranger after sitting down and talking with him for an hour?
2.
Some years ago, an experimental psychologist at Harvard University, Nalini Ambady, together with Robert Rosenthal, set out to examine the nonverbal aspects of good teaching. As the basis of her research, she used videotapes of teaching fellows which had been made during a training program at Harvard. Her plan was to have outside observers look at the tapes with the sound off and rate the effectiveness of the teachers by their expressions and physical cues. Ambady wanted to have at least a minute of film to work with. When she looked at the tapes, though, there was really only about ten seconds when the teachers were shown apart from the students. "I didn't want students in the frame, because obviously it would bias the ratings," Ambady says. "So I went to my adviser, and I said, 'This isn't going to work.'"
But it did. The observers, presented with a ten-second silent video clip, had no difficulty rating the teachers on a fifteen- item checklist of personality traits. In fact, when Ambady cut the clips back to five seconds, the ratings were the same. They were even the same when she showed her raters just two seconds of videotape. That sounds unbelievable unless you actually watch Ambady's teacher clips, as I did, and realize that the eight seconds that distinguish the longest clips from the shortest are superfluous: anything beyond the first flash of insight is unnecessary. When we make a snap judgment, it is made in a snap. It's also, very clearly, a judgment:we get a feeling that we have no difficulty articulating.
Ambady's next step led to an even more remarkable conclusion. She compared those snap judgments of teacher effectiveness with evaluations made, after a full semester of classes, by students of the same teachers. The correlation between the two, she found, was astoundingly high. A person watching a two-second silent video clip of a teacher he has never met will reach conclusions about how good that teacher is that are very similar to those of a student who sits in the teacher's class for an entire semester.
Recently, a comparable experiment was conducted by Frank Bernieri, a psychologist at the University of Toledo. Bernieri, working with one of his graduate students, Neha Gada-Jain, selected two people to act as interviewers, and trained them for six weeks in the proper procedures and techniques of giving an effective job interview. The two then interviewed ninety-eight volunteers, of various ages and backgrounds. The interviews lasted between fifteen and twenty minutes, and afterward each interviewer filled out a six-page, five-part evaluation of the person he'd just talked to. Originally, the intention of the study was to find out whether applicants who had been coached in certain nonverbal behaviors designed to ingratiate themselves with their interviewers—like mimicking the interviewers' physical gestures or posture—would get better ratings than applicants who behaved normally. As it turns out, they didn't. But then another of Bernieri's students, an undergraduate named Tricia Prickett, decided that she wanted to use the interview videotapes and the evaluations that had been collected to test out the adage that "the handshake is everything."
"She took fifteen seconds of videotape showing the applicant as he or she knocks on the door, comes in, shakes the hand of the interviewer, sits down, and the interviewer welcomes the person," Bernieri explained. Then, like Ambady, Prickett got a series of strangers to rate the applicants based on the handshake clip, using the same criteria that the interviewers had used. Once more, against all expectations, the ratings were very similar to those of the interviewers. "On nine out of the eleven traits the applicants were being judged on, the observers significantly predicted the outcome of the interview," Bernieri says. "The strength of the correlations was extraordinary."
This research takes Ambady's conclusions one step further. In the Toledo experiment, the interviewers were trained in the art of interviewing. They weren't dashing off a teacher evaluation on their way out the door. They were filling out a formal, detailed questionnaire, of the sort designed to give the most thorough and unbiased account of an interview. And still their ratings weren't all that different from those of people off the street who saw just the greeting.
This is why Hadi Partovi, Steve Ballmer, and I all agreed on Nolan Myers. Apparently, human beings don't need to know someone in order to believe that they know someone. Nor does it make that much difference, apparently, that Partovi reached his conclusion after putting Myers through the wringer for an hour, I reached mine after ninety minutes of amiable conversation at Au Bon Pain, and Ballmer reached his after watching and listening as Myers asked a question.
Bernieri and Ambady believe that the power of first impressions suggests that human beings have a particular kind of prerational ability for making searching judgments about others. In Ambady's teacher experiments, when she asked her observers to perform a potentially distracting cognitive task—like memorizing a set of numbers—while watching the tapes, their judgments of teacher effectiveness were unchanged. But when she instructed her observers to think hard about their ratings before they made them, their accuracy suffered substantially. Thinking only gets in the way. "The brain structures that are involved here are very primitive," Ambady speculates. "All of these affective reactions are probably governed by the lower brain structures." What we are picking up in that first instant would seem to be something quite basic about a person's character, because what we conclude after two seconds is pretty much the same as what we conclude after twenty minutes or, indeed, an entire semester. "Maybe you can tell immediately whether someone is extroverted, or gauge the person's ability to communicate,"Bernieri says. "Maybe these clues or cues are immediately accessible and apparent." Bernieri and Ambady are talking about the existence of a powerful form of human intuition. In a way, that's comforting, because it suggests that we can meet a perfect stranger and immediately pick up on something important about him. It means that I shouldn't be concerned that I can't explain why I like Nolan Myers, because, if such judgments are made without thinking, then surely they defy explanation.
But there's a troubling suggestion here as well. I believe that Nolan Myers is an accomplished and likable person. But I have no idea from our brief encounter how honest he is, or whether he is self-centered, or whether he works best by himself or in a group, or any number of other fundamental traits. That people who simply see the handshake arrive at the same conclusions as people who conduct a full interview also implies, perhaps, that those initial impressions matter too much—that they color all the other impressions that we gather over time.
For example, I asked Myers if he felt nervous about the prospect of leaving school for the workplace, which seemed like a reasonable question, since I remember how anxious I was before my first job. Would the hours scare him? Oh no, he replied, he was already working between eighty and a hundred hours a week at school. "Are there things that you think you aren't good at, which make you worry?" I continued.
His reply was sharp: "Are there things that I'm not good at, or things that I can't learn? I think that's the real question. There are a lot of things I don't know anything about, but I feel comfortable that given the right environment and the right encouragement I can do well at." In my notes, next to that reply, I wrote "Great answer!" and I can remember at the time feeling the little thrill you experience as an interviewer when someone's behavior conforms with your expectations. Because I had decided, right off, that I liked him, what I heard in his answer was toughness and confidence. Had I decided early on that I didn't like Nolan Myers, I would have heard in that reply arrogance and bluster. The first impression becomes a self-fulfilling prophecy: we hear what we expect to hear. The interview is hopelessly biased in favor of the nice.
3.
When Ballmer and Partovi and I met Nolan Myers, we made a prediction. We looked at the way he behaved in our presence—at the way he talked and acted and seemed to think—and drew conclusions about how he would behave in other situations. I had decided, remember, that Myers was the kind of person you called the night before the big test in seventh grade. Was I right to make that kind of generalization?
This is a question that social psychologists have looked at closely. In the late nineteen-twenties, in a famous study, the psychologist Theodore Newcomb analyzed extroversion among adolescent boys at a summer camp. He found that how talkative a boy was in one setting—say, lunch—was highly predictive of how talkative that boy would be in the same setting in the future. A boy who was curious at lunch on Monday was likely to be curious at lunch on Tuesday. But his behavior in one setting told you almost nothing about how he would behave in a different setting: from how someone behaved at lunch, you couldn't predict how he would behave during, say, afternoon playtime. In a more recent study, of conscientiousness among students at Carleton College, the researchers Walter Mischel, Neil Lutsky, and Philip K. Peake showed that how neat a student's assignments were or how punctual he was told you almost nothing about how often he attended class or how neat his room or his personal appearance was. How we behave at any one time, evidently, has less to do with some immutable inner compass than with the particulars of our situation.
This conclusion, obviously, is at odds with our intuition. Most of the time, we assume that people display the same character traits in different situations. We habitually underestimate the large role that context plays in people's behavior. In the Newcomb summer-camp experiment, for example, the results showing how little consistency there was from one setting to another in talkativeness, curiosity, and gregariousness were tabulated from observations made and recorded by camp counsellors on the spot. But when, at the end of the summer, those same counsellors were asked to give their final impressions of the kids, they remembered the children's behavior as being highly consistent.
"The basis of the illusion is that we are somehow confident that we are getting what is there, that we are able to read off a person's disposition," Richard Nisbett, a psychologist at the University of Michigan, says. "When you have an interview with someone and have an hour with them, you don't conceptualize that as taking a sample of a person's behavior, let alone a possibly biased sample, which is what it is. What you think is that you are seeing a hologram, a small and fuzzy image but still the whole person."
Then Nisbett mentioned his frequent collaborator, Lee Ross, who teaches psychology at Stanford. "There was one term when he was teaching statistics and one term he was teaching a course with a lot of humanistic psychology. He gets his teacher evaluations. The first referred to him as cold, rigid, remote, finicky, and uptight. And the second described this wonderful warmhearted guy who was so deeply concerned with questions of community and getting students to grow. It was Jekyll and Hyde. In both cases, the students thought they were seeing the real Lee Ross."
Psychologists call this tendency—to fixate on supposedly stable character traits and overlook the influence of context—the Fundamental Attri-bution Error, and if you combine this error with what we know about snap judgments the interview becomes an even more problematic encounter. Not only had I let my first impressions color the informationI gathered about Myers, but I had also assumed that the way he behaved with me in an interview setting was indicative of the way he would always behave. It isn't that the interview is useless; what I learned about Myers—that he and I get along well—is something I could never have got from a rĂ©sumĂ© or by talking to his references. It's just that our conversation turns out to have been less useful, and potentially more misleading, than I had supposed. That most basic of human rituals—the conversation with a stranger—turns out to be a minefield.
4.
Not long after I met with Nolan Myers, I talked with a human- resources consultant from Pasadena named Justin Menkes. Menkes's job is to figure out how to extract meaning from face-to-face encounters, and with that in mind he agreed to spend an hour interviewing me the way he thinks interviewing ought to be done. It felt, going in, not unlike a visit to a shrink, except that instead of having months, if not years, to work things out, Menkes was set upon stripping away my secrets in one session. Consider, he told me, a commonly asked question like "Describe a few situations in which your work was criticized. How did you handle the criticism?" The problem, Menkes said, is that it's much too obvious what the interviewee is supposed to say. "There was a situation where I was working on a project, and I didn't do as well as I could have," he said, adopting a mock-sincere singsong. "My boss gave me some constructive criticism. And I redid the project. It hurt. Yet we worked it out." The same is true of the question "What would your friends say about you?"—to which the correct answer (preferably preceded by a pause, as if to suggest that it had never dawned on you that someone would ask such a question) is "My guess is that they would call me a people person—either that or a hard worker."
Myers and I had talked about obvious questions, too. "What is your greatest weakness?" I asked him. He answered, "I tried to work on a project my freshman year, a children's festival. I was trying to start a festival as a benefit here in Boston. And I had a number of guys working with me. I started getting concerned with the scope of the project we were working on—how much responsibility we had, getting things done. I really put the brakes on, but in retrospect I really think we could have done it and done a great job."
Then Myers grinned and said, as an aside, "Do I truly think that is a fault? Honestly, no." And, of course, he's right. All I'd really asked him was whether he could describe a personal strength as if it were a weakness, and, in answering as he did, he had merely demonstrated his knowledge of the unwritten rules of the interview.
But, Menkes said, what if those questions were rephrased so that the answers weren't obvious? For example: "At your weekly team meetings, your boss unexpectedly begins aggressively critiquing your performance on a current project. What do you do?"
I felt a twinge of anxiety. What would I do? I remembered a terrible boss I'd had years ago. "I'd probably be upset," I said. "But I doubt I'd say anything. I'd probably just walk away." Menkes gave no indication whether he was concerned or pleased by that answer. He simply pointed out that another person might well have said something like "I'd go and see my boss later in private, and confront him about why he embarrassed me in front of my team." I was saying that I would probably handle criticism—even inappropriate criticism—from a superior with stoicism; in the second case, the applicant was saying he or she would adopt a more confrontational style. Or, at least, we were telling the interviewer that the workplace demands either stoicism or confrontation—and to Menkes these are revealing and pertinent pieces of information.
Menkes moved on to another area—handling stress. A typical question in this area is something like "Tell me about a time when you had to do several things at once. How did you handle the situation? How did you decide what to do first?" Menkes says this is also too easy. "I just had to be very organized," he began again in his mock-sincere singsong. "I had to multitask. I had to prioritize and delegate appropriately. I checked in frequently with my boss." Here's how Menkes rephrased it: "You're in a situation where you have two very important responsibilities that both have a deadline that is impossible to meet. You cannot accomplish both. How do you handle that situation?"
"Well," I said, "I would look at the two and decide what I was best at, and then go to my boss and say, 'It's better that I do one well than both poorly,' and we'd figure out who else could do the other task."
Menkes immediately seized on a telling detail in my answer. I was in-terested in what job I would do best. But isn't the key issue what job the company most needed to have done? With that comment, I had revealed some-thing valuable: that in a time of work-related crisis I start from a self-centered consideration. "Perhaps you are a bit of a solo practitioner," Menkes said diplomatically. "That's an essential bit of information."
Menkes deliberately wasn't drawing any broad conclusions. If we are not people who are shy or talkative or outspoken but people who are shy in some contexts, talkative in other situations, and outspoken in still other areas, then what it means to know someone is to catalogue and appreciate all those variations. Menkes was trying to begin that process of cataloguing. This interviewing technique is known as "structured interviewing," and in studies by industrial psychologists it has been shown to be the only kind of interviewing that has any success at all in predicting performance in the workplace. In the structured interviews, the format is fairly rigid. Each applicant is treated in precisely the same manner. The questions are scripted. The interviewers are carefully trained, and each applicant is rated on a series of predetermined scales.
What is interesting about the structured interview is how narrow its objectives are. When I interviewed Nolan Myers I was groping for some kind of global sense of who he was; Menkes seemed entirely uninterested in arriving at that same general sense of me—he seemed to realize how foolish that expectation was for an hour-long interview. The structured interview works precisely because it isn't really an interview; it isn't about getting to know someone, in a traditional sense. It's as much concerned with rejecting information as it is with collecting it.
Not surprisingly, interview specialists have found it extraordinarily difficult to persuade most employers to adopt the structured interview. It just doesn't feel right. For most of us, hiring someone is essentially a romantic process, in which the job interview functions as a desexualized version of a date. We are looking for someone with whom we have a certain chemistry, even if the coupling that results ends in tears and the pursuer and the pursued turn out to have nothing in common. We want the unlimited promise of a love affair. The structured interview, by contrast, seems to offer only the dry logic and practicality of an arranged marriage.
5.
Nolan Myers agonized over which job to take. He spent half an hour on the phone with Steve Ballmer, and Ballmer was very persuasive. "He gave me very, very good advice," Myers says of his conversations with the Microsoft C.E.O. "He felt that I should go to the place that excited me the most and that I thought would be best for my career. He offered to be my mentor." Myers says he talked to his parents every day about what to do. In February, he flew out to California and spent a Saturday going from one Tellme executive to another, asking and answering questions. "Basically, I had three things I was looking for. One was long-term goals for the company. Where did they see themselves in five years? Second, what position would I be playing in the company?" He stopped and burst out laughing. "And I forget what the third one is." In March, Myers committed to Tellme.
Will Nolan Myers succeed at Tellme? I think so, although I honestly have no idea. It's a harder question to answer now than it would have been thirty or forty years ago. If this were 1965, Nolan Myers would have gone to work at I.B.M. and worn a blue suit and sat in a small office and kept his head down, and the particulars of his personality would not have mattered so much. It was not so important that I.B.M. understood who you were before it hired you, because you understood what I.B.M. was. If you walked through the door at Armonk or at a branch office in Illinois, you knew what you had to be and how you were supposed to act. But to walk through the soaring, open offices of Tellme, with the bunk beds over the desks, is to be struck by how much more demanding the culture of Silicon Valley is. Nolan Myers will not be provided with a social script, that blue suit and organization chart. Tellme, like any technology startup these days, wants its employees to be part of a fluid team, to be flexible and innovative, to work with shifting groups in the absence of hierarchy and bureaucracy, and in that environment, where the workplace doubles as the rec room, the particulars of your personality matter a great deal.
This is part of the new economy's appeal, because Tellme's soaring warehouse is a more productive and enjoyable place to work than the little office boxes of the old I.B.M. But the danger here is that we will be led astray in judging these newly important particulars of character. If we let personability—some indefinable, prerational intuition, magnified by the Fundamental Attribution Error—bias the hiring process today, then all we will have done is replace the old-boy network, where you hired your nephew, with the new-boy network, where you hire whoever impressed you most when you shook his hand. Social progress, unless we're careful, can merely be the means by which we replace the obviously arbitrary with the not so obviously arbitrary.
Myers has spent much of the past year helping to teach Introduction to Computer Science. He realized, he says, that one of the reasons that students were taking the course was that they wanted to get jobs in the software industry. "I decided that, having gone through all this interviewing, I had developed some expertise, and I would like to share that. There is a real skill and art in presenting yourself to potential employers. And so what we did in this class was talk about the kinds of things that employers are looking for—what are they looking for in terms of personality. One of the most important things is that you have to come across as being confident in what you are doing and in who you are. How do you do that? Speak clearly and smile." As he said that, Nolan Myers smiled. "For a lot of people, that's a very hard skill to learn. But for some reason I seem to understand it intuitively."
Cheap and Easy
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 10, 2000
COMMENT
Every now and again in politics, there is a moment that captures the temper of the times, and our moment may have come this budget season in Washington. The Centers for Disease Control asked Congress if, for an extra fifteen million dollars in C.D.C. funding, it would like to wipe out syphilis from the United States by 2005. And Congress said no.
The request was not a political ploy to get a bigger budget. Syphilis is an epidemic that, for reasons no one quite understands, runs in cycles, and, after peaking in 1990, the disease is now at its lowest level in United States history. It has retreated to a handful of areas across the country: just twenty- five counties account for half of all cases. In other words, syphilis is very close to that critical point faced by many epidemics, when even the slightest push could tip them into oblivion. That's why the C.D.C. has asked for the extra fifteen million dollars-- to supply that final push.
This was all patiently explained to Congress last year as the epidemic first neared its lowest ebb. The C.D.C. proposed the most prosaic and straightforward of public-health efforts--an aggressive regimen of free diagnosis and treatment. The drug of choice? Penicillin, the same antibiotic that has been so successful in fighting syphilis for the past half century. Congress wasn't interested. This year, the C.D.C. made its case again, and again the public-health budgets that emerged from the House and the Senate left the agency well short of the necessary funding. Next year, unfortunately, the moment when syphilis can be easily eliminated will have passed. The disease will have begun its cyclical return, moving out of the familiar, well-defined neighborhoods where it is now sequestered, and presenting a much more formidable target for public-health officials. "If you miss the timing, there is a point when it is no longer feasible to move to elimination," says Judy Wasserheit, who is the head of the C.D.C.'s syphilis-prevention effort. "We're already pushing the limits of that time frame."
Exactly why, in a period of fiscal plenty, Congress cannot find the money for an anti-syphilis campaign is a bit puzzling. The disease plays a major role in the transmission of H.I.V., increasing infection rates between two- and five-fold. It often irreparably harms children born to those who are infected. And it is extremely expensive. Even with the rates as low as they are now, syphilis costs the country two hundred and fourteen million dollars a year. Congress has the opportunity to make history by eliminating a disease that has plagued the West for centuries. Why isn't it taking it?
The truth is, this is the price we pay for the ways in which disease has become steadily politicized. The great insight of the AIDS movement--later picked up by groups concerned about breast cancer and prostate cancer--was that a community afflicted with a specific medical problem could take its case directly to Capitol Hill, bypassing the medical establishment entirely. This has dramatically increased the resources available for medical research. But it has also given Congress an excuse to treat public health as another form of interest-group politics, in which the most deserving constituencies are those which shout the loudest. In fact, when it comes to illness and disease the most deserving constituencies are often those who cannot shout at all. That syphilis is a sexually transmitted disease primarily affecting very poor African-Americans only makes things worse--sex, race, and poverty being words that the present Congress has difficulty pronouncing individually, let alone in combination.
The last time America came so tantalizingly close to the elimination of syphilis was during the mid-fifties, after the introduction of penicillin. "Are Venereal Diseases disappearing?" the American Journal of Syphilis asked in 1951; four years later, the journal itself had disappeared. Such was the certainty that the era of syphilis was ending that the big debate in the public- health field was ethical rather than medical--namely, how the removal of the threat of venereal disease would affect sexual behavior.
As Dr. John Stokes, one of the leading experts of his day on sexually transmitted diseases, wrote, "It is a reasonable question, whether by eliminating disease, without commensurate attention to the development of human idealism, self-control, and responsibility in the sexual life, we are not bringing mankind to its fall instead of fulfillment." Stokes assumed that syphilis would soon vanish, and that we ought to worry about the morality of those who could have got the disease but now wouldn't. As it turns out, he had it backward. Syphilis is still with us. And we ought to worry instead about the morality of those who could have eliminated the disease but chose not to.
The Art of Failure
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
August 21 & 28, 2000
PERFORMANCE STUDIES
Why some people choke and others panic
There was a moment, in the third and deciding set of the 1993 Wimbledon final, when Jana Novotna seemed invincible. She was leading 4-1 and serving at 40-30, meaning that she was one point from winning the game, and just five points from the most coveted championship in tennis. She had just hit a backhand to her opponent, Steffi Graf, that skimmed the net and landed so abruptly on the far side of the court that Graf could only watch, in flat- footed frustration. The stands at Center Court were packed. The Duke and Duchess of Kent were in their customary place in the royal box. Novotna was in white, poised and confident, her blond hair held back with a headband--and then something happened. She served the ball straight into the net. She stopped and steadied herself for the second serve--the toss, the arch of the back--but this time it was worse. Her swing seemed halfhearted, all arm and no legs and torso. Double fault. On the next point, she was slow to react to a high shot by Graf, and badly missed on a forehand volley. At game point, she hit an overhead straight into the net. Instead of 5-1, it was now 4-2. Graf to serve: an easy victory, 4-3. Novotna to serve. She wasn't tossing the ball high enough. Her head was down. Her movements had slowed markedly. She double-faulted once, twice, three times. Pulled wide by a Graf forehand, Novotna inexplicably hit a low, flat shot directly at Graf, instead of a high crosscourt forehand that would have given her time to get back into position: 4-4. Did she suddenly realize how terrifyingly close she was to victory? Did she remember that she had never won a major tournament before? Did she look across the net and see Steffi Graf--Steffi Graf!--the greatest player of her generation?
On the baseline, awaiting Graf's serve, Novotna was now visibly agitated, rocking back and forth, jumping up and down. She talked to herself under her breath. Her eyes darted around the court. Graf took the game at love; Novotna, moving as if in slow motion, did not win a single point: 5-4, Graf. On the sidelines, Novotna wiped her racquet and her face with a towel, and then each finger individually. It was her turn to serve. She missed a routine volley wide, shook her head, talked to herself. She missed her first serve, made the second, then, in the resulting rally, mis-hit a backhand so badly that it sailed off her racquet as if launched into flight. Novotna was unrecognizable, not an Ă©lite tennis player but a beginner again. She was crumbling under pressure, but exactly why was as baffling to her as it was to all those looking on. Isn't pressure supposed to bring out the best in us? We try harder. We concentrate harder. We get a boost of adrenaline. We care more about how well we perform. So what was happening to her?
At championship point, Novotna hit a low, cautious, and shallow lob to Graf. Graf answered with an unreturnable overhead smash, and, mercifully, it was over. Stunned, Novotna moved to the net. Graf kissed her twice. At the awards ceremony, the Duchess of Kent handed Novotna the runner-up's trophy, a small silver plate, and whispered something in her ear, and what Novotna had done finally caught up with her. There she was, sweaty and exhausted, looming over the delicate white-haired Duchess in her pearl necklace. The Duchess reached up and pulled her head down onto her shoulder, and Novotna started to sob.
Human beings sometimes falter under pressure. Pilots crash and divers drown. Under the glare of competition, basketball players cannot find the basket and golfers cannot find the pin. When that happens, we say variously that people have "panicked" or, to use the sports colloquialism, "choked." But what do those words mean? Both are pejoratives. To choke or panic is considered to be as bad as to quit. But are all forms of failure equal? And what do the forms in which we fail say about who we are and how we think?We live in an age obsessed with success, with documenting the myriad ways by which talented people overcome challenges and obstacles. There is as much to be learned, though, from documenting the myriad ways in which talented people sometimes fail.
"Choking" sounds like a vague and all-encompassing term, yet it describes a very specific kind of failure. For example, psychologists often use a primitive video game to test motor skills. They'll sit you in front of a computer with a screen that shows four boxes in a row, and a keyboard that has four corresponding buttons in a row. One at a time, x's start to appear in the boxes on the screen, and you are told that every time this happens you are to push the key corresponding to the box. According to Daniel Willingham, a psychologist at the University of Virginia, if you're told ahead of time about the pattern in which those x's will appear, your reaction time in hitting the right key will improve dramatically. You'll play the game very carefully for a few rounds, until you've learned the sequence, and then you'll get faster and faster. Willingham calls this "explicit learning." But suppose you're not told that the x's appear in a regular sequence, and even after playing the game for a while you're not aware that there is a pattern. You'll still get faster: you'll learn the sequence unconsciously. Willingham calls that "implicit learning"--learning that takes place outside of awareness. These two learning systems are quite separate, based in different parts of the brain. Willingham says that when you are first taught something--say, how to hit a backhand or an overhead forehand--you think it through in a very deliberate, mechanical manner. But as you get better the implicit system takes over: you start to hit a backhand fluidly, without thinking. The basal ganglia, where implicit learning partially resides, are concerned with force and timing, and when that system kicks in you begin to develop touch and accuracy, the ability to hit a drop shot or place a serve at a hundred miles per hour. "This is something that is going to happen gradually," Willingham says. "You hit several thousand forehands, after a while you may still be attending to it. But not very much. In the end, you don't really notice what your hand is doing at all."
Under conditions of stress, however, the explicit system sometimes takes over. That's what it means to choke. When Jana Novotna faltered at Wimbledon, it was because she began thinking about her shots again. She lost her fluidity, her touch. She double-faulted on her serves and mis-hit her overheads, the shots that demand the greatest sensitivity in force and timing. She seemed like a different person--playing with the slow, cautious deliberation of a beginner--because, in a sense, she was a beginner again: she was relying on a learning system that she hadn't used to hit serves and overhead forehands and volleys since she was first taught tennis, as a child. The same thing has happened to Chuck Knoblauch, the New York Yankees' second baseman, who inexplicably has had trouble throwing the ball to first base. Under the stress of playing in front of forty thousand fans at Yankee Stadium, Knoblauch finds himself reverting to explicit mode, throwing like a Little Leaguer again.
Panic is something else altogether. Consider the following account of a scuba-diving accident, recounted to me by Ephimia Morphew, a human-factors specialist at nasa: "It was an open-water certification dive, Monterey Bay, California, about ten years ago. I was nineteen. I'd been diving for two weeks. This was my first time in the open ocean without the instructor. Just my buddy and I. We had to go about forty feet down, to the bottom of the ocean, and do an exercise where we took our regulators out of our mouth, picked up a spare one that we had on our vest, and practiced breathing out of the spare. My buddy did hers. Then it was my turn. I removed my regulator. I lifted up my secondary regulator. I put it in my mouth, exhaled, to clear the lines, and then I inhaled, and, to my surprise, it was water. I inhaled water. Then the hose that connected that mouthpiece to my tank, my air source, came unlatched and air from the hose came exploding into my face.
"Right away, my hand reached out for my partner's air supply, as if I was going to rip it out. It was without thought. It was a physiological response. My eyes are seeing my hand do something irresponsible. I'm fighting with myself. Don't do it. Then I searched my mind for what I could do. And nothing came to mind. All I could remember was one thing: If you can't take care of yourself, let your buddy take care of you. I let my hand fall back to my side, and I just stood there."
This is a textbook example of panic. In that moment, Morphew stopped thinking. She forgot that she had another source of air, one that worked perfectly well and that, moments before, she had taken out of her mouth. She forgot that her partner had a working air supply as well, which could easily be shared, and she forgot that grabbing her partner's regulator would imperil both of them. All she had was her most basic instinct: get air. Stress wipes out short-term memory. People with lots of experience tend not to panic, because when the stress suppresses their short- term memory they still have some residue of experience to draw on. But what did a novice like Morphew have? I searched my mind for what I could do. And nothing came to mind.
Panic also causes what psychologists call perceptual narrowing. In one study, from the early seventies, a group of subjects were asked to perform a visual acuity task while undergoing what they thought was a sixty-foot dive in a pressure chamber. At the same time, they were asked to push a button whenever they saw a small light flash on and off in their peripheral vision. The subjects in the pressure chamber had much higher heart rates than the control group, indicating that they were under stress. That stress didn't affect their accuracy at the visual-acuity task, but they were only half as good as the control group at picking up the peripheral light. "You tend to focus or obsess on one thing," Morphew says. "There's a famous airplane example, where the landing light went off, and the pilots had no way of knowing if the landing gear was down. The pilots were so focussed on that light that no one noticed the autopilot had been disengaged, and they crashed the plane." Morphew reached for her buddy's air supply because it was the only air supply she could see.
Panic, in this sense, is the opposite of choking. Choking is about thinking too much. Panic is about thinking too little. Choking is about loss of instinct. Panic is reversion to instinct. They may look the same, but they are worlds apart.
Why does this distinction matter? In some instances, it doesn't much. If you lose a close tennis match, it's of little moment whether you choked or panicked; either way, you lost. But there are clearly cases when how failure happens is central to understanding why failure happens.
Take the plane crash in which John F. Kennedy, Jr., was killed last summer. The details of the flight are well known. On a Friday evening last July, Kennedy took off with his wife and sister-in-law for Martha's Vineyard. The night was hazy, and Kennedy flew along the Connecticut coastline, using the trail of lights below him as a guide. At Westerly, Rhode Island, he left the shoreline, heading straight out over Rhode Island Sound, and at that point, apparently disoriented by the darkness and haze, he began a series of curious maneuvers: He banked his plane to the right, farther out into the ocean, and then to the left. He climbed and descended. He sped up and slowed down. Just a few miles from his destination, Kennedy lost control of the plane, and it crashed into the ocean.
Kennedy's mistake, in technical terms, was that he failed to keep his wings level. That was critical, because when a plane banks to one side it begins to turn and its wings lose some of their vertical lift. Left unchecked, this process accelerates. The angle of the bank increases, the turn gets sharper and sharper, and the plane starts to dive toward the ground in an ever-narrowing corkscrew. Pilots call this the graveyard spiral. And why didn't Kennedy stop the dive? Because, in times of low visibility and high stress, keeping your wings level--indeed, even knowing whether you are in a graveyard spiral--turns out to be surprisingly difficult. Kennedy failed under pressure.
Had Kennedy been flying during the day or with a clear moon, he would have been fine. If you are the pilot, looking straight ahead from the cockpit, the angle of your wings will be obvious from the straight line of the horizon in front of you. But when it's dark outside the horizon disappears. There is no external measure of the plane's bank. On the ground, we know whether we are level even when it's dark, because of the motion-sensing mechanisms in the inner ear. In a spiral dive, though, the effect of the plane's G-force on the inner ear means that the pilot feels perfectly level even if his plane is not. Similarly, when you are in a jetliner that is banking at thirty degrees after takeoff, the book on your neighbor's lap does not slide into your lap, nor will a pen on the floor roll toward the "down" side of the plane. The physics of flying is such that an airplane in the midst of a turn always feels perfectly level to someone inside the cabin.
This is a difficult notion, and to understand it I went flying with William Langewiesche, the author of a superb book on flying, "Inside the Sky." We met at San Jose Airport, in the jet center where the Silicon Valley billionaires keep their private planes. Langewiesche is a rugged man in his forties, deeply tanned, and handsome in the way that pilots (at least since the movie "The Right Stuff") are supposed to be. We took off at dusk, heading out toward Monterey Bay, until we had left the lights of the coast behind and night had erased the horizon. Langewiesche let the plane bank gently to the left. He took his hands off the stick. The sky told me nothing now, so I concentrated on the instruments. The nose of the plane was dropping. The gyroscope told me that we were banking, first fifteen, then thirty, then forty-five degrees. "We're in a spiral dive," Langewiesche said calmly. Our airspeed was steadily accelerating, from a hundred and eighty to a hundred and ninety to two hundred knots. The needle on the altimeter was moving down. The plane was dropping like a stone, at three thousand feet per minute. I could hear, faintly, a slight increase in the hum of the engine, and the wind noise as we picked up speed. But if Langewiesche and I had been talking I would have caught none of that. Had the cabin been unpressurized, my ears might have popped, particularly as we went into the steep part of the dive. But beyond that? Nothing at all. In a spiral dive, the G-load--the force of inertia--is normal. As Langewiesche puts it, the plane likes to spiral-dive. The total time elapsed since we started diving was no more than six or seven seconds. Suddenly, Langewiesche straightened the wings and pulled back on the stick to get the nose of the plane up, breaking out of the dive. Only now did I feel the full force of the G-load, pushing me back in my seat. "You feel no G-load in a bank," Langewiesche said. "There's nothing more confusing for the uninitiated."
I asked Langewiesche how much longer we could have fallen. "Within five seconds, we would have exceeded the limits of the airplane," he replied, by which he meant that the force of trying to pull out of the dive would have broken the plane into pieces. I looked away from the instruments and asked Langewiesche to spiral-dive again, this time without telling me. I sat and waited. I was about to tell Langewiesche that he could start diving anytime, when, suddenly, I was thrown back in my chair. "We just lost a thousand feet," he said.
This inability to sense, experientially, what your plane is doing is what makes night flying so stressful. And this was the stress that Kennedy must have felt when he turned out across the water at Westerly, leaving the guiding lights of the Connecticut coastline behind him. A pilot who flew into Nantucket that night told the National Transportation Safety Board that when he descended over Martha's Vineyard he looked down and there was "nothing to see. There was no horizon and no light.... I thought the island might [have] suffered a power failure." Kennedy was now blind, in every sense, and he must have known the danger he was in. He had very little experience in flying strictly by instruments. Most of the time when he had flown up to the Vineyard the horizon or lights had still been visible. That strange, final sequence of maneuvers was Kennedy's frantic search for a clearing in the haze. He was trying to pick up the lights of Martha's Vineyard, to restore the lost horizon. Between the lines of the National Transportation Safety Board's report on the crash, you can almost feel his desperation:
About 2138 the target began a right turn in a southerly direction. About 30 seconds later, the target stopped its descent at 2200 feet and began a climb that lasted another 30 seconds. During this period of time, the target stopped the turn, and the airspeed decreased to about 153 KIAS. About 2139, the target leveled off at 2500 feet and flew in a southeasterly direction. About 50 seconds later, the target entered a left turn and climbed to 2600 feet. As the target continued in the left turn, it began a descent that reached a rate of about 900 fpm.
But was he choking or panicking? Here the distinction between those two states is critical. Had he choked, he would have reverted to the mode of explicit learning. His movements in the cockpit would have become markedly slower and less fluid. He would have gone back to the mechanical, self-conscious application of the lessons he had first received as a pilot--and that might have been a good thing. Kennedy needed to think, to concentrate on his instruments, to break away from the instinctive flying that served him when he had a visible horizon.
But instead, from all appearances, he panicked. At the moment when he needed to remember the lessons he had been taught about instrument flying, his mind--like Morphew's when she was underwater--must have gone blank. Instead of reviewing the instruments, he seems to have been focussed on one question: Where are the lights of Martha's Vineyard? His gyroscope and his other instruments may well have become as invisible as the peripheral lights in the underwater-panic experiments. He had fallen back on his instincts--on the way the plane felt--and in the dark, of course, instinct can tell you nothing. The N.T.S.B. report says that the last time the Piper's wings were level was seven seconds past 9:40, and the plane hit the water at about 9:41, so the critical period here was less than sixty seconds. At twenty-five seconds past the minute, the plane was tilted at an angle greater than forty-five degrees. Inside the cockpit it would have felt normal. At some point, Kennedy must have heard the rising wind outside, or the roar of the engine as it picked up speed. Again, relying on instinct, he might have pulled back on the stick, trying to raise the nose of the plane. But pulling back on the stick without first levelling the wings only makes the spiral tighter and the problem worse. It's also possible that Kennedy did nothing at all, and that he was frozen at the controls, still frantically searching for the lights of the Vineyard, when his plane hit the water. Sometimes pilots don't even try to make it out of a spiral dive. Langewiesche calls that "one G all the way down."
What happened to Kennedy that night illustrates a second major difference between panicking and choking. Panicking is conventional failure, of the sort we tacitly understand. Kennedy panicked because he didn't know enough about instrument flying. If he'd had another year in the air, he might not have panicked, and that fits with what we believe--that performance ought to improve with experience, and that pressure is an obstacle that the diligent can overcome. But choking makes little intuitive sense. Novotna's problem wasn't lack of diligence; she was as superbly conditioned and schooled as anyone on the tennis tour. And what did experience do for her? In 1995, in the third round of the French Open, Novotna choked even more spectacularly than she had against Graf, losing to Chanda Rubin after surrendering a 5-0 lead in the third set. There seems little doubt that part of the reason for her collapse against Rubin was her collapse against Graf--that the second failure built on the first, making it possible for her to be up 5-0 in the third set and yet entertain the thought I can still lose. If panicking is conventional failure, choking is paradoxical failure.
Claude Steele, a psychologist at Stanford University, and his colleagues have done a number of experiments in recent years looking at how certain groups perform under pressure, and their findings go to the heart of what is so strange about choking. Steele and Joshua Aronson found that when they gave a group of Stanford undergraduates a standardized test and told them that it was a measure of their intellectual ability, the white students did much better than their black counterparts. But when the same test was presented simply as an abstract laboratory tool, with no relevance to ability, the scores of blacks and whites were virtually identical. Steele and Aronson attribute this disparity to what they call "stereotype threat": when black students are put into a situation where they are directly confronted with a stereotype about their group--in this case, one having to do with intelligence--the resulting pressure causes their performance to suffer.
Steele and others have found stereotype threat at work in any situation where groups are depicted in negative ways. Give a group of qualified women a math test and tell them it will measure their quantitative ability and they'll do much worse than equally skilled men will; present the same test simply as a research tool and they'll do just as well as the men. Or consider a handful of experiments conducted by one of Steele's former graduate students, Julio Garcia, a professor at Tufts University. Garcia gathered together a group of white, athletic students and had a white instructor lead them through a series of physical tests: to jump as high as they could, to do a standing broad jump, and to see how many pushups they could do in twenty seconds. The instructor then asked them to do the tests a second time, and, as you'd expect, Garcia found that the students did a little better on each of the tasks the second time around. Then Garcia ran a second group of students through the tests, this time replacing the instructor between the first and second trials with an African-American. Now the white students ceased to improve on their vertical leaps. He did the experiment again, only this time he replaced the white instructor with a black instructor who was much taller and heavier than the previous black instructor. In this trial, the white students actually jumped less high than they had the first time around. Their performance on the pushups, though, was unchanged in each of the conditions. There is no stereotype, after all, that suggests that whites can't do as many pushups as blacks. The task that was affected was the vertical leap, because of what our culture says: white men can't jump.
It doesn't come as news, of course, that black students aren't as good at test-taking as white students, or that white students aren't as good at jumping as black students. The problem is that we've always assumed that this kind of failure under pressure is panic. What is it we tell underperforming athletes and students? The same thing we tell novice pilots or scuba divers: to work harder, to buckle down, to take the tests of their ability more seriously. But Steele says that when you look at the way black or female students perform under stereotype threat you don't see the wild guessing of a panicked test taker. "What you tend to see is carefulness and second-guessing," he explains. "When you go and interview them, you have the sense that when they are in the stereotype-threat condition they say to themselves, 'Look, I'm going to be careful here. I'm not going to mess things up.' Then, after having decided to take that strategy, they calm down and go through the test. But that's not the way to succeed on a standardized test. The more you do that, the more you will get away from the intuitions that help you, the quick processing. They think they did well, and they are trying to do well. But they are not." This is choking, not panicking. Garcia's athletes and Steele's students are like Novotna, not Kennedy. They failed because they were good at what they did: only those who care about how well they perform ever feel the pressure of stereotype threat. The usual prescription for failure--to work harder and take the test more seriously--would only make their problems worse.
That is a hard lesson to grasp, but harder still is the fact that choking requires us to concern ourselves less with the performer and more with the situation in which the performance occurs. Novotna herself could do nothing to prevent her collapse against Graf. The only thing that could have saved her is if--at that critical moment in the third set--the television cameras had been turned off, the Duke and Duchess had gone home, and the spectators had been told to wait outside. In sports, of course, you can't do that. Choking is a central part of the drama of athletic competition, because the spectators have to be there--and the ability to overcome the pressure of the spectators is part of what it means to be a champion. But the same ruthless inflexibility need not govern the rest of our lives. We have to learn that sometimes a poor performance reflects not the innate ability of the performer but the complexion of the audience; and that sometimes a poor test score is the sign not of a poor student but of a good one.
Through the first three rounds of the 1996 Masters golf tournament, Greg Norman held a seemingly insurmountable lead over his nearest rival, the Englishman Nick Faldo. He was the best player in the world. His nickname was the Shark. He didn't saunter down the fairways; he stalked the course, blond and broad-shouldered, his caddy behind him, struggling to keep up. But then came the ninth hole on the tournament's final day. Norman was paired with Faldo, and the two hit their first shots well. They were now facing the green. In front of the pin, there was a steep slope, so that any ball hit short would come rolling back down the hill into oblivion. Faldo shot first, and the ball landed safely long, well past the cup.
Norman was next. He stood over the ball. "The one thing you guard against here is short," the announcer said, stating the obvious. Norman swung and then froze, his club in midair, following the ball in flight. It was short. Norman watched, stone-faced, as the ball rolled thirty yards back down the hill, and with that error something inside of him broke.
At the tenth hole, he hooked the ball to the left, hit his third shot well past the cup, and missed a makable putt. At eleven, Norman had a three-and-a-half-foot putt for par--the kind he had been making all week. He shook out his hands and legs before grasping the club, trying to relax. He missed: his third straight bogey. At twelve, Norman hit the ball straight into the water. At thirteen, he hit it into a patch of pine needles. At sixteen, his movements were so mechanical and out of synch that, when he swung, his hips spun out ahead of his body and the ball sailed into another pond. At that, he took his club and made a frustrated scythelike motion through the grass, because what had been obvious for twenty minutes was now official: he had fumbled away the chance of a lifetime.
Faldo had begun the day six strokes behind Norman. By the time the two started their slow walk to the eighteenth hole, through the throng of spectators, Faldo had a four- stroke lead. But he took those final steps quietly, giving only the smallest of nods, keeping his head low. He understood what had happened on the greens and fairways that day. And he was bound by the particular etiquette of choking, the understanding that what he had earned was something less than a victory and what Norman had suffered was something less than a defeat.
When it was all over, Faldo wrapped his arms around Norman. "I don't know what to say--I just want to give you a hug," he whispered, and then he said the only thing you can say to a choker: "I feel horrible about what happened. I'm so sorry." With that, the two men began to cry.
The Pitchman
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
October 30, 2000
ANNALS OF ENTERPRISE
Ron Popeil and the conquest of the American kitchen.
The extraordinary story of the Ronco Showtime Rotisserie & BBQ begins with Nathan Morris, the son of the shoemaker and cantor Kidders Morris, who came over from the Old Country in the eighteen-eighties, and settled in Asbury Park, New Jersey. Nathan Morris was a pitchman. He worked the boardwalk and the five-and-dimes and county fairs up and down the Atlantic coast, selling kitchen gadgets made by Acme Metal, out of Newark. In the early forties, Nathan set up N. K. Morris Manufacturing--turning out the KwiKi-Pi and the Morris Metric Slicer--and perhaps because it was the Depression and job prospects were dim, or perhaps because Nathan Morris made such a compelling case for his new profession, one by one the members of his family followed him into the business. His sons Lester Morris and Arnold (the Knife) Morris became his pitchmen. He set up his brother-in-law Irving Rosenbloom, who was to make a fortune on Long Island in plastic goods, including a hand grater of such excellence that Nathan paid homage to it with his own Dutch Kitchen Shredder Grater. He partnered with his brother Al, whose own sons worked the boardwalk, alongside a gangly Irishman by the name of Ed McMahon. Then, one summer just before the war, Nathan took on as an apprentice his nephew Samuel Jacob Popeil. S.J., as he was known, was so inspired by his uncle Nathan that he went on to found Popeil Brothers, based in Chicago, and brought the world the Dial-O-Matic, the Chop-O-Matic, and the Veg-O-Matic. S. J. Popeil had two sons. The elder was Jerry, who died young. The younger is familiar to anyone who has ever watched an infomercial on late- night television. His name is Ron Popeil.
In the postwar years, many people made the kitchen their life's work. There were the Klinghoffers of New York, one of whom, Leon, died tragically in 1985, during the Achille Lauro incident, when he was pushed overboard in his wheelchair by Palestinian terrorists). They made the Roto-Broil 400, back in the fifties, an early rotisserie for the home, which was pitched by Lester Morris. There was Lewis Salton, who escaped the Nazis with an English stamp from his father's collection and parlayed it into an appliance factory in the Bronx. He brought the world the Salton Hotray--a sort of precursor to the microwave--and today Salton, Inc., sells the George Foreman Grill.
But no rival quite matched the Morris-Popeil clan. They were the first family of the American kitchen. They married beautiful women and made fortunes and stole ideas from one another and lay awake at night thinking of a way to chop an onion so that the only tears you shed were tears of joy. They believed that it was a mistake to separate product development from marketing, as most of their contemporaries did, because to them the two were indistinguishable: the object that sold best was the one that sold itself. They were spirited, brilliant men. And Ron Popeil was the most brilliant and spirited of them all. He was the family's Joseph, exiled to the wilderness by his father only to come back and make more money than the rest of the family combined. He was a pioneer in taking the secrets of the boardwalk pitchmen to the television screen. And, of all the kitchen gadgets in the Morris-Popeil pantheon, nothing has ever been quite so ingenious in its design, or so broad in its appeal, or so perfectly representative of the Morris-Popeil belief in the interrelation of the pitch and the object being pitched, as the Ronco Showtime Rotisserie & BBQ, the countertop oven that can be bought for four payments of $39.95 and may be, dollar for dollar, the finest kitchen appliance ever made.
A Rotisserie Is Born
Ron Popeil is a handsome man, thick through the chest and shoulders, with a leonine head and striking, over-size features. He is in his mid-sixties, and lives in Beverly Hills, halfway up Coldwater Canyon, in a sprawling bungalow with a stand of avocado trees and a vegetable garden out back. In his habits he is, by Beverly Hills standards, old school. He carries his own bags. He has been known to eat at Denny's. He wears T-shirts and sweatpants. As often as twice a day, he can be found buying poultry or fish or meat at one of the local grocery stores--in particular, Costco, which he favors because the chickens there are ninety-nine cents a pound, as opposed to a dollar forty-nine at standard supermarkets. Whatever he buys, he brings back to his kitchen, a vast room overlooking the canyon, with an array of industrial appliances, a collection of fifteen hundred bottles of olive oil, and, in the corner, an oil painting of him, his fourth wife, Robin (a former Frederick's of Hollywood model), and their baby daughter, Contessa. On paper, Popeil owns a company called Ronco Inventions, which has two hundred employees and a couple of warehouses in Chatsworth, California, but the heart of Ronco is really Ron working out of his house, and many of the key players are really just friends of Ron's who work out of their houses, too, and who gather in Ron's kitchen when, every now and again, Ron cooks a soup and wants to talk things over.
In the last thirty years, Ron has invented a succession of kitchen gadgets, among them the Ronco Electric Food Dehydrator and the Popeil Automatic Pasta and Sausage Maker, which featured a thrust bearing made of the same material used in bulletproof glass. He works steadily, guided by flashes of inspiration. This past August, for instance, he suddenly realized what product should follow the Showtime Rotisserie. He and his right-hand man, Alan Backus, had been working on a bread-and-batter machine, which would take up to ten pounds of chicken wings or scallops or shrimp or fish fillets and do all the work--combining the eggs, the flour, the breadcrumbs--in a few minutes, without dirtying either the cook's hands or the machine. "Alan goes to Korea, where we have some big orders coming through," Ron explained recently over lunch--a hamburger, medium-well, with fries--in the V.I.P. booth by the door in the Polo Lounge, at the Beverly Hills Hotel. "I call Alan on the phone. I wake him up. It was two in the morning there. And these are my exact words: `Stop. Do not pursue the bread-and-batter machine. I will pick it up later. This other project needs to come first.' " The other project, his inspiration, was a device capable of smoking meats indoors without creating odors that can suffuse the air and permeate furniture. Ron had a version of the indoor smoker on his porch--"a Rube Goldberg kind of thing" that he'd worked on a year earlier--and, on a whim, he cooked a chicken in it. "That chicken was so good that I said to myself"--and with his left hand Ron began to pound on the table--"This is the best chicken sandwich I have ever had in my life." He turned to me: "How many times have you had a smoked-turkey sandwich? Maybe you have a smoked- turkey or a smoked-chicken sandwich once every six months. Once! How many times have you had smoked salmon? Aah. More. I'm going to say you come across smoked salmon as an hors d'oeuvre or an entrée once every three months. Baby-back ribs? Depends on which restaurant you order ribs at. Smoked sausage, same thing. You touch on smoked food"--he leaned in and poked my arm for emphasis--"but I know one thing, Malcolm. You don't have a smoker."
The idea for the Showtime came about in the same way. Ron was at Costco about four years ago when he suddenly realized that there was a long line of customers waiting to buy chickens from the in-store rotisserie ovens. They touched on rotisserie chicken, but Ron knew one thing: they did not have a rotisserie oven. Ron went home and called Backus. Together, they bought a glass aquarium, a motor, a heating element, a spit rod, and a handful of other spare parts, and began tinkering. Ron wanted something big enough for a fifteen-pound turkey but small enough to fit into the space between the base of an average kitchen cupboard and the countertop. He didn't want a thermostat, because thermostats break, and the constant clicking on and off of the heat prevents the even, crispy browning that he felt was essential. And the spit rod had to rotate on the horizontal axis, not the vertical axis, because if you cooked a chicken or a side of beef on the vertical axis the top would dry out and the juices would drain to the bottom. Roderick Dorman, Ron's patent attorney, says that when he went over to Coldwater Canyon he often saw five or six prototypes on the kitchen counter, lined up in a row. Ron would have a chicken in each of them, so that he could compare the consistency of the flesh and the browning of the skin, and wonder if, say, there was a way to rotate a shish kebab as it approached the heating element so that the inner side of the kebab would get as brown as the outer part. By the time Ron finished, the Showtime prompted no fewer than two dozen patent applications. It was equipped with the most powerful motor in its class. It had a drip tray coated with a nonstick ceramic, which was easily cleaned, and the oven would still work even after it had been dropped on a concrete or stone surface ten times in succession, from a distance of three feet. To Ron, there was no question that it made the best chicken he had ever had in his life.
It was then that Ron filmed a television infomercial for the Showtime, twenty-eight minutes and thirty seconds in length. It was shot live before a studio audience, and aired for the first time on August 8, 1998. It has run ever since, often in the wee hours of the morning, or on obscure cable stations, alongside the get-rich schemes and the "Three's Company" reruns. The response to it has been such that within the next three years total sales of the Showtime should exceed a billion dollars. Ron Popeil didn't use a single focus group. He had no market researchers, R. & D. teams, public-relations advisers, Madison Avenue advertising companies, or business consultants. He did what the Morrises and the Popeils had been doing for most of the century, and what all the experts said couldn't be done in the modern economy. He dreamed up something new in his kitchen and went out and pitched it himself.
Pitchmen
Nathan Morris, Ron Popeil's great-uncle, looked a lot like Cary Grant. He wore a straw boater. He played the ukulele, drove a convertible, and composed melodies for the piano. He ran his business out of a low-slung, whitewashed building on Ridge Avenue, near Asbury Park, with a little annex in the back where he did pioneering work with Teflon. He had certain eccentricities, such as a phobia he developed about travelling beyond Asbury Park without the presence of a doctor. He feuded with his brother Al, who subsequently left in a huff for Atlantic City, and then with his nephew S. J. Popeil, whom Nathan considered insufficiently grateful for the start he had given him in the kitchen- gadget business. That second feud led to a climactic legal showdown over S. J. Popeil's Chop-O-Matic, a food preparer with a pleated, W-shaped blade rotated by a special clutch mechanism. The Chop-O-Matic was ideal for making coleslaw and chopped liver, and when Morris introduced a strikingly similar product, called the Roto-Chop, S. J. Popeil sued his uncle for patent infringement. (As it happened, the Chop-O-Matic itself seemed to have been inspired by the Blitzhacker, from Switzerland, and S.J. later lost a patent judgment to the Swiss.)
The two squared off in Trenton, in May of 1958, in a courtroom jammed with Morrises and Popeils. When the trial opened, Nathan Morris was on the stand, being cross-examined by his nephew's attorneys, who were out to show him that he was no more than a huckster and a copycat. At a key point in the questioning, the judge suddenly burst in. "He took the index finger of his right hand and he pointed it at Morris," Jack Dominik, Popeil's longtime patent lawyer, recalls, "and as long as I live I will never forget what he said. `I know you! You're a pitchman! I've seen you on the boardwalk!' And Morris pointed his index finger back at the judge and shouted, `No! I'm a manufacturer. I'm a dignified manufacturer, and I work with the most eminent of counsel!' " (Nathan Morris, according to Dominik, was the kind of man who referred to everyone he worked with as eminent.) "At that moment," Dominik goes on, "Uncle Nat's face was getting red and the judge's was getting redder, so a recess was called." What happened later that day is best described in Dominik's unpublished manuscript, "The Inventions of Samuel Joseph Popeil by Jack E. Dominik--His Patent Lawyer." Nathan Morris had a sudden heart attack, and S.J. was guilt-stricken. "Sobbing ensued," Dominik writes. "Remorse set in. The next day, the case was settled. Thereafter, Uncle Nat's recovery from his previous day's heart attack was nothing short of a miracle."
Nathan Morris was a performer, like so many of his relatives, and pitching was, first and foremost, a performance. It's said that Nathan's nephew Archie (the Pitchman's Pitchman) Morris once sold, over a long afternoon, gadget after gadget to a well-dressed man. At the end of the day, Archie watched the man walk away, stop and peer into his bag, and then dump the whole lot into a nearby garbage can. The Morrises were that good. "My cousins could sell you an empty box," Ron says.
The last of the Morrises to be active in the pitching business is Arnold (the Knife) Morris, so named because of his extraordinary skill with the Sharpcut, the forerunner of the Ginsu. He is in his early seventies, a cheerful, impish man with a round face and a few wisps of white hair, and a trademark move whereby, after cutting a tomato into neat, regular slices, he deftly lines the pieces up in an even row against the flat edge of the blade. Today, he lives in Ocean Township, a few miles from Asbury Park, with Phyllis, his wife of twenty-nine years, whom he refers to (with the same irresistible conviction that he might use to describe, say, the Feather Touch Knife) as "the prettiest girl in Asbury Park." One morning recently, he sat in his study and launched into a pitch for the Dial-O-Matic, a slicer produced by S. J. Popeil some forty years ago.
"Come on over, folks. I'm going to show you the most amazing slicing machine you have ever seen in your life," he began. Phyllis, sitting nearby, beamed with pride. He picked up a package of barbecue spices, which Ron Popeil sells alongside his Showtime Rotisserie, and used it as a prop. "Take a look at this!" He held it in the air as if he were holding up a Tiffany vase. He talked about the machine's prowess at cutting potatoes, then onions, then tomatoes. His voice, a marvellous instrument inflected with the rhythms of the Jersey Shore, took on a singsong quality: "How many cut tomatoes like this? You stab it. You jab it. The juices run down your elbow. With the Dial-O-Matic, you do it a little differently. You put it in the machine and you wiggle"--he mimed fixing the tomato to the bed of the machine. "The tomato! Lady! The tomato! The more you wiggle, the more you get. The tomato! Lady! Every slice comes out perfectly, not a seed out of place. But the thing I love my Dial-O-Matic for is coleslaw. My mother-in-law used to take her cabbage and do this." He made a series of wild stabs at an imaginary cabbage. "I thought she was going to commit suicide. Oh, boy, did I pray--that she wouldn't slip! Don't get me wrong. I love my mother-in-law. It's her daughter I can't figure out. You take the cabbage. Cut it in half. Coleslaw, hot slaw. Pot slaw. Liberty slaw. It comes out like shredded wheat . . ."
It was a vaudeville monologue, except that Arnold wasn't merely entertaining; he was selling. "You can take a pitchman and make a great actor out of him, but you cannot take an actor and always make a great pitchman out of him," he says. The pitchman must make you applaud and take out your money. He must be able to execute what in pitchman's parlance is called "the turn"--the perilous, crucial moment where he goes from entertainer to businessman. If, out of a crowd of fifty, twenty-five people come forward to buy, the true pitchman sells to only twenty of them. To the remaining five, he says, "Wait! There's something else I want to show you!" Then he starts his pitch again, with slight variations, and the remaining four or five become the inner core of the next crowd, hemmed in by the people around them, and so eager to pay their money and be on their way that they start the selling frenzy all over again. The turn requires the management of expectation. That's why Arnold always kept a pineapple tantalizingly perched on his stand. "For forty years, I've been promising to show people how to cut the pineapple, and I've never cut it once," he says. "It got to the point where a pitchman friend of mine went out and bought himself a plastic pineapple. Why would you cut the pineapple? It cost a couple bucks. And if you cut it they'd leave." Arnold says that he once hired some guys to pitch a vegetable slicer for him at a fair in Danbury, Connecticut, and became so annoyed at their lackadaisical attitude that he took over the demonstration himself. They were, he says, waiting for him to fail: he had never worked that particular slicer before and, sure enough, he was massacring the vegetables. Still, in a single pitch he took in two hundred dollars. "Their eyes popped out of their heads," Arnold recalls. "They said, `We don't understand it. You don't even know how to work the damn machine.' I said, `But I know how to do one thing better than you.' They said, `What's that?' I said, `I know how to ask for the money.' And that's the secret to the whole damn business."
Ron Popeil started pitching his father's kitchen gadgets at the Maxwell Street flea market in Chicago, in the mid-fifties. He was thirteen. Every morning, he would arrive at the market at five and prepare fifty pounds each of onions, cabbages, and carrots, and a hundred pounds of potatoes. He sold from six in the morning until four in the afternoon, bringing in as much as five hundred dollars a day. In his late teens, he started doing the state- and county-fair circuit, and then he scored a prime spot in the Woolworth's at State and Washington, in the Loop, which at the time was the top-grossing Woolworth's store in the country. He was making more than the manager of the store, selling the Chop- O-Matic and the Dial-O-Matic. He dined at the Pump Room and wore a Rolex and rented hundred-and-fifty-dollar-a-night hotel suites. In pictures from the period, he is beautiful, with thick dark hair and blue-green eyes and sensuous lips, and, several years later, when he moved his office to 919 Michigan Avenue, he was called the Paul Newman of the Playboy Building. Mel Korey, a friend of Ron's from college and his first business partner, remembers the time he went to see Ron pitch the Chop-O-Matic at the State Street Woolworth's. "He was mesmerizing," Korey says. "There were secretaries who would take their lunch break at Woolworth's to watch him because he was so good-looking. He would go into the turn, and people would just come running." Several years ago, Ron's friend Steve Wynn, the founder of the Mirage resorts, went to visit Michael Milken in prison. They were near a television, and happened to catch one of Ron's infomercials just as he was doing the countdown, a routine taken straight from the boardwalk, where he says, "You're not going to spend two hundred dollars, not a hundred and eighty dollars, not one-seventy, not one- sixty . . ." It's a standard pitchman's gimmick: it sounds dramatic only because the starting price is set way up high. But something about the way Ron did it was irresistible. As he got lower and lower, Wynn and Milken--who probably know as much about profit margins as anyone in America--cried out in unison, "Stop, Ron! Stop!"
Was Ron the best? The only attempt to settle the question definitively was made some forty years ago, when Ron and Arnold were working a knife set at the Eastern States Exposition, in West Springfield, Massachusetts. A third man, Frosty Wishon, who was a legend in his own right, was there, too. "Frosty was a well-dressed, articulate individual and a good salesman," Ron says. "But he thought he was the best. So I said, `Well, guys, we've got a ten-day show, eleven, maybe twelve hours a day. We'll each do a rotation, and we'll compare how much we sell." In Morris-Popeil lore, this is known as "the shoot-out," and no one has ever forgotten the outcome. Ron beat Arnold, but only by a whisker- -no more than a few hundred dollars. Frosty Wishon, meanwhile, sold only half as much as either of his rivals. "You have no idea the pressure Frosty was under," Ron continues. "He came up to me at the end of the show and said, `Ron, I will never work with you again as long as I live.' "
No doubt Frosty Wishon was a charming and persuasive person, but he assumed that this was enough--that the rules of pitching were the same as the rules of celebrity endorsement. When Michael Jordan pitches McDonald's hamburgers, Michael Jordan is the star. But when Ron Popeil or Arnold Morris pitched, say, the Chop-O-Matic, his gift was to make the Chop-O-Matic the star. It was, after all, an innovation. It represented a different way of dicing onions and chopping liver: it required consumers to rethink the way they went about their business in the kitchen. Like most great innovations, it was disruptive. And how do you persuade people to disrupt their lives? Not merely by ingratiation or sincerity, and not by being famous or beautiful. You have to explain the invention to customers-- not once or twice but three or four times, with a different twist each time. You have to show them exactly how it works and why it works, and make them follow your hands as you chop liver with it, and then tell them precisely how it fits into their routine, and, finally, sell them on the paradoxical fact that, revolutionary as the gadget is, it's not at all hard to use.
Thirty years ago, the videocassette recorder came on the market, and it was a disruptive product, too: it was supposed to make it possible to tape a television show so that no one would ever again be chained to the prime-time schedule. Yet, as ubiquitous as the VCR became, it was seldom put to that purpose. That's because the VCR was never pitched: no one ever explained the gadget to American consumers--not once or twice but three or four times--and no one showed them exactly how it worked or how it would fit into their routine, and no pair of hands guided them through every step of the process. All the VCR-makers did was hand over the box with a smile and a pat on the back, tossing in an instruction manual for good measure. Any pitchman could have told you that wasn't going to do it.
Once, when I was over at Ron's house in Coldwater Canyon, sitting on one of the high stools in his kitchen, he showed me what real pitching is all about. He was talking about how he had just had dinner with the actor Ron Silver, who is playing Ron's friend Robert Shapiro in a new movie about the O. J. Simpson trial. "They shave the back of Ron Silver's head so that he's got a bald spot, because, you know, Bob Shapiro's got a bald spot back there, too," Ron said. "So I say to him, `You've gotta get GLH.' " GLH, one of Ron's earlier products, is an aerosol spray designed to thicken the hair and cover up bald spots. "I told him, `It will make you look good. When you've got to do the scene, you shampoo it out.' "
At this point, the average salesman would have stopped. The story was an aside, no more. We had been discussing the Showtime Rotisserie, and on the counter behind us was a Showtime cooking a chicken and next to it a Showtime cooking baby-back ribs, and on the table in front of him Ron's pasta maker was working, and he was frying some garlic so that we could have a little lunch. But now that he had told me about GLH it was unthinkable that he would not also show me its wonders. He walked quickly over to a table at the other side of the room, talking as he went. "People always ask me, `Ron, where did you get that name GLH?' I made it up. Great-Looking Hair." He picked up a can. "We make it in nine different colors. This is silver-black." He picked up a hand mirror and angled it above his head so that he could see his bald spot. "Now, the first thing I'll do is spray it where I don't need it." He shook the can and began spraying the crown of his head, talking all the while. "Then I'll go to the area itself." He pointed to his bald spot. "Right here. O.K. Now I'll let that dry. Brushing is fifty per cent of the way it's going to look." He began brushing vigorously, and suddenly Ron Popeil had what looked like a complete head of hair. "Wow," I said. Ron glowed. "And you tell me `Wow.' That's what everyone says. `Wow.' That's what people say who use it. `Wow.' If you go outside"--he grabbed me by the arm and pulled me out onto the deck--"if you are in bright sunlight or daylight, you cannot tell that I have a big bald spot in the back of my head. It really looks like hair, but it's not hair. It's quite a product. It's incredible. Any shampoo will take it out. You know who would be a great candidate for this? Al Gore. You want to see how it feels?" Ron inclined the back of his head toward me. I had said, "Wow," and had looked at his hair inside and outside, but the pitchman in Ron Popeil wasn't satisfied. I had to feel the back of his head. I did. It felt just like real hair.
The Tinkerer
Ron Popeil inherited more than the pitching tradition of Nathan Morris. He was very much the son of S. J. Popeil, and that fact, too, goes a long way toward explaining the success of the Showtime Rotisserie. S.J. had a ten-room apartment high in the Drake Towers, near the top of Chicago's Magnificent Mile. He had a chauffeured Cadillac limousine with a car phone, a rarity in those days, which he delighted in showing off (as in "I'm calling you from the car"). He wore three-piece suits and loved to play the piano. He smoked cigars and scowled a lot and made funny little grunting noises as he talked. He kept his money in T-bills. His philosophy was expressed in a series of epigrams: To his attorney, "If they push you far enough, sue"; to his son, "It's not how much you spend, it's how much you make." And, to a designer who expressed doubts about the utility of one of his greatest hits, the Pocket Fisherman, "It's not for using; it's for giving." In 1974, S.J.'s second wife, Eloise, decided to have him killed, so she hired two hit men--one of whom, aptly, went by the name of Mr. Peeler. At the time, she was living at the Popeil estate in Newport Beach with her two daughters and her boyfriend, a thirty-seven-year-old machinist. When, at Eloise's trial, S.J. was questioned about the machinist, he replied, "I was kind of happy to have him take her off my hands." That was vintage S.J. But eleven months later, after Eloise got out of prison, S.J. married her again. That was vintage S.J., too. As a former colleague of his puts it, "He was a strange bird."
S. J. Popeil was a tinkerer. In the middle of the night, he would wake up and make frantic sketches on a pad he kept on his bedside table. He would disappear into his kitchen for hours and make a huge mess, and come out with a faraway look on his face. He loved standing behind his machinists, peering over their shoulders while they were assembling one of his prototypes. In the late forties and early fifties, he worked almost exclusively in plastic, reinterpreting kitchen basics with a subtle, modernist flair. "Popeil Brothers made these beautiful plastic flour sifters," Tim Samuelson, a curator at the Chicago Historical Society and a leading authority on the Popeil legacy, says. "They would use contrasting colors, or a combination of opaque plastic with a translucent swirl plastic." Samuelson became fascinated with all things Popeil after he acquired an original Popeil Brothers doughnut maker, in red-and-white plastic, which he felt "had beautiful lines"; to this day, in the kitchen of his Hyde Park high-rise, he uses the Chop-O-Matic in the preparation of salad ingredients. "There was always a little twist to what he did," Samuelson goes on. "Take the Popeil automatic egg turner. It looks like a regular spatula, but if you squeeze the handle the blade turns just enough to flip a fried egg."
Walter Herbst, a designer whose firm worked with Popeil Brothers for many years, says that S.J.'s modus operandi was to "come up with a holistic theme. He'd arrive in the morning with it. It would be something like"--Herbst assumes S.J.'s gruff voice--" 'We need a better way to shred cabbage.' It was a passion, an absolute goddam passion. One morning, he must have been eating grapefruit, because he comes to work and calls me and says, 'We need a better way to cut grapefruit!' " The idea they came up with was a double-bladed paring knife, with the blades separated by a fraction of an inch so that both sides of the grapefruit membrane could be cut simultaneously. "There was a little grocery store a few blocks away," Herbst says. "So S.J. sends the chauffeur out for grapefruit. How many? Six. Well, over the period of a couple of weeks, six turns to twelve and twelve turns to twenty, until we were cutting thirty to forty grapefruits a day. I don't know if that little grocery store ever knew what happened."
S. J. Popeil's finest invention was undoubtedly the Veg-O-Matic, which came on the market in 1960 and was essentially a food processor, a Cuisinart without the motor. The heart of the gadget was a series of slender, sharp blades strung like guitar strings across two Teflon-coated metal rings, which were made in Woodstock, Illinois, from 364 Alcoa, a special grade of aluminum. When the rings were aligned on top of each other so that the blades ran parallel, a potato or an onion pushed through would come out in perfect slices. If the top ring was rotated, the blades formed a crosshatch, and a potato or an onion pushed through would come out diced. The rings were housed in a handsome plastic assembly, with a plunger to push the vegetables through the blades. Technically, the Veg-O-Matic was a triumph: the method of creating blades strong enough to withstand the assault of vegetables received a U.S. patent. But from a marketing perspective it posed a problem. S.J.'s products had hitherto been sold by pitchmen armed with a mound of vegetables meant to carry them through a day's worth of demonstrations. But the Veg-O-Matic was too good. In a single minute, according to the calculations of Popeil Brothers, it could produce a hundred and twenty egg wedges, three hundred cucumber slices, eleven hundred and fifty potato shoestrings, or three thousand onion dices. It could go through what used to be a day's worth of vegetables in a matter of minutes. The pitchman could no longer afford to pitch to just a hundred people at a time; he had to pitch to a hundred thousand. The Veg-O-Matic needed to be sold on television, and one of the very first pitchmen to grasp this fact was Ron Popeil.
In the summer of 1964, just after the Veg-O-Matic was introduced, Mel Korey joined forces with Ron Popeil in a company called Ronco. They shot a commercial for the Veg-O-Matic for five hundred dollars, a straightforward pitch shrunk to two minutes, and set out from Chicago for the surrounding towns of the Midwest. They cold-called local department stores and persuaded them to carry the Veg-O-Matic on guaranteed sale, which meant that whatever the stores didn't sell could be returned. Then they visited the local television station and bought a two- or three-week run of the cheapest airtime they could find, praying that it would be enough to drive traffic to the store. "We got Veg-O-Matics wholesale for $3.42," Korey says. "They retailed for $9.95, and we sold them to the stores for $7.46, which meant that we had four dollars to play with. If I spent a hundred dollars on television, I had to sell twenty-five Veg-O-Matics to break even." It was clear, in those days, that you could use television to sell kitchen products if you were Procter & Gamble. It wasn't so clear that this would work if you were Mel Korey and Ron Popeil, two pitchmen barely out of their teens selling a combination slicer-dicer that no one had ever heard of. They were taking a wild gamble, and, to their amazement, it paid off. "They had a store in Butte, Montana--Hennessy's," Korey goes on, thinking back to those first improbable years. "Back then, people there were still wearing peacoats. The city was mostly bars. It had just a few three-story buildings. There were twenty-seven thousand people, and one TV station. I had the Veg-O-Matic, and I go to the store, and they said, 'We'll take a case. We don't have a lot of traffic here.' I go to the TV station and the place is a dump. The only salesperson was going blind and deaf. So I do a schedule. For five weeks, I spend three hundred and fifty dollars. I figure if I sell a hundred and seventy-four machines--six cases--I'm happy. I go back to Chicago, and I walk into the office one morning and the phone is ringing. They said, 'We sold out. You've got to fly us another six cases of Veg-O-Matics.' The next week, on Monday, the phone rings. It's Butte again: 'We've got a hundred and fifty oversold.' I fly him another six cases. Every few days after that, whenever the phone rang we'd look at each other and say, 'Butte, Montana.' " Even today, thirty years later, Korey can scarcely believe it. "How many homes in total in that town? Maybe several thousand? We ended up selling two thousand five hundred Veg-O-Matics in five weeks!"
Why did the Veg-O-Matic sell so well? Doubtless, Americans were eager for a better way of slicing vegetables. But it was more than that: the Veg-O-Matic represented a perfect marriage between the medium (television) and the message (the gadget). The Veg-O-Matic was, in the relevant sense, utterly transparent. You took the potato and you pushed it through the Teflon-coated rings and--voilĂ !--you had French fries. There were no buttons being pressed, no hidden and intimidating gears: you could show-and-tell the Veg-O-Matic in a two-minute spot and allay everyone's fears about a daunting new technology. More specifically, you could train the camera on the machine and compel viewers to pay total attention to the product you were selling. TV allowed you to do even more effectively what the best pitchmen strove to do in live demonstrations--make the product the star.
This was a lesson Ron Popeil never forgot. In his infomercial for the Showtime Rotisserie, he opens not with himself but with a series of shots of meat and poultry, glistening almost obscenely as they rotate in the Showtime. A voice-over describes each shot: a "delicious six-pound chicken," a "succulent whole duckling," a "mouthwatering pork-loin roast . . ." Only then do we meet Ron, in a sports coat and jeans. He explains the problems of conventional barbecues, how messy and unpleasant they are. He bangs a hammer against the door of the Showtime, to demonstrate its strength. He deftly trusses a chicken, impales it on the patented two-pronged Showtime spit rod, and puts it into the oven. Then he repeats the process with a pair of chickens, salmon steaks garnished with lemon and dill, and a rib roast. All the time, the camera is on his hands, which are in constant motion, manipulating the Showtime apparatus gracefully, with his calming voice leading viewers through every step: "All I'm going to do here is slide it through like this. It goes in very easily. I'll match it up over here. What I'd like to do is take some herbs and spices here. All I'll do is slide it back. Raise up my glass door here. I'll turn it to a little over an hour. . . . Just set it and forget it."
Why does this work so well? Because the Showtime--like the Veg-O-Matic before it--was designed to be the star. From the very beginning, Ron insisted that the entire door be a clear pane of glass, and that it slant back to let in the maximum amount of light, so that the chicken or the turkey or the baby-back ribs turning inside would be visible at all times. Alan Backus says that after the first version of the Showtime came out Ron began obsessing over the quality and evenness of the browning and became convinced that the rotation speed of the spit wasn't quite right. The original machine moved at four revolutions per minute. Ron set up a comparison test in his kitchen, cooking chicken after chicken at varying speeds until he determined that the optimal speed of rotation was actually six r.p.m. One can imagine a bright-eyed M.B.A. clutching a sheaf of focus-group reports and arguing that Ronco was really selling convenience and healthful living, and that it was foolish to spend hundreds of thousands of dollars retooling production in search of a more even golden brown. But Ron understood that the perfect brown is important for the same reason that the slanted glass door is important: because in every respect the design of the product must support the transparency and effectiveness of its performance during a demonstration--the better it looks onstage, the easier it is for the pitchman to go into the turn and ask for the money.
If Ron had been the one to introduce the VCR, in other words, he would not simply have sold it in an infomercial. He would also have changed the VCR itself, so that it made sense in an infomercial. The clock, for example, wouldn't be digital. (The haplessly blinking unset clock has, of course, become a symbol of frustration.) The tape wouldn't be inserted behind a hidden door--it would be out in plain view, just like the chicken in the rotisserie, so that if it was recording you could see the spools turn. The controls wouldn't be discreet buttons; they would be large, and they would make a reassuring click as they were pushed up and down, and each step of the taping process would be identified with a big, obvious numeral so that you could set it and forget it. And would it be a slender black, low-profile box? Of course not. Ours is a culture in which the term "black box" is synonymous with incomprehensibility. Ron's VCR would be in red-and-white plastic, both opaque and translucent swirl, or maybe 364 Alcoa aluminum, painted in some bold primary color, and it would sit on top of the television, not below it, so that when your neighbor or your friend came over he would spot it immediately and say, "Wow, you have one of those Ronco Tape-O-Matics!"
A Real Piece of Work
Ron Popeil did not have a happy childhood. "I remember baking a potato. It must have been when I was four or five years old," he told me. We were in his kitchen, and had just sampled some baby-back ribs from the Showtime. It had taken some time to draw the memories out of him, because he is not one to dwell on the past. "I couldn't get that baked potato into my stomach fast enough, because I was so hungry." Ron is normally in constant motion, moving his hands, chopping food, bustling back and forth. But now he was still. His parents split up when he was very young. S.J. went off to Chicago. His mother disappeared. He and his older brother, Jerry, were shipped off to a boarding school in upstate New York. "I remember seeing my mother on one occasion. I don't remember seeing my father, ever, until I moved to Chicago, at thirteen. When I was in the boarding school, the thing I remember was a Sunday when the parents visited the children, and my parents never came. Even knowing that they weren't going to show up, I walked out to the perimeter and looked out over the farmland, and there was this road." He made an undulating motion with his hand to suggest a road stretching off into the distance. "I remember standing on the road crying, looking for the movement of a car miles away, hoping that it was my mother and father. And they never came. That's all I remember about boarding school." Ron remained perfectly still. "I don't remember ever having a birthday party in my life. I remember that my grandparents took us out and we moved to Florida. My grandfather used to tie me down in bed--my hands, my wrists, and my feet. Why? Because I had a habit of turning over on my stomach and bumping my head either up and down or side to side. Why? How? I don't know the answers. But I was spread-eagle, on my back, and if I was able to twist over and do it my grandfather would wake up at night and come in and beat the hell out of me." Ron stopped, and then added, "I never liked him. I never knew my mother or her parents or any of that family. That's it. Not an awful lot to remember. Obviously, other things took place. But they have been erased."
When Ron came to Chicago, at thirteen, with his grandparents, he was put to work in the Popeil Brothers factory--but only on the weekends, when his father wasn't there. "Canned salmon and white bread for lunch, that was the diet," he recalls. "Did I live with my father? Never. I lived with my grandparents." When he became a pitchman, his father gave him just one advantage: he extended his son credit. Mel Korey says that he once drove Ron home from college and dropped him off at his father's apartment. "He had a key to the apartment, and when he walked in his dad was in bed already. His dad said, 'Is that you, Ron?' And Ron said, 'Yeah.' And his dad never came out. And by the next morning Ron still hadn't seen him." Later, when Ron went into business for himself, he was persona non grata around Popeil Brothers. "Ronnie was never allowed in the place after that," one of S.J.'s former associates recalls. "He was never let in the front door. He was never allowed to be part of anything." My father, Ron says simply, "was all business. I didn't know him personally."
Here is a man who constructed his life in the image of his father--who went into the same business, who applied the same relentless attention to the workings of the kitchen, who got his start by selling his father's own products--and where was his father? "You know, they could have done wonders together," Korey says, shaking his head. "I remember one time we talked with K-tel about joining forces, and they said that we would be a war machine--that was their word. Well, Ron and his dad, they could have been a war machine." For all that, it is hard to find in Ron even a trace of bitterness. Once, I asked him, "Who are your inspirations?" The first name came easily: his good friend Steve Wynn. He was silent for a moment, and then he added, "My father." Despite everything, Ron clearly found in his father's example a tradition of irresistible value. And what did Ron do with that tradition? He transcended it. He created the Showtime, which is indisputably a better gadget, dollar for dollar, than the Morris Metric Slicer, the Dutch Kitchen Shredder Grater, the Chop-O-Matic, and the Veg-O-Matic combined.
When I was in Ocean Township, visiting Arnold Morris, he took me to the local Jewish cemetery, Chesed Shel Ames, on a small hilltop just outside town. We drove slowly through the town's poorer sections in Arnold's white Mercedes. It was a rainy day. At the cemetery, a man stood out front in an undershirt, drinking a beer. We entered through a little rusty gate. "This is where it all starts," Arnold said, by which he meant that everyone--the whole spirited, squabbling clan--was buried here. We walked up and down the rows until we found, off in a corner, the Morris headstones. There was Nathan Morris, of the straw boater and the opportune heart attack, and next to him his wife, Betty. A few rows over was the family patriarch, Kidders Morris, and his wife, and a few rows from there Irving Rosenbloom, who made a fortune in plastic goods out on Long Island. Then all the Popeils, in tidy rows: Ron's grandfather Isadore, who was as mean as a snake, and his wife, Mary; S.J., who turned a cold shoulder to his own son; Ron's brother, Jerry, who died young. Ron was from them, but he was not of them. Arnold walked slowly among the tombstones, the rain dancing off his baseball cap, and then he said something that seemed perfectly right. "You know, I'll bet you you'll never find Ronnie here."
On the Air
One Saturday night a few weeks ago, Ron Popeil arrived at the headquarters of the television shopping network QVC, a vast gleaming complex nestled in the woods of suburban Philadelphia. Ron is a regular on QVC. He supplements his infomercials with occasional appearances on the network, and, for twenty-four hours beginning that midnight, QVC had granted him eight live slots, starting with a special "Ronco" hour between midnight and 1 a.m. Ron was travelling with his daughter Shannon, who had got her start in the business selling the Ronco Electric Food Dehydrator on the fair circuit, and the plan was that the two of them would alternate throughout the day. They were pitching a Digital Jog Dial version of the Showtime, in black, available for one day only, at a "special value" of $129.72.
In the studio, Ron had set up eighteen Digital Jog Dial Showtimes on five wood-panelled gurneys. From Los Angeles, he had sent, via Federal Express, dozens of Styrofoam containers with enough meat for each of the day's airings: eight fifteen-pound turkeys, seventy-two hamburgers, eight legs of lamb, eight ducks, thirty-odd chickens, two dozen or so Rock Cornish game hens, and on and on, supplementing them with garnishes, trout, and some sausage bought that morning at three Philadelphia-area supermarkets. QVC's target was thirty-seven thousand machines, meaning that it hoped to gross about $4.5 million during the twenty-four hours--a huge day, even by the network's standards. Ron seemed tense. He barked at the team of QVC producers and cameramen bustling around the room. He fussed over the hero plates--the ready-made dinners that he would use to showcase meat taken straight from the oven. "Guys, this is impossible," he said, peering at a tray of mashed potatoes and gravy. "The level of gravy must be higher." He was limping a little. "You know, there's a lot of pressure on you," he said wearily. " 'How did Ron do? Is he still the best?' "
With just a few minutes to go, Ron ducked into the greenroom next to the studio to put GLH in his hair: a few aerosol bursts, followed by vigorous brushing. "Where is God right now?" his co-host, Rick Domeier, yelled out, looking around theatrically for his guest star. "Is God backstage?" Ron then appeared, resplendent in a chef's coat, and the cameras began to roll. He sliced open a leg of lamb. He played with the dial of the new digital Showtime. He admired the crispy, succulent skin of the duck. He discussed the virtues of the new food-warming feature--where the machine would rotate at low heat for up to four hours after the meat was cooked in order to keep the juices moving--and, all the while, bantered so convincingly with viewers calling in on the testimonial line that it was as if he were back mesmerizing the secretaries in the Woolworth's at State and Washington.
In the greenroom, there were two computer monitors. The first displayed a line graph charting the number of calls that came in at any given second. The second was an electronic ledger showing the total sales up to that point. As Ron took flight, one by one, people left the studio to gather around the computers. Shannon Popeil came first. It was 12:40 a.m. In the studio, Ron was slicing onions with one of his father's Dial-O-Matics. She looked at the second monitor and gave a little gasp. Forty minutes in, and Ron had already passed seven hundred thousand dollars. A QVC manager walked in. It was 12:48 a.m., and Ron was roaring on: $837,650. "It can't be!" he cried out. "That's unbelievable!" Two QVC producers came over. One of them pointed at the first monitor, which was graphing the call volume. "Jump," he called out. "Jump!" There were only a few minutes left. Ron was extolling the virtues of the oven one final time, and, sure enough, the line began to take a sharp turn upward, as all over America viewers took out their wallets. The numbers on the second screen began to change in a blur of recalculation--rising in increments of $129.72 plus shipping and taxes. "You know, we're going to hit a million dollars, just on the first hour," one of the QVC guys said, and there was awe in his voice. It was one thing to talk about how Ron was the best there ever was, after all, but quite another to see proof of it, before your very eyes. At that moment, on the other side of the room, the door opened, and a man appeared, stooped and drawn but with a smile on his face. It was Ron Popeil, who invented a better rotisserie in his kitchen and went out and pitched it himself. There was a hush, and then the whole room stood up and cheered.
Dept. of Useful Things
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
November 27, 2000
Out of the Frying Pan, Into the Voting Booth
My parents have an electric stove in their kitchen made by a company called Moffat. It has four burners on top and a raised panel that runs across the back with a set of knobs on it, and the way the panel is laid out has always been a bone of contention in our family. The knobs for the two left-hand burners are on the left side of the back panel, stacked one on top of the other, with the top knob controlling the back burner and the bottom knob controlling the front burner--same thing on the right-hand side. My mother finds this logical. But not my father. Every time he looks at the stove, he gets confused and thinks that the top knob controls the front burner.
Does this mean that my mother is more rational than my father? I don't think so. It simply means that any time you create a visual guide to an action that isn't intuitive--that requires some kind of interpretation or physical contortion--you're going to baffle some people. From the perspective of "usability" researchers, my father has fallen victim to an ill-designed interface. People who pop the trunk of their car when they mean to pop the gas-tank lid are experiencing the same kind of confusion, as the singer John Denver did, apparently, when he died in an airplane crash a few years ago. Denver was flying a new, experimental plane, and may not have realized how little fuel he had, since the fuel gauge wasn't linear, the way you'd expect it to be. When the line on that sort of gauge registers one-quarter, for example, it doesn't mean that the twenty-six-gallon tank is a quarter full; it means that the tank has less than five gallons left.
Then, there's the question of voting. Susan King Roth, an associate professor of visual communication at Ohio State University, did an experiment recently with voting machines and found that a surprising number of the people in her study didn't vote on the issues section of the ballot. Why? Because the issues proposals were at the top of the ballot, sixty-seven inches from the floor, and the eye height of the average American woman is sixty inches. Some people in the study simply couldn't see the proposals.
The Florida butterfly ballot may be the textbook example of what can go wrong when design isn't intuitive. The usability expert Kevin Fox has identified three "cognitive paths" that could have led voters to misunderstand the butterfly layout: gestalt grouping, linear visual search, and numeric mapping--all of which point out that the way the butterfly ballot invites itself to be read does not match the way it invites voters to act. In the language of usability studies, there is an incompatibility between input and output. In a sense, it's just like the problem with the Moffat stove. My father hasn't burned down the house yet. But sometimes he puts a pot on the back burner and turns on the front burner. If he's used the stove twenty thousand times in his life, it's a reasonable guess that he's made this mistake maybe a few hundred times. In the grand scheme of things, that's not a very high percentage. Then again, sometimes a few hundred mistakes can turn out to be awfully important.
Designs For Working
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
December 11, 2000
DEPT. OF HUMAN RESOURCES
Why your bosses want to turn your
new office into Greenwich Village.
1.
In the early nineteen-sixties, Jane Jacobs lived on Hudson Street, in Greenwich Village, near the intersection of Eighth Avenue and Bleecker Street. It was then, as now, a charming district of nineteenth-century tenements and town houses, bars and shops, laid out over an irregular grid, and Jacobs loved the neighborhood. In her 1961 masterpiece, "The Death and Life of Great American Cities," she rhapsodized about the White Horse Tavern down the block, home to Irish longshoremen and writers and intellectuals--a place where, on a winter's night, as "the doors open, a solid wave of conversation and animation surges out and hits you." Her Hudson Street had Mr. Slube, at the cigar store, and Mr. Lacey, the locksmith, and Bernie, the candy-store owner, who, in the course of a typical day, supervised the children crossing the street, lent an umbrella or a dollar to a customer, held on to some keys or packages for people in the neighborhood, and "lectured two youngsters who asked for cigarettes." The street had "bundles and packages, zigzagging from the drug store to the fruit stand and back over to the butcher's," and "teenagers, all dressed up, are pausing to ask if their slips show or their collars look right." It was, she said, an urban ballet.
The miracle of Hudson Street, according to Jacobs, was created by the particular configuration of the streets and buildings of the neighborhood. Jacobs argued that when a neighborhood is oriented toward the street, when sidewalks are used for socializing and play and commerce, the users of that street are transformed by the resulting stimulation: they form relationships and casual contacts they would never have otherwise. The West Village, she pointed out, was blessed with a mixture of houses and apartments and shops and offices and industry, which meant that there were always people "outdoors on different schedules and... in the place for different purposes." It had short blocks, and short blocks create the greatest variety in foot traffic. It had lots of old buildings, and old buildings have the low rents that permit individualized and creative uses. And, most of all, it had people, cheek by jowl, from every conceivable walk of life. Sparely populated suburbs may look appealing, she said, but without an active sidewalk life, without the frequent, serendipitous interactions of many different people, "there is no public acquaintanceship, no foundation of public trust, no cross-connections with the necessary people--and no practice or ease in applying the most ordinary techniques of city public life at lowly levels."
Jane Jacobs did not win the battle she set out to fight. The West Village remains an anomaly. Most developers did not want to build the kind of community Jacobs talked about, and most Americans didn't want to live in one. To reread "Death and Life" today, however, is to be struck by how the intervening years have given her arguments a new and unexpected relevance. Who, after all, has a direct interest in creating diverse, vital spaces that foster creativity and serendipity? Employers do. On the fortieth anniversary of its publication, "Death and Life" has been reborn as a primer on workplace design.
The parallels between neighborhoods and offices are striking. There was a time, for instance, when companies put their most valued employees in palatial offices, with potted plants in the corner, and secretaries out front, guarding access. Those offices were suburbs--gated communities, in fact--and many companies came to realize that if their best employees were isolated in suburbs they would be deprived of public acquaintanceship, the foundations of public trust, and cross-connections with the necessary people. In the eighties and early nineties, the fashion in corporate America was to follow what designers called "universal planning"--rows of identical cubicles, which resembled nothing so much as a Levittown. Today, universal planning has fallen out of favor, for the same reason that the postwar suburbs like Levittown did: to thrive, an office space must have a diversity of uses--it must have the workplace equivalent of houses and apartments and shops and industry.
If you visit the technology companies of Silicon Valley, or the media companies of Manhattan, or any of the firms that self-consciously identify themselves with the New Economy, you'll find that secluded private offices have been replaced by busy public spaces, open-plan areas without walls, executives next to the newest hires. The hush of the traditional office has been supplanted by something much closer to the noisy, bustling ballet of Hudson Street. Forty years ago, people lived in neighborhoods like the West Village and went to work in the equivalent of suburbs. Now, in one of the odd reversals that mark the current economy, they live in suburbs and, increasingly, go to work in the equivalent of the West Village.
2.
The office used to be imagined as a place where employees punch clocks and bosses roam the halls like high-school principals, looking for miscreants. But when employees sit chained to their desks, quietly and industriously going about their business, an office is not functioning as it should. That's because innovation--the heart of the knowledge economy--is fundamentally social. Ideas arise as much out of casual conversations as they do out of formal meetings. More precisely, as one study after another has demonstrated, the best ideas in any workplace arise out of casual contacts among different groups within the same company. If you are designing widgets for Acme.com, for instance, it is unlikely that a breakthrough idea is going to come from someone else on the widget team: after all, the other team members are as blinkered by the day-to- day demands of dealing with the existing product as you are. Someone from outside Acme.com--your old engineering professor, or a guy you used to work with at Apex.com--isn't going to be that helpful, either. A person like that doesn't know enough about Acme's widgets to have a truly useful idea. The most useful insights are likely to come from someone in customer service, who hears firsthand what widget customers have to say, or from someone in marketing, who has wrestled with the problem of how to explain widgets to new users, or from someone who used to work on widgets a few years back and whose work on another Acme product has given him a fresh perspective. Innovation comes from the interactions of people at a comfortable distance from one another, neither too close nor too far. This is why--quite apart from the matter of logistics and efficiency--companies have offices to begin with. They go to the trouble of gathering their employees under one roof because they want the widget designers to bump into the people in marketing and the people in customer service and the guy who moved to another department a few years back.
The catch is that getting people in an office to bump into people from another department is not so easy as it looks. In the sixties and seventies, a researcher at M.I.T. named Thomas Allen conducted a decade-long study of the way in which engineers communicated in research-and-development laboratories. Allen found that the likelihood that any two people will communicate drops off dramatically as the distance between their desks increases: we are four times as likely to communicate with someone who sits six feet away from us as we are with someone who sits sixty feet away. And people seated more than seventy-five feet apart hardly talk at all.
Allen's second finding was even more disturbing. When the engineers weren't talking to those in their immediate vicinity, many of them spent their time talking to people outside their company--to their old computer-science professor or the guy they used to work with at Apple. He concluded that it was actually easier to make the outside call than to walk across the room. If you constantly ask for advice or guidance from people inside your organization, after all, you risk losing prestige. Your colleagues might think you are incompetent. The people you keep asking for advice might get annoyed at you. Calling an outsider avoids these problems. "The engineer can easily excuse his lack of knowledge by pretending to be an `expert in something else' who needs some help in `broadening into this new area,'" Allen wrote. He did his study in the days before E-mail and the Internet, but the advent of digital communication has made these problems worse. Allen's engineers were far too willing to go outside the company for advice and new ideas. E-mail makes it even easier to talk to people outside the company.
The task of the office, then, is to invite a particular kind of social interaction--the casual, nonthreatening encounter that makes it easy for relative strangers to talk to each other. Offices need the sort of social milieu that Jane Jacobs found on the sidewalks of the West Village. "It is possible in a city street neighborhood to know all kinds of people without unwelcome entanglements, without boredom, necessity for excuses, explanations, fears of giving offense, embarrassments respecting impositions or commitments, and all such paraphernalia of obligations which can accompany less limited relationships," Jacobs wrote. If you substitute "office" for "city street neighborhood," that sentence becomes the perfect statement of what the modern employer wants from the workplace.
3.
Imagine a classic big-city office tower, with a floor plate of a hundred and eighty feet by a hundred and eighty feet. The center part of every floor is given over to the guts of the building: elevators, bathrooms, electrical and plumbing systems. Around the core are cubicles and interior offices, for support staff and lower management. And around the edges of the floor, against the windows, are rows of offices for senior staff, each room perhaps two hundred or two hundred and fifty square feet. The best research about office communication tells us that there is almost no worse way to lay out an office. The executive in one corner office will seldom bump into any other executive in a corner office. Indeed, stringing the exterior offices out along the windows guarantees that there will be very few people within the critical sixty-foot radius of those offices. To maximize the amount of contact among employees, you really ought to put the most valuable staff members in the center of the room, where the highest number of people can be within their orbit. Or, even better, put all places where people tend to congregate--the public areas--in the center, so they can draw from as many disparate parts of the company as possible. Is it any wonder that creative firms often prefer loft-style buildings, which have usable centers?
Another way to increase communication is to have as few private offices as possible. The idea is to exchange private space for public space, just as in the West Village, where residents agree to live in tiny apartments in exchange for a wealth of nearby cafés and stores and bars and parks. The West Village forces its residents outdoors. Few people, for example, have a washer and dryer in their apartment, and so even laundry is necessarily a social event: you have to take your clothes to the laundromat down the street. In the office equivalent, designers force employees to move around, too. They build in "functional inefficiencies"; they put kitchens and copiers and printers and libraries in places that can be reached only by a circuitous journey.
A more direct approach is to create an office so flexible that the kinds of people who need to spontaneously interact can actually be brought together. For example, the Ford Motor Company, along with a group of researchers from the University of Michigan, recently conducted a pilot project on the effectiveness of "war rooms" in software development. Previously, someone inside the company who needed a new piece of software written would have a series of meetings with the company's programmers, and the client and the programmers would send messages back and forth. In the war-room study, the company moved the client, the programmers, and a manager into a dedicated conference room, and made them stay there until the project was done. Using the war room cut the software-development time by two-thirds, in part because there was far less time wasted on formal meetings or calls outside the building: the people who ought to have been bumping into each other were now sitting next to each other.
Two years ago, the advertising agency TBWA\Chiat\Day moved into new offices in Los Angeles, out near the airport. In the preceding years, the firm had been engaged in a radical, and in some ways disastrous, experiment with a "nonterritorial" office: no one had a desk or any office equipment of his own. It was a scheme that courted failure by neglecting all the ways in which an office is a sort of neighborhood. By contrast, the new office is an almost perfect embodiment of Jacobsian principles of community. The agency is in a huge old warehouse, three stories high and the size of three football fields. It is informally known as Advertising City, and that's what it is: a kind of artfully constructed urban neighborhood. The floor is bisected by a central corridor called Main Street, and in the center of the room is an open space, with café tables and a stand of ficus trees, called Central Park. There's a basketball court, a game room, and a bar. Most of the employees are in snug workstations known as nests, and the nests are grouped together in neighborhoods that radiate from Main Street like Paris arrondissements. The top executives are situated in the middle of the room. The desk belonging to the chairman and creative director of the company looks out on Central Park. The offices of the chief financial officer and the media director abut the basketball court. Sprinkled throughout the building are meeting rooms and project areas and plenty of nooks where employees can closet themselves when they need to. A small part of the building is elevated above the main floor on a mezzanine, and if you stand there and watch the people wander about with their portable phones, and sit and chat in Central Park, and play basketball in the gym, and you feel on your shoulders the sun from the skylights and listen to the gentle buzz of human activity, it is quite possible to forget that you are looking at an office.
4.
In "The Death and Life of Great American Cities," Jacobs wrote of the importance of what she called "public characters"--people who have the social position and skills to orchestrate the movement of information and the creation of bonds of trust:
A public character is anyone who is in frequent contact with a wide circle of people and who is sufficiently interested to make himself a public character....The director of a settlement on New York's Lower East Side, as an example, makes a regular round of stores. He learns from the cleaner who does his suits about the presence of dope pushers in the neighborhood. He learns from the grocer that the Dragons are working up to something and need attention. He learns from the candy store that two girls are agitating the Sportsmen toward a rumble. One of his most important information spots is an unused breadbox on Rivington Street.... A message spoken there for any teen-ager within many blocks will reach his ears unerringly and surprisingly quickly, and the opposite flow along the grapevine similarly brings news quickly in to the breadbox.
A vital community, in Jacobs's view, required more than the appropriate physical environment. It also required a certain kind of person, who could bind together the varied elements of street life. Offices are no different. In fact, as office designers have attempted to create more vital workplaces, they have become increasingly interested in identifying and encouraging public characters.
One of the pioneers in this way of analyzing offices is Karen Stephenson, a business-school professor and anthropologist who runs a New York-based consulting company called Netform. Stephenson studies social networks. She goes into a company--her clients include J.P. Morgan, the Los Angeles Police Department, T.R.W., and I.B.M.--and distributes a questionnaire to its employees, asking about which people they have contact with. Whom do you like to spend time with? Whom do you talk to about new ideas? Where do you go to get expert advice? Every name in the company becomes a dot on a graph, and Stephenson draws lines between all those who have regular contact with each other. Stephenson likens her graphs to X-rays, and her role to that of a radiologist. What she's depicting is the firm's invisible inner mechanisms, the relationships and networks and patterns of trust that arise as people work together over time, and that are hidden beneath the organization chart. Once, for example, Stephenson was doing an "X-ray" of a Head Start organization. The agency was mostly female, and when Stephenson analyzed her networks she found that new hires and male staffers were profoundly isolated, communicating with the rest of the organization through only a handful of women. "I looked at tenure in the organization, office ties, demographic data. I couldn't see what tied the women together, and why the men were talking only to these women," Stephenson recalls. "Nor could the president of the organization. She gave me a couple of ideas. She said, `Sorry I can't figure it out.' Finally, she asked me to read the names again, and I could hear her stop, and she said, `My God, I know what it is. All those women are smokers.'" The X- ray revealed that the men--locked out of the formal power structure of the organization--were trying to gain access and influence by hanging out in the smoking area with some of the more senior women.
What Stephenson's X-rays do best, though, is tell you who the public characters are. In every network, there are always one or two people who have connections to many more people than anyone else. Stephenson calls them "hubs," and on her charts lines radiate out from them like spokes on a wheel. (Bernie the candy-store owner, on Jacobs's Hudson Street, was a hub.) A few people are also what Stephenson calls "gatekeepers": they control access to critical people, and link together a strategic few disparate groups. Finally, if you analyze the graphs there are always people who seem to have lots of indirect links to other people--who are part of all sorts of networks without necessarily being in the center of them. Stephenson calls those people "pulsetakers." (In Silicon Valleyspeak, the person in a sea of cubicles who pops his or her head up over the partition every time something interesting is going on is called a prairie dog: prairie dogs are pulsetakers.)
5.
In the past year, Stephenson has embarked on a partnership with Steelcase, the world's largest manufacturer of office furniture, in order to use her techniques in the design of offices. Traditionally, office designers would tell a company what furniture should go where. Stephenson and her partners at Steelcase propose to tell a company what people should go where, too. At Steelcase, they call this "floor-casting."
One of the first projects for the group is the executive level at Steelcase's headquarters, a five-story building in Grand Rapids, Michigan. The executive level, on the fourth floor, is a large, open room filled with small workstations. (Jim Hackett, the head of the company, occupies what Steelcase calls a Personal Harbor, a black, freestanding metal module that may be--at seven feet by eight--the smallest office of a Fortune 500 C.E.O.) One afternoon recently, Stephenson pulled out a laptop and demonstrated how she had mapped the communication networks of the leadership group onto a seating chart of the fourth floor. The dots and swirls are strangely compelling--abstract representations of something real and immediate. One executive, close to Hackett, was inundated with lines from every direction. "He's a hub, a gatekeeper, and a pulsetaker across all sorts of different dimensions," Stephenson said. "What that tells you is that he is very strategic. If there is no succession planning around that person, you have got a huge risk to the knowledge base of this company. If he's in a plane accident, there goes your knowledge." She pointed to another part of the floor plan, with its own thick overlay of lines. "That's sales and marketing. They have a pocket of real innovation here. The guy who runs it is very good, very smart." But then she pointed to the lines connecting that department with other departments. "They're all coming into this one place," she said, and she showed how all the lines coming out of marketing converged on one senior executive. "There's very little path redundancy. In human systems, you need redundancy, you need communication across multiple paths." What concerned Stephenson wasn't just the lack of redundancy but the fact that, in her lingo, many of the paths were "unconfirmed": they went only one way. People in marketing were saying that they communicated with the senior management, but there weren't as many lines going in the other direction. The sales-and-marketing team, she explained, had somehow become isolated from senior management. They couldn't get their voices heard when it came to innovation--and that fact, she said, ought to be a big consideration when it comes time to redo the office. "If you ask the guy who heads sales and marketing who he wants to sit next to, he'll pick out all the people he trusts," she said. "But do you sit him with those people? No. What you want to do is put people who don't trust each other near each other. Not necessarily next to each other, because they get too close. But close enough so that when you pop your head up, you get to see people, they are in your path, and all of a sudden you build an inviting space where they can hang out, kitchens and things like that. Maybe they need to take a hub in an innovation network and place the person with a pulsetaker in an expert network--to get that knowledge indirectly communicated to a lot of people."
The work of translating Stephenson's insights onto a new floor plan is being done in a small conference room--a war room--on the second floor of Steelcase headquarters. The group consists of a few key people from different parts of the firm, such as human resources, design, technology, and space-planning research. The walls of the room are cluttered with diagrams and pictures and calculations and huge, blownup versions of Stephenson's X-rays. Team members stress that what they are doing is experimental. They don't know yet how directly they want to translate findings from the communications networks to office plans. After all, you don't want to have to redo the entire office every time someone leaves or joins the company. But it's clear that there are some very simple principles from the study of public characters which ought to drive the design process. "You want to place hubs at the center," Joyce Bromberg, the director of space planning, says. "These are the ones other people go to in order to get information. Give them an environment that allows access. But there are also going to be times that they need to have control--so give them a place where they can get away. Gatekeepers represent the fit between groups. They transmit ideas. They are brokers, so you might want to put them at the perimeter, and give them front porches"--areas adjoining the workspace where you might put little tables and chairs. "Maybe they could have swinging doors with white boards, to better transmit information. As for pulsetakers, they are the roamers. Rather than give them one fixed work location, you might give them a series of touchdown spots--where you want them to stop and talk. You want to enable their meandering."
One of the other team members was a tall, thoughtful man named Frank Graziano. He had a series of pencil drawings--with circles representing workstations of all the people whose minds, as he put it, he wanted to make "explicit." He said that he had done the plan the night before. "I think we can thread innovation through the floor," he went on, and with a pen drew a red line that wound its way through the maze of desks. It was his Hudson Street.
6.
"The Death and Life of Great American Cities" was a controversial book, largely because there was always a whiff of paternalism in Jacobs's vision of what city life ought to be. Chelsea--the neighborhood directly to the north of her beloved West Village--had "mixtures and types of buildings and densities of dwelling units per acre... almost identical with those of Greenwich Village," she noted. But its long-predicted renaissance would never happen, she maintained, because of the "barriers of long, self-isolating blocks." She hated Chatham Village, a planned "garden city" development in Pittsburgh. It was a picturesque green enclave, but it suffered, in Jacobs's analysis, from a lack of sidewalk life. She wasn't concerned that some people might not want an active street life in their neighborhood; that what she saw as the "self-isolating blocks" of Chelsea others would see as a welcome respite from the bustle of the city, or that Chatham Village would appeal to some people precisely because one did not encounter on its sidewalks a "solid wave of conversation and animation." Jacobs felt that city dwellers belonged in environments like the West Village, whether they realized it or not.
The new workplace designers are making the same calculation, of course. The point of the new offices is to compel us to behave and socialize in ways that we otherwise would not--to overcome our initial inclination to be office suburbanites. But, in all the studies of the new workplaces, the reservations that employees have about a more social environment tend to diminish once they try it. Human behavior, after all, is shaped by context, but how it is shaped--and whether we'll be happy with the result--we can understand only with experience. Jane Jacobs knew the virtues of the West Village because she lived there. What she couldn't know was that her ideas about community would ultimately make more sense in the workplace. From time to time, social critics have bemoaned the falling rates of community participation in American life, but they have made the same mistake. The reason Americans are content to bowl alone (or, for that matter, not bowl at all) is that, increasingly, they receive all the social support they need--all the serendipitous interactions that serve to make them happy and productive--from nine to five.
The Trouble with Fries
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
March 5, 2001
ANNALS OF EATING
Fast food is killing us. Can it be fixed?
1.
In 1954, a man named Ray Kroc, who made his living selling the five-spindle Multimixer milkshake machine, began hearing about a hamburger stand in San Bernardino, California. This particular restaurant, he was told, had no fewer than eight of his machines in operation, meaning that it could make forty shakes simultaneously. Kroc was astounded. He flew from Chicago to Los Angeles, and drove to San Bernardino, sixty miles away, where he found a small octagonal building on a corner lot. He sat in his car and watched as the workers showed up for the morning shift. They were in starched white shirts and paper hats, and moved with a purposeful discipline. As lunchtime approached, customers began streaming into the parking lot, lining up for bags of hamburgers. Kroc approached a strawberry blonde in a yellow convertible.
"How often do you come here?" he asked.
"Anytime I am in the neighborhood," she replied, and, Kroc would say later, "it was not her sex appeal but the obvious relish with which she devoured the hamburger that made my pulse begin to hammer with excitement." He came back the next morning, and this time set up inside the kitchen, watching the griddle man, the food preparers, and, above all, the French-fry operation, because it was the French fries that truly captured his imagination. They were made from top-quality oblong Idaho russets, eight ounces apiece, deep-fried to a golden brown, and salted with a shaker that, as he put it, kept going like a Salvation Army girl's tambourine. They were crispy on the outside and buttery soft on the inside, and that day Kroc had a vision of a chain of restaurants, just like the one in San Bernardino, selling golden fries from one end of the country to the other. He asked the two brothers who owned the hamburger stand if he could buy their franchise rights. They said yes. Their names were Mac and Dick McDonald.
Ray Kroc was the great visionary of American fast food, the one who brought the lessons of the manufacturing world to the restaurant business. Before the fifties, it was impossible, in most American towns, to buy fries of consistent quality. Ray Kroc was the man who changed that. "The french fry," he once wrote, "would become almost sacrosanct for me, its preparation a ritual to be followed religiously." A potato that has too great a percentage of water--and potatoes, even the standard Idaho russet burbank, vary widely in their water content--will come out soggy at the end of the frying process. It was Kroc, back in the fifties, who sent out field men, armed with hydrometers, to make sure that all his suppliers were producing potatoes in the optimal solids range of twenty to twenty-three per cent. Freshly harvested potatoes, furthermore, are rich in sugars, and if you slice them up and deep-fry them the sugars will caramelize and brown the outside of the fry long before the inside is cooked. To make a crisp French fry, a potato has to be stored at a warm temperature for several weeks in order to convert those sugars to starch. Here Kroc led the way as well, mastering the art of "curing" potatoes by storing them under a giant fan in the basement of his first restaurant, outside Chicago.
Perhaps his most enduring achievement, though, was the so-called potato computer--developed for McDonald's by a former electrical engineer for Motorola named Louis Martino--which precisely calibrated the optimal cooking time for a batch of fries. (The key: when a batch of cold raw potatoes is dumped into a vat of cooking oil, the temperature of the fat will drop and then slowly rise. Once the oil has risen three degrees, the fries are ready.) Previously, making high-quality French fries had been an art. The potato computer, the hydrometer, and the curing bins made it a science. By the time Kroc was finished, he had figured out how to turn potatoes into an inexpensive snack that would always be hot, salty, flavorful, and crisp, no matter where or when you bought it.
This was the first fast-food revolution--the mass production of food that had reliable mass appeal. But today, as the McDonald's franchise approaches its fiftieth anniversary, it is clear that fast food needs a second revolution. As many Americans now die every year from obesity-related illnesses--heart disease and complications of diabetes--as from smoking, and the fast-food toll grows heavier every year. In the fine new book "Fast Food Nation," the journalist Eric Schlosser writes of McDonald's and Burger King in the tone usually reserved for chemical companies, sweatshops, and arms dealers, and, as shocking as that seems at first, it is perfectly appropriate. Ray Kroc's French fries are killing us. Can fast food be fixed?
2.
Fast-food French fries are made from a baking potato like an Idaho russet, or any other variety that is mealy, or starchy, rather than waxy. The potatoes are harvested, cured, washed, peeled, sliced, and then blanched--cooked enough so that the insides have a fluffy texture but not so much that the fry gets soft and breaks. Blanching is followed by drying, and drying by a thirty-second deep fry, to give the potatoes a crisp shell. Then the fries are frozen until the moment of service, when they are deep-fried again, this time for somewhere around three minutes. Depending on the fast-food chain involved, there are other steps interspersed in this process. McDonald's fries, for example, are briefly dipped in a sugar solution, which gives them their golden-brown color; Burger King fries are dipped in a starch batter, which is what gives those fries their distinctive hard shell and audible crunch. But the result is similar. The potato that is first harvested in the field is roughly eighty per cent water. The process of creating a French fry consists, essentially, of removing as much of that water as possible--through blanching, drying, and deep-frying--and replacing it with fat.
Elisabeth Rozin, in her book "The Primal Cheeseburger," points out that the idea of enriching carbohydrates with fat is nothing new. It's a standard part of the cuisine of almost every culture. Bread is buttered; macaroni comes with cheese; dumplings are fried; potatoes are scalloped, baked with milk and cheese, cooked in the dripping of roasting meat, mixed with mayonnaise in a salad, or pan-fried in butterfat as latkes. But, as Rozin argues, deep-frying is in many ways the ideal method of adding fat to carbohydrates. If you put butter on a mashed potato, for instance, the result is texturally unexciting: it simply creates a mush. Pan-frying results in uneven browning and crispness. But when a potato is deep-fried the heat of the oil turns the water inside the potato into steam, which causes the hard granules of starch inside the potato to swell and soften: that's why the inside of the fry is fluffy and light. At the same time, the outward migration of the steam limits the amount of oil that seeps into the interior, preventing the fry from getting greasy and concentrating the oil on the surface, where it turns the outer layer of the potato brown and crisp. "What we have with the french fry," Rozin writes, "is a near perfect enactment of the enriching of a starch food with oil or fat."
This is the trouble with the French fry. The fact that it is cooked in fat makes it unhealthy. But the contrast that deep-frying creates between its interior and its exterior--between the golden shell and the pillowy whiteness beneath--is what makes it so irresistible. The average American now eats a staggering thirty pounds of French fries a year, up from four pounds when Ray Kroc was first figuring out how to mass-produce a crisp fry. Meanwhile, fries themselves have become less healthful. Ray Kroc, in the early days of McDonald's, was a fan of a hot-dog stand on the North Side of Chicago called Sam's, which used what was then called the Chicago method of cooking fries. Sam's cooked its fries in animal fat, and Kroc followed suit, prescribing for his franchises a specially formulated beef tallow called Formula 47 (in reference to the forty-seven-cent McDonald's "All-American meal" of the era: fifteen-cent hamburger, twelve-cent fries, twenty-cent shake). Among aficionados, there is general agreement that those early McDonald's fries were the finest mass-market fries ever made: the beef tallow gave them an unsurpassed rich, buttery taste. But in 1990, in the face of public concern about the health risks of cholesterol in animal-based cooking oil, McDonald's and the other major fast-food houses switched to vegetable oil. That wasn't an improvement, however. In the course of making vegetable oil suitable for deep frying, it is subjected to a chemical process called hydrogenation, which creates a new substance called a trans unsaturated fat. In the hierarchy of fats, polyunsaturated fats--the kind found in regular vegetable oils--are the good kind; they lower your cholesterol. Saturated fats are the bad kind. But trans fats are worse: they wreak havoc with the body's ability to regulate cholesterol.
According to a recent study involving some eighty thousand women, for every five-per-cent increase in the amount of saturated fats that a woman consumes, her risk of heart disease increases by seventeen per cent. But only a two-per-cent increase in trans fats will increase her heart-disease risk by ninety-three per cent. Walter Willett, an epidemiologist at Harvard--who helped design the study--estimates that the consumption of trans fats in the United States probably causes about thirty thousand premature deaths a year.
McDonald's and the other fast-food houses aren't the only purveyors of trans fats, of course; trans fats are in crackers and potato chips and cookies and any number of other processed foods. Still, a lot of us get a great deal of our trans fats from French fries, and to read the medical evidence on trans fats is to wonder at the odd selectivity of the outrage that consumers and the legal profession direct at corporate behavior. McDonald's and Burger King and Wendy's have switched to a product, without disclosing its risks, that may cost human lives. What is the difference between this and the kind of thing over which consumers sue companies every day?
3.
The French-fry problem ought to have a simple solution: cook fries in oil that isn't so dangerous. Oils that are rich in monounsaturated fats, like canola oil, aren't nearly as bad for you as saturated fats, and are generally stable enough for deep-frying. It's also possible to "fix" animal fats so that they aren' t so problematic. For example, K. C. Hayes, a nutritionist at Brandeis University, has helped develop an oil called Appetize. It's largely beef tallow, which gives it a big taste advantage over vegetable shortening, and makes it stable enough for deep-frying. But it has been processed to remove the cholesterol, and has been blended with pure corn oil, in a combination that Hayes says removes much of the heart-disease risk.
Perhaps the most elegant solution would be for McDonald's and the other chains to cook their fries in something like Olestra, a fat substitute developed by Procter & Gamble. Ordinary fats are built out of a molecular structure known as a triglyceride: it's a microscopic tree, with a trunk made of glycerol and three branches made of fatty acids. Our bodies can't absorb triglycerides, so in the digestive process each of the branches is broken off by enzymes and absorbed separately. In the production of Olestra, the glycerol trunk of a fat is replaced with a sugar, which has room for not three but eight fatty acids. And our enzymes are unable to break down a fat tree with eight branches--so the Olestra molecule can't be absorbed by the body at all. "Olestra" is as much a process as a compound: you can create an "Olestra" version of any given fat. Potato chips, for instance, tend to be fried in cottonseed oil, because of its distinctively clean taste. Frito-Lay's no-fat Wow! chips are made with an Olestra version of cottonseed oil, which behaves just like regular cottonseed oil except that it's never digested. A regular serving of potato chips has a hundred and fifty calories, ninety of which are fat calories from the cooking oil. A serving of Wow! chips has seventy-five calories and no fat. If Procter & Gamble were to seek F.D.A. approval for the use of Olestra in commercial deep-frying (which it has not yet done), it could make an Olestra version of the old McDonald's Formula 47, which would deliver every nuance of the old buttery, meaty tallow at a fraction of the calories.
Olestra, it must be said, does have some drawbacks--in particular, a reputation for what is delicately called "gastrointestinal distress." The F.D.A. has required all Olestra products to carry a somewhat daunting label saying that they may cause "cramping and loose stools." Not surprisingly, sales have been disappointing, and Olestra has never won the full acceptance of the nutrition community. Most of this concern, however, appears to be overstated. Procter & Gamble has done randomized, double-blind studies--one of which involved more than three thousand people over six weeks--and found that people eating typical amounts of Olestra-based chips don't have significantly more gastrointestinal problems than people eating normal chips. Diarrhea is such a common problem in America--nearly a third of adults have at least one episode each month--that even F.D.A. regulators now appear to be convinced that in many of the complaints they received Olestra was unfairly blamed for a problem that was probably caused by something else. The agency has promised Procter & Gamble that the warning label will be reviewed.
Perhaps the best way to put the Olestra controversy into perspective is to compare it to fibre. Fibre is vegetable matter that goes right through you: it's not absorbed by the gastrointestinal tract. Nutritionists tell us to eat it because it helps us lose weight and it lowers cholesterol--even though if you eat too many baked beans or too many bowls of oat bran you will suffer the consequences. Do we put warning labels on boxes of oat bran? No, because the benefits of fibre clearly outweigh its drawbacks. Research has suggested that Olestra, like fibre, helps people lose weight and lowers cholesterol; too much Olestra, like too much fibre, may cause problems. (Actually, too much Olestra may not be as troublesome as too much bran. According to Procter & Gamble, eating a large amount of Olestra--forty grams--causes no more problems than eating a small bowl--twenty grams--of wheat bran.) If we had Olestra fries, then, they shouldn't be eaten for breakfast, lunch, and dinner. In fact, fast-food houses probably shouldn't use hundred-per-cent Olestra; they should cook their fries in a blend, using the Olestra to displace the most dangerous trans and saturated fats. But these are minor details. The point is that it is entirely possible, right now, to make a delicious French fry that does not carry with it a death sentence. A French fry can be much more than a delivery vehicle for fat.
4.
Is it really that simple, though? Consider the cautionary tale of the efforts of a group of food scientists at Auburn University, in Alabama, more than a decade ago to come up with a better hamburger. The Auburn team wanted to create a leaner beef that tasted as good as regular ground beef. They couldn't just remove the fat, because that would leave the meat dry and mealy. They wanted to replace the fat. "If you look at ground beef, it contains moisture, fat, and protein," says Dale Huffman, one of the scientists who spearheaded the Auburn project. "Protein is relatively constant in all beef, at about twenty per cent. The traditional McDonald's ground beef is around twenty per cent fat. The remainder is water. So you have an inverse ratio of water and fat. If you reduce fat, you need to increase water." The goal of the Auburn scientists was to cut about two-thirds of the fat from normal ground beef, which meant that they needed to find something to add to the beef that would hold an equivalent amount of water--and continue to retain that water even as the beef was being grilled. Their choice? Seaweed, or, more precisely, carrageenan. "It's been in use for centuries," Huffman explains. "It's the stuff that keeps the suspension in chocolate milk--otherwise the chocolate would settle at the bottom. It has tremendous water-holding ability. There's a loose bond between the carrageenan and the moisture." They also selected some basic flavor enhancers, designed to make up for the lost fat "taste." The result was a beef patty that was roughly three-quarters water, twenty per cent protein, five per cent or so fat, and a quarter of a per cent seaweed. They called it AU Lean.
It didn't take the Auburn scientists long to realize that they had created something special. They installed a test kitchen in their laboratory, got hold of a McDonald's grill, and began doing blind taste comparisons of AU Lean burgers and traditional twenty- per-cent-fat burgers. Time after time, the AU Lean burgers won. Next, they took their invention into the field. They recruited a hundred families and supplied them with three kinds of ground beef for home cooking over consecutive three-week intervals--regular "market" ground beef with twenty per cent fat, ground beef with five percent fat, and AU Lean. The families were asked to rate the different kinds of beef, without knowing which was which. Again, the AU Lean won hands down--trumping the other two on "likability," "tenderness," "flavorfulness," and "juiciness."
What the Auburn team showed was that, even though people love the taste and feel of fat--and naturally gravitate toward high-fat food--they can be fooled into thinking that there is a lot of fat in something when there isn't. Adam Drewnowski, a nutritionist at the University of Washington, has found a similar effect with cookies. He did blind taste tests of normal and reduced-calorie brownies, biscotti, and chocolate-chip, oatmeal, and peanut-butter cookies. If you cut the sugar content of any of those cookies by twenty-five per cent, he found, people like the cookies much less. But if you cut the fat by twenty-five per cent they barely notice. "People are very finely attuned to how much sugar there is in a liquid or a solid," Drewnowski says. "For fat, there's no sensory break point. Fat comes in so many guises and so many textures it is very difficult to perceive how much is there." This doesn't mean we are oblivious of fat levels, of course. Huffman says that when his group tried to lower the fat in AU Lean below five per cent, people didn't like it anymore. But, within the relatively broad range of between five and twenty-five per cent, you can add water and some flavoring and most people can't tell the difference.
What's more, people appear to be more sensitive to the volume of food they consume than to its calorie content. Barbara Rolls, a nutritionist at Penn State, has demonstrated this principle with satiety studies. She feeds one group of people a high-volume snack and another group a low-volume snack. Even though the two snacks have the same calorie count, she finds that people who eat the high-volume snack feel more satisfied. "People tend to eat a constant weight or volume of food in a given day, not a constant portion of calories," she says. Eating AU Lean, in short, isn't going to leave you with a craving for more calories; you'll feel just as full.
For anyone looking to improve the quality of fast food, all this is heartening news. It means that you should be able to put low-fat cheese and low-fat mayonnaise in a Big Mac without anyone's complaining. It also means that there's no particular reason to use twenty-per-cent-fat ground beef in a fast-food burger. In 1990, using just this argument, the Auburn team suggested to McDonald's that it make a Big Mac out of AU Lean. Shortly thereafter, McDonald's came out with the McLean Deluxe. Other fast-food houses scrambled to follow suit. Nutritionists were delighted. And fast food appeared on the verge of a revolution.
Only, it wasn't. The McLean was a flop, and four years later it was off the market. What happened? Part of the problem appears to have been that McDonald's rushed the burger to market before many of the production kinks had been worked out. More important, though, was the psychological handicap the burger faced. People liked AU Lean in blind taste tests because they didn't know it was AU Lean; they were fooled into thinking it was regular ground beef. But nobody was fooled when it came to the McLean Deluxe. It was sold as the healthy choice--and who goes to McDonald's for health food?
Leann Birch, a developmental psychologist at Penn State, has looked at the impact of these sorts of expectations on children. In one experiment, she took a large group of kids and fed them a big lunch. Then she turned them loose in a room with lots of junk food. "What we see is that some kids eat almost nothing," she says. "But other kids really chow down, and one of the things that predicts how much they eat is the extent to which parents have restricted their access to high-fat, high-sugar food in the past: the more the kids have been restricted, the more they eat." Birch explains the results two ways. First, restricting food makes kids think not in terms of their own hunger but in terms of the presence and absence of food. As she puts it, "The kid is essentially saying, 'If the food's here I better get it while I can, whether or not I'm hungry.' We see these five-year-old kids eating as much as four hundred calories." Birch's second finding, though, is more important. Because the children on restricted diets had been told that junk food was bad for them, they clearly thought that it had to taste good. When it comes to junk food, we seem to follow an implicit script that powerfully biases the way we feel about food. We like fries not in spite of the fact that they're unhealthy but because of it.
That is sobering news for those interested in improving the American diet. For years, the nutrition movement in this country has made transparency one of its principal goals: it has assumed that the best way to help people improve their diets is to tell them precisely what's in their food, to label certain foods good and certain foods bad. But transparency can backfire, because sometimes nothing is more deadly for our taste buds than the knowledge that what we are eating is good for us. McDonald's should never have called its new offering the McLean Deluxe, in other words. They should have called it the Burger Supreme or the Monster Burger, and then buried the news about reduced calories and fat in the tiniest type on the remotest corner of their Web site. And if we were to cook fries in some high-tech, healthful cooking oil--whether Olestrized beef tallow or something else with a minimum of trans and saturated fats--the worst thing we could do would be to market them as healthy fries. They will not taste nearly as good if we do. They have to be marketed as better fries, as Classic Fries, as fries that bring back the rich tallowy taste of the original McDonald's.
What, after all, was Ray Kroc's biggest triumph? A case could be made for the field men with their hydrometers, or the potato-curing techniques, or the potato computer, which turned the making of French fries from an art into a science. But we should not forget Ronald McDonald, the clown who made the McDonald's name irresistible to legions of small children. Kroc understood that taste comprises not merely the food on our plate but also the associations and assumptions and prejudices we bring to the table--that half the battle in making kids happy with their meal was calling what they were eating a Happy Meal. The marketing of healthful fast food will require the same degree of subtlety and sophistication. The nutrition movement keeps looking for a crusader--someone who will bring about better public education and tougher government regulations. But we need much more than that. We need another Ray Kroc.
Wrong Turn
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
June 11, 2001
A REPORTER AT LARGE
How the fight to make America's
highways safer went off course.
I. BANG
Every two miles, the average driver makes four hundred observations, forty decisions, and one mistake. Once every five hundred miles, one of those mistakes leads to a near collision, and once every sixty-one thousand miles one of those mistakes leads to a crash. When people drive, in other words, mistakes are endemic and accidents inevitable, and that is the first and simplest explanation for what happened to Robert Day on the morning of Saturday, April 9, 1994. He was driving a 1980 Jeep Wagoneer from his home, outside Philadelphia, to spend a day working on train engines in Winslow Township, New Jersey. He was forty-four years old, and made his living as an editor for the Chilton Book Company. His ten-year-old son was next to him, in the passenger seat. It was a bright, beautiful spring day. Visibility was perfect, and the roadway was dry, although one of the many peculiarities of car crashes is that they happen more often under ideal road conditions than in bad weather. Day's route took him down the Atlantic City Expressway to Fleming Pike, a two-lane country road that winds around a sharp curve and intersects, about a mile later, with Egg Harbor Road. In that final stretch of Fleming Pike, there is a scattering of houses and a fairly thick stand of trees on either side of the road, obscuring all sight lines to the left and right. As he approached the intersection, then, Day could not have seen a blue-and-gray 1993 Ford Aerostar minivan travelling between forty and fifty miles per hour southbound on Egg Harbor, nor a white 1984 Mazda 626 travelling at approximately fifty miles per hour in the other direction. Nor, apparently, did he see the stop sign at the corner, or the sign a tenth of a mile before that, warning of the intersection ahead. Day's son, in the confusing aftermath of the accident, told police that he was certain his father had come to a stop at the corner. But the accident's principal witness says he never saw any brake lights on the Wagoneer, and, besides, there is no way that the Jeep could have done the damage that it did from a standing start. Perhaps Day was distracted. The witness says that Day's turn signal had been on since he left the expressway. Perhaps he was looking away and looked back at the road at the wrong time, since there is an area, a few hundred yards before Egg Harbor Road, just on the near side of a little ridge, where the trees and houses make it look as if Fleming Pike ran without interruption well off into the distance. We will never know, and in any case it does not matter much. Day merely did what all of us do every time we get in a car: he made a mistake. It's just that he was unlucky enough that his mistake led him directly into the path of two other cars.
The driver of the Ford Aerostar was Stephen Capoferri, then thirty-nine. He worked in the warehouse of Whitehall Laboratories, in southern New Jersey. He had just had breakfast with his parents and was on his way to the bank. The driver of the Mazda was Elizabeth Wolfrum. She was twenty-four. She worked as the manager of a liquor store. Her eighteen-year-old sister, Julie, was in the passenger seat; a two-year-old girl was in the back seat. Because of the vegetation on either side of Fleming Pike, Capoferri did not see Day's vehicle until it was just eighty-five feet from the point of impact, and if we assume that Day was travelling at forty miles per hour, or fifty-nine feet per second, that means that Capoferri had about 1.5 seconds to react. That is scarcely enough time. The average adult needs about that long simply to translate an observation ("That car is going awfully fast") into an action ("I ought to hit my brake"). Capoferri hit Day broadside, at a slight angle, the right passenger side of the Aerostar taking most of the impact. The Jeep was pushed sidewise, but it kept going forward, pulling off the grille and hood of the Aerostar, and sending it into a two-hundred-and-seventy-degree counterclockwise spin. As the Jeep lurched across the intersection, it slammed into the side of Wolfrum's Mazda. The cars slapped together, and then skidded together across the intersection, ending on the grass on the far, southeastern corner. According to documents filed by Elizabeth Wolfrum's lawyers, Wolfrum suffered eighteen injuries, including a ruptured spleen, multiple liver lacerations, brain damage, and fractures to the legs, ribs, ankles, and nose. Julie Wolfrum was partially ejected from the Mazda and her face hit the ground. She subsequently underwent seventeen separate surgical procedures and remained in intensive care for forty-four days. In post-crash photographs, their car looks as if it had been dropped head first from an airplane. Robert Day suffered massive internal injuries and was pronounced dead two hours later, at West Jersey Hospital. His son was bruised and shaken up. Capoferri walked away largely unscathed.
"Once the impact occurred, I did a spin," he remembers. "I don't recall doing that. I may have blacked out. It couldn't have been for very long. I wanted to get out. I was trying to judge how I was. I was having a little trouble breathing. But I knew I could walk. My senses were gradually coming back to normal. I'm pretty sure I went to Day's vehicle first. I went to the driver's side. He was semi-conscious. He had blood coming out of his mouth. I tried to keep him awake. His son was in the passenger seat. He had no injuries. He said, 'Is my father O.K.?' I seem to remember looking in the Mazda. My first impression was that they were dead, because the driver's side of the vehicle was very badly smashed in. I think they needed the 'jaws of life' to get them out. There was a little girl in the back. She was crying."
Capoferri has long black hair and a beard and the build of a wrestler. He is a thoughtful man who chooses his words carefully. As he talked, he was driving his Taurus back toward the scene of the accident, and he was apologetic that he could not recall more details of those moments leading up to the accident. But what is there to remember? In the popular imagination--fuelled by the car crashes of Hollywood movies, with their special effects and complicated stunts--an accident is a protracted sequence, played out in slow motion, over many frames. It is not that way in real life. The time that elapsed between the collision of Capoferri and Day and Day and Wolfrum was probably no more than twenty-five milliseconds, faster than the blinking of an eye, and the time that elapsed between the moment Capoferri struck Day and the moment his van came to a rest, two hundred and seventy degrees later, was probably no more than a second. Capoferri said that a friend of his, who lived right on the corner where the accident happened, told him later that all the crashing and spinning and skidding sounded like an single, sharp explosion--bang!
II. THE PASSIVE APPROACH
In the middle part of the last century, a man named William Haddon changed forever the way Americans think about car accidents. Haddon was, by training, a medical doctor and an epidemiologist and, by temperament, a New Englander--tall and reed-thin, with a crewcut, a starched white shirt, and a bow tie. He was exacting and cerebral, and so sensitive to criticism that it was said of him that he could be "blistered by moonbeams." He would not eat mayonnaise, or anything else subject to bacterial contamination. He hated lawyers, which was ironic, because it was lawyers who became his biggest disciples. Haddon was discovered by Daniel Patrick Moynihan, when Moynihan was working for Averell Harriman, then the Democratic governor of New York State. It was 1958. Moynihan was chairing a meeting on traffic safety, in Albany's old state-executive-office chambers, and a young man at the back of the room kept asking pointed questions. "What's your name?" Moynihan eventually asked, certain he had collared a Republican spy. "Haddon, sir," the young man answered. He was just out of the Harvard School of Public Health, and convinced that what the field of traffic safety needed was the rigor of epidemiology. Haddon asked Moynihan what data he was using. Moynihan shrugged. He wasn't using any data at all.
Haddon and Moynihan went across the street to Yezzi's, a local watering hole, and Moynihan fell under Haddon's spell. The orthodoxy of that time held that safety was about reducing accidents--educating drivers, training them, making them slow down. To Haddon, this approach made no sense. His goal was to reduce the injuries that accidents caused. In particular, he did not believe in safety measures that depended on changing the behavior of the driver, since he considered the driver unreliable, hard to educate, and prone to error. Haddon believed the best safety measures were passive. "He was a gentle man," Moynihan recalls. "Quiet, without being mum. He never forgot that what we were talking about were children with their heads smashed and broken bodies and dead people."
Several years later, Moynihan was working for President Johnson in the Department of Labor, and hired a young lawyer out of Harvard named Ralph Nader to work on traffic-safety issues. Nader, too, was a devotee of Haddon's ideas, and he converted a young congressional aide named Joan Claybrook. In 1959, Moynihan wrote an enormously influential article, articulating Haddon's principles, called "Epidemic on the Highways." In 1965, Nader wrote his own homage to the Haddon philosophy, "Unsafe at Any Speed," which became a best-seller, and in 1966 the Haddon crusade swept Washington. In the House and the Senate, there were packed hearings on legislation to create a federal regulatory agency for traffic safety. Moynihan and Haddon testified, as did a liability lawyer from South Carolina, in white shoes and a white suit, and a Teamsters official, Jimmy Hoffa, whom Claybrook remembers as a "fabulous" witness. It used to be that, during a frontal crash, steering columns in cars were pushed back through the passenger compartment, potentially impaling the driver. The advocates argued that columns should collapse inward on impact. Instrument panels ought to be padded, they said, and knobs shouldn't stick out, where they might cause injury. Doors ought to have strengthened side-impact beams. Roofs should be strong enough to withstand a rollover. Seats should have head restraints to protect against neck injuries. Windshields ought to be glazed, so that if you hit them with your head at high speed your face wasn't cut to ribbons. The bill sailed through both houses of Congress, and a regulatory body, which eventually became the National Highway Traffic Safety Administration, was established. Haddon was made its commissioner, Claybrook his special assistant. "I remember a Senate hearing we had with Warren Magnuson," Nader recalls. "He was listening to a pediatrician who was one of our allies, Seymour Charles, from New Jersey, and Charles was showing how there were two cars that collided, and one had a collapsible steering column and one didn't, and one driver walked away, the other was killed. And, just like that, Magnuson caught on. 'You mean,' he said, 'you can have had a crash without an injury?' That's it! A crash without an injury. That idea was very powerful."
There is no question that the improvements in auto design which Haddon and his disciples pushed for saved countless lives. They changed the way cars were built, and put safety on the national agenda. What they did not do, however, is make American highways the safest in the world. In fact--and this is the puzzling thing about the Haddon crusade--the opposite happened. United States auto-fatality rates were the lowest in the world before Haddon came along. But, since the late nineteen-seventies, just as the original set of N.H.T.S.A. safety standards were having their biggest impact, America's safety record has fallen to eleventh place. According to calculations by Leonard Evans, a longtime General Motors researcher and one of the world's leading experts on traffic safety, if American traffic fatalities had declined at the same rate as Canada's or Australia's between 1979 and 1997, there would have been somewhere in the vicinity of a hundred and sixty thousand fewer traffic deaths in that span.
This is not to suggest, of course, that Haddon's crusade is responsible for a hundred and sixty thousand highway deaths. Traffic safety is the most complex of phenomena--fatality rates can be measured in many ways, and reflect a hundred different variables--and in this period there were numerous factors that distinguished the United States from places like Canada and Australia, including different trends in drunk driving. Nor is it to say that the Haddonites had anything but the highest motives. Still, Evans's figures raise a number of troubling questions. Haddon and Nader and Claybrook told us, after all, that the best way to combat the epidemic on the highways was to shift attention from the driver to the vehicle. No other country pursued the passive strategy as vigorously, and no other country had such high expectations for its success. But America's slipping record on auto safety suggests that somewhere in the logic of that approach there was a mistake. And, if so, it necessarily changes the way we think about car crashes like the one that happened seven years ago on the corner of Fleming Pike and Egg Harbor Road.
"I think that the philosophical argument behind the passive approach is a strong one," Evans says. A physicist by training, he is a compact, spry man in his sixties, with a trace in his voice of his native Northern Ireland. On the walls of his office in suburban Detroit is a lifetime of awards and certifications from safety researchers, but, like many technical types, he is embittered by how hard it has been to make his voice heard in the safety debates of the past thirty years. "Either you can persuade people to boil their own water because there is a typhoid epidemic or you can put chlorine in the water," he went on. "And the second, passive solution is obviously preferred to the first, because there is no way you can persuade everyone to act in a prudent way. But starting from that philosophical principle and then ignoring reality is a recipe for disaster. And that's what happened. Why?" Here Evans nearly leaped out of his chair. "Because there isn't any chlorine for traffic crashes."
III. THE FIRST COLLISION
Robert Day's crash was not the accident of a young man. He was hit from the side, and adolescents and young adults usually have side-impact crashes when their cars slide off the road into a fixed object like a tree, often at reckless speeds. Older people tend to have side-impact crashes at normal speeds, in intersections, and as the result of error, not negligence. In fact, Day's crash was not merely typical in form; it was the result of a common type of driver error. He didn't see something he was supposed to see.
His mistake is, on one level, difficult to understand. There was a sign, clearly visible from the roadway, telling him of an intersection ahead, and then another, in bright red, telling him to stop. How could he have missed them both? From what we know of human perception, though, this kind of mistake happens all the time. Imagine, for instance, that you were asked to look at the shape of a cross, briefly displayed on a computer screen, and report on which arm of the cross was longer. After you did this a few times, another object, like a word or a small colored square--what psychologists call a critical stimulus--flashes next to the cross on the screen, right in front of your eyes. Would you see the critical stimulus? Most of us would say yes. Intuitively, we believe that we "see" everything in our field of vision--particularly things right in front of us--and that the difference between the things we pay attention to and the things we don't is simply that the things we focus on are the things we become aware of. But when experiments to test this assumption were conducted recently by Arien Mack, a psychologist at the New School, in New York, she found, to her surprise, that a significant portion of her observers didn't see the second object at all: it was directly in their field of vision, and yet, because their attention was focussed on the cross, they were oblivious of it. Mack calls this phenomenon "inattentional blindness."
Daniel Simons, a professor of psychology at Harvard, has done a more dramatic set of experiments, following on the same idea. He and a colleague, Christopher Chabris, recently made a video of two teams of basketball players, one team in white shirts and the other in black, each player in constant motion as two basketballs are passed back and forth. Observers were asked to count the number of passes completed by the members of the white team. After about forty-five seconds of passes, a woman in a gorilla suit walks into the middle of the group, stands in front of the camera, beats her chest vigorously, and then walks away. "Fifty per cent of the people missed the gorilla," Simons says. "We got the most striking reactions. We'd ask people, 'Did you see anyone walking across the screen?' They'd say no. Anything at all? No. Eventually, we'd ask them, 'Did you notice the gorilla?' And they'd say, 'The what?'" Simons's experiment is one of those psychological studies which are impossible to believe in the abstract: if you look at the video (called "Gorillas in Our Midst") when you know what's coming, the woman in the gorilla suit is inescapable. How could anyone miss that? But people do. In recent years, there has been much scientific research on the fallibility of memory--on the fact that eyewitnesses, for example, often distort or omit critical details when they recall what they saw. But the new research points to something that is even more troubling: it isn't just that our memory of what we see is selective; it's that seeing itself is selective.
This is a common problem in driving. Talking on a cell phone and trying to drive, for instance, is not unlike trying to count passes in a basketball game and simultaneously keep track of wandering animals. "When you get into a phone conversation, it's different from the normal way we have evolved to interact," David Strayer, a professor of psychology at the University of Utah, says. "Normally, conversation is face to face. There are all kinds of cues. But when you are on the phone you strip that away. It's virtual reality. You attend to that virtual reality, and shut down processing of the here and now." Strayer has done tests of people who were driving and talking on phones, and found that they remember far fewer things than those driving without phones. Their field of view shrinks. In one experiment, he flashed red and green lights at people while they were driving, and those on the phone missed twice as many lights as the others, and responded far more slowly to those lights they did see. "We tend to find the biggest deficits in unexpected events, a child darting onto the road, a light changing," Strayer says. "Someone going into your lane. That's what you don't see. There is a part of driving that is automatic and routine. There is a second part of driving that is completely unpredictable, and that is the part that requires attention." This is what Simons found with his gorilla, and it is the scariest part of inattentional blindness. People allow themselves to be distracted while driving because they think that they will still be able to pay attention to anomalies. But it is precisely those anomalous things, those deviations from the expected script, which they won't see.
Marc Green, a psychologist with an accident-consulting firm in Toronto, once worked on a case where a woman hit a bicyclist with her car. "She was pulling into a gas station," Green says. "It was five o'clock in the morning. She'd done that almost every day for a year. She looks to the left, and then she hears a thud. There's a bicyclist on the ground. She'd looked down that sidewalk nearly every day for a year and never seen anybody. She adaptively learned to ignore what was on that sidewalk because it was useless information. She may actually have turned her eyes toward him and failed to see him." Green says that, once you understand why the woman failed to see the bicyclist, the crash comes to seem almost inevitable.
It's the same conclusion that Haddon reached, and that formed the basis for his conviction that Americans were spending too much time worrying about what happened before an accident and not enough time worrying about what happened during and after an accident. Sometimes crashes happen because people do stupid things that they shouldn't have done--like drink or speed or talk on their cell phone. But sometimes people do stupid things that they cannot help, and it makes no sense to construct a safety program that does not recognize human fallibility. Just imagine, for example, that you're driving down a country road. The radio is playing. You're talking to your son, next to you. There is a highway crossing up ahead, but you can't see it, nor can you see any cars on the roadway, because of a stand of trees on both sides of the road. Maybe you look away from the road, for a moment, to change the dial on the radio, or something catches your eye outside, and when you glance back it happens to be at the very moment when a trick of geography makes it look as if your road stretched without interruption well off into the distance. Suddenly, up ahead, right in front of your eyes looms a bright-red anomalous stop sign--as out of place in the momentary mental universe that you have constructed for yourself as a gorilla in a basketball game--and, precisely because it is so anomalous, it doesn't register. Then--bang! How do you prevent an accident like that?
IV. THE SECOND COLLISION
One day in 1968, a group of engineers from the Cleveland-based auto-parts manufacturer Eaton, Yale &Towne went to Washington, D.C., to see William Haddon. They carried with them a secret prototype of what they called the People Saver. It was a nylon air cushion that inflated on impact, and the instant Haddon saw it he was smitten. "Oh, he was ecstatic, just ecstatic," Claybrook recalls. "I think it was one of the most exciting moments of his life."
The air bag had been invented in the early fifties by a man named John Hetrick, who became convinced, after running his car into a ditch, that drivers and passengers would be much safer if they could be protected by some kind of air cushion. But how could one inflate it in the first few milliseconds of a crash? As he pondered the problem, Hetrick remembered a freak accident that had happened during the war, when he was in the Navy working in a torpedo-maintenance shop. Torpedos carry a charge of compressed air, and one day a torpedo covered in canvas accidentally released its charge. All at once, Hetrick recalled years later, the canvas "shot up into the air, quicker than you could blink an eye." Thus was the idea for the air bag born.
In its earliest incarnation, the air bag was a crude device; one preliminary test inadvertently killed a baboon, and there were widespread worries about the safety of detonating what was essentially a small bomb inside a car. (Indeed, as a result of numerous injuries to children and small adults, air bags have now been substantially depowered.) But to Haddon the People Saver was the embodiment of everything he believed in--it was the chlorine in the water, and it solved a problem that had been vexing him for years. The Haddonites had always insisted that what was generally called a crash was actually two separate events. The first collision was the initial contact between two automobiles, and in order to prevent the dangerous intrusion of one car into the passenger compartment of another, they argued, cars ought to be built with a protective metal cage around the front and back seats. The second collision, though, was even more important. That was the collision between the occupants of a car and the inside of their own vehicle. If the driver and his passengers were to survive the abrupt impact of a crash, they needed a second safety system, which carefully and gradually decelerated their bodies. The logical choice for that task was seat belts, but Haddon, with his background in public health, didn't trust safety measures that depended on an individual's active cooperation. "The biggest problem we had back then was that only about twelve per cent of the public used seat belts," Claybrook says. "They were terribly designed, and people didn't use them." With the air bag, there was no decision to make. The Haddonites called it a "technological vaccine," and attacked its doubters in Detroit for showing "an absence of moral and ethical leadership." The air bag, they vowed, was going to replace the seat belt. In "Unsafe at Any Speed," Nader wrote:
The seat belt should have been introduced in the twenties and rendered obsolete by the early fifties, for it is only the first step toward a more rational passenger restraint system which modern technology could develop and perfect for mass production. Such a system ideally would not rely on the active participation of the passenger to take effect; it would be the superior passive safety design which would come into use only when needed, and without active participation of the occupant. . . . Protection like this could be achieved by a kind of inflatable air bag restraint which would be actuated to envelop a passenger before a crash.
For the next twenty years, Haddon, Nader, and Claybrook were consumed by the battle to force a reluctant Detroit to make the air bag mandatory equipment. There were lawsuits, and heated debates, and bureaucratic infighting. The automakers, mindful of cost and other concerns, argued that the emphasis ought to be on seat belts. But, to the Haddonites, Detroit was hopelessly in the grip of the old paradigm on auto safety. His opponents, Haddon wrote, with typical hauteur, were like "Malinowski's natives in their approaches to the hazards out the reef which they did not understand." Their attitudes were "redolent of the extranatural, supernatural and the pre-scientific." In 1991, the Haddonites won. That year, a law was passed requiring air bags in every new car by the end of the decade. It sounded like a great victory. But was it?
V. HADDON'S MISTAKE
When Stephen Capoferri's Aerostar hit Robert Day's Jeep Wagoneer, Capoferri's seat belt lay loose across his hips and chest. His shoulder belt probably had about two inches of slack. At impact, his car decelerated, but Capoferri's body kept moving forward, and within thirty milliseconds the slack in his seat belts was gone. In the language of engineers, he "loaded" his restraints. Under the force of Capoferri's onrushing weight, his belts began to stretch--the fabric giving by as much as six inches. As his shoulder belt grew taut, it dug into his chest, compressing it by another two inches, and if you had seen Capoferri at the moment of maximum forward trajectory his shoulder belt around his chest would have looked like a rubber band around a balloon. Simultaneously, within those first few milliseconds, his air bag exploded and rose to meet him at more than a hundred miles per hour. Forty to fifty milliseconds after impact, it had enveloped his face, neck, and upper chest. A fraction of a second later, the bag deated. Capoferri was thrown back against his seat. Total time elapsed: one hundred milliseconds.
Would Capoferri have lived without an air bag? Probably. He would have stretched his seat belt so far that his head would have hit the steering wheel. But his belts would have slowed him down enough that he might only have broken his nose or cut his forehead or suffered a mild concussion. The other way around, however, with an air bag but not a seat belt, his fate would have been much more uncertain. In the absence of seat belts, air bags work best when one car hits another squarely, so that the driver pitches forward directly into the path of the oncoming bag. But Capoferri hit Day at a slight angle. The front-passenger side of the Aerostar sustained more damage than the driver's side, which means that without his belts holding him in place he would have been thrown away from the air bag off to the side, toward the rearview mirror or perhaps even the front-passenger "A" pillar. Capoferri's air bag protected him only because he was wearing his seat belt. Car-crash statistics show this to be the rule. Wearing a seat belt cuts your chances of dying in an accident by forty-three per cent. If you add the protection of an air bag, your fatality risk is cut by forty-seven per cent. But an air bag by itself reduces the risk of dying in an accident by just thirteen per cent.
That the effectiveness of an air bag depended on the use of a seat belt was a concept that the Haddonites, in those early days, never properly understood. They wanted the air bag to replace the seat belt when in fact it was capable only of supplementing it, and they clung to that belief, even in the face of mounting evidence to the contrary. Don Huelke, a longtime safety researcher at the University of Michigan, remembers being on an N.H.T.S.A. advisory committee in the early nineteen-seventies, when people at the agency were trying to come up with statistics for the public on the value of air bags. "Their estimates were that something like twenty-eight thousand people a year could be saved by the air bags," he recalls, "and then someone pointed out to them that there weren't that many driver fatalities in frontal crashes in a year. It was kind of like 'Oops.' So the estimates were reduced." In 1977, Claybrook became the head of N.H.T.S.A. and renewed the push for air bags. The agency's estimate now was that air bags would cut a driver's risk of dying in a crash by forty per cent--a more modest but still implausible figure. "In 1973, there was a study in the open literature, performed at G.M., that estimated that the air bag would reduce the fatality risk to an unbelted driver by eighteen per cent," Leonard Evans says. "N.H.T.S.A. had this information and dismissed it. Why? Because it was from the automobile industry."
The truth is that even today it is seat belts, not air bags, that are providing the most important new safety advances. Had Capoferri been driving a late-model Ford minivan, for example, his seat belt would have had what is called a pretensioner: a tiny explosive device that would have taken the slack out of the belt just after the moment of impact. Without the pretensioner, Stephen Kozak, an engineer at Ford, explains, "you start to accelerate before you hit the belt. You get the clothesline effect." With it, Capoferri's deceleration would have been a bit more gradual. At the same time, belts are now being designed which cut down on chest compression. Capoferri's chest wall was pushed in two inches, and had he been a much older man, with less resilient bones and cartilage, that two-inch compression might have been enough to fracture three or four ribs. So belts now "pay out" extra webbing after a certain point: as Capoferri stretched forward, his belt would have been lengthened by several inches, relieving the pressure on his chest. The next stage in seat-belt design is probably to offer car buyers the option of what is called a four-point belt--two shoulder belts that run down the chest, like suspenders attached to a lap belt. Ford showed a four-point prototype at the auto shows this spring, and early estimates are that it might cut fatality risk by another ten per cent--which would make seat belts roughly five times more effective in saving lives than air bags by themselves. "The best solution is to provide automatic protection, including air bags, as baseline protection for everyone, with seat belts as a supplement for those who will use them," Haddon wrote in 1984. In putting air bags first and seat belts second, he had things backward.
Robert Day suffered a very different kind of accident from Stephen Capoferri's: he was hit from the side, and the physics of a side-impact crash are not nearly so forgiving. Imagine, for instance, that you punched a brick wall as hard as you could. If your fist was bare, you'd break your hand. If you had a glove with two inches of padding, your hand would sting. If you had a glove with six inches of padding, you might not feel much of anything. The more energyabsorbing material--the more space--you can put between your body and the wall, the better off you are. An automobile accident is no different. Capoferri lived, in part, because he had lots of space between himself and Day's Wagoneer. Cars have steel rails connecting the passenger compartment with the bumper, and each of those rails is engineered with what are called convolutions--accordionlike folds designed to absorb, slowly and evenly, the impact of a collision. Capoferri's van was engineered with twenty-seven inches of crumple room, and at the speed he was travelling he probably used about twenty-one inches of that. But Day had four inches, no more, between his body and the door, and perhaps another five to six inches in the door itself. Capoferri hit the wall with a boxing glove. Day punched it with his bare hand.
Day's problems were compounded by the fact that he was not wearing his seat belt. The right-front fender of Capoferri's Aerostar struck his Wagoneer squarely on the driver's door, pushing the Jeep sidewise, and if Day had been belted he would have moved with his vehicle, away from the onrushing Aerostar. But he wasn't, and so the Jeep moved out from under him: within fifteen milliseconds, the four inches of space between his body and the side of the Jeep was gone. The impact of the Aerostar slammed the driver's door against his ribs and spleen.
Day could easily have been ejected from his vehicle at that point. The impact of Capoferri's van shattered the glass in Day's door, and a Wagoneer, like most sports-utility vehicles, has a low belt line--meaning that the side windows are so large that with the glass gone there's a hole big enough for an unrestrained body to fly through. This is what it means to be "thrown clear" of a crash, although when that phrase is used in the popular literature it is sometimes said as if it were a good thing, when of course to be "thrown clear" of a crash is merely to be thrown into some other hard and even more lethal object, like the pavement or a tree or another car. Day, for whatever reason, was not thrown clear, and in that narrow sense he was lucky. This advantage, however, amounted to little. Day's door was driven into him like a sledgehammer.
Would a front air bag have saved Robert Day? Not at all. He wasn't moving forward into the steering wheel. He was moving sidewise into the door. Some cars now have additional air bags that are intended to protect the head as it hits the top of the door frame in a side-impact crash. But Day didn't die of head injuries. He died of abdominal injuries. Conceivably, a side-impact bag might have offered his abdomen some slight protection. But Day's best chance of surviving the accident would have been to wear his seat belt. It would have held him in place in those first few milliseconds of impact. It would have preserved some part of the space separating him from the door, diminishing the impact of the Aerostar. Day made two mistakes that morning, then, the second of which was not buckling up. But this is a point on which the Haddonites were in error as well, because the companion to their obsession with air bags was the equally false belief that encouraging drivers to wear their seat belts was a largely futile endeavor.
In the early nineteen-seventies, just at the moment when Haddon and Claybrook were pushing hardest for air bags, the Australian state of Victoria passed the world's first mandatory seat-belt legislation, and the law was an immediate success. With an aggressive public-education campaign, rates of seat-belt use jumped from twenty to eighty per cent. During the next several years, Canada, New Zealand, Germany, France, and others followed suit. But a similar movement in the United States in the early seventies stalled. James Gregory, who headed the N.H.T.S.A. during the Ford years, says that if Nader had advocated mandatory belt laws they might have carried the day. But Nader, then at the height of his fame and influence, didn't think that belt laws would work in this country. "You push mandatory belts, you might get a very adverse reaction," Nader says today of his thinking back then. "Mindless reaction. And how many tickets do you give out a day? What about back seats? At what point do you require a seat belt for small kids? And it's administratively difficult when people cross state lines. That's why I always focussed on the passive. We have a libertarian streak that Europe doesn't have." Richard Peet, a congressional staffer who helped draft legislation in Congress giving states financial incentives to pass belt laws, founded an organization in the early seventies to promote belt-wearing. "After I did that, some of the people who worked for Nader's organization went after me, saying that I was selling out the air-bag movement," Peet recalls. "That pissed me off. I thought the safety movement was the safety movement and we were all working together for common aims." In "Auto Safety," a history of auto-safety regulation, John Graham, of the Harvard School of Public Health, writes of Claybrook's time at the N.H.T.S.A.:
Her lack of aggressive leadership on safety belt use was a major source of irritation among belt use advocates, auto industry officials, and officials from state safety programs. They saw her pessimistic attitudes as a self-fulfilling prophecy. One of Claybrook's aides at N.H.T.S.A. who worked with state agencies acknowledged: "It is fair to say that Claybrook never made a dedicated effort to get mandatory belt-use laws." Another aide offered the following explanation of her philosophy: "Joan didn't do much on mandatory belt use because her primary interests were in vehicle regulation. She was fond of saying 'it is easier to get twenty auto companies to do something than to get 200 million Americans to do something.' "
Claybrook says that while at the N.H.T.S.A. she mailed a letter to all the state governors encouraging them to pass mandatory seat-belt legislation, and "not one governor would help us." It is clear that she had low expectations for her efforts. Even as late as 1984, Claybrook was still insisting that trying to encourage seat-belt use was a fool's errand. "It is not likely that mandatory seat belt usage laws will be either enacted or found acceptable to the public in large numbers," Claybrook wrote. "There is massive public resistance to adult safety belt usage." In the very year her words were published, however, a coalition of medical groups finally managed to pass the country's first mandatory seat-belt law, in New York, and the results were dramatic. One state after another soon did likewise, and public opinion about belts underwent what the pollster Gary Lawrence has called "one of the most phenomenal shifts in attitudes ever measured." Americans, it turned out, did not have a cultural aversion to seat belts. They just needed some encouragement. "It's not a big Freudian thing whether you buckle up or not," says B. J. Campbell, a former safety researcher at the University of North Carolina, who was one of the veterans of the seat-belt movement. "It's just a habit, and either you're in the habit of doing it or you're not."
Today, belt-wearing rates in the United States are just over seventy per cent, and every year they inch up a little more. But if the seat-belt campaign had begun in the nineteen-seventies, instead of the nineteen-eighties, the use rate in this country would be higher right now, and in the intervening years an awful lot of car accidents might have turned out differently, including one at the intersection of Egg Harbor Road and Fleming Pike.
VI. CRASH TEST
William Haddon died in 1985, of kidney disease, at the age of fty-eight. From the time he left government until his death, he headed an influential research group called the Insurance Institute for Highway Safety.
Joan Claybrook left the N.H.T.S.A. in 1980 and went on to run Ralph Nader's advocacy group Public Citizen, where she has been a powerful voice on auto safety ever since. In an interview this spring, Claybrook listed the things that she would do if she were back as the country's traffic-safety czar. "I'd issue a rollover standard, and have a thirty-miles-per-hour test for air bags," she said. "Upgrade the seating structure. Integrate the head restraint better. Upgrade the tire-safety standard. Provide much more consumer information. And also do more crash testing, whether it's rollover or offset crash testing and rear-crash testing." The most effective way to reduce automobile fatalities, she went on, would be to focus on rollovers--lowering the center of gravity in S.U.V.s, strengthening doors and roofs. In the course of outlining her agenda, Claybrook did not once mention the words "seat belt."
Ralph Nader, for his part, spends a great deal of time speaking at college campuses about political activism. He remains a distinctive figure, tall and slightly stooped, with a bundle of papers under his arm. His interests have widened in recent years, but he is still passionate about his first crusade. "Haddon was all business--never made a joke, didn't tolerate fools easily," Nader said not long ago, when he was asked about the early days. He has a deep, rumbling press-conference voice, and speaks in sentence fragments, punctuated with long pauses. "Very dedicated. He influenced us all." The auto-safety campaign, he went on, "was a spectacular success of the federal-government mission. When the regulations were allowed, they worked. And it worked because it deals with technology rather than human behavior." Nader had just been speaking in Detroit, at Wayne State University, and was on the plane back to Washington, D.C. He was folded into his seat, his knees butting up against the tray table in front of him, and from time to time he looked enviously over at the people stretching their legs in the exit row. Did he have any regrets? Yes, he said. He wished that back in 1966 he had succeeded in keeping the criminal-penalties provision in the auto-safety bill that Congress passed that summer. "That would have gone right to the executive suite," he said.
There were things, he admitted, that had puzzled him over the years. He couldn't believe the strides that had been made against drunk driving. "You've got to hand it to MADD. It took me by surprise. The drunk-driving culture is deeply embedded. I thought it was too ingrained." And then there was what had happened with seat belts. "Use rates are up sharply," he said. "They're a lot higher than I thought they would be. I thought it would be very hard to hit fifty per cent. The most unlikely people now buckle up." He shook his head, marvelling. He had always been a belt user, and recommends belts to others, but who knew they would catch on?
Other safety activists, who had seen what had happened to driver behavior in Europe and Australia in the seventies, weren't so surprised, of course. But Nader was never the kind of activist who had great faith in the people whose lives he was trying to protect.He and the other Haddonites were sworn to a theory that said that the way to prevent typhoid is to chlorinate the water, even though there are clearly instances where chlorine will not do the trick. This is the blindness of ideology. It is what happens when public policy is conducted by those who cannot conceive that human beings will do willingly what is in their own interest. What was the truly poignant thing about Robert Day, after all? Not just that he was a click away from saving his only life but that his son, sitting right next to him, was wearinghis seat belt. In the Days' Jeep Wagoneer, a fight that experts assumed was futile was already half won.
One day this spring, a team of engineers at Ford conducted a crash test on a 2003 Mercury. This was at Ford's test facility in Dearborn, a long, rectangular white steel structure, bisected by a five-hundred-and-fifty-foot runway. Ford crashes as many as two cars a day there, ramming them with specially designed sleds or dragging them down the runway with a cable into a twenty-foot cube of concrete. Along the side of the track were the twisted hulks of previous experiments: a Ford Focus wagon up on blocks; a mangled BMW S.U.V. that had been crashed, out of competitive curiosity, the previous week; a Ford Explorer that looked as though it had been thrown into a blender. In a room at the back, there were fifty or sixty crash-test dummies, propped up on tables and chairs, in a dozen or more configurations--some in Converse sneakers, some in patent-leather shoes, some without feet and legs at all, each one covered with multiple electronic sensors, all designed to measure the kinds of injuries possible in a crash.
The severity of any accident is measured not by the speed of the car at the moment of impact but by what is known as the delta V--the difference between how fast a car is going at the moment of impact and how fast it is moving after the accident. Capoferri's delta V was about twenty-five miles per hour, seven miles per hour higher than the accident average. The delta V of the Mercury test, though, was to be thirty-five miles per hour, which is the equivalent of hitting an identical parked car at seventy miles per hour. The occupants were two adult-size dummies in orange shorts. Their faces were covered in wet paint, red above the upper jaw and blue below it, to mark where their faces hit on the air bag. The back seat carried a full cargo of computers and video cameras. A series of yellow lights began flashing. An engineer stood to the side, holding an abort button. Then a bank of stage lights came on, directly above the point of impact. Sixteen video cameras began rolling. A voice came over a loudspeaker, counting down: five, four, three... There was a blur as the Mercury swept by--then bang, as the car hit the barrier and the dual front air bags exploded. A plastic light bracket skittered across the floor, and the long warehouse was suddenly still.
It was a moment of extraordinary violence, yet it was also strangely compelling. This was performance art, an abstract and ritualized rendering of reality, given in a concrete-and-steel gallery. The front end of the Mercury was perfectly compressed; the car was thirty inches shorter than it had been a moment before. The windshield was untouched. The "A" pillars and roofline were intact. The passenger cabin was whole. In the dead center of the deflated air bags, right where they were supposed to be, were perfect blue-and-red paint imprints of the dummies' faces.
But it was only a performance, and that was the hard thing to remember. In the real world, people rarely have perfectly square frontal collisions, sitting ramrod straight and ideally positioned; people rarely have accidents that so perfectly showcase the minor talents of the air bag. A crash test is beautiful. In the sequence we have all seen over and over in automobile commercials, the dummy rises magically to meet the swelling cushion, always in slow motion, the bang replaced by Mozart, and on those theatrical terms the dowdy fabric strips of the seat belt cannot compete with the billowing folds of the air bag. This is the image that seduced William Haddon when the men from Eaton, Yale showed him the People Saver so many years ago, and the image that warped auto safety for twenty long years. But real accidents are seldom like this. They are ugly and complicated, shaped by the messy geometries of the everyday world and by the infinite variety of human frailty. A man looks away from the road at the wrong time. He does not see what he ought to see. Another man does not have time to react. The two cars collide, but at a slight angle. There is a two-hundred-and-seventy-degree spin. There is skidding and banging. A belt presses deep into one man's chest--and that saves his life. The other man's unrestrained body smashes against the car door--and that kills him.
"They left pretty early, about eight, nine in the morning," Susan Day, Robert Day's widow, recalls. "I was at home when the hospital called. I went to see my son first. He was pretty much O.K., had a lot of bruising. Then they came in and said, 'Your husband didn't make it.'"
The Mosquito Killer
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 2, 2001
ANNALS OF PUBLIC HEALTH
Millions of people owe their lives to Fred Soper.
Why isn't he a hero?
1.
In the late nineteen-thirties, a chemist who worked for the J.R. Geigy company, in Switzerland, began experimenting with an odorless white crystalline powder called dichloro-diphenyl-trichloroethane. The chemist, Paul MĂĽller, wanted to find a way to protect woollens against moths, and his research technique was to coat the inside of a glass box with whatever chemical he was testing, and then fill it with houseflies. To his dismay, the flies seemed unaffected by the new powder. But, in one of those chance decisions on which scientific discovery so often turns, he continued his experiment overnight--and in the morning all the flies were dead. He emptied the box, and put in a fresh batch of flies. By the next morning, they, too, were dead. He added more flies, and then a handful of other insects. They all died. He scrubbed the box with an acetone solvent, and repeated the experiment with a number of closely related compounds that he had been working with. The flies kept dying. Now he was excited: had he come up with a whole line of potent new insecticides? As it turned out, he hadn't. The new candidate chemicals were actually useless. To his amazement, what was killing the flies in the box were scant traces of the first compound, dichloro-diphenyl-trichloroethane--or, as it would come to be known, DDT.
In 1942, Geigy sent a hundred kilograms of the miracle powder to its New York office. The package lay around, undisturbed, until another chemist, Victor Froelicher, happened to translate the extraordinary claims for DDT into English, and then passed on a sample to the Department of Agriculture, which in turn passed it on to its entomology research station, in Orlando, Florida. The Orlando laboratory had been charged by the Army to develop new pesticides, because the military, by this point in the war, was desperate for a better way to protect its troops against insect-borne disease. Typhus--the lethal fever spread by lice--had killed millions of people during and after the First World War and was lurking throughout the war zones. Worse, in almost every theatre of operations, malaria-carrying mosquitoes were causing havoc. As Robert Rice recounted in this magazine almost fifty years ago, the First Marine Division had to be pulled from combat in 1942 and sent to Melbourne to recuperate because, out of seventeen thousand men, ten thousand were incapacitated with malarial headaches, fevers, and chills. Malaria hit eighty-five per cent of the men holding onto Bataan. In fact, at any one time in the early stages of the war, according to General Douglas MacArthur, two-thirds of his troops in the South Pacific were sick with malaria. Unless something was done, MacArthur complained to the malariologist Paul Russell, it was going to be "a long war." Thousands of candidate insecticides were tested at Orlando, and DDT was by far the best.
To gauge a chemical's potential against insects, the Orlando researchers filled a sleeve with lice and a candidate insecticide, slipped the sleeve over a subject's arm, and taped it down at both ends. After twenty-four hours, the dead lice were removed and fresh lice were added. A single application of DDT turned out to kill lice for a month, almost four times longer than the next-best insecticide. As Rice described it, researchers filled twelve beakers with mosquito larvae, and placed descending amounts of DDT in each receptacle--with the last beaker DDT free. The idea was to see how much chemical was needed to kill the mosquitoes. The mosquito larvae in every beaker died. Why? Because just the few specks of chemical that floated through the air and happened to land in the last beaker while the experiment was being set up were enough to kill the mosquitoes. Quickly, a field test was scheduled. Two duck ponds were chosen, several miles apart. One was treated with DDT. One was not. Spraying was done on a day when the wind could not carry the DDT from the treated to the untreated pond. The mosquito larvae in the first pond soon died. But a week later mosquito larvae in the untreated pond also died: when ducks from the first pond visited the second pond, there was enough DDT residue on their feathers to kill mosquitoes there as well.
The new compound was administered to rabbits and cats. Rice tells how human volunteers slathered themselves with it, and sat in vaults for hours, inhaling the fumes. Tests were done to see how best to apply it. "It was put in solution or suspension, depending on what we were trying to do," Geoffrey Jeffery, who worked on DDT at the Tennessee Valley Authority, recalls. "Sometimes we'd use some sort of petroleum-based carrier, even diesel oil, or add water to a paste or concentration and apply it on the wall with a Hudson sprayer." Under conditions of great secrecy, factories were set up, to manufacture the new chemical by the ton. It was rushed to every Allied theatre. In Naples, in 1944, the Army averted a catastrophic typhus epidemic by "dusting" more than a million people with DDT powder. The Army Air Force built DDT "bombs," attaching six-hundred-and-twenty-five-gallon tanks to the underside of the wings of B-25s and C-47s, and began spraying Pacific beachheads in advance of troop arrivals. In Saipan, invading marines were overtaken by dengue, a debilitating fever borne by the Aedes variety of mosquito. Five hundred men were falling sick every day, each incapacitated for four to five weeks. The medical officer called in a DDT air strike that saturated the surrounding twenty-five square miles with nearly nine thousand gallons of five-per-cent DDT solution. The dengue passed. The marines took Saipan.
It is hard to overestimate the impact that DDT's early success had on the world of public health. In the nineteen-forties, there was still malaria in the American South. There was malaria throughout Europe, Asia, and the Caribbean. In India alone, malaria killed eight hundred thousand people a year. When, in 1920, William Gorgas, the man who cleansed the Panama Canal Zone of malaria, fell mortally ill during a trip through England, he was knighted on his deathbed by King George V and given an official state funeral at St. Paul's Cathedral--and this for an American who just happened to be in town when he died. That is what it meant to be a malaria fighter in the first half of the last century. And now there was a chemical--the first successful synthetic pesticide--that seemed to have an almost magical ability to kill mosquitoes. In 1948, MĂĽller won the Nobel Prize for his work with DDT, and over the next twenty years his discovery became the centerpiece of the most ambitious public-health campaign in history.
Today, of course, DDT is a symbol of all that is dangerous about man's attempts to interfere with nature. Rachel Carson, in her landmark 1962 book, "Silent Spring," wrote memorably of the chemical's environmental consequences, how its unusual persistence and toxicity had laid waste to wildlife and aquatic ecosystems. Only two countries--India and China--continue to manufacture the substance, and only a few dozen more still use it. In May, at the Stockholm Convention on Persistent Organic Pollutants, more than ninety countries signed a treaty, placing DDT on a restricted-use list, and asking all those still using the chemical to develop plans for phasing it out entirely. On the eve of its burial, however--and at a time when the threat of insect-borne disease around the world seems to be resurgent--it is worth remembering that people once felt very differently about DDT, and that between the end of the Second World War and the beginning of the nineteen-sixties it was considered not a dangerous pollutant but a lifesaver. The chief proponent of that view was a largely forgotten man named Fred Soper, who ranks as one of the unsung heroes of the twentieth century. With DDT as his weapon, Soper almost saved the world from one of its most lethal afflictions. Had he succeeded, we would not today be writing DDT's obituary. We would view it in the same heroic light as penicillin and the polio vaccine.
2.
Fred Soper was a physically imposing man. He wore a suit, it was said, like a uniform. His hair was swept straight back from his forehead. His eyes were narrow. He had large wire-rimmed glasses, and a fastidiously maintained David Niven mustache. Soper was born in Kansas in 1893, received a doctorate from the Johns Hopkins School of Public Health, and spent the better part of his career working for the Rockefeller Foundation, which in the years before the Second World War--before the establishment of the United Nations and the World Health Organization--functioned as the world's unofficial public-health directorate, using its enormous resources to fight everything from yellow fever in Colombia to hookworm in Thailand.
In those years, malaria warriors fell into one of two camps. The first held that the real enemy was the malaria parasite--the protozoan that mosquitoes pick up from the blood of an infected person and transmit to others. The best way to break the chain of infection, this group argued, was to treat the sick with antimalarial drugs, to kill the protozoan so there was nothing for mosquitoes to transmit. The second camp held, to the contrary, that the mosquito was the real enemy, since people would not get malaria in the first place if there were no mosquitoes around to bite them. Soper belonged to the latter group, and his special contribution was to raise the killing of mosquitoes to an art. Gorgas, Soper's legendary predecessor, said that in order to fight malaria you had to learn to think like a mosquito. Soper disagreed. Fighting malaria, he said, had very little to do with the intricacies of science and biology. The key was learning to think like the men he hired to go door-to-door and stream-to-stream, killing mosquitoes. His method was to apply motivation, discipline, organization, and zeal, in understanding human nature. Fred Soper was the General Patton of entomology.
While working in South America in 1930, Soper had enforced a rigorous protocol for inspecting houses for mosquito infestation, which involved checking cisterns and climbing along roof gutters. (He pushed himself so hard perfecting the system in the field that he lost twenty-seven pounds in three months.) He would map an area to be cleansed of mosquitoes, give each house a number, and then assign each number to a sector. A sector, in turn, would be assigned to an inspector, armed with the crude pesticides then available; the inspector's schedule for each day was planned to the minute, in advance, and his work double-checked by a supervisor. If a supervisor found a mosquito that the inspector had missed, he received a bonus. And if the supervisor found that the inspector had deviated by more than ten minutes from his preassigned schedule the inspector was docked a day's pay. Once, in the state of Rio de Janeiro, a large ammunition dump--the NiterĂłi Arsenal--blew up. Soper, it was said, heard the explosion in his office, checked the location of the arsenal on one of his maps, verified by the master schedule that an inspector was at the dump at the time of the accident, and immediately sent condolences and a check to the widow. The next day, the inspector showed up for work, and Soper fired him on the spot--for being alive. Soper, in one memorable description, "seemed equally capable of browbeating man or mosquito." He did not engage in small talk. In 1973, at Soper's eightieth-birthday party, a former colleague recounted how much weight he had lost working for Soper; another told a story of how Soper looked at him uncomprehendingly when he asked to go home to visit his ailing wife; a third spoke of Soper's betting prowess. "He was very cold and very formal," remembers Andrew Spielman, a senior investigator in tropical disease at the Harvard School of Public Health and the author, with Michael D'Antonio, of the marvellous new book "Mosquito: A Natural History of Our Most Persistent and Deadly Foe." "He always wore a suit and tie. With that thin little mustache and big long upper lip, he scared the hell out of me."
One of Soper's greatest early victories came in Brazil, in the late nineteen-thirties, when he took on a particularly vicious strain of mosquito known as Anopheles gambiae. There are about twenty-five hundred species of mosquito in the world, each with its own habits and idiosyncrasies--some like running water, some like standing water, some bite around the ankles, some bite on the arms, some bite indoors, some bite outdoors--but only mosquitoes of the genus Anopheles are capable of carrying the human malaria parasite. And, of the sixty species of Anopheles that can transmit malaria, gambiae is the variety best adapted to spreading the disease. In California, there is a strain of Anopheles known as freeborni, which is capable of delivering a larger dose of malaria parasite than gambiae ever could. But freeborni is not a good malaria vector, because it prefers animals to people. Gambiae, by contrast, bites humans ninety-five per cent of the time. It has long legs and yellow-and-black spotted wings. It likes to breed in muddy pools of water, even in a water-filled footprint. And, unlike many mosquitoes, it is long-lived, meaning that once it has picked up the malaria parasite it can spread the protozoan to many others. Gambiae gathers in neighborhoods in the evenings, slips into houses at dusk, bites quietly and efficiently during the night, digests its "blood meal" while resting on the walls of the house, and then slips away in the morning. In epidemiology, there is a concept known as the "basic reproduction number," or BRN, which refers to the number of people one person can infect with a contagious disease. The number for H.I.V., which is relatively difficult to transmit, is just above one. For measles, the BRN is between twelve and fourteen. But with a vector like gambiae in the picture the BRN for malaria can be more than a hundred, meaning that just one malarious person can be solely responsible for making a hundred additional people sick. The short answer to the question of why malaria is such an overwhelming problem in Africa is that gambiae is an African mosquito.
In March, 1930, a Rockefeller Foundation entomologist named Raymond Shannon was walking across tidal flats to the Potengi River, in Natal, Brazil, when he noticed, to his astonishment, two thousand gambiae larvae in a pool of water, thousands of miles from their homeland. Less than a kilometre away was a port where French destroyers brought mail across the Atlantic from Africa, and Shannon guessed that the mosquito larvae had come over, fairly recently, aboard one of the mail ships. He notified Soper, who was his boss, and Soper told Brazilian officials to open the dykes damming the tidal flats, because salt water from the ocean would destroy the gambiae breeding spots. The government refused. Over the next few years, there were a number of small yet worrisome outbreaks of malaria, followed by a few years of drought, which kept the problem in check. Then, in 1938, the worst malaria epidemic in the history of the Americas broke out. Gambiae had spread a hundred and fifty miles along the coast and inland, infecting a hundred thousand people and killing as many as twenty thousand. Soper was called in. This was several years before the arrival of DDT, so he brought with him the only tools malariologists had in those years: diesel oil and an arsenic-based mixture called Paris green, both of which were spread on the pools of water where gambiae larvae bred; and pyrethrum, a natural pesticide made from a variety of chrysanthemum, which was used to fumigate buildings. Four thousand men were put at his disposal. He drew maps and divided up his troops. The men wore uniforms, and carried flags to mark where they were working, and they left detailed written records of their actions, to be reviewed later by supervisors. When Soper discovered twelve gambiae in a car leaving an infected area, he set up thirty de-insectization posts along the roads, spraying the interiors of cars and trucks; seven more posts on the rail lines; and defumigation posts at the ports and airports. In Soper's personal notes, now housed at the National Library of Medicine, in Bethesda, there is a cue card, on which is typed a quotation from a veteran of the Rockefeller Foundation's efforts, in the early twentieth century, to eradicate hookworm. "Experience proved that the best way to popularize a movement so foreign to the customs of the people . . . was to prosecute it as though it were the only thing in the universe left undone." It is not hard to imagine the card tacked above Soper's desk in Rio for inspiration: his goal was not merely to cripple the population of gambiae, since that would simply mean that they would return, to kill again. His goal was to eliminate gambiae from every inch of the region of Brazil that they had colonized--an area covering some eighteen thousand square miles. It was an impossible task. Soper did it in twenty-two months.
3.
While DDT was being tested in Orlando, Soper was in North Africa with the United States Typhus Commission, charged with preventing the kind of louse-spread typhus epidemics that were so devastating during the First World War. His tool of choice was a delousing powder called MYL. Lice live in the folds of clothing, and a previous technique had been to treat the clothing after people had disrobed. But that was clearly not feasible in Muslim cities like Cairo and Algiers, nor was it practical for large-scale use. So Soper devised a new technique. He had people tie their garments at the ankles and wrists, and then he put the powder inside a dust gun, of the sort used in gardening, and blew it down the collar, creating a balloon effect. "We were in Algiers, waiting for Patton to get through Sicily," Thomas Aitken, an entomologist who worked with Soper in those years, remembers. "We were dusting people out in the countryside. This particular day, a little old Arab man, only about so high, came along with his donkey and stopped to talk to us. We told him what we were doing, and we dusted him. The next day, he comes by again and says that that had been the first time in his life that he had ever been able to sleep through the night."
In December of 1943, the typhus team was dispatched to Naples, where in the wake of the departing German Army the beginnings of a typhus epidemic had been detected. The rituals of Cairo were repeated, only this time the typhus fighters, instead of relying on MYL (which easily lost its potency), were using DDT. Men with dusters careered through the narrow cobblestoned streets of the town, amid the wreckage of the war, delousing the apartment buildings of typhus victims. Neapolitans were dusted as they came out of the railway stations in the morning, and dusted in the streets, and dusted in the crowded grottoes that served as bomb shelters beneath the city streets. In the first month, more than 1.3 million people were dusted, saving countless lives.
Soper's diary records a growing fascination with this new weapon. July 25, 1943: "Lunch with L.L. Williams and Justin Andrews. L.L. reports that he has ordered 10,000 lbs of Neocid [DDT]and that Barber reports it to be far superior to [Paris Green]for mosquitoes." February 25, 1944: "Knipling visits laboratory. Malaria results [for DDT]ARE FANTASTIC." When Rome fell, in mid-1944, Soper declared that he wanted to test DDT in Sardinia, the most malarious part of Italy. In 1947, he got his wish. He pulled out his old organization charts from Brazil. The island--a rocky, mountainous region the size of New Hampshire, with few roads--was mapped and divided up hierarchically, the smallest unit being the area that could be covered by a sprayer in a week. Thirty-three thousand people were hired. More than two hundred and eighty-six tons of DDT were acquired. Three hundred and thirty-seven thousand buildings were sprayed. The target Anopheles was labranchiae, which flourishes not just in open water but also in the thick weeds that surround the streams and ponds and marshes of Sardinia. Vegetation was cut back, and a hundred thousand acres of swampland were drained. Labranchiae larvae were painstakingly collected and counted and shipped to a central laboratory, where precise records were kept of the status of the target vector. In 1946, before the campaign started, there were seventy-five thousand malaria cases on the island. In 1951, after the campaign finished, there were nine.
"The locals regarded this as the best thing that had ever happened to them," Thomas Aitken says. He had signed on with the Rockefeller Foundation after the war, and was one of the leaders of the Sardinian effort. "The fact that malaria was gone was welcome," he went on. "But also the DDT got rid of the houseflies. Sardinian houses were made of stone. The wires for the lights ran along the walls near the ceiling. And if you looked up at the wires they were black with housefly droppings from over the years. And suddenly the flies disappeared." Five years ago, Aitken says, he was invited back to Sardinia for a celebration to mark the forty-fifth anniversary of malaria's eradication from the island. "There was a big meeting at our hotel. The public was invited, as well as a whole bunch of island and city officials, the mayor of Cagliari, and representatives of the Italian government. We all sat on a dais, at the side of the room, and I gave a speech there, in Italian, and when I finished everybody got up and clapped their hands and was shouting. It was very embarrassing. I started crying. I couldn't help it. Just reminiscing now . . ."
Aitken is a handsome, courtly man of eighty-eight, lean and patrician in appearance. He lives outside New Haven, in an apartment filled with art and furniture from his time in Sardinia. As he thought back to those years, there were tears in his eyes, and at that moment it was possible to appreciate the excitement that gripped malariologists in the wake of the Second World War. The old-school mosquito men called themselves mud-hen malariologists, because they did their job in swamps and ditches and stagnant pools of water. Paris green and pyrethrum were crude insecticides that had to be applied repeatedly; pyrethrum killed only those mosquitoes that happened to be in the room when you were spraying. But here, seemingly, was a clean, pure, perfectly modern weapon. You could spray a tiny amount on a wall, and that single application would kill virtually every mosquito landing on that surface for the next six months. Who needed a standing army of inspectors anymore? Who needed to slog through swamps? This was an age of heroics in medicine. Sabin and Salk were working on polio vaccines with an eye to driving that disease to extinction. Penicillin was brand new, and so effective that epidemiologists were dreaming of an America without venereal disease. The extinction of smallpox, that oldest of scourges, seemed possible. All the things that we find sinister about DDT today--the fact that it killed everything it touched, and kept on killing everything it touched--were precisely what made it so inspiring at the time. "The public-health service didn't pay us a lot," says McWilson Warren, who spent the early part of his career fighting malaria in the Malaysian jungle. "So why were we there? Because there was something so wonderful about being involved with people who thought they were doing something more important than themselves." In the middle of the war, Soper had gone to Egypt, and warned the government that it had an incipient invasion of gambiae. The government ignored him, and the next year the country was hit with an epidemic that left more than a hundred thousand dead. In his diary, Soper wrote of his subsequent trip to Egypt, "In the afternoon to the Palace where Mr. Jacobs presents me to His Majesty King Faruk. The King says that he is sorry to know that measures I suggested last year were not taken at that time." Soper had triumphed over gambiae in Brazil, driven lice from Cairo and Naples, and had a weapon, DDT, that seemed like a gift from God--and now kings were apologizing to him. Soper started to dream big: Why not try to drive malaria from the entire world?
4.
Fred Soper's big idea came to be known as the Global Malaria Eradication Programme. In the early nineteen-fifties, Soper had been instrumental in getting the Brazilian malariologist Marcolino Candau--whom he had hired during the anti-gambiae campaign of the nineteen-thirties--elected as director-general of the World Health Organization, and, in 1955, with Candau's help, Soper pushed through a program calling on all member nations to begin a rigorous assault on any malaria within their borders. Congress was lobbied, and John Kennedy, then a senator, became an enthusiastic backer. Beginning in 1958, the United States government pledged the equivalent of billions in today's dollars for malaria eradication--one of the biggest commitments that a single country has ever made to international health. The appeal of the eradication strategy was its precision. The idea was not to kill every Anopheles mosquito in a given area, as Soper had done with gambiae in Brazil. That was unnecessary. The idea was to use DDT to kill only those mosquitoes which were directly connected to the spread of malaria--only those which had just picked up the malaria parasite from an infected person and were about to fly off and infect someone else. When DDTis used for this purpose, Spielman writes in "Mosquito," "it is applied close to where people sleep, on the inside walls of houses. After biting, the mosquitoes generally fly to the nearest vertical surface and remain standing there for about an hour, anus down, while they drain the water from their gut contents and excrete it in a copious, pink-tinged stream. If the surfaces the mosquitoes repair to are coated by a poison that is soluble in the wax that covers all insects' bodies, the mosquitoes will acquire a lethal dose." Soper pointed out that people who get malaria, and survive, generally clear their bodies of the parasite after three years. If you could use spraying to create a hiatus during which minimal transmission occurred--and during which anyone carrying the parasite had a chance to defeat it--you could potentially eradicate malaria. You could stop spraying and welcome the mosquitoes back, because there would be no more malaria around for them to transmit. Soper was under no illusions about how difficult this task would be. But, according to his calculations, it was technically possible, if he and his team achieved eighty-per-cent coverage--if they sprayed eight out of every ten houses in infected areas.
Beginning in the late fifties, DDT was shipped out by the ton. Training institutes were opened. In India alone, a hundred and fifty thousand people were hired. By 1960, sixty-six nations had signed up. "What we all had was a handheld pressure sprayer of three-gallon capacity," Jesse Hobbs, who helped run the eradication effort in Jamaica in the early sixties, recalls. "Generally, we used a formulation that was water wettable, meaning you had powder you mixed with water. Then you pressurized the tank. The squad chief would usually have notified the household some days before. The instructions were to take the pictures off the wall, pull everything away from the wall. Take the food and eating utensils out of the house. The spray man would spray with an up-and-down movement--at a certain speed, according to a pattern. You started at a certain point and sprayed the walls and ceiling, then went outside to spray the eaves of the roof. A spray man could cover ten to twelve houses a day. You were using about two hundred milligrams per square foot of DDT, which isn't very much, and it was formulated in a way that you could see where you sprayed. When it dried, it left a deposit, like chalk. It had a bit of a chlorine smell. It's not perfume. It's kind of like swimming-pool water. People were told to wait half an hour for the spray to dry, then they could go back." The results were dramatic. In Taiwan, much of the Caribbean, the Balkans, parts of northern Africa, the northern region of Australia, and a large swath of the South Pacific, malaria was eliminated. Sri Lanka saw its cases drop to about a dozen every year. In India, where malaria infected an estimated seventy-five million and killed eight hundred thousand every year, fatalities had dropped to zero by the early sixties. Between 1945 and 1965, DDT saved millions--even tens of millions--of lives around the world, perhaps more than any other man-made drug or chemical before or since.
What DDT could not do, however, was eradicate malaria entirely. How could you effectively spray eighty per cent of homes in the Amazonian jungle, where communities are spread over hundreds of thousands of highly treacherous acres? Sub-Saharan Africa, the most malarious place on earth, presented such a daunting logistical challenge that the eradication campaign never really got under way there. And, even in countries that seemed highly amenable to spraying, problems arose. "The rich had houses that they didn't want to be sprayed, and they were giving bribes," says Socrates Litsios, who was a scientist with the W.H.O. for many years and is now a historian of the period. "The inspectors would try to double their spraying in the morning so they wouldn't have to carry around the heavy tanks all day, and as a result houses in the afternoon would get less coverage. And there were many instances of corruption with insecticides, because they were worth so much on the black market. People would apply diluted sprays even when they knew they were worthless." Typical of the logistical difficulties is what happened to the campaign in Malaysia. In Malaysian villages, the roofs of the houses were a thatch of palm fronds called atap. They were expensive to construct, and usually lasted five years. But within two years of DDT spraying the roofs started to fall down. As it happened, the atap is eaten by caterpillar larvae, which in turn are normally kept in check by parasitic wasps. But the DDT repelled the wasps, leaving the larvae free to devour the atap. "Then the Malaysians started to complain about bedbugs, and it turns out what normally happens is that ants like to eat bedbug larvae," McWilson Warren said. "But the ants were being killed by the DDT and the bedbugs weren't--they were pretty resistant to it. So now you had a bedbug problem." He went on, "The DDT spray teams would go into villages, and no one would be at home and the doors would be locked and you couldn't spray the house. And, understand, for that campaign to work almost every house had to be sprayed. You had to have eighty-per-cent coverage. I remember there was a malaria meeting in '62 in Saigon, and the Malaysians were saying that they could not eradicate malaria. It was not possible. And everyone was arguing with them, and they were saying, 'Look, it's not going to work.' And if Malaysia couldn't do it--and Malaysia was one of the most sophisticated places in the region--who could?"
At the same time, in certain areas DDT began to lose its potency. DDT kills by attacking a mosquito's nervous system, affecting the nerve cells so that they keep firing and the insect goes into a spasm, lurching, shuddering, and twitching before it dies. But in every population of mosquitoes there are a handful with a random genetic mutation that renders DDT nontoxic--that prevents it from binding to nerve endings. When mass spraying starts, those genetic outliers are too rare to matter. But, as time goes on, they are the only mosquitoes still breeding, and entire new generations of insects become resistant. In Greece, in the late nineteen-forties, for example, a malariologist noticed Anopheles sacharovi mosquitoes flying around a room that had been sprayed with DDT. In time, resistance began to emerge in areas where spraying was heaviest. To the malaria warriors, it was a shock. "Why should they have known?" Janet Hemingway, an expert in DDT resistance at the University of Wales in Cardiff, says. "It was the first synthetic insecticide. They just assumed that it would keep on working, and that the insects couldn't do much about it." Soper and the malariologist Paul Russell, who was his great ally, responded by pushing for an all-out war on malaria. We had to use DDT, they argued, or lose it. "If countries, due to lack of funds, have to proceed slowly, resistance is almost certain to appear and eradication will become economically impossible," Russell wrote in a 1956 report. "TIME IS OF THE ESSENCE because DDT resistance has appeared in six or seven years." But, with the administrative and logistical problems posed by the goal of eighty-per-cent coverage, that deadline proved impossible to meet.
5.
In 1963, the money from Congress ran out. Countries that had been told they could wipe out malaria in four years--and had diverted much of their health budgets to that effort--grew disillusioned as the years dragged on and eradication never materialized. Soon, they put their money back into areas that seemed equally pressing, like maternal and child health. Spraying programs were scaled back. In those countries where the disease had not been completely eliminated, malaria rates began to inch upward. In 1969, the World Health Organization formally abandoned global eradication, and in the ensuing years it proved impossible to muster any great enthusiasm from donors to fund antimalaria efforts. The W.H.O. now recommends that countries treat the disease largely through the health-care system--through elimination of the parasite--but many anti-malarial drugs are no longer effective. In the past thirty years, there have been outbreaks in India, Sri Lanka, Brazil, and South Korea, among other places. "Our troubles with mosquitoes are getting worse," Spielman concludes in "Mosquito," "making more people sick and claiming more lives, millions of lives, every year."
For Soper, the unravelling of his dream was pure torture. In 1959, he toured Asia to check on the eradication campaigns of Thailand, the Philippines, Ceylon, and India, and came back appalled at what he had seen. Again and again, he found, countries were executing his strategy improperly. They weren't spraying for long enough. They didn't realize that unless malaria was ground into submission it would come roaring back. But what could he do? He had prevailed against gambiae in Brazil in the nineteen-thirties because he had been in charge; he had worked with the country's dictator to make it illegal to prevent an inspector from entering a house, and illegal to prevent the inspector from treating any open container of water. Jesse Hobbs tells of running into Soper one day in Trinidad, after driving all day in an open jeep through the tropical heat. Soper drove up in a car and asked Hobbs to get in; Hobbs demurred, gesturing at his sweaty shirt. "Son," Soper responded, "we used to go out in a day like this in Brazil and if we found a sector chief whose shirt was not wet we'd fire him." Killing mosquitoes, Soper always said, was not a matter of knowledge and academic understanding; it was a matter of administration and discipline. "He used to say that if you have a democracy you can't have eradication," Litsios says. "When Soper was looking for a job at Johns Hopkins--this would have been '46--he told a friend that 'they turned me down because they said I was a fascist.'" Johns Hopkins was right, of course: he was a fascist--a disease fascist--because he believed a malaria warrior had to be. But now roofs were falling down in Malaysia, and inspectors were taking bribes, and local health officials did not understand the basic principles of eradication--and his critics had the audacity to blame his ideas, rather than their own weakness.
It was in this same period that Rachel Carson published "Silent Spring," taking aim at the environmental consequences of DDT. "The world has heard much of the triumphant war against disease through the control of insect vectors of infection," she wrote, alluding to the efforts of men like Soper, "but it has heard little of the other side of the story--the defeats, the short-lived triumphs that now strongly support the alarming view that the insect enemy has been made actually stronger by our efforts." There had already been "warnings," she wrote, of the problems created by pesticides:
On Nissan Island in the South Pacific, for example, spraying had been carried on intensively during the Second World War, but was stopped when hostilities came to an end. Soon swarms of a malaria-carrying mosquito reinvaded the island. All of its predators had been killed off and there had not been time for new populations to become established. The way was therefore clear for a tremendous population explosion. Marshall Laird, who had described this incident, compares chemical control to a treadmill; once we have set foot on it we are unable to stop for fear of the consequences.
It is hard to read that passage and not feel the heat of Soper's indignation. He was familiar with "Silent Spring"--everyone in the malaria world was--and what was Carson saying?Of course the mosquitoes came back when DDT spraying stopped. The question was whether the mosquitoes were gone long enough to disrupt the cycle of malaria transmission. The whole point of eradication, to his mind, was that it got you off the treadmill: DDT was so effective that if you used it properly you could stop spraying and not fear the consequences. Hadn't that happened in places like Taiwan and Jamaica and Sardinia?
"Silent Spring" was concerned principally with the indiscriminate use of DDT for agricultural purposes; in the nineteen-fifties, it was being sprayed like water in the Western countryside, in an attempt to control pests like the gypsy moth and the spruce budworm. Not all of Carson's concerns about the health effects of DDT have stood the test of time--it has yet to be conclusively linked to human illness--but her larger point was justified: DDT was being used without concern for its environmental consequences. It must have galled Soper, however, to see how Carson effectively lumped the malaria warriors with those who used DDT for economic gain. Nowhere in "Silent Spring" did Carson acknowledge that the chemical she was excoriating as a menace had, in the two previous decades, been used by malariologists to save somewhere in the vicinity of ten million lives. Nor did she make it clear how judiciously the public-health community was using the chemical. By the late fifties, health experts weren't drenching fields and streams and poisoning groundwater and killing fish. They were leaving a microscopic film on the inside walls of houses; spraying every house in a country the size of Guyana, for example, requires no more DDT in a year than a large cotton farm does. Carson quoted a housewife from Hinsdale, Illinois, who wrote about the damage left by several years of DDT spraying against bark beetles: "The town is almost devoid of robins and starlings; chickadees have not been on my shelf for two years, and this year the cardinals are gone too; the nesting population in the neighborhood seems to consist of one dove pair and perhaps one catbird family. . . . 'Will they ever come back?' [the children]ask, and I do not have the answer." Carson then quoted a bird-lover from Alabama:"There was not a sound of the song of a bird. It was eerie, terrifying. What was man doing to our perfect and beautiful world?" But to Soper the world was neither perfect nor beautiful, and the question of what man could do to nature was less critical than what nature, unimpeded, could do to man. Here, from a well-thumbed page inserted in Soper's diaries, is a description of a town in Egypt during that country's gambiae invasion of 1943--a village in the grip of its own, very different, unnatural silence:
Most houses are without roofs. They are just a square of dirty earth. In those courtyards and behind the doors of these hovels were found whole families lying on the floor; some were just too weakened by illness to get up and others were lying doubled up shaking from head to foot with their teeth chattering and their violently trembling hands trying in vain to draw some dirty rags around them for warmth. They were in the middle of the malaria crisis. There was illness in every house. There was hardly a house which had not had its dead and those who were left were living skeletons, their old clothing in rags, their limbs swollen from undernourishment and too weak to go into the fields to work or even to get food.
It must have seemed to Soper that the ground had shifted beneath his feet--that the absolutes that governed his life, that countenanced even the most extreme of measures in the fight against disease, had suddenly and bewilderingly been set aside. "I was on several groups who evaluated malaria-eradication programs in some of the Central American countries and elsewhere," Geoffrey Jeffery recalls. "Several times we came back with the answer that with the present technology and effort it wasn't going to work. Well, that didn't suit Soper very much. He harangued us. We shouldn't be saying things like that!" Wilbur Downs, a physician who worked for the Rockefeller Foundation in Mexico in the fifties, used to tell of a meeting with Soper and officials of the Mexican government about the eradication of malaria in that country. Soper had come down from Washington, and amid excited talk of ending malaria forever Downs pointed out that there were serious obstacles to eradication--among them the hastened decomposition and absorption of DDT by the clays forming adobe walls. It was all too much for Soper. This was the kind of talk that was impeding eradication--the doubting, the equivocation, the incompetence, the elevation of songbirds over human life. In the middle of the meeting, Soper--ramrod straight, eyes afire--strode over to Downs, put both his hands around his neck, and began to shake.
6.
Fred Soper ran up against the great moral of the late twentieth century--that even the best-intentioned efforts have perverse consequences, that benefits are inevitably offset by risks. This was the lesson of "Silent Spring," and it was the lesson, too, that malariologists would take from the experience with global eradication. DDT, Spielman argues, ought to be used as selectively as possible, to quell major outbreaks. "They should have had a strong rule against spraying the same villages again and again," he says. "But that went against their doctrine. They wanted eighty-per-cent coverage. They wanted eight out of ten houses year after year after year, and that's a sure formula for resistance." Soper and Russell once argued about whether, in addition to house spraying, malaria fighters should continue to drain swamps. Russell said yes; Soper said no, that it would be an unnecessary distraction. Russell was right: it made no sense to use only one weapon against malaria. Spielman points out that malaria transmission in sub-Saharan Africa is powerfully affected by the fact that so many people live in mud huts. The walls of that kind of house need to be constantly replastered, and to do that villagers dig mud holes around their huts. But a mud hole is a prime breeding spot for gambiae. If economic aid were directed at helping villagers build houses out of brick, Spielman argues, malaria could be dealt a blow. Similarly, the Princeton University malariologist Burton Singer says that since the forties it has been well known that mosquito larvae that hatch in rice fields--a major breeding site in southeast Asia--can be killed if the water level in the fields is intermittently drained, a practice that has the additional effect of raising rice yields. Are these perfect measures? No. But, under the right circumstances, they are sustainable. In a speech Soper presented on eradication, he quoted Louis Pasteur: "It is within the power of man to rid himself of every parasitic disease." The key phrase, for Soper, was "within the power." Soper believed that the responsibility of the public-health professional was to make an obligation out of what was possible. He never understood that concessions had to be made to what was practical. "This is the fundamental difference between those of us in public health who have an epidemiological perspective, and people, like Soper, with more of a medical approach," Spielman says. "We deal with populations over time, populations of individuals. They deal with individuals at a moment in time. Their best outcome is total elimination of the condition in the shortest possible period. Our first goal is to cause no outbreaks, no epidemics, to manage, to contain the infection." Bringing the absolutist attitudes of medicine to a malarious village, Spielman says, "is a good way to do a bad thing." The Fred Soper that we needed, in retrospect, was a man of more modest ambitions.
But, of course, Fred Soper with modest ambitions would not be Fred Soper; his epic achievements arose from his fanaticism, his absolutism, his commitment to saving as many lives as possible in the shortest period of time. For all the talk of his misplaced ambition, there are few people in history to whom so many owe their lives. The Global Malaria Eradication Programme helped eliminate the disease from the developed world, and from many parts of the developing world. In a number of cases where the disease returned, it came back at a lower level than it had been in the prewar years, and even in those places where eradication made little headway the campaign sometimes left in place a public infrastructure that had not existed before. The problem was that Soper had raised expectations too high. He had said that the only acceptable outcome for Global Eradication was global eradication, and when that did not happen he was judged--and, most important, he judged himself--a failure. But isn't the urgency Soper felt just what is lacking in the reasonableness of our contemporary attitude--in our caution and thoughtfulness and restraint? In the wake of the failure of eradication, it was popular to say that truly effective malaria control would have to await the development of a public-health infrastructure in poorer countries. Soper's response was, invariably: What about now? In a letter to a friend, he snapped, "The delay in handling malaria until it can be done by local health units is needlessly sacrificing the generation now living." There is something to admire in that attitude; it is hard to look at the devastation wrought by H.I.V. and malaria and countless other diseases in the Third World and not conclude that what we need, more than anything, is someone who will marshal the troops, send them house to house, monitor their every movement, direct their every success, and, should a day of indifference leave their shirts unsullied, send them packing. Toward the end of his life, Soper, who died in 1975, met with an old colleague, M. A. Farid, with whom he had fought gambiae in Egypt years before. "How do things go?" Soper began. "Bad!" Farid replied, for this was in the years when everyone had turned against Soper's vision. "Who will be our ally?" Soper asked. And Farid said simply, "Malaria," and Soper, he remembered, almost hugged him, because it was clear what Farid meant: Someday, when DDT is dead and buried, and the West wakes up to a world engulfed by malaria, we will think back on Fred Soper and wish we had another to take his place.
he Critics
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 16, 2001
"SUPER FRIENDS"
Sumner Redstone and the
rules of the corporate memoir.
1.
In the early nineteen-nineties, Sumner Redstone, the head of Viacom, wanted to merge his company with Paramount Communications. The problem was that the chairman of Paramount, Martin Davis, was being difficult. As Redstone recounts in his new autobiography, "A Passion to Win" (Simon & Schuster; $26), he and Davis would meet at, say, a charitable function, and Davis would make it sound as if the deal were imminent. Then, abruptly, he would back away. According to Redstone, Davis was a ditherer, a complicated and emotionally cold man who couldn't bear to part with his company. Yet, somehow in the course of their dealings, Redstone writes, he and Davis developed a "mutual respect and fond friendship." They became, he says a page later, "friends" who "enjoyed each other's company and were developing a close working rapport," and who had, he tells us two pages after that, "a great affection for each other." The turning point in the talks comes when Davis and Redstone have dinner in a dining room at Morgan Stanley, and Redstone is once more struck by how Davis "had a genuine affection for me." When the two have dinner again, this time at Redstone's suite in the Carlyle Hotel, Davis looks out over the spectacular lights of nighttime Manhattan and says, "You know, Sumner, when this deal gets done, they'll build a big statue of you in the middle of Central Park and I'll be forgotten." "No, Martin," Redstone replies. "They'll build statues of both of us and I will be looking up to you in admiration." Davis laughs. "It was just the right touch," Redstone reports, and one can almost imagine him at that point throwing a brawny forearm around Davis's shoulders and giving him a manly squeeze.
2.
"A Passion to Win," which Redstone wrote with Peter Knobler, is an account of a man's rise to the top of a multibillion-dollar media empire. It is the tale of the complex negotiations, blinding flashes of insight, and lengthy dinners at exclusive Manhattan hotels which created the colossus that is Viacom. But mostly it is a story about the value of friendship, about how very, very powerful tycoons like Redstone have the surprising ability to transcend their own egos and see each other, eye to eye, as human beings.
For instance, Gerald Levin, the head of Time Warner, might look like a rival of Redstone's. Not at all. He is, Redstone tells us, "a very close friend." Redstone says that he and Sherry Lansing, who heads Paramount Pictures, a division of Viacom, "are not just business associates, we are extremely close friends." So, too, with Geraldine Laybourne, who used to head Viacom's Nickelodeon division. "We were not just business associates," he writes, in the plainspoken manner that is his trademark. "We were friends." The singer Tony Bennett was one of Redstone's idols for years, and then one day Redstone's employees threw him a surprise birthday party and there was Bennett, who had come thousands of miles to sing a song. "Now," Redstone says proudly, "he is my friend." The producer Bob Evans? "A good friend." Aaron Spelling? "One of my closest friends." Bill and Hillary? "I have come to know and like the Clintons." Redstone's great friend Martin Davis warned him once about Barry Diller: "Don't trust him. He's got too big an ego." But Redstone disagreed. "Barry Diller and I were extremely friendly," he says. Ted Kennedy he met years ago, at a get-together of business executives. Everyone else was flattering Kennedy. Not so Redstone. A true friend is never disingenuous. As he recalls, he said to Kennedy,
"Look, I don't want to disagree with everybody, but, Senator, the problem is that you believe . . . that you can solve any problem by just throwing money at it. It doesn't work that way."
Conversation ceased, glances were exchanged. Everyone was appalled. Then Senator Kennedy said: "Sumner's right." . . . After that, Senator Kennedy called me regularly when he came to Boston and we developed a lasting friendship.
You might think that Redstone simply becomes friends with anyone he meets who is rich or famous. This is not the case. Once, Redstone was invited to dinner at the office of Robert Maxwell, the British press baron. It was no small matter, since Redstone is one of those moguls for whom dinner has enormous symbolic importance--it is the crucible in which friendships are forged. But Maxwell didn't show up until halfway through the meal: not a friend. On another occasion, Redstone hires a highly touted executive away from the retailing giant Wal-Mart to run his Blockbuster division, and then learns that the man is eating dinner alone in his hotel dining room and isn't inviting his fellow-executives to join him. Dinner alone? Redstone was worried. That was not friendly behavior. The executive, needless to say, was not long for Viacom.
What Redstone likes most in a friend is someone who reminds him of himself. "I respected Malone for having started with nothing and rising to become chairman of the very successful Tele-Communications, Inc.," Redstone writes of John Malone, the billionaire cable titan. "I had admired Kerkorian's success over the years," he says of Kirk Kerkorian, the billionaire corporate raider. "He started with nothing, and I have a special affection for people who start with nothing and create empires. . . . Today Kirk Kerkorian and I are friends." (They are so friendly, in fact, that they recently had a meal together at Spago.) Of his first meeting with John Antioco, whom Redstone would later hire to run Blockbuster--replacing the executive who ate dinner alone--Redstone writes, "We hit it off immediately. . . . He had come from humble beginnings, which I empathized with." Soon, the two men are dining together. Look at my life, Redstone seems to marvel again and again in "A Passion to Win." You think you see a hard-nosed mogul, selflessly wringing the last dollar out of megadeals on behalf of his shareholders. But inside that mogul beats a heart of warmth and compassion. Ever wonder how Redstone was able to pull off his recent mammoth merger with CBS? He happens to have lunch with the CBS chief, Mel Karmazin, in an exclusive Viacom corporate dining room, and discovers that he and Karmazin are kindred spirits. "Both of us had started with nothing," Redstone writes, "and ended up in control of major corporations." Can you believe it?
3.
In 1984, Lee Iacocca, the chairman of the resurgent Chrysler Corporation, published his autobiography, "Iacocca," with the writer William Novak. It was a charming book, in which Ia-cocca came across as a homespun, no-nonsense man of the people, and it sold more copies than any other business book in history. This was good news for Iacocca, because it made him a household name. But it was bad news for the rest of us, because it meant that an entire class of C.E.O.s promptly signed up ghostwriters and turned out memoirs designed to portray themselves as homespun, no-nonsense men of the people.
"Iacocca" began with a brief, dramatic prologue, in which he described his last day at Ford, where he had worked his entire life. He had just been fired by Henry Ford II, and it was a time of great personal crisis. "Before I left the house," he wrote, establishing the conflict between him and Henry Ford that would serve as the narrative engine of the book, "I kissed my wife, Mary, and my two daughters, Kathi and Lia. . . . Even today, their pain is what stays with me. It's like the lioness and her cubs. If the hunter knows what's good for him, he will leave the little ones alone. Henry Ford made my kids suffer, and for that I'll never forgive him." Now every C.E.O. book begins with a short, dramatic prologue, in which the author describes a day of great personal crisis that is intended to serve as the narrative engine of the book. In "Work in Progress," by Michael Eisner, Disney's C.E.O., it's the day he suffered chest pains at the Herb Allen conference in Sun Valley: "I spent much of dinner at Herb Allen's talking to Tom Brokaw, the NBC anchorman, who told me a long story about fly-fishing with his friend Robert Redford. . . . The pain in my arms returned." In "A Passion to Win," it's the day Redstone clung to a ledge during a fire at the Copley Plaza Hotel, in Boston, eventually suffering third-degree burns over forty-five per cent of his body: "The pain was excruciating but I refused to let go. That way was death."
Iacocca followed the dramatic prologue with a chapter on his humble origins. It opens, "Nicola Iacocca, my father, arrived in this country in 1902 at the age of twelve--poor, alone, and scared." Now every C.E.O. has humble origins. Then Iacocca spoke of an early mentor, a gruff, no-nonsense man who instilled lessons that guide him still. His name was Charlie Beacham, and he was "the kind of guy you'd charge up the hill for even though you knew very well you could get killed in the process. He had the rare gift of being tough and generous at the same time." Sure enough, everywhere now there are gruff, no-nonsense men instilling lessons that guide C.E.O.s to this day. ("Nobbe, who was in his sixties, was a stern disciplinarian and a tough guy who didn't take crap from anyone," writes the former Scott Paper and Sunbeam C.E.O. Al Dunlap, in his book "Mean Business." "He was always chewing me out. . . . Still, Nobbe rapidly won my undying respect and admiration because he wore his bastardness like a well-earned badge of honor.")
The legacy of "Iacocca" wouldn't matter so much if most C.E.O.s were, in fact, homespun men of the people who had gruff mentors, humble beginnings, and searing personal crises that shaped their lives and careers. But they aren't. Redstone's attempt to play the humble-beginnings card, for instance, is compromised by the fact that he didn't exactly have humble beginnings. Although his earliest years were spent in a tenement, his family's fortunes rapidly improved. He went to Harvard and Harvard Law School. His father was a highly successful businessman, and it was his father's company that served as the basis for the Viacom empire. (Just why Redstone continues to think that he comes from nothing, under the circumstances, is an interesting case study in the psychology of success: perhaps, if you are worth many billions, an upper-middle-class upbringing simply feels like nothing.) Eisner's personal crisis ends with him driving himself to Cedars-Sinai Hospital in Los Angeles--one of the best hospitals in the world--where he is immediately met by not one but two cardiologists, who take him to a third cardiologist, who tells Eisner that the procedure he is about to undergo, an angiogram, a common surgical procedure, is ninety-eight per cent safe. In "On the Firing Line," the former Apple C.E.O. Gil Amelio's day of personal crisis is triggered merely by walking down the halls of his new company: "In each of the offices near mine toiled some key executive I was just coming to know, wrestling with problems that would only gradually be revealed to me. I wondered what caged alligators they would let loose at me on some future date." Dunlap, meanwhile, tells us that one of his first acts as C.E.O. of Scott Paper was, in an orgy of unpretentiousness, to throw out the bookshelves in his office and replace them with Aboriginal paintings from Australia: "To me, the paintings made a lot more sense. They showed people who had to survive by their wits, people who couldn't call out for room service." Among Dunlap's gruff mentors was the Australian multimillionaire Kerry Packer, and one day, while playing tennis with Packer, Dunlap has a personal crisis. He pops a tendon. Packer rushes over, picks him up, and carries him to a lounge chair. "This was not only a wealthy man and a man who had political power, this was a physically powerful man," Dunlap reports. "In the end," he adds, taking Iacocca's lioness and Amelio's alligators to the next level, "Kerry and I split because we were just too similar. We were like two strong-willed, dominant animals who hunted together and brought down the biggest prey, but, when not hunting, fought each other." It is hard to read passages like these and not shudder at the prospect of the General Electric chairman Jack Welch's upcoming memoir, for which Warner Books paid a seven-milliondollar advance. Who will be tapped as the gruff mentor? What was Welch's career-altering personal crisis? What wild-animal metaphors will he employ? ("As I looked around the room, I felt like a young wildebeest being surveyed by a group of older and larger--but less nimble--wildebeests, whose superior market share and penetration of the herd were no match for my greater hunger, born of my impoverished middle-class upbringing in the jungles of suburban Boston.")
The shame of it is that many of these books could have been fascinating. Scattered throughout Eisner's "Work in Progress," for example, are numerous hints about how wonderfully weird and obsessive Eisner is. He hears that Universal is thinking of building a rival theme park four miles from Disney in Orlando, and he and his assistant climb the fence at the Universal construction site at three in the morning to check on its progress. He sneaks into performances of the musical "Beauty and the Beast" in Los Angeles at least a dozen times, and when the stage version of "The Lion King" has its first tryout, in Minneapolis, he flies there from Los Angeles half a dozen times during the course of one summer to give his "notes." When he is thinking of building Euro Disney, outside Paris, he is told that it takes half an hour to travel by MĂ©tro from the Arc de Triomphe to the end of the line, six miles from the Disney site. Eisner gets on the MĂ©tro to see for himself. He sets his watch. It takes twenty-five minutes.
By the end of the book, the truth is spilling out from under the breezy façade: Eisner is a compulsive, detail-oriented control freak. That fact, of course, says a lot about why Disney is successful. But you cannot escape the sense, while reading "Work in Progress," that you weren't supposed to reach that conclusion--that the bit about climbing the fence was supposed to be evidence of Eisner's boyish enthusiasm, and the bit about seeing "Beauty and the Beast" a dozen times was supposed to make it look as if he just loved the theatre. This is the sorry state of C.E.O. memoirs in the post-Iacocca era. It's only when they fail at their intended task that they truly succeed.
4.
"A Passion to Win" ought to have been a terrific book, because Redstone has a terrific story to tell. He graduated from Boston Latin High School with the highest grade-point average in the school's three-hundred-year history. During the war, he was a cryptographer, part of the team that successfully cracked Japanese military and diplomatic codes. After the war, he had a brilliant career as a litigator, arguing a case before the Supreme Court. The mobster Bugsy Siegel once offered him a job. Then Redstone took over his father's business, and, through a series of breathtaking acquisitions--Viacom, Paramount Communications, Blockbuster, and then CBS--turned himself, in the space of twenty years, into one of the richest men in the world.
What links all these successes, it becomes clear, is a very particular and subtle intelligence. Here, in one of the book's best passages, is Redstone's description of his dealings with Wayne Huizenga, the man from whom he bought Blockbuster. Huizenga, he writes, put together his empire by buying out local video stores around the country:
He and his Blockbuster associates would swoop in on some video guy who saw money for his store dangling from Huizenga's pockets. When negotiations came to an impasse, rather than say, "We have a problem with the proposal," and make a counteroffer, he would say, "Sorry we couldn't do a deal. Good luck to you," shake the guy's hand, pull on the leather coat and head for the elevator.
Seeing the deal about to fall apart, the video operator, who only moments before was seeing dollar signs, would run after him. "Wait, don't go. Come back. Let's talk about it." Huizenga hadn't hit the down button. He had been waiting. That's how he got his concessions.
When Redstone was negotiating for Blockbuster, Huizenga pulled the same stunt. It would be 2 a.m., Redstone says, and Huizenga would put on his coat and head for the exit. But Redstone was wise to him:
Huizenga would get to the elevator and no one would run after him. One time he waited there for fifteen minutes before it dawned on him that we weren't going to chase him. He got to his car. Nothing.
He would soon find some excuse to call--he left papers in our office--waiting for us to say, "Why don't you come back." Still, nothing. Once he was literally on his plane, perhaps even circling the neighborhood, when he phoned and said he had to be back in New York for a Merrill Lynch dinner anyway and maybe we could get together.
Redstone has a great intuitive grasp of people. He understood immediately that Huizenga was simply a bully. This kind of insight is hardly rare among people who make their living at the negotiating table. It's the skill of the poker player. But poker is a game of manipulation and exploitation--and Redstone doesn't seem to manipulate or exploit. He persuades and seduces: he would concede that your straight flush beat his three of a kind, but then, over a very long dinner at Spago, he would develop such a rapport with you that you'd willingly split the pot with him. It's no accident that, of Paramount's many suitors, Redstone won the day, because he realized that what Martin Davis needed was the assurance of friendship: he needed to hear about the two statues in Central Park, one gazing in admiration at the other. Redstone's peculiar gift also explains why he seems to have ended up as "friends" with so many of the people with whom he's done business. In Redstone's eyes, these people really are his friends. At the moment when he looked into Davis's eyes that night at the Carlyle, he absolutely believed they had a special bond--and, more important, he made Davis believe it, too. Redstone's heart happily follows his mind, and that's a powerful gift for someone whose success depends on the serial seduction of takeover targets.
Most of us, needless to say, don't think of friendships this way. Our hearts don't always follow our minds; they go off in crazy directions, and we develop loyalties that make no functional sense. But there's little of that fuzziness in Redstone's world, and perhaps that's why "A Passion to Win" is sometimes so chilling. A picture runs in the Post, Redstone tells us, that shows him walking down a street in Paris with "a beautiful woman." Phyllis, his wife of fifty-two years, files for divorce. The news hits him "like a bullet," he says. "I could not believe that she wanted to end it." It takes him only a few sentences, though, to recover from his wounds. "Of course, divorce settlement or no, my interest in Viacom's parent company, National Amusements, had been structured in such a way that events in Phyllis's and my personal life would not affect the ownership, control or management of Viacom," he assures us. Redstone says that he considered Frank Biondi, his longtime deputy at Viacom, "my friend." But one day he decides to get rid of Biondi, and immediately the gibes and cheap shots appear. Biondi is lazy. Biondi cannot negotiate deals. Biondi is not C.E.O. material. "Frank took the news calmly, almost as if he expected it," Redstone writes of the firing. "But I was shocked to learn that the first person he called was not his wife, but his lawyer to determine his rights under his contract. We were prepared to honor his contract to the fullest, so that was not an issue, but I found this implicit statement of his priorities to be revealing." What kind of person says this about a friend? Redstone aligns his passions with his interests, and when his interests change, so do his friendships.
At the very end of "A Passion to Win," Redstone recounts Viacom's merger with CBS. The deal meant that the network's C.E.O., Mel Karmazin, would come aboard as chief operating officer of Viacom. But that in turn meant that two of Redstone's most trusted executives, Tom Dooley and Philippe Dauman, would have to give up their posts as deputy chairmen. Redstone says that he was "shocked" when he was told this. Dooley and Dauman were not just business associates; they were his "close friends." Redstone says that he could not accept this, that there was "no way" he could agree to the deal if it meant losing his deputies. At this point, though, we simply don't believe him--we don't believe that someone as smart as Redstone wouldn't have realized this going into the deal with CBS, and we don't believe that Redstone's entirely instrumental friendships could possibly stand in the way of his getting bigger and richer. "A Passion to Win" would have told us much more about Redstone, and about business, if it had confronted this fact and tried to make sense of it. But Redstone is a supremely unself-conscious man, and that trait, which has served him so well in the business world, is fatal in an author. Karmazin comes. Dauman and Dooley go. Redstone moves blithely on to make new best friends.
Java Man
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 30, 2001
A CRITIC AT LARGE
How caffeine created the modern world.
1.
The original Coca-Cola was a late-nineteenth-century concoction known as Pemberton's French Wine Coca, a mixture of alcohol, the caffeine-rich kola nut, and coca, the raw ingredient of cocaine. In the face of social pressure, first the wine and then the coca were removed, leaving the more banal modern beverage in its place: carbonated, caffeinated sugar water with less kick to it than a cup of coffee. But is that the way we think of Coke? Not at all. In the nineteen-thirties, a commercial artist named Haddon Sundblom had the bright idea of posing a portly retired friend of his in a red Santa Claus suit with a Coke in his hand, and plastering the image on billboards and advertisements across the country. Coke, magically, was reborn as caffeine for children, caffeine without any of the weighty adult connotations of coffee and tea. It was--as the ads with Sundblom's Santa put it--"the pause that refreshes." It added life. It could teach the world to sing.
One of the things that have always made drugs so powerful is their cultural adaptability, their way of acquiring meanings beyond their pharmacology. We think of marijuana, for example, as a drug of lethargy, of disaffection. But in Colombia, the historian David T. Courtwright points out in "Forces of Habit" (Harvard; $24.95), "peasants boast that cannabis helps them to quita el cansancio or reduce fatigue; increase their fuerza and ánimo, force and spirit; and become incansable, tireless." In Germany right after the Second World War, cigarettes briefly and suddenly became the equivalent of crack cocaine. "Up to a point, the majority of the habitual smokers preferred to do without food even under extreme conditions of nutrition rather than to forgo tobacco," according to one account of the period. "Many housewives... bartered fat and sugar for cigarettes." Even a drug as demonized as opium has been seen in a more favorable light. In the eighteen-thirties, Franklin Delano Roosevelt's grandfather Warren Delano II made the family fortune exporting the drug to China, and Delano was able to sugarcoat his activities so plausibly that no one ever accused his grandson of being the scion of a drug lord. And yet, as Bennett Alan Weinberg and Bonnie K. Bealer remind us in their marvellous new book "The World of Caffeine" (Routledge; $27.50), there is no drug quite as effortlessly adaptable as caffeine, the Zelig of chemical stimulants.
At one moment, in one form, it is the drug of choice of café intellectuals and artists; in another, of housewives; in another, of Zen monks; and, in yet another, of children enthralled by a fat man who slides down chimneys. King Gustav III, who ruled Sweden in the latter half of the eighteenth century, was so convinced of the particular perils of coffee over all other forms of caffeine that he devised an elaborate experiment. A convicted murderer was sentenced to drink cup after cup of coffee until he died, with another murderer sentenced to a lifetime of tea drinking, as a control. (Unfortunately, the two doctors in charge of the study died before anyone else did; then Gustav was murdered; and finally the tea drinker died, at eighty-three, of old age--leaving the original murderer alone with his espresso, and leaving coffee's supposed toxicity in some doubt.) Later, the various forms of caffeine began to be divided up along sociological lines. Wolfgang Schivelbusch, in his book "Tastes of Paradise," argues that, in the eighteenth century, coffee symbolized the rising middle classes, whereas its great caffeinated rival in those years--cocoa, or, as it was known at the time, chocolate--was the drink of the aristocracy. "Goethe, who used art as a means to lift himself out of his middle class background into the aristocracy, and who as a member of a courtly society maintained a sense of aristocratic calm even in the midst of immense productivity, made a cult of chocolate, and avoided coffee," Schivelbusch writes. "Balzac, who despite his sentimental allegiance to the monarchy, lived and labored for the literary marketplace and for it alone, became one of the most excessive coffee-drinkers in history. Here we see two fundamentally different working styles and means of stimulation--fundamentally different psychologies and physiologies." Today, of course, the chief cultural distinction is between coffee and tea, which, according to a list drawn up by Weinberg and Bealer, have come to represent almost entirely opposite sensibilities:
Coffee Aspect
Tea Aspect
Male
Female
Boisterous
Decorous
Indulgence
Temperance
Hardheaded
Romantic
Topology
Geometry
Heidegger
Carnap
Beethoven
Mozart
Libertarian
Statist
Promiscuous
Pure
That the American Revolution began with the symbolic rejection of tea in Boston Harbor, in other words, makes perfect sense. Real revolutionaries would naturally prefer coffee. By contrast, the freedom fighters of Canada, a hundred years later, were most definitely tea drinkers. And where was Canada's autonomy won? Not on the blood-soaked fields of Lexington and Concord but in the genteel drawing rooms of Westminster, over a nice cup of Darjeeling and small, triangular cucumber sandwiches.
2.
All this is a bit puzzling. We don't fetishize the difference between salmon eaters and tuna eaters, or people who like their eggs sunny-side up and those who like them scrambled. So why invest so much importance in the way people prefer their caffeine? A cup of coffee has somewhere between a hundred and two hundred and fifty milligrams; black tea brewed for four minutes has between forty and a hundred milligrams. But the disparity disappears if you consider that many tea drinkers drink from a pot, and have more than one cup. Caffeine is caffeine. "The more it is pondered," Weinberg and Bealer write, "the more paradoxical this duality within the culture of caffeine appears. After all, both coffee and tea are aromatic infusions of vegetable matter, served hot or cold in similar quantities; both are often mixed with cream or sugar; both are universally available in virtually any grocery or restaurant in civilized society; and both contain the identical psychoactive alkaloid stimulant, caffeine."
It would seem to make more sense to draw distinctions based on the way caffeine is metabolized rather than on the way it is served. Caffeine, whether it is in coffee or tea or a soft drink, moves easily from the stomach and intestines into the bloodstream, and from there to the organs, and before long has penetrated almost every cell of the body. This is the reason that caffeine is such a wonderful stimulant. Most substances can't cross the blood-brain barrier, which is the body's defensive mechanism, preventing viruses or toxins from entering the central nervous system. Caffeine does so easily. Within an hour or so, it reaches its peak concentration in the brain, and there it does a number of things--principally, blocking the action of adenosine, the neuromodulator that makes you sleepy, lowers your blood pressure, and slows down your heartbeat. Then, as quickly as it builds up in your brain and tissues, caffeine is gone--which is why it's so safe. (Caffeine in ordinary quantities has never been conclusively linked to serious illness.)
But how quickly it washes away differs dramatically from person to person. A two-hundred-pound man who drinks a cup of coffee with a hundred milligrams of caffeine will have a maximum caffeine concentration of one milligram per kilogram of body weight. A hundred-pound woman having the same cup of coffee will reach a caffeine concentration of two milligrams per kilogram of body weight, or twice as high. In addition, when women are on the Pill, the rate at which they clear caffeine from their bodies slows considerably. (Some of the side effects experienced by women on the Pill may in fact be caffeine jitters caused by their sudden inability to tolerate as much coffee as they could before.) Pregnancy reduces a woman's ability to process caffeine still further. The half-life of caffeine in an adult is roughly three and a half hours. In a pregnant woman, it's eighteen hours. (Even a four-month-old child processes caffeine more efficiently.) An average man and woman sitting down for a cup of coffee are thus not pharmaceutical equals: in effect, the woman is under the influence of a vastly more powerful drug. Given these differences, you'd think that, instead of contrasting the caffeine cultures of tea and coffee, we'd contrast the caffeine cultures of men and women.
3.
But we don't, and with good reason. To parse caffeine along gender lines does not do justice to its capacity to insinuate itself into every aspect of our lives, not merely to influence culture but even to create it. Take coffee's reputation as the "thinker's" drink. This dates from eighteenth-century Europe, where coffeehouses played a major role in the egalitarian, inclusionary spirit that was then sweeping the continent. They sprang up first in London, so alarming Charles II that in 1676 he tried to ban them. It didn't work. By 1700, there were hundreds of coffeehouses in London, their subversive spirit best captured by a couplet from a comedy of the period: "In a coffeehouse just now among the rabble / I bluntly asked, which is the treason table." The movement then spread to Paris, and by the end of the eighteenth century coffeehouses numbered in the hundreds--most famously, the Café de la Régence, near the Palais Royal, which counted among its customers Robespierre, Napoleon, Voltaire, Victor Hugo, Théophile Gautier, Rousseau, and the Duke of Richelieu. Previously, when men had gathered together to talk in public places, they had done so in bars, which drew from specific socioeconomic niches and, because of the alcohol they served, created a specific kind of talk. The new coffeehouses, by contrast, drew from many different classes and trades, and they served a stimulant, not a depressant. "It is not extravagant to claim that it was in these gathering spots that the art of conversation became the basis of a new literary style and that a new ideal of general education in letters was born," Weinberg and Bealer write.
It is worth noting, as well, that in the original coffeehouses nearly everyone smoked, and nicotine also has a distinctive physiological effect. It moderates mood and extends attention, and, more important, it doubles the rate of caffeine metabolism: it allows you to drink twice as much coffee as you could otherwise. In other words, the original coffeehouse was a place where men of all types could sit all day; the tobacco they smoked made it possible to drink coffee all day; and the coffee they drank inspired them to talk all day. Out of this came the Enlightenment. (The next time we so perfectly married pharmacology and place, we got Joan Baez.)
In time, caffeine moved from the café to the home. In America, coffee triumphed because of the country's proximity to the new Caribbean and Latin American coffee plantations, and the fact that throughout the nineteenth century duties were negligible. Beginning in the eighteen-twenties, Courtwright tells us, Brazil "unleashed a flood of slave-produced coffee. American per capita consumption, three pounds per year in 1830, rose to eight pounds by 1859."
What this flood of caffeine did, according to Weinberg and Bealer, was to abet the process of industrialization--to help "large numbers of people to coordinate their work schedules by giving them the energy to start work at a given time and continue it as long as necessary." Until the eighteenth century, it must be remembered, many Westerners drank beer almost continuously, even beginning their day with something called "beer soup." (Bealer and Weinberg helpfully provide the following eighteenth-century German recipe: "Heat the beer in a saucepan; in a separate small pot beat a couple of eggs. Add a chunk of butter to the hot beer. Stir in some cool beer to cool it, then pour over the eggs. Add a bit of salt, and finally mix all the ingredients together, whisking it well to keep it from curdling.") Now they began each day with a strong cup of coffee. One way to explain the industrial revolution is as the inevitable consequence of a world where people suddenly preferred being jittery to being drunk. In the modern world, there was no other way to keep up. That's what Edison meant when he said that genius was ninety-nine per cent perspiration and one per cent inspiration. In the old paradigm, working with your mind had been associated with leisure. It was only the poor who worked hard. (The quintessential pre-industrial narrative of inspiration belonged to Archimedes, who made his discovery, let's not forget, while taking a bath.) But Edison was saying that the old class distinctions no longer held true--that in the industrialized world there was as much toil associated with the life of the mind as there had once been with the travails of the body.
In the twentieth century, the professions transformed themselves accordingly: medicine turned the residency process into an ordeal of sleeplessness, the legal profession borrowed a page from the manufacturing floor and made its practitioners fill out time cards like union men. Intellectual heroics became a matter of endurance. "The pace of computation was hectic," James Gleick writes of the Manhattan Project in "Genius," his biography of the physicist Richard Feynman. "Feynman's day began at 8:30 and ended fifteen hours later. Sometimes he could not leave the computing center at all. He worked through for thirty-one hours once and the next day found that an error minutes after he went to bed had stalled the whole team. The routine allowed just a few breaks." Did Feynman's achievements reflect a greater natural talent than his less productive forebears had? Or did he just drink a lot more coffee? Paul Hoffman, in "The Man Who Loved Only Numbers," writes of the legendary twentieth-century mathematician Paul Erdös that "he put in nineteen-hour days, keeping himself fortified with 10 to 20 milligrams of Benzedrine or Ritalin, strong espresso and caffeine tablets. 'A mathematician,' Erdös was fond of saying, 'is a machine for turning coffee into theorems.'" Once, a friend bet Erdös five hundred dollars that he could not quit amphetamines for a month. Erdös took the bet and won, but, during his time of abstinence, he found himself incapable of doing any serious work. "You've set mathematics back a month," he told his friend when he collected, and immediately returned to his pills.
Erdös's unadulterated self was less real and less familiar to him than his adulterated self, and that is a condition that holds, more or less, for the rest of society as well. Part of what it means to be human in the modern age is that we have come to construct our emotional and cognitive states not merely from the inside out--with thought and intention--but from the outside in, with chemical additives. The modern personality is, in this sense, a synthetic creation: skillfully regulated and medicated and dosed with caffeine so that we can always be awake and alert and focussed when we need to be. On a bet, no doubt, we could walk away from caffeine if we had to. But what would be the point? The lawyers wouldn't make their billable hours. The young doctors would fall behind in their training. The physicists might still be stuck out in the New Mexico desert. We'd set the world back a month.
4.
That the modern personality is synthetic is, of course, a disquieting notion. When we talk of synthetic personality--or of constructing new selves through chemical means--we think of hard drugs, not caffeine. Timothy Leary used to make such claims about LSD, and the reason his revolution never took flight was that most of us found the concept of tuning in, turning on, and dropping out to be a bit creepy. Here was this shaman, this visionary--and yet, if his consciousness was so great, why was he so intent on altering it? More important, what exactly were we supposed to be tuning in to? We were given hints, with psychedelic colors and deep readings of "Lucy in the Sky with Diamonds," but that was never enough. If we are to re-create ourselves, we would like to know what we will become.
Caffeine is the best and most useful of our drugs because in every one of its forms it can answer that question precisely. It is a stimulant that blocks the action of adenosine, and comes in a multitude of guises, each with a ready-made story attached, a mixture of history and superstition and whimsy which infuses the daily ritual of adenosine blocking with meaning and purpose. Put caffeine in a red can and it becomes refreshing fun. Brew it in a teapot and it becomes romantic and decorous. Extract it from little brown beans and, magically, it is hardheaded and potent. "There was a little known Russian émigré, Trotsky by name, who during World War I was in the habit of playing chess in Vienna's Café Central every evening," Bealer and Weinberg write, in one of the book's many fascinating café yarns:
A typical Russian refugee, who talked too much but seemed utterly harmless, indeed, a pathetic figure in the eyes of the Viennese. One day in 1917 an official of the Austrian Foreign Ministry rushed into the minister's room, panting and excited, and told his chief, "Your excellency . . . Your excellency . . . Revolution has broken out in Russia." The minister, less excitable and less credulous than his official, rejected such a wild claim and retorted calmly, "Go away . . . Russia is not a land where revolutions break out. Besides, who on earth would make a revolution in Russia? Perhaps Herr Trotsky from the Café Central?"
The minister should have known better. Give a man enough coffee and he's capable of anything.
Drugstore Athlete
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
September 10, 2001
THE SPORTING SCENE
To beat the competition,
first you have to beat the drug test.
1.
At the age of twelve, Christiane Knacke-Sommer was plucked from a small town in Saxony to train with the elite SC Dynamo swim club, in East Berlin. After two years of steady progress, she was given regular injections and daily doses of small baby-blue pills, which she was required to take in the presence of a trainer. Within weeks, her arms and shoulders began to thicken. She developed severe acne. Her pubic hair began to spread over her abdomen. Her libido soared out of control. Her voice turned gruff. And her performance in the pool began to improve dramatically, culminating in a bronze medal in the hundred-metre butterfly at the 1980 Moscow Olympics. But then the Wall fell and the truth emerged about those little blue pills. In a new book about the East German sports establishment, "Faust's Gold," Steven Ungerleider recounts the moment in 1998 when Knacke-Sommer testified in Berlin at the trial of her former coaches and doctors:
"Did defendant Gläser or defendant Binus ever tell you that the blue pills were the anabolic steroid known as Oral-Turinabol?" the prosecutor asked. "They told us they were vitamin tablets," Christiane said, "just like they served all the girls with meals." "Did defendant Binus ever tell you the injection he gave was Depot-Turinabol?" "Never," Christiane said, staring at Binus until the slight, middle-aged man looked away. "He said the shots were another kind of vitamin." "He never said he was injecting you with the male hormone testosterone?" the prosecutor persisted. "Neither he nor Herr Gläser ever mentioned Oral-Turinabol or Depot-Turinabol," Christiane said firmly. "Did you take these drugs voluntarily?" the prosecutor asked in a kindly tone. "I was fifteen years old when the pills started," she replied, beginning to lose her composure. "The training motto at the pool was, 'You eat the pills, or you die.' It was forbidden to refuse."
As her testimony ended, Knacke-Sommer pointed at the two defendants and shouted, "They destroyed my body and my mind!" Then she rose and threw her Olympic medal to the floor.
Anabolic steroids have been used to enhance athletic performance since the early sixties, when an American physician gave the drugs to three weight lifters, who promptly jumped from mediocrity to world records. But no one ever took the use of illegal drugs quite so far as the East Germans. In a military hospital outside the former East Berlin, in 1991, investigators discovered a ten-volume archive meticulously detailing every national athletic achievement from the mid-sixties to the fall of the -- Berlin Wall, each entry annotated with the name of the drug and the dosage given to the athlete. An average teen-age girl naturally produces somewhere around half a milligram of testosterone a day. The East German sports authorities routinely prescribed steroids to young adolescent girls in doses of up to thirty-five milligrams a day. As the investigation progressed, former female athletes, who still had masculinized physiques and voices, came forward with tales of deformed babies, inexplicable tumors, liver dysfunction, internal bleeding, and depression. German prosecutors handed down hundreds of indictments of former coaches, doctors, and sports officials, and won numerous convictions. It was the kind of spectacle that one would have thought would shock the sporting world. Yet it didn't. In a measure of how much the use of drugs in competitive sports has changed in the past quarter century, the trials caused barely a ripple.
Today, coaches no longer have to coerce athletes into taking drugs. Athletes take them willingly. The drugs themselves are used in smaller doses and in creative combinations, leaving few telltale physical signs, and drug testers concede that it is virtually impossible to catch all the cheaters, or even, at times, to do much more than guess when cheating is taking place. Among the athletes, meanwhile, there is growing uncertainty about what exactly is wrong with doping. When the cyclist Lance Armstrong asserted last year, after his second consecutive Tour de France victory, that he was drug-free, some doubters wondered whether he was lying, and others simply assumed he was, and wondered why he had to. The moral clarity of the East German scandal -- with its coercive coaches, damaged athletes, and corrupted competitions--has given way to shades of gray. In today's climate, the most telling moment of the East German scandal was not Knacke-Sommer's outburst. It was when one of the system's former top officials, at the beginning of his trial, shrugged and quoted Brecht: "Competitive sport begins where healthy sport ends."
2.
Perhaps the best example of how murky the drug issue has become is the case of Ben Johnson, the Canadian sprinter who won the one hundred metres at the Seoul Olympics, in 1988. Johnson set a new world record, then failed a post-race drug test and was promptly stripped of his gold medal and suspended from international competition. No athlete of Johnson's calibre has ever been exposed so dramatically, but his disgrace was not quite the victory for clean competition that it appeared to be.
Johnson was part of a group of world-class sprinters based in Toronto in the nineteen-seventies and eighties and trained by a brilliant coach named Charlie Francis. Francis was driven and ambitious, eager to give his athletes the same opportunities as their competitors from the United States and Eastern Europe, and in 1979 he began discussing steroids with one of his prize sprinters, Angella Taylor. Francis felt that Taylor had the potential that year to run the two hundred metres in close to 22.90 seconds, a time that would put her within striking distance of the two best sprinters in the world, Evelyn Ashford, of the United States, and Marita Koch, of East Germany. But, seemingly out of nowhere, Ashford suddenly improved her two-hundred-metre time by six-tenths of a second. Then Koch ran what Francis calls, in his autobiography, "Speed Trap," a "science fictional" 21.71. In the sprints, individual improvements are usually measured in hundredths of a second; athletes, once they have reached their early twenties, typically improve their performance in small, steady increments, as experience and strength increase. But these were quantum leaps, and to Francis the explanation was obvious. "Angella wasn't losing ground because of a talent gap," he writes; "she was losing because of a drug gap, and it was widening by the day." (In the case of Koch, at least, he was right. In the East German archives, investigators found a letter from Koch to the director of research at V.E.B. Jenapharm, an East German pharmaceutical house, in which she complained, "My drugs were not as potent as the ones that were given to my opponent Brbel Eckert, who kept beating me." In East Germany, Ungerleider writes, this particular complaint was known as "dope-envy.") Later, Francis says, he was confronted at a track meet by Brian Oldfield, then one of the world's best shot-putters:
"When are you going to start getting serious?" he demanded. "When are you going to tell your guys the facts of life?" I asked him how he could tell they weren't already using steroids. He replied that the muscle density just wasn't there. "Your guys will never be able to compete against the Americans--their careers will be over," he persisted.
Among world-class athletes, the lure of steroids is not that they magically transform performance--no drug can do that--but that they make it possible to train harder. An aging baseball star, for instance, may realize that what he needs to hit a lot more home runs is to double the intensity of his weight training. Ordinarily, this might actually hurt his performance. "When you're under that kind of physical stress," Charles Yesalis, an epidemiologist at Pennsylvania State University, says, "your body releases corticosteroids, and when your body starts making those hormones at inappropriate times it blocks testosterone. And instead of being anabolic--instead of building muscle--corticosteroids are catabolic. They break down muscle. That's clearly something an athlete doesn't want." Taking steroids counteracts the impact of corticosteroids and helps the body bounce back faster. If that home-run hitter was taking testosterone or an anabolic steroid, he'd have a better chance of handling the extra weight training.
It was this extra training that Francis and his sprinters felt they needed to reach the top. Angella Taylor was the first to start taking steroids. Ben Johnson followed in 1981, when he was twenty years old, beginning with a daily dose of five milligrams of the steroid Dianabol, in three-week on-and-off cycles. Over time, that protocol grew more complex. In 1984, Taylor visited a Los Angeles doctor, Robert Kerr, who was famous for his willingness to provide athletes with pharmacological assistance. He suggested that the Canadians use human growth hormone, the pituitary extract that promotes lean muscle and that had become, in Francis's words, "the rage in elite track circles." Kerr also recommended three additional substances, all of which were believed to promote the body's production of growth hormone: the amino acids arginine and ornithine and the dopamine precursor L-dopa. "I would later learn," Francis writes, "that one group of American women was using three times as much growth hormone as Kerr had suggested, in addition to 15 milligrams per day of Dianabol, another 15 milligrams of Anavar, large amounts of testosterone, and thyroxine, the synthetic thyroid hormone used by athletes to speed the metabolism and keep people lean." But the Canadians stuck to their initial regimen, making only a few changes: Vitamin B12, a non-steroidal muscle builder called inosine, and occasional shots of testosterone were added; Dianabol was dropped in favor of a newer steroid called Furazabol; and L-dopa, which turned out to cause stiffness, was replaced with the blood-pressure drug Dixarit.
Going into the Seoul Olympics, then, Johnson was a walking pharmacy. But--and this is the great irony of his case--none of the drugs that were part of his formal pharmaceutical protocol resulted in his failed drug test. He had already reaped the benefit of the steroids in intense workouts leading up to the games, and had stopped Furazabol and testosterone long enough in advance that all traces of both supplements should have disappeared from his system by the time of his race--a process he sped up by taking the diuretic Moduret. Human growth hormone wasn't--and still isn't--detectable by a drug test, and arginine, ornithine, and Dixarit were legal. Johnson should have been clean. The most striking (and unintentionally hilarious) moment in "Speed Trap" comes when Francis describes his bewilderment at being informed that his star runner had failed a drug test--for the anabolic steroid stanozolol. "I was floored," Francis writes:
To my knowledge, Ben had never injected stanozolol. He occasionally used Winstrol, an oral version of the drug, but for no more than a few days at a time, since it tended to make him stiff. He'd always discontinued the tablets at least six weeks before a meet, well beyond the accepted "clearance time." . . . After seven years of using steroids, Ben knew what he was doing. It was inconceivable to me that he might take stanozolol on his own and jeopardize the most important race of his life.
Francis suggests that Johnson's urine sample might have been deliberately contaminated by a rival, a charge that is less preposterous than it sounds. Documents from the East German archive show, for example, that in international competitions security was so lax that urine samples were sometimes switched, stolen from a "clean" athlete, or simply "borrowed" from a noncompetitor. "The pure urine would either be infused by a catheter into the competitor's bladder (a rather painful procedure) or be held in condoms until it was time to give a specimen to the drug control lab," Ungerleider writes. (The top East German sports official Manfred Höppner was once in charge of urine samples at an international weight-lifting competition. When he realized that several of his weight lifters would not pass the test, he broke open the seal of their specimens, poured out the contents, and, Ungerleider notes, "took a nice long leak of pure urine into them.") It is also possible that Johnson's test was simply botched. Two years later, in 1990, track and field's governing body claimed that Butch Reynolds, the world's four-hundred-metre record holder, had tested positive for the steroid nandrolone, and suspended him for two years. It did so despite the fact that half of his urine-sample data had been misplaced, that the testing equipment had failed during analysis of the other half of his sample, and that the lab technician who did the test identified Sample H6 as positive--and Reynolds's sample was numbered H5. Reynolds lost the prime years of his career.
We may never know what really happened with Johnson's assay, and perhaps it doesn't much matter. He was a doper. But clearly this was something less than a victory for drug enforcement. Here was a man using human growth hormone, Dixarit, inosine, testosterone, and Furazabol, and the only substance that the testers could find in him was stanozolol--which may have been the only illegal drug that he hadn't used. Nor is it encouraging that Johnson was the only prominent athlete caught for drug use in Seoul. It is hard to believe, for instance, that the sprinter Florence Griffith Joyner, the star of the Seoul games, was clean. Before 1988, her best times in the hundred metres and the two hundred metres were, respectively, 10.96 and 21.96. In 1988, a suddenly huskier FloJo ran 10.49 and 21.34, times that no runner since has even come close to equalling. In other words, at the age of twenty-eight--when most athletes are beginning their decline--Griffith Joyner transformed herself in one season from a career-long better-than-average sprinter to the fastest female sprinter in history. Of course, FloJo never failed a drug test. But what does that prove? FloJo went on to make a fortune as a corporate spokeswoman. Johnson's suspension cost him an estimated twenty-five million dollars in lost endorsements. The real lesson of the Seoul Olympics may simply have been that Johnson was a very unlucky man.
3.
The basic problem with drug testing is that testers are always one step behind athletes. It can take years for sports authorities to figure out what drugs athletes are using, and even longer to devise effective means of detecting them. Anabolic steroids weren't banned by the International Olympic Committee until 1975, almost a decade after the East Germans started using them. In 1996, at the Atlanta Olympics, five athletes tested positive for what we now know to be the drug Bromantan, but they weren't suspended, because no one knew at the time what Bromantan was. (It turned out to be a Russian-made psycho-stimulant.) Human growth hormone, meanwhile, has been around for twenty years, and testers still haven't figured out how to detect it.
Perhaps the best example of the difficulties of drug testing is testosterone. It has been used by athletes to enhance performance since the fifties, and the International Olympic Committee announced that it would crack down on testosterone supplements in the early nineteen-eighties. This didn't mean that the I.O.C. was going to test for testosterone directly, though, because the testosterone that athletes were getting from a needle or a pill was largely indistinguishable from the testosterone they produce naturally. What was proposed, instead, was to compare the level of testosterone in urine with the level of another hormone, epitestosterone, to determine what's called the T/E ratio. For most people, under normal circumstances, that ratio is 1:1, and so the theory was that if testers found a lot more testosterone than epitestosterone it would be a sign that the athlete was cheating. Since a small number of people have naturally high levels of testosterone, the I.O.C. avoided the risk of falsely accusing anyone by setting the legal limit at 6:1.
Did this stop testosterone use? Not at all. Through much of the eighties and nineties, most sports organizations conducted their drug testing only at major competitions. Athletes taking testosterone would simply do what Johnson did, and taper off their use in the days or weeks prior to those events. So sports authorities began randomly showing up at athletes' houses or training sites and demanding urine samples. To this, dopers responded by taking extra doses of epitestosterone with their testosterone, so their T/E would remain in balance. Testers, in turn, began treating elevated epitestosterone levels as suspicious, too. But that still left athletes with the claim that they were among the few with naturally elevated testosterone. Testers, then, were forced to take multiple urine samples, measuring an athlete's T/E ratio over several weeks. Someone with a naturally elevated T/E ratio will have fairly consistent ratios from week to week. Someone who is doping will have telltale spikes--times immediately after taking shots or pills when the level of the hormone in his blood soars. Did all these precautions mean that cheating stopped? Of course not. Athletes have now switched from injection to transdermal testosterone patches, which administer a continuous low-level dose of the hormone, smoothing over the old, incriminating spikes. The patch has another advantage: once you take it off, your testosterone level will drop rapidly, returning to normal, depending on the dose and the person, in as little as an hour. "It's the peaks that get you caught," says Don Catlin, who runs the U.C.L.A. Olympic Analytical Laboratory. "If you took a pill this morning and an unannounced test comes this afternoon, you'd better have a bottle of epitestosterone handy. But, if you are on the patch and you know your own pharmacokinetics, all you have to do is pull it off." In other words, if you know how long it takes for you to get back under the legal limit and successfully stall the test for that period, you can probably pass the test. And if you don't want to take that chance, you can just keep your testosterone below 6:1, which, by the way, still provides a whopping performance benefit. "The bottom line is that only careless and stupid people ever get caught in drug tests," Charles Yesalis says. "The lite athletes can hire top medical and scientific people to make sure nothing bad happens, and you can't catch them."
4.
But here is where the doping issue starts to get complicated, for there's a case to be made that what looks like failure really isn't--that regulating aggressive doping, the way the 6:1 standard does, is a better idea than trying to prohibit drug use. Take the example of erythropoietin, or EPO. EPO is a hormone released by your kidneys that stimulates the production of red blood cells, the body's oxygen carriers. A man-made version of the hormone is given to those with suppressed red-blood-cell counts, like patients undergoing kidney dialysis or chemotherapy. But over the past decade it has also become the drug of choice for endurance athletes, because its ability to increase the amount of oxygen that the blood can carry to the muscles has the effect of postponing fatigue. "The studies that have attempted to estimate EPO's importance say it's worth about a three-, four-, or five-per-cent advantage, which is huge," Catlin says. EPO also has the advantage of being a copy of a naturally occurring substance, so it's very hard to tell if someone has been injecting it. (A cynic would say that this had something to do with the spate of remarkable times in endurance races during that period.)
So how should we test for EPO? One approach, which was used in the late nineties by the International Cycling Union, is a test much like the T/E ratio for testosterone. The percentage of your total blood volume which is taken up by red blood cells is known as your hematocrit. The average adult male has a hematocrit of between thirty-eight and forty-four per cent. Since 1995, the cycling authorities have declared that any rider who had a hematocrit above fifty per cent would be suspended--a deliberately generous standard (like the T/E ratio) meant to avoid falsely accusing someone with a naturally high hematocrit. The hematocrit rule also had the benefit of protecting athletes' health. If you take too much EPO, the profusion of red blood cells makes the blood sluggish and heavy, placing enormous stress on the heart. In the late eighties, at least fifteen professional cyclists died from suspected EPO overdoses. A fifty-per-cent hematocrit limit is below the point at which EPO becomes dangerous.
But, like the T/E standard, the hematocrit standard had a perverse effect: it set the legal limit so high that it actually encouraged cyclists to titrate their drug use up to the legal limit. After all, if you are riding for three weeks through the mountains of France and Spain, there's a big difference between a hematocrit of forty-four per cent and one of 49.9 per cent. This is why Lance Armstrong faced so many hostile questions about EPO from the European press--and why eyebrows were raised at his five-year relationship with an Italian doctor who was thought to be an expert on performance-enhancing drugs. If Armstrong had, say, a hematocrit of forty-four per cent, the thinking went, why wouldn't he have raised it to 49.9, particularly since the rules (at least, in 2000) implicitly allowed him to do so. And, if he didn't, how on earth did he win?
The problems with hematocrit testing have inspired a second strategy, which was used on a limited basis at the Sydney Olympics and this summer's World Track and Field Championships. This test measures a number of physiological markers of EPO use, including the presence of reticulocytes, which are the immature red blood cells produced in large numbers by EPO injections. If you have a lot more reticulocytes than normal, then there's a good chance you've used EPO recently. The blood work is followed by a confirmatory urinalysis. The test has its weaknesses. It's really only useful in picking up EPO used in the previous week or so, whereas the benefits of taking the substance persist for a month. But there's no question that, if random EPO testing were done aggressively in the weeks leading to a major competition, it would substantially reduce cheating.
On paper, this second strategy sounds like a better system. But there's a perverse effect here as well. By discouraging EPO use, the test is simply pushing savvy athletes toward synthetic compounds called hemoglobin-based oxygen carriers, which serve much the same purpose as EPO but for which there is no test at the moment. "I recently read off a list of these new blood-oxygen expanders to a group of toxicologists, and none had heard of any of them," Yesalis says. "That's how fast things are moving." The attempt to prevent EPO use actually promotes inequity: it gives an enormous advantage to those athletes with the means to keep up with the next wave of pharmacology. By contrast, the hematocrit limit, though more permissive, creates a kind of pharmaceutical parity. The same is true of the T/E limit. At the 1986 world swimming championships, the East German Kristin Otto set a world record in the hundred-metre freestyle, with an extraordinary display of power in the final leg of the race. According to East German records, on the day of her race Otto had a T/E ratio of 18:1. Testing can prevent that kind of aggressive doping; it can insure no one goes above 6:1. That is a less than perfect outcome, of course, but international sports is not a perfect world. It is a place where Ben Johnson is disgraced and FloJo runs free, where Butch Reynolds is barred for two years and East German coaches pee into cups--and where athletes without access to the cutting edge of medicine are condemned to second place. Since drug testers cannot protect the purity of sport, the very least they can do is to make sure that no athlete can cheat more than any other.
5.
The first man to break the four-minute mile was the Englishman Roger Bannister, on a windswept cinder track at Oxford, nearly fifty years ago. Bannister is in his early seventies now, and one day last summer he returned to the site of his historic race along with the current world-record holder in the mile, Morocco's Hicham El Guerrouj. The two men chatted and compared notes and posed for photographs. "I feel as if I am looking at my mirror image," Bannister said, indicating El Guerrouj's similarly tall, high-waisted frame. It was a polite gesture, an attempt to suggest that he and El Guerrouj were part of the same athletic lineage. But, as both men surely knew, nothing could be further from the truth.
Bannister was a medical student when he broke the four-minute mile in 1954. He did not have time to train every day, and when he did he squeezed in his running on his hour-long midday break at the hospital. He had no coach or trainer or entourage, only a group of running partners who called themselves "the Paddington lunch time club." In a typical workout, they might run ten consecutive quarter miles--ten laps--with perhaps two minutes of recovery between each repetition, then gobble down lunch and hurry back to work. Today, that training session would be considered barely adequate for a high-school miler. A month or so before his historic mile, Bannister took a few days off to go hiking in Scotland. Five days before he broke the four-minute barrier, he stopped running entirely, in order to rest. The day before the race, he slipped and fell on his hip while working in the hospital. Then he ran the most famous race in the history of track and field. Bannister was what runners admiringly call an "animal," a natural.
El Guerrouj, by contrast, trains five hours a day, in two two-and-a-half-hour sessions. He probably has a team of half a dozen people working with him: at the very least, a masseur, a doctor, a coach, an agent, and a nutritionist. He is not in medical school. He does not go hiking in rocky terrain before major track meets. When Bannister told him, last summer, how he had prepared for his four-minute mile, El Guerrouj was stunned. "For me, a rest day is perhaps when I train in the morning and spend the afternoon at the cinema," he said. El Guerrouj certainly has more than his share of natural ability, but his achievements are a reflection of much more than that: of the fact that he is better coached and better prepared than his opponents, that he trains harder and more intelligently, that he has found a way to stay injury free, and that he can recover so quickly from one day of five-hour workouts that he can follow it, the next day, with another five-hour workout.
Of these two paradigms, we have always been much more comfortable with the first: we want the relation between talent and achievement to be transparent, and we worry about the way ability is now so aggressively managed and augmented. Steroids bother us because they violate the honesty of effort: they permit an athlete to train too hard, beyond what seems reasonable. EPO fails the same test. For years, athletes underwent high-altitude training sessions, which had the same effect as EPO--promoting the manufacture of additional red blood cells. This was considered acceptable, while EPO is not, because we like to distinguish between those advantages which are natural or earned and those which come out of a vial.
Even as we assert this distinction on the playing field, though, we defy it in our own lives. We have come to prefer a world where the distractable take Ritalin, the depressed take Prozac, and the unattractive get cosmetic surgery to a world ruled, arbitrarily, by those fortunate few who were born focussed, happy, and beautiful. Cosmetic surgery is not "earned" beauty, but then natural beauty isn't earned, either. One of the principal contributions of the late twentieth century was the moral deregulation of social competition--the insistence that advantages derived from artificial and extraordinary intervention are no less legitimate than the advantages of nature. All that athletes want, for better or worse, is the chance to play by those same rules.
Operation Rescue
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
September 17, 2001
COMMENT
One of the most striking aspects of the automobile industry is the precision with which it makes calculations of life and death. The head restraint on the back of a car seat has been determined to reduce an occupant's risk of dying in an accident by 0.36 per cent. The steel beams in a car's side doors cut fatalities by 1.7 per cent. The use of a seat belt in a right-front collision reduces the chances of a front-seat passenger's being killed through ejection by fourteen per cent, with a margin of error of plus or minus one per cent. When auto engineers discuss these numbers, they use detailed charts and draw curves on quadrille paper, understanding that it is through the exact and dispassionate measurement of fatality effects and the resulting technical tinkering that human lives are saved. They could wax philosophical about the sanctity of life, but what would that accomplish? Sometimes progress in matters of social policy occurs when the moralizers step back and the tinkerers step forward. In the face of the right-to-life debate in the countryand show trials like the Bush Administration's recent handling of the stem-cell controversy, it's worth wondering what would happen if those involved in that debate were to learn the same lesson.
Suppose, for example, that, instead of focussing on the legality of abortion, we focussed on the number of abortions in this country. That's the kind of thing that tinkerers do: they focus not on the formal status of social phenomena but on their prevalence. And the prevalence of abortion in this country is striking. In 1995, for example, American adolescents terminated pregnancies at a rate roughly a third greater than their Canadian, English, and Swedish counterparts, around triple that of French teen-agers, and six times that of Dutch and Italian adolescents.
This is not because abortions are more readily available in America. The European countries with the lowest abortion rates are almost all places where abortions are easier to get than they are in the United States. And it's not because pregnant European teen-agers are more likely to carry a child to term than Americans. (If anything, the opposite is true.) Nor is it because American teen-agers have more sex than Europeans: sexual behavior, in the two places, appears to be much the same. American teen-agers have more abortions because they get pregnant more than anyone else: they simply don't use enough birth control.
Bringing the numbers down is by no means an insurmountable problem. Many Western European countries managed to reduce birth rates among teen-agers by more than seventy per cent between 1970 and 1995, and reproductive-health specialists say that there's no reason we couldn't follow suit. Since the early nineteen-seventies, for instance, the federal Title X program has funded thousands of family-planning clinics around the country, and in the past twenty years the program has been responsible for preventing an estimated nine million abortions. It could easily be expanded. There is also solid evidence that a comprehensive, national sex-education curriculum could help to reduce unintended pregnancies still further. If these steps succeeded in bringing our teen-age-pregnancy rates into line with those in Canada and England, the number of abortions in this country could drop by about five hundred thousand a year. For those who believe that a fetus is a human being, this is like saying that if we could find a few hundred million dollars, and face the fact that, yes, teen-agers have sex, we could save the equivalent of the population of Arizona within a decade.
But this is not, unfortunately, the way things are viewed in Washington. Since the eighties, Title X has been under constant attack. Taking inflation into account, its level of funding is now about sixty per cent lower than it was twenty years ago, and the Bush Administration's budget appropriation does little to correct that shortfall. As for sex education, the President's stated preference is that a curriculum instructing teen-agers to abstain from sex be given parity with forms of sex education that mention the option of contraception. The chief distinguishing feature of abstinence-only programs is that there's no solid evidence that they do any good. The right's squeamishness about sex has turned America into the abortion capital of the West.
But, then, this is the same movement that considered Ronald Reagan to be an ally and Bill Clinton a foe. And what does the record actually show? In the eight years of President Reagan's Administration, there was an average of 1.6 million abortions a year; by the end of President Clinton's first term, when the White House was much more favorably disposed toward the kinds of policies that are now anathema in Washington, that annual figure had dropped by more than two hundred thousand. A tinkerer would look at those numbers and wonder whether we need a new definition of "pro-life."
Safety in the Skies
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
October 1, 2001
ANNALS OF AVIATION
How far can airline safety go?
1.
On November 24, 1971, a man in a dark suit, white shirt, and sunglasses bought a ticket in the name of Dan Cooper on the 2:50 P.M. Northwest Orient flight from Portland to Seattle. Once aboard the plane, he passed a note to a flight attendant. He was carrying a bomb, he said, and he wanted two hundred thousand dollars, four parachutes, and "no funny stuff." In Seattle, the passengers and flight attendants were allowed to leave, and the F.B.I. handed over the parachutes and the money in used twenty-dollar bills. Cooper then told the pilot to fly slowly at ten thousand feet in the direction of Nevada, and not long after takeoff, somewhere over southwest Washington, he gathered up the ransom, lowered the plane's back stairs, and parachuted into the night.
In the aftermath of Cooper's leap, "para-jacking," as it was known, became an epidemic in American skies. Of the thirty-one hijackings in the United States the following year, nineteen were attempts at Cooper-style extortion, and in fifteen of those cases the hijackers demanded parachutes so that they, too, could leap to freedom. It was a crime wave unlike any America had seen, and in response Boeing installed a special latch on its 727 model which prevented the tail stairs from being lowered in flight. The latch was known as the Cooper Vane, and it seemed, at the time, to be an effective response to the reign of terror in the skies. Of course, it was not. The Cooper Vane just forced hijackers to come up with ideas other than parachuting out of planes.
This is the great paradox of law enforcement. The better we are at preventing and solving the crimes before us, the more audacious criminals become. Put alarms and improved locks on cars, and criminals turn to the more dangerous sport of carjacking. Put guards and bulletproof screens in banks, and bank robbery gets taken over by high-tech hackers. In the face of resistance, crime falls in frequency but rises in severity, and few events better illustrate this tradeoff than the hijackings of September 11th. The way in which those four planes were commandeered that Tuesday did not simply reflect a failure of our security measures; it reflected their success. When you get very good at cracking down on ordinary hijacking -- when you lock the stairs at the back of the aircraft with a Cooper Vane -- what you are left with is extraordinary hijacking.
2.
The first serious push for airport security began in late 1972, in the wake of a bizarre hijacking of a DC-9 flight out of Birmingham, Alabama. A group of three men -- one an escaped convict and two awaiting trial for rape -- demanded a ransom of ten million dollars and had the pilot circle the Oak Ridge, Tennessee, nuclear facility for five hours, threatening to crash the plane if their demands were not met. Until that point, security at airports had been minimal, but, as the director of the Federal Aviation Administration said at the time, "The Oak Ridge odyssey has cleared the air." In December of that year, the airlines were given sixty days to post armed security officers at passenger-boarding checkpoints. On January 5, 1973, all passengers and all carry-on luggage were required by law to be screened, and X-ray machines and metal detectors began to be installed in airports.
For a time, the number of hijackings dropped significantly. But it soon became clear that the battle to make flying safer was only beginning. In the 1985 hijacking of TWA Flight 847 out of Athens -- which lasted seventeen days -- terrorists bypassed the X-ray machines and the metal detectors by using members of the cleaning staff to stash guns and grenades in a washroom of the plane. In response, the airlines started to require background checks and accreditation of ground crews. In 1986, El Al security officers at London's Heathrow Airport found ten pounds of high explosives in the luggage of an unwitting and pregnant Irish girl, which had been placed there by her Palestinian boyfriend. Now all passengers are asked if they packed their bags themselves. In a string of bombings in the mid-eighties, terrorists began checking explosives-filled bags onto planes without boarding the planes themselves. Airlines responded by introducing "bag matching" on international flights -- stipulating that no luggage can be loaded on a plane unless its owner is on board as well. As an additional safety measure, the airlines started X-raying and searching checked bags for explosives. But in the 1988 bombing of Pan Am Flight 103 over Lockerbie, Scotland, terrorists beat that system by hiding plastic explosives inside a radio. As a result, the airlines have now largely switched to using CT scanners, a variant of the kind used in medical care, which take a three-dimensional picture of the interior of every piece of luggage and screen it with pattern-recognition software. The days when someone could stroll onto a plane with a bag full of explosives are long gone.
3.
These are the security obstacles that confront terrorists planning an attack on an airline. They can't bomb an international flight with a checked bag, because they know that there is a good chance the bag will be intercepted. They can't check the bag and run, because the bomb will never get on board. And they can't hijack the plane with a gun, because there is no sure way of getting that weapon on board. The contemporary hijacker, in other words, must either be capable of devising a weapon that can get past security or be willing to go down with the plane. Most terrorists have neither the cleverness to meet the first criterion nor the audacity to meet the second, which is why the total number of hijackings has been falling for the past thirty years. During the nineties, in fact, the number of civil aviation "incidents" worldwide -- hijackings, bombings, shootings, attacks, and so forth -- dropped by more than seventy per cent. But this is where the law -- enforcement paradox comes in: Even as the number of terrorist acts has diminished, the number of people killed in hijackings and bombings has steadily increased. And, despite all the improvements in airport security, the percentage of terrorist hijackings foiled by airport security in the years between 1987 and 1996 was at its lowest point in thirty years. Airport-security measures have simply chased out the amateurs and left the clever and the audacious. "A look at the history of attacks on commercial aviation reveals that new terrorist methods of attack have virtually never been foreseen by security authorities," the Israeli terrorism expert Ariel Merari writes, in the recent book "Aviation Terrorism and Security."
The security system was caught by surprise when an airliner was first hijacked for political extortion; it was unprepared when an airliner was attacked on the tarmac by a terrorist team firing automatic weapons; when terrorists, who arrived as passengers, collected their luggage from the conveyer belt, took out weapons from their suitcases, and strafed the crowd in the arrivals hall; when a parcel bomb sent by mail exploded in an airliner's cargo hold in mid-flight; when a bomb was brought on board by an unwitting passenger. . . . The history of attacks on aviation is the chronicle of a cat-and-mouse game, where the cat is busy blocking old holes and the mouse always succeeds in finding new ones.
And no hole was bigger than the one found on September 11th.
4.
What the attackers understood was the structural weakness of the passenger-gate security checkpoint, particularly when it came to the detection of knives. Hand-luggage checkpoints use X-ray machines, which do a good job of picking out a large, dense, and predictable object like a gun. Now imagine looking at a photograph of a knife. From the side, the shape is unmistakable. But if the blade edge is directly facing the camera what you'll see is just a thin line. "If you stand the knife on its edge, it could be anything," says Harry Martz, who directs the Center for Nondestructive Characterization at Lawrence Livermore Laboratories. "It could be a steel ruler. Then you put in computers, hair dryers, pens, clothes hangers, and it makes it even more difficult to pick up the pattern."
The challenge of detecting something like a knife blade is made harder still by the psychological demands on X-ray operators. What they are looking for -- weapons -- is called the "signal," and a well-documented principle of human-factors research is that as the "signal rate" declines, detection accuracy declines as well. If there was a gun in every second bag, for instance, you could expect the signals to be detected with almost perfect accuracy: the X-ray operator would be on his toes. But guns are almost never found in bags, which means that the vigilance of the operator inevitably falters. This is a significant problem in many fields, from nuclear-plant inspection to quality-control in manufacturing plants -- where the job of catching defects on, say, a car becomes harder and harder as cars become better made. "I've studied this in people who look for cracks in the rotor disks of airplane engines," says Colin Drury, a human-factors specialist at the University of Buffalo. "Remember the DC-10 crash at Sioux City? That was a rotor disk. Well, the probability of that kind of crack happening is incredibly small. Most inspectors won't see one in their lifetime, so it's very difficult to remain alert to that." The F.A.A. periodically plants weapons in baggage to see whether they are detected. But it's not clear what effect that kind of test has on vigilance. In the wake of the September attacks, some commentators called for increased training for X-ray security operators. Yet the problem is not just a lack of expertise; it is the paucity of signals. "Better training is only going to get you so far," explains Douglas Harris, chairman of Anacapa Sciences, a California-based human-factors firm. "If it now takes a day to teach people the techniques they need, adding another day isn't going to make much difference."
A sophisticated terrorist wanting to smuggle knives on board, in other words, has a good shot at "gaming" the X-ray machine by packing his bags cleverly and exploiting the limitations of the operator. If he chooses, he can also beat the metal detector by concealing on his person knives made of ceramic or plastic, which wouldn't trip the alarm. The knife strategy has its drawbacks, of course. It's an open question how long a group of terrorists armed only with knives can hold off a cabin full of passengers. But if all they need is to make a short flight from Boston to downtown Manhattan knives would suffice.
5.
Can we close the loopholes that led to the September 11th attack? Logistically, an all-encompassing security system is probably impossible. A new safety protocol that adds thirty seconds to the check-in time of every passenger would add more than three hours to the preparation time for a 747, assuming that there are no additional checkpoints. Reforms that further encumber the country's already overstressed air-traffic system are hardly reforms; they are self-inflicted wounds. People have suggested that we station armed federal marshals on more flights. This could be an obstacle for some terrorists but an opportunity for others, who could overcome a marshal to gain possession of a firearm.
What we ought to do is beef up security for a small percentage of passengers deemed to be high-risk. The airlines already have in place a screening technology of this sort, known as CAPPS -- Computer-Assisted Passenger Prescreening System. When a ticket is purchased on a domestic flight in the United States, the passenger is rated according to approximately forty pieces of data. Though the parameters are classified, they appear to include the traveller's address, credit history, and destination; whether he or she is travelling alone; whether the ticket was paid for in cash; how long before the departure it was bought; and whether it is one way. (A recent review by the Department of Justice affirmed that the criteria are not discriminatory on the basis of ethnicity.) A sixty-eight-year-old male who lives on Park Avenue, has a fifty-thousand-dollar limit on his credit card, and has flown on the Washington-New York shuttle twice a week for the past eight years, for instance, is never going to get flagged by the CAPPS system. Probably no more than a handful of people per domestic flight ever are, but those few have their checked luggage treated with the kind of scrutiny that, until this month, was reserved for international flights. Their bags are screened for explosives and held until the passengers are actually on board. It would be an easy step to use the CAPPS ratings at the gate as well. Those dubbed high-risk could have their hand luggage scrutinized by the slower but much more comprehensive CT scanner, which would make hiding knives or other weapons in hand luggage all but impossible.
At the same time, high-risk passengers could be asked to undergo an electronic strip search known as a body scan. In a conventional X-ray, the rays pass through the body, leaving an imprint on a detector on the other side. In a body scanner, the X-rays are much weaker, penetrating clothes but not the body, so they bounce back and leave an imprint of whatever lies on the surface of the skin. A body scanner would have picked up a ceramic knife in an instant. Focussing on a smaller group of high-risk people would have the additional benefit of improving the detection accuracy of the security staff: it would raise the signal rate.
We may never know, of course, whether an expanded CAPPS system would have flagged the September 11th terrorists, but certainly those who planned the attack would have had to take that possibility seriously. The chief distinction between American and Israeli airport defense, at the moment, is that the American system focusses on technological examination of the baggage while the Israeli system focusses on personal interrogation and assessment of the passenger -- which has resulted in El Al's having an almost unblemished record against bombings and hijackings over the past twenty years. Wider use of CAPPS profiling would correct that shortcoming, and narrow still further the options available for any would-be terrorist. But we shouldn't delude ourselves that these steps will end hijackings, any more than the Cooper Vane did thirty years ago. Better law enforcement doesn't eliminate crime. It forces the criminals who remain to come up with something else. And, as we have just been reminded, that something else, all too frequently, is something worse.
The Scourge You Know
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
October 29, 2001
CONTAGIONS
If you are wondering what to worry about when it comes to biological weapons, you should concern yourself, first of all, with things that are easy to deliver. Biological agents are really dangerous only when they can reach lots of people, and very few bioweapons can easily do that. In 1990, members of Japan's Aum Shinrikyo cult drove around the Parliament buildings in Tokyo in an automobile rigged to disseminate botulinum toxin. It didn't work. The same group also tried, repeatedly, to release anthrax from a rooftop, and that didn't work, either. It's simply too complicated to make anthrax in the fine, "mist" form that is the most lethal. And the spores are destroyed so quickly by sunlight that any kind of mass administration of anthrax is extremely difficult.
A much scarier biological weapon would be something contagious: something a few infected people could spread, unwittingly, in ever widening and more terrifying circles. Even with a contagious agent, though, you don't really have to worry about pathogens that are what scientists call stable--that are easy to identify and that don't change from place to place or year to year--because those kinds of biological agents are easy to defend against. That's why you shouldn't worry quite so much about smallpox. Deadly as it is, smallpox is so well understood that the vaccine is readily made and extraordinarily effective, and works for decades. If we wanted to, we could all be inoculated against smallpox in a matter of years.
What you really should worry about, then, is something that is highly contagious and highly unstable, a biological agent that kills lots of people and isn't easy to treat, that mutates so rapidly that each new bout of terror requires a brand-new vaccine. What you should worry about, in other words, is the influenza virus.
If there is an irony to America's current frenzy over anthrax and biological warfare--the paralyzed mailrooms, the endless talk-show discussions, the hoarding of antibiotics, and the closed halls of Congress--it is that it has occurred right at the beginning of the flu season, the time each year when the democracies of the West are routinely visited by one of the most deadly of all biological agents. This year, around twenty thousand Americans will die of the flu, and if this is one of those years, like 1957 or 1968, when we experience an influenza pandemic, that number may hit fifty thousand. The victims will primarily be the very old and the very young, although there will be a significant number of otherwise healthy young adults among them, including many pregnant women. All will die horrible deaths, racked by raging fevers, infections, headaches, chills, and sweats. And the afflicted, as they suffer, will pass their illness on to others, creating a wave of sickness that will cost the country billions of dollars. Influenza "quietly kills tens of thousands of people every year," Edwin Kilbourne, a research professor at New York Medical College and one of the country's leading flu experts, says. "And those who don't die are incapacitated for weeks. It mounts a silent and pervasive assault."
That we have chosen to worry more about anthrax than about the flu is hardly surprising. The novel is always scarier than the familiar, and the flu virus, as far as we know, isn't being sent through the mail by terrorists. But it is a strange kind of public-health policy that concerns itself more with the provenance of illness than with its consequences; and the consequences of the flu, year in, year out, dwarf everything but the most alarmist bioterror scenarios. If even a fraction of the energy and effort now being marshalled against anthrax were directed instead at the flu, we could save thousands of lives. Kilbourne estimates that at least half the deaths each year from the flu are probably preventable: vaccination rates among those most at risk under the age of fifty are a shameful twenty-three per cent, and for asthmatic children, who are also at high risk, the vaccination rate is ten per cent. And vaccination has been shown to save money: the costs of hospitalization for those who get sick far exceed the costs of inoculating everyone else. Why, under the circumstances, this country hasn't mounted an aggressive flu-vaccination program is a question that Congress might want to consider, when it returns to its newly fumigated, anthrax-free chambers. Not all threats to health and happiness come from terrorists in faraway countries. Many are the result of what, through simple indifference, we do to ourselves.
THE ARCHIVE
complete list
Articles from the New Yorker
Smaller
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
November 26, 2001
ANNALS OF TECHNOLOGY
The disposable diaper and the meaning of progress.
1.
The best way to explore the mystery of the Huggies Ultratrim disposable diaper is to unfold it and then cut it in half, widthwise, across what is known as the diaper's chassis. At Kimberly-Clark's Lakeview plant, in Neenah, Wisconsin, where virtually all the Huggies in the Midwest are made, there is a quality-control specialist who does this all day long, culling diapers from the production line, pinning them up against a lightboard, and carefully dismembering them with a pair of scissors. There is someone else who does a "visual cull," randomly picking out Huggies and turning them over to check for flaws. But a surface examination tells you little. A diaper is not like a computer that makes satisfying burbling noises from time to time, hinting at great inner complexity. It feels like papery underwear wrapped around a thin roll of Cottonelle. But peel away the soft fabric on the top side of the diaper, the liner, which receives what those in the trade delicately refer to as the "insult." You'll find a layer of what's called polyfilm, which is thinner than a strip of Scotch tape. This layer is one of the reasons the garment stays dry: it has pores that are large enough to let air flow in, so the diaper can breathe, but small enough to keep water from flowing out, so the diaper doesn't leak.
Or run your hands along that liner. It feels like cloth. In fact, the people at Kimberly-Clark make the liner out of a special form of plastic, a polyresin. But they don't melt the plastic into a sheet, as one would for a plastic bag. They spin the resin into individual fibres, and then use the fibres to create a kind of microscopic funnel, channelling the insult toward the long, thick rectangular pad that runs down the center of the chassis, known as the absorbent core. A typical insult arrives at a rate of seven millilitres a second, and might total seventy millilitres of fluid. The liner can clear that insult in less than twenty seconds. The core can hold three or more of those insults, with a chance of leakage in the single digits. The baby's skin will remain almost perfectly dry, and that is critical, because prolonged contact between the baby and the insult (in particular, ammonium hydroxide, a breakdown product of urine) is what causes diaper rash. And all this will be accomplished by a throwaway garment measuring, in the newborn size, just seven by thirteen inches. This is the mystery of the modern disposable diaper: how does something so small do so much?
2.
Thirty-seven years ago, the Silicon Valley pioneer Gordon Moore made a famous prediction. The number of transistors that engineers could fit onto a microchip, he said, would double every two years. It seemed like a foolhardy claim: it was not clear that you could keep making transistors smaller and smaller indefinitely. It also wasn't clear that it would make sense to do so. Most of the time when we make things smaller, after all, we pay a price. A smaller car is cheaper and more fuel-efficient, and easier to park and maneuver, but it will never be as safe as a larger car. In the nineteen-fifties and sixties, the transistor radio was all the rage; it could fit inside your pocket and run on a handful of batteries. But, because it was so small, the sound was terrible, and virtually all the other mini-electronics turn out to be similarly imperfect. Tiny cell phones are hard to dial. Tiny televisions are hard to watch. In making an object smaller, we typically compromise its performance. The remarkable thing about chips, though, was that there was no drawback: if you could fit more and more transistors onto a microchip, then instead of using ten or twenty or a hundred microchips for a task you could use just one. This meant, in turn, that you could fit microchips in all kinds of places (such as cellular phones and laptops) that you couldn't before, and, because you were using one chip and not a hundred, computer power could be had at a fraction of the price, and because chips were now everywhere and in such demand they became even cheaper to make--and so on and so on. Moore's Law, as it came to be called, describes that rare case in which there is no trade-off between size and performance. Microchips are what might be termed a perfect innovation.
In the past twenty years, diapers have got smaller and smaller, too. In the early eighties, they were three times bulkier than they are now, thicker and substantially wider in the crotch. But in the mid-eighties Huggies and Procter & Gamble's Pampers were reduced in bulk by fifty per cent; in the mid-nineties they shrank by a third or so; and in the next few years they may shrink still more. It seems reasonable that there should have been a downside to this, just as there was to the shrinking of cars and radios: how could you reduce the amount of padding in a diaper and not, in some way, compromise its ability to handle an insult? Yet, as diapers got smaller, they got better, and that fact elevates the diaper above nearly all the thousands of other products on the supermarket shelf.
Kimberly-Clark's Lakeview plant is a huge facility, just down the freeway from Green Bay. Inside, it is as immaculate as a hospital operating room. The walls and floors have been scrubbed white. The stainless-steel machinery gleams. The employees are dressed in dark-blue pants, starched light-blue button-down shirts, and tissue-paper caps. There are rows of machines in the plant, each costing more than fifteen million dollars--a dizzying combination of conveyor belts and whirling gears and chutes stretching as long as a city block and creating such a din that everyone on the factory floor wears headsets and communicates by radio. Computers monitor a million data points along the way, insuring that each of those components is precisely cut and attached according to principles and processes and materials protected, on the Huggies Ultratrim alone, by hundreds of patents. At the end of the line, the Huggies come gliding out of the machine, stacked upright, one after another in an endless row, looking like exquisitely formed slices of white bread in a toast rack. For years, because of Moore's Law, we have considered the microchip the embodiment of the technological age. But if the diaper is also a perfect innovation, doesn't it deserve a place beside the chip?
3.
The modern disposable diaper was invented twice, first by Victor Mills and then by Carlyle Harmon and Billy Gene Harper. Mills worked for Procter & Gamble, and he was a legend. Ivory soap used to be made in an expensive and time-consuming batch-by-batch method. Mills figured out a simpler, continuous process. Duncan Hines cake mixes used to have a problem blending flour, sugar, and shortening in a consistent mixture. Mills introduced the machines used for milling soap, which ground the ingredients much more finely than before, and the result was New, Improved Duncan Hines cake mix. Ever wonder why Pringles, unlike other potato chips, are all exactly the same shape? Because they are made like soap: the potato is ground into a slurry, then pressed, baked, and wrapped--and that was Victor Mills's idea, too.
In 1957, Procter & Gamble bought the Charmin Paper Company, of Green Bay, Wisconsin, and Mills was told to think of new products for the paper business. Since he was a grandfather--and had always hated washing diapers--he thought of a disposable diaper. "One of the early researchers told me that among the first things they did was go out to a toy store and buy one of those Betsy Wetsy-type dolls, where you put water in the mouth and it comes out the other end," Ed Rider, the head of the archives department at Procter & Gamble, says. "They brought it back to the lab, hooked up its legs on a treadmill to make it walk, and tested diapers on it." The end result was Pampers, which were launched in Peoria, in 1961. The diaper had a simple rectangular shape. Its liner, which lay against the baby's skin, was made of rayon. The outside material was plastic. In between were multiple layers of crĂŞped tissue. The diaper was attached with pins and featured what was known as a Z fold, meaning that the edges of the inner side were pleated, to provide a better fit around the legs.
In 1968, Kimberly-Clark brought out Kimbies, which took the rectangular diaper and shaped it to more closely fit a baby's body. In 1976, Procter & Gamble brought out Luvs, which elasticized the leg openings to prevent leakage. But diapers still adhered to the basic Millsian notion of an absorbent core made out of paper--and that was a problem. When paper gets wet, the fluid soaks right through, which makes diaper rash worse. And if you put any kind of pressure on paper--if you squeeze it, or sit on it--it will surrender some of the water it has absorbed, which creates further difficulties, because a baby, in the usual course of squirming and crawling and walking, might place as much as five kilopascals of pressure on the absorbent core of a diaper. Diaper-makers tried to address this shortcoming by moving from crĂŞped tissue to what they called fluff, which was basically finely shredded cellulose. Then they began to compensate for paper's failing by adding more and more of it, until diapers became huge. But they now had Moore's Law in reverse: in order to get better, they had to get bigger--and bigger still wasn't very good.
Carlyle Harmon worked for Johnson & Johnson and Billy Gene Harper worked for Dow Chemical, and they had a solution. In 1966, each filed separate but virtually identical patent applications, proposing that the best way to solve the diaper puzzle was with a peculiar polymer that came in the form of little pepperlike flakes and had the remarkable ability to absorb up to three hundred times its weight in water.
In the Dow patent, Harper and his team described how they sprinkled two grams of the superabsorbent polymer between two twenty-inch-square sheets of nylon broadcloth, and then quilted the nylon layers together. The makeshift diaper was "thereafter put into use in personal management of a baby of approximately 6 months age." After four hours, the diaper was removed. It now weighed a hundred and twenty grams, meaning the flakes had soaked up sixty times their weight in urine.
Harper and Harmon argued that it was quite unnecessary to solve the paper problem by stuffing the core of the diaper with thicker and thicker rolls of shredded pulp. Just a handful of superabsorbent polymer would do the job. Thus was the modern diaper born. Since the mid-eighties, Kimberly-Clark and Procter & Gamble have made diapers the Harper and Harmon way, pulling out paper and replacing it with superabsorbent polymer. The old, paper-filled diaper could hold, at most, two hundred and seventy-five millilitres of fluid, or a little more than a cup. Today, a diaper full of superabsorbent polymer can handle as much as five hundred millilitres, almost twice that. The chief characteristic of the Mills diaper was its simplicity: the insult fell directly into the core. But the presence of the polymer has made the diaper far more complex. It takes longer for the polymer than it does paper to fully absorb an insult, for instance. So another component was added, the acquisition layer, between the liner and the core. The acquisition layer acts like blotting paper, holding the insult while the core slowly does its work, and distributing the fluid over its full length.
Diaper researchers sometimes perform what is called a re-wet test, where they pour a hundred millilitres of fluid onto the surface of a diaper and then apply a piece of filter paper to the diaper liner with five kilopascals of pressure--the average load a baby would apply to a diaper during ordinary use. In a contemporary superabsorbent diaper, like a Huggies or a Pampers, the filter paper will come away untouched after one insult. After two insults, there might be 0.1 millilitres of fluid on the paper. After three insults, the diaper will surrender, at most, only two millilitres of moisture--which is to say that, with the aid of superabsorbents, a pair of Huggies or Pampers can effortlessly hold, even under pressure, a baby's entire night's work.
The heir to the legacy of Billy Gene Harper at Dow Chemical is Fredric Buchholz, who works in Midland, Michigan, a small town two hours northwest of Detroit, where Dow has its headquarters. His laboratory is in the middle of the sprawling chemical works, a mile or two away from corporate headquarters, in a low, unassuming brick building. "We still don't understand perfectly how these polymers work," Buchholz said on a recent fall afternoon. What we do know, he said, is that superabsorbent polymers appear, on a microscopic level, to be like a tightly bundled fisherman's net. In the presence of water, that net doesn't break apart into thousands of pieces and dissolve, like sugar. Rather, it just unravels, the way a net would open up if you shook it out, and as it does the water gets stuck in the webbing. That ability to hold huge amounts of water, he said, could make superabsorbent polymers useful in fire fighting or irrigation, because slightly gelled water is more likely to stay where it's needed. There are superabsorbents mixed in with the sealant on the walls of the Chunnel between England and France, so if water leaks in the polymer will absorb the water and plug the hole.
Right now, one of the major challenges facing diaper technology, Buchholz said, is that urine is salty, and salt impairs the unravelling of the netting: superabsorbents can handle only a tenth as much salt water as fresh water. "One idea is to remove the salt from urine. Maybe you could have a purifying screen," he said. If the molecular structure of the superabsorbent were optimized, he went on, its absorptive capacity could increase by another five hundred per cent. "Superabsorbents could go from absorbing three hundred times their weight to absorbing fifteen hundred times their weight. We could have just one perfect particle of super-absorbent in a diaper. If you are going to dream, why not make the diaper as thin as a pair of underwear?"
Buchholz was in his laboratory, and he held up a small plastic cup filled with a few tablespoons of superabsorbent flakes, each not much larger than a grain of salt. "It's just a granular material, totally nontoxic," he said. "This is about two grams." He walked over to the sink and filled a large beaker with tap water, and poured the contents of the beaker into the jar of superabsorbent. At first, nothing happened. The amounts were so disproportionate that it looked as if the water would simply engulf the flakes. But, slowly and steadily, the water began to thicken. "Look," Buchholz said. "It's becoming soupy." Sure enough, little beads of gel were forming. Nothing else was happening: there was no gas given off, no burbling or sizzling as the chemical process took place. The superabsorbent polymer was simply swallowing up the water, and within minutes the contents of the cup had thickened into what looked like slightly lumpy, spongy pudding. Buchholz picked up the jar and tilted it, to show that nothing at all was coming out. He pushed and prodded the mass with his finger. The water had disappeared. To soak up that much liquid, the Victor Mills diaper would have needed a thick bundle of paper towelling. Buchholz had used a few tablespoons of superabsorbent flakes. Superabsorbent was not merely better; it was smaller.
4.
Why does it matter that the diaper got so small? It seems a trivial thing, chiefly a matter of convenience to the parent taking a bag of diapers home from the supermarket. But it turns out that size matters a great deal. There's a reason that there are now "new, improved concentrated" versions of laundry detergent, and that some cereals now come in smaller boxes. Smallness is one of those changes that send ripples through the whole economy. The old disposable diapers, for example, created a transportation problem. Tractor-trailers are prohibited by law from weighing more than eighty thousand pounds when loaded. That's why a truck carrying something heavy and compact like bottled water or Campbell's soup is "full," when the truck itself is still half empty. But the diaper of the eighties was what is known as a "high cube" item. It was bulky and not very heavy, meaning that a diaper truck was full before it reached its weight limit. By cutting the size of a diaper in half, companies could fit twice as many diapers on a truck, and cut transportation expenses in half. They could also cut the amount of warehouse space and labor they needed in half. And companies could begin to rethink their manufacturing operations. "Distribution costs used to force you to have plants in lots of places," Dudley Lehman, who heads the Kimberly-Clark diaper business, says. "As that becomes less and less of an issue, you say, 'Do I really need all my plants?' In the United States, it used to take eight. Now it takes five." (Kimberly-Clark didn't close any plants. But other manufacturers did, and here, perhaps, is a partial explanation for the great wave of corporate restructuring that swept across America in the late eighties and early nineties: firms could downsize their workforce because they had downsized their products.) And, because using five plants to make diapers is more efficient than using eight, it became possible to improve diapers without raising diaper prices--which is important, because the sheer number of diapers parents have to buy makes it a price-sensitive product. Until recently, diapers were fastened with little pieces of tape, and if the person changing the diapers got lotion or powder on her fingers the tape wouldn't work. A hook-and-loop, Velcro-like fastener doesn't have this problem. But it was years before the hook-and-loop fastener was incorporated into the diaper chassis: until over-all manufacturing costs were reduced, it was just too expensive.
Most important, though, is how size affects the way diapers are sold. The shelves along the aisles of a supermarket are divided into increments of four feet, and the space devoted to a given product category is almost always a multiple of that. Diapers, for example, might be presented as a twenty-foot set. But when diapers were at their bulkiest the space reserved for them was never enough. "You could only get a limited number on the shelf," says Sue Klug, the president of Catalina Marketing Solutions and a former executive for Albertson's and Safeway. "Say you only had six bags. Someone comes in and buys a few, and then someone else comes in and buys a few more. Now you're out of stock until someone reworks the shelf, which in some supermarkets might be a day or two." Out-of-stock rates are already a huge problem in the retail business. At any given time, only about ninety-two per cent of the products that a store is supposed to be carrying are actually on the shelf--which, if you consider that the average supermarket has thirty-five thousand items, works out to twenty-eight hundred products that are simply not there. (For a highly efficient retailer like Wal-Mart, in-stock rates might be as high as ninety-nine per cent; for a struggling firm, they might be in the low eighties.) But, for a fast-moving, bulky item like diapers, the problem of restocking was much worse. Supermarkets could have allocated more shelf space to diapers, of course, but diapers aren't a particularly profitable category for retailers--profit margins are about half what they are for the grocery department. So retailers would much rather give more shelf space to a growing and lucrative category like bottled water. "It's all a trade-off," Klug says. "If you expand diapers four feet, you've got to give up four feet of something else." The only way diaper-makers could insure that their products would actually be on the shelves was to make the products smaller, so they could fit twelve bags into the space of six. And if you can fit twelve bags on a shelf, you can introduce different kinds of diapers. You can add pull-ups and premium diapers and low-cost private-label diapers, all of which give parents more options.
"We cut the cost of trucking in half," says Ralph Drayer, who was in charge of logistics for Procter & Gamble for many years and now runs his own supply-chain consultancy in Cincinnati. "We cut the cost of storage in half. We cut handling in half, and we cut the cost of the store shelf in half, which is probably the most expensive space in the whole chain." Everything in the diaper world, from plant closings and trucking routes to product improvements and consumer choice and convenience, turns, in the end, on the fact that Harmon and Harper's absorbent core was smaller than Victor Mills's.
The shame of it, though, is that Harmon and Harper have never been properly celebrated for their accomplishment. Victor Mills is the famous one. When he died, he was given a Times obituary, in which he was called "the father of disposable diapers." When Carlyle Harmon died, seven months earlier, he got four hundred words in Utah's Deseret News, stressing his contributions to the Mormon Church. We tend to credit those who create an idea, not those who perfect it, forgetting that it is often only in the perfection of an idea that true progress occurs. Putting sixty-four transistors on a chip allowed people to dream of the future. Putting four million transistors on a chip actually gave them the future. The diaper is no different. The paper diaper changed parenting. But a diaper that could hold four insults without leakage, keep a baby's skin dry, clear an insult in twenty seconds flat, and would nearly always be in stock, even if you arrived at the supermarket at eight o'clock in the evening--and that would keep getting better at all those things, year in and year out--was another thing altogether. This was more than a good idea. This was something like perfection.
THE ARCHIVE
complete list
Articles from the New Yorker
Examined Life
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
December 17, 2001
A CRITIC AT LARGE
What Stanley H. Kaplan taught us about the SAT
1.
Once, in fourth grade, Stanley Kaplan got a B-plus on his report card and was so stunned that he wandered aimlessly around the neighborhood, ashamed to show his mother. This was in Brooklyn, on Avenue K in Flatbush, between the wars. Kaplan's father, Julius, was from Slutsk, in Belorussia, and ran a plumbing and heating business. His mother, Ericka, ninety pounds and four feet eight, was the granddaughter of the chief rabbi of the synagogue of Prague, and Stanley loved to sit next to her on the front porch, immersed in his schoolbooks while his friends were off playing stickball. Stanley Kaplan had Mrs. Holman for fifth grade, and when she quizzed the class on math equations, he would shout out the answers. If other students were having problems, Stanley would take out pencil and paper and pull them aside. He would offer them a dime, sometimes, if they would just sit and listen. In high school, he would take over algebra class, and the other kids, passing him in the hall, would call him Teach. One classmate, Aimee Rubin, was having so much trouble with math that she was in danger of being dropped from the National Honor Society. Kaplan offered to help her, and she scored a ninety-five on her next exam. He tutored a troubled eleven-year-old named Bob Linker, and Bob Linker ended up a successful businessman. In Kaplan?s sophomore year at City College, he got a C in biology and was so certain that there had been a mistake that he marched in to see the professor and proved that his true grade, an A, had accidentally been switched with that of another, not quite so studious, Stanley Kaplan. Thereafter, he became Stanley H. Kaplan, and when people asked him what the "H" stood for he would say "Higher scores!" or, with a sly wink, "Preparation!" He graduated Phi Beta Kappa and hung a shingle outside his parent's house on Avenue K, "Stanley H. Kaplan Educational Center," and started tutoring kids in the basement. In 1946, a high-school junior named Elizabeth, from Coney Island, came to him for help on an exam he was unfamiliar with. It was called the Scholastic Aptitude Test, and from that moment forward the business of getting into college in America was never quite the same.
The S.A.T., at that point, was just beginning to go into widespread use. Unlike existing academic exams, it was intended to measure innate ability--not what a student had learned but what a student was capable of learning--and it stated clearly in the instructions that "cramming or last-minute reviewing" was pointless. Kaplan was puzzled. In Flatbush you always studied for tests. He gave Elizabeth pages of math problems and reading-comprehension drills. He grilled her over and over, doing what the S.A.T. said should not be done. And what happened? On test day, she found the S.A.T. "a piece of cake," and promptly told all her friends, and her friends told their friends, and soon word of Stanley H. Kaplan had spread throughout Brooklyn.
A few years later, Kaplan married Rita Gwirtzman, who had grown up a mile away, and in 1951 they moved to a two-story brick-and-stucco house on Bedford Avenue, a block from his alma mater, James Madison High School. He renovated his basement, dividing it into classrooms. When the basement got too crowded, he rented a podiatrist's office near King's Highway, at the Brighton Beach subway stop. In the nineteen-seventies, he went national, setting up educational programs throughout the country, creating an S.A.T.-preparation industry that soon became crowded with tutoring companies and study manuals. Kaplan has now written a memoir, "Test Pilot" (Simon & Schuster; $19), which has as its subtitle "How I Broke Testing Barriers for Millions of Students and Caused a Sonic Boom in the Business of Education." That actually understates his importance. Stanley Kaplan changed the rules of the game.
2.
The S.A.T. is now seventy-five years old, and it is in trouble. Earlier this year, the University of California--the nation's largest public-university system--stunned the educational world by proposing a move toward a "holistic" admissions system, which would mean abandoning its heavy reliance on standardized-test scores. The school backed up its proposal with a devastating statistical analysis, arguing that the S.A.T. is virtually useless as a tool for making admissions decisions.
The report focussed on what is called predictive validity, a statistical measure of how well a high-school student's performance in any given test or program predicts his or her performance as a college freshman. If you wanted to, for instance, you could calculate the predictive validity of prowess at Scrabble, or the number of books a student reads in his senior year, or, more obviously, high-school grades. What the Educational Testing Service (which creates the S.A.T.) and the College Board (which oversees it) have always argued is that most performance measures are so subjective and unreliable that only by adding aptitude-test scores into the admissions equation can a college be sure it is picking the right students.
This is what the U.C. study disputed. It compared the predictive validity of three numbers: a student's high-school G.P.A., his or her score on the S.A.T. (or, as it is formally known, the S.A.T. I), and his or her score on what is known as the S.A.T. II, which is a so-called achievement test, aimed at gauging mastery of specific areas of the high-school curriculum. Drawing on the transcripts of seventy-eight thousand University of California freshmen from 1996 through 1999, the report found that, over all, the most useful statistic in predicting freshman grades was the S.A.T. II, which explained sixteen per cent of the "variance" (which is another measure of predictive validity). The second most useful was high-school G.P.A., at 15.4 per cent. The S.A.T. was the least useful, at 13.3 per cent. Combining high-school G.P.A. and the S.A.T. II explained 22.2 per cent of the variance in freshman grades. Adding in S.A.T. I scores increased that number by only 0.1 per cent. Nor was the S.A.T. better at what one would have thought was its strong suit: identifying high-potential students from bad schools. In fact, the study found that achievement tests were ten times more useful than the S.A.T. in predicting the success of students from similar backgrounds. "Achievement tests are fairer to students because they measure accomplishment rather than promise," Richard Atkinson, the president of the University of California, told a conference on college admissions last month. "They can be used to improve performance; they are less vulnerable to charges of cultural or socioeconomic bias; and they are more appropriate for schools because they set clear curricular guidelines and clarify what is important for students to learn. Most important, they tell students that a college education is within the reach of anyone with the talent and determination to succeed."
This argument has been made before, of course. The S.A.T. has been under attack, for one reason or another, since its inception. But what is happening now is different. The University of California is one of the largest single customers of the S.A.T. It was the U.C. system's decision, in 1968, to adopt the S.A.T. that affirmed the test's national prominence in the first place. If U.C. defects from the S.A.T., it is not hard to imagine it being followed by a stampede of other colleges. Seventy-five years ago, the S.A.T. was instituted because we were more interested, as a society, in what a student was capable of learning than in what he had already learned. Now, apparently, we have changed our minds, and few people bear more responsibility for that shift than Stanley H. Kaplan.
3.
From the moment he set up shop on Avenue K, Stanley Kaplan was a pariah in the educational world. Once, in 1956, he went to a meeting for parents and teachers at a local high school to discuss the upcoming S.A.T., and one of the teachers leading the meeting pointed his finger at Kaplan and shouted, "I refuse to continue until THAT MAN leaves the room." When Kaplan claimed that his students routinely improved their scores by a hundred points or more, he was denounced by the testing establishment as a "quack" and "the cram king" and a "snake oil salesman." At the Educational Testing Service, "it was a cherished assumption that the S.A.T. was uncoachable," Nicholas Lemann writes in his history of the S.A.T., "The Big Test":
The whole idea of psychometrics was that mental tests are a measurement of a psychical property of the brain, analogous to taking a blood sample. By definition, the test-taker could not affect the result. More particularly, E.T.S.' s main point of pride about the S.A.T. was its extremely high test-retest reliability, one of the best that any standardized test had ever achieved... . So confident of the S.A.T.'s reliability was E.T.S. that the basic technique it developed for catching cheaters was simply to compare first and second scores, and to mount an investigation in the case of any very large increase. E.T.S. was sure that substantially increasing one's score could be accomplished only by nefarious means.
But Kaplan wasn't cheating. His great contribution was to prove that the S.A.T. was eminently coachable--that whatever it was that the test was measuring was less like a blood sample than like a heart rate, a vital sign that could be altered through the right exercises. In those days, for instance, the test was a secret. Students walking in to take the S.A.T. were often in a state of terrified ignorance about what to expect. (It wasn't until the early eighties that the E.T.S. was forced to release copies of old test questions to the public.) So Kaplan would have "Thank Goodness It's Over" pizza parties after each S.A.T. As his students talked about the questions they had faced, he and his staff would listen and take notes, trying to get a sense of how better to structure their coaching. "Every night I stayed up past midnight writing new questions and study materials," he writes. "I spent hours trying to understand the design of the test, trying to think like the test makers, anticipating the types of questions my students would face." His notes were typed up the next day, cranked out on a Gestetner machine, hung to dry in the office, then snatched off the line and given to waiting students. If students knew what the S.A.T. was like, he reasoned, they would be more confident. They could skip the instructions and save time. They could learn how to pace themselves. They would guess more intelligently. (For a question with five choices, a right answer is worth one point but a wrong answer results in minus one-quarter of a point--which is why students were always warned that guessing was penalized. In reality, of course, if a student can eliminate even one obviously wrong possibility from the list of choices, guessing becomes an intelligent strategy.) The S.A.T. was a test devised by a particular institution, by a particular kind of person, operating from a particular mind-set. It had an ideology, and Kaplan realized that anyone who understood that ideology would have a tremendous advantage.
Critics of the S.A.T. have long made a kind of parlor game of seeing how many questions on the reading-comprehension section (where a passage is followed by a series of multiple-choice questions about its meaning) can be answered without reading the passage. David Owen, in the anti-S.A.T. account "None of the Above," gives the following example, adapted from an actual S.A.T. exam:
1.
The main idea of the passage is that:
A) a constricted view of [this novel] is natural and acceptable
B) a novel should not depict a vanished society
C) a good novel is an intellectual rather than an emotional experience
D) many readers have seen only the comedy [in this novel]
E) [this novel] should be read with sensitivity and an open mind
If you've never seen an S.A.T. before, it might be difficult to guess the right answer. But if, through practice and exposure, you have managed to assimilate the ideology of the S.A.T.--the kind of decent, middlebrow earnestness that permeates the testit's possible to develop a kind of gut feeling for the right answer, the confidence to predict, in the pressure and rush of examination time, what the S.A.T. is looking for. A is suspiciously postmodern. B is far too dogmatic. C is something that you would never say to an eager, college-bound student. Is it D? Perhaps, but D seems too small a point. It's probably E--and, sure enough, it is.
With that in mind, try this question:
2.
The author of [this passage] implies that a work of art is properly judged on the basis of its:
A) universality of human experience truthfully recorded
B) popularity and critical acclaim in its own age
C) openness to varied interpretations, including seemingly contradictory ones
D) avoidance of political and social issues of minor importance
E) continued popularity through different eras and with different societies
Is it any surprise that the answer is A? Bob Schaeffer, the public education director of the anti-test group FairTest, says that when he got a copy of the latest version of the S.A.T. the first thing he did was try the reading comprehension section blind. He got twelve out of thirteen questions right. The math portion of the S.A.T. is perhaps a better example of how coachable the test can be. Here is another question, cited by Owen, from an old S.A.T.:
In how many different color combinations can 3 balls be painted if each ball is painted one color and there are 3 colors available? (Order is not considered; e.g. red, blue, red is considered the same combination as red, red, blue.)
A) 4
B) 6
C) 9
D) 10
E) 27
This was, Owen points out, the twenty-fifth question in a twenty-five-question math section. S.A.T.s--like virtually all standardized tests--rank their math questions from easiest to hardest. If the hardest questions came first, the theory goes, weaker students would be so intimidated as they began the test that they might throw up their hands in despair. So this is a "hard" question. The second thing to understand about the S.A.T. is that it only really works if good students get the hard questions right and poor students get the hard questions wrong. If anyone can guess or blunder his way into the right answer to a hard question, then the test isn't doing its job. So this is the second clue: the answer to this question must not be something that an average student might blunder into answering correctly. With these two facts in mind, Owen says, don't focus on the question. Just look at the numbers: there are three balls and three colors. The average student is most likely to guess by doing one of three things--adding three and three, multiplying three times three, or, if he is feeling more adventurous, multiplying three by three by three. So six, nine, and twenty-seven are out. That leaves four and ten. Now, he says, read the problem. It can't be four, since anyone can think of more than four combinations. The correct answer must be D, 10.
Does being able to answer that question mean that a student has a greater "aptitude" for math? Of course not. It just means that he had a clever teacher. Kaplan once determined that the testmakers were fond of geometric problems involving the Pythagorean theorem. So an entire generation of Kaplan students were taught "boo, boo, boo, square root of two," to help them remember how the Pythagorean formula applies to an isosceles right triangle. "It was usually not lack of ability," Kaplan writes, "but poor study habits, inadequate instruction or a combination of the two that jeopardized students' performance." The S.A.T. was not an aptitude test at all.
4.
In proving that the S.A.T. was coachable, Stanley Kaplan did something else, which was of even greater importance. He undermined the use of aptitude tests as a means of social engineering. In the years immediately before and after the First World War, for instance, the country's Ă©lite colleges faced what became known as "the Jewish problem." They were being inundated with the children of Eastern European Jewish immigrants. These students came from the lower middle class and they disrupted the genteel Wasp sensibility that had been so much a part of the Ivy League tradition. They were guilty of "underliving and overworking." In the words of one writer, they "worked far into each night [and] their lessons next morning were letter perfect." They were "socially untrained," one Harvard professor wrote, "and their bodily habits are not good." But how could a college keep Jews out? Columbia University had a policy that the New York State Regents Examinations--the statewide curriculum-based high-school-graduation examination--could be used as the basis for admission, and the plain truth was that Jews did extraordinarily well on the Regents Exams. One solution was simply to put a quota on the number of Jews, which is what Harvard explored. The other idea, which Columbia followed, was to require applicants to take an aptitude test. According to Herbert Hawkes, the dean of Columbia College during this period, because the typical Jewish student was simply a "grind," who excelled on the Regents Exams because he worked so hard, a test of innate intelligence would put him back in his place. "We have not eliminated boys because they were Jews and do not propose to do so," Hawkes wrote in 1918: We have honestly attempted to eliminate the lowest grade of applicant and it turns out that a good many of the low grade men are New York City Jews. It is a fact that boys of foreign parentage who have no background in many cases attempt to educate themselves beyond their intelligence. Their accomplishment is over 100% of their ability on account of their tremendous energy and ambition. I do not believe however that a College would do well to admit too many men of low mentality who have ambition but not brains.
Today, Hawkes's anti-Semitism seems absurd, but he was by no means the last person to look to aptitude tests as a means of separating ambition from brains. The great selling point of the S.A.T. has always been that it promises to reveal whether the high-school senior with a 3.0 G.P.A. is someone who could have done much better if he had been properly educated or someone who is already at the limit of his abilities. We want to know that information because, like Hawkes, we prefer naturals to grinds: we think that people who achieve based on vast reserves of innate ability are somehow more promising and more worthy than those who simply work hard.
But is this distinction real? Some years ago, a group headed by the British psychologist John Sloboda conducted a study of musical talent. The group looked at two hundred and fifty-six young musicians, between the ages of ten and sixteen, drawn from Ă©lite music academies and public-school music programs alike. They interviewed all the students and their parents and recorded how each student did in England's national music-examination system, which, the researchers felt, gave them a relatively objective measure of musical ability. "What we found was that the best predictor of where you were on that scale was the number of hours practiced," Sloboda says. This is, if you think about it, a little hard to believe. We conceive musical ability to be a "talent"--people have an aptitude for music--and so it would make sense that some number of students could excel at the music exam without practicing very much. Yet Sloboda couldn't find any. The kids who scored the best on the test were, on average, practicing eight hundred per cent more than the kids at the bottom. "People have this idea that there are those who learn better than others, can get further on less effort,"Sloboda says. "On average, our data refuted that. Whether you're a dropout or at the best school, where you end up can be predicted by how much you practice."
Sloboda found another striking similarity among the "musical" children. They all had parents who were unusually invested in their musical education. It wasn't necessarily the case that the parents were themselves musicians or musically inclined. It was simply that they wanted their children to be that way. "The parents of the high achievers did things that most parents just don't do," he said. "They didn' t simply drop their child at the door of the teacher. They went into the practice room. They took notes on what the teacher said, and when they got home they would say, Remember when your teacher said do this and that. There was a huge amount of time and motivational investment by the parents." Does this mean that there is no such thing as musical talent? Of course not. Most of those hardworking children with pushy parents aren't going to turn out to be Itzhak Perlmans; some will be second violinists in their community orchestra. The point is that when it comes to a relatively well-defined and structured task--like playing an instrument or taking an exam--how hard you work and how supportive your parents are have a lot more to do with success than we ordinarily imagine. Ability cannot be separated from effort. The testmakers never understood that, which is why they thought they could weed out the grinds. But educators increasingly do, and that is why college admissions are now in such upheaval. The Texas state-university system, for example, has, since 1997, automatically admitted any student who places in the top ten per cent of his or her high-school class--regardless of S.A.T. score. Critics of the policy said that it would open the door to students from marginal schools whose S.A.T. scores would normally have been too low for admission to the University of Texas--and that is exactly what happened. But so what? The "top ten percenters," as they are known, may have lower S.A.T. scores, but they get excellent grades. In fact, their college G.P.A.s are the equal of students who scored two hundred to three hundred points higher on the S.A.T. In other words, the determination and hard work that propel someone to the top of his high-school class--even in cases where that high school is impoverished--are more important to succeeding in college (and, for that matter, in life) than whatever abstract quality the S.A.T. purports to measure. The importance of the Texas experience cannot be overstated. Here, at last, is an intelligent alternative to affirmative action, a way to find successful minority students without sacrificing academic performance. But we would never have got this far without Stanley Kaplan--without someone first coming along and puncturing the mystique of the S.A.T. "Acquiring test-taking skills is the same as learning to play the piano or ride a bicycle,"Kaplan writes. "It requires practice, practice, practice. Repetition breeds familiarity. Familiarity breeds confidence." In this, as in so many things, the grind was the natural.
To read Kaplan's memoir is to be struck by what a representative figure he was in the postwar sociological miracle that was Jewish Brooklyn. This is the lower-middle-class, second- and third-generation immigrant world, stretching from Prospect Park to Sheepshead Bay, that ended up peopling the upper reaches of American professional life. Thousands of students from those neighborhoods made their way through Kaplan's classroom in the fifties and sixties, many along what Kaplan calls the "heavily traveled path" from Brooklyn to Cornell, Yale, and the University of Michigan. Kaplan writes of one student who increased his score by three hundred and forty points, and ended up with a Ph.D. and a position as a scientist at Xerox. "Debbie" improved her S.A.T. by five hundred points, got into the University of Chicago, and earned a Ph.D. in clinical psychology. Arthur Levine, the president of Teachers College at Columbia University, raised his S.A.T.s by two hundred and eighty-two points, "making it possible," he writes on the book?s jacket, "for me to attend a better university than I ever would have imagined." Charles Schumer, the senior senator from New York, studied while he worked the mimeograph machine in Kaplan's office, and ended up with close to a perfect sixteen hundred.
These students faced a system designed to thwart the hard worker, and what did they do? They got together with their pushy parents and outworked it. Kaplan says that he knew a "strapping athlete who became physically ill before taking the S.A.T. because his mother was so demanding." There was the mother who called him to say, "Mr. Kaplan, I think I'm going to commit suicide. My son made only a 1000 on the S.A.T." "One mother wanted her straight-A son to have an extra edge, so she brought him to my basement for years for private tutoring in basic subjects," Kaplan recalls. "He was extremely bright and today is one of the country' s most successful ophthalmologists." Another student was "so nervous that his mother accompanied him to class armed with a supply of terry-cloth towels. She stood outside the classroom and when he emerged from our class sessions dripping in sweat, she wiped him dry and then nudged him back into the classroom." Then, of course, there was the formidable four-foot-eight figure of Ericka Kaplan, granddaughter of the chief rabbi of the synagogue of Prague. "My mother was a perfectionist whether she was keeping the company books or setting the dinner table," Kaplan writes, still in her thrall today. "She was my best cheerleader, the reason I performed so well, and I constantly strove to please her." What chance did even the most artfully constructed S.A.T. have against the mothers of Brooklyn?
5.
Stanley Kaplan graduated No. 2 in his class at City College, and won the school's Award for Excellence in Natural Sciences. He wanted to be a doctor, and he applied to five medical schools, confident that he would be accepted. To his shock, he was rejected by every single one. Medical schools did not take public colleges like City College seriously. More important, in the forties there was a limit to how many Jews they were willing to accept. "The term meritocracy--or success based on merit rather than heritage, wealth, or social status?wasn?t even coined yet," Kaplan writes, "and the methods of selecting students based on talent, not privilege, were still evolving."
That's why Stanley Kaplan was always pained by those who thought that what went on in his basement was somehow subversive. He loved the S.A.T. He thought that the test gave people like him the best chance of overcoming discrimination. As he saw it, he was simply giving the middle-class students of Brooklyn the same shot at a bright future that their counterparts in the private schools of Manhattan had. In 1983, after years of hostility, the College Board invited him to speak at its annual convention. It was one of the highlights of Kaplan's life. "Never, in my wildest dreams," he began, "did I ever think I'd be speaking to you here today."
The truth is, however, that Stanley Kaplan was wrong. What he did in his basement was subversive. The S.A.T. was designed as an abstract intellectual tool. It never occurred to its makers that aptitude was a social matter: that what people were capable of was affected by what they knew, and what they knew was affected by what they were taught, and what they were taught was affected by the industry of their teachers and parents. And if what the S.A.T. was measuring, in no small part, was the industry of teachers and parents, then what did it mean? Stanley Kaplan may have loved the S.A.T. But when he stood up and recited "boo, boo, boo, square root of two," he killed it.
THE ARCHIVE
complete list
Articles from the New Yorker
The Social Life of Paper
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
March 25, 2002
BOOKS
Looking for method in the mess.
1.
On a busy day, a typical air-traffic controller might be in charge of as many as twenty-five airplanes at a time--some ascending, some descending, each at a different altitude and travelling at a different speed. He peers at a large, monochromatic radar console, tracking the movement of tiny tagged blips moving slowly across the screen. He talks to the sector where a plane is headed, and talks to the pilots passing through his sector, and talks to the other controllers about any new traffic on the horizon. And, as a controller juggles all those planes overhead, he scribbles notes on little pieces of paper, moving them around on his desk as he does. Air-traffic control depends on computers and radar. It also depends, heavily, on paper and ink.
When people talk about the need to modernize the American air-traffic-control system, this is, in large part, what they are referring to. Whenever a plane takes off, the basic data about the flight -- the type of plane, the radar I.D. number, the requested altitude, the destination -- are printed out on a stiff piece of paper, perhaps one and a half by six and a half inches, known as a flight strip. And as the plane passes through each sector of the airspace the controller jots down, using a kind of shorthand, everything new that is happening to the plane -- its speed, say, and where it's heading, clearances from ground control, holding instructions, comments on the pilot. It's a method that dates back to the days before radar, and it drives critics of the air-traffic-control system crazy. Why, in this day and age, are planes being handled like breakfast orders in a roadside diner?
This is one of the great puzzles of the modern workplace. Computer technology was supposed to replace paper. But that hasn't happened. Every country in the Western world uses more paper today, on a per-capita basis, than it did ten years ago. The consumption of uncoated free-sheet paper, for instance -- the most common kind of office paper -- rose almost fifteen per cent in the United States between 1995 and 2000. This is generally taken as evidence of how hard it is to eradicate old, wasteful habits and of how stubbornly resistant we are to the efficiencies offered by computerization. A number of cognitive psychologists and ergonomics experts, however, don't agree. Paper has persisted, they argue, for very good reasons: when it comes to performing certain kinds of cognitive tasks, paper has many advantages over computers. The dismay people feel at the sight of a messy desk -- or the spectacle of air-traffic controllers tracking flights through notes scribbled on paper strips -- arises from a fundamental confusion about the role that paper plays in our lives.
2.
The case for paper is made most eloquently in "The Myth of the Paperless Office" (M.I.T.; $24.95), by two social scientists, Abigail Sellen and Richard Harper. They begin their book with an account of a study they conducted at the International Monetary Fund, in Washington, D.C. Economists at the I.M.F. spend most of their time writing reports on complicated economic questions, work that would seem to be perfectly suited to sitting in front of a computer. Nonetheless, the I.M.F. is awash in paper, and Sellen and Harper wanted to find out why. Their answer is that the business of writing reports -- at least at the I.M.F -- is an intensely collaborative process, involving the professional judgments and contributions of many people. The economists bring drafts of reports to conference rooms, spread out the relevant pages, and negotiate changes with one other. They go back to their offices and jot down comments in the margin, taking advantage of the freedom offered by the informality of the handwritten note. Then they deliver the annotated draft to the author in person, taking him, page by page, through the suggested changes. At the end of the process, the author spreads out all the pages with comments on his desk and starts to enter them on the computer -- moving the pages around as he works, organizing and reorganizing, saving and discarding.
Without paper, this kind of collaborative, iterative work process would be much more difficult. According to Sellen and Harper, paper has a unique set of "affordances" -- that is, qualities that permit specific kinds of uses. Paper is tangible: we can pick up a document, flip through it, read little bits here and there, and quickly get a sense of it. (In another study on reading habits, Sellen and Harper observed that in the workplace, people almost never read a document sequentially, from beginning to end, the way they would read a novel.) Paper is spatially flexible, meaning that we can spread it out and arrange it in the way that suits us best. And it's tailorable: we can easily annotate it, and scribble on it as we read, without altering the original text. Digital documents, of course, have their own affordances. They can be easily searched, shared, stored, accessed remotely, and linked to other relevant material. But they lack the affordances that really matter to a group of people working together on a report. Sellen and Harper write:
Because paper is a physical embodiment of information, actions performed in relation to paper are, to a large extent, made visible to one's colleagues. Reviewers sitting around a desk could tell whether a colleague was turning toward or away from a report; whether she was flicking through it or setting it aside. Contrast this with watching someone across a desk looking at a document on a laptop. What are they looking at? Where in the document are they? Are they really reading their e-mail? Knowing these things is important because they help a group coördinate its discussions and reach a shared understanding of what is being discussed.
3.
Paper enables a certain kind of thinking. Picture, for instance, the top of your desk. Chances are that you have a keyboard and a computer screen off to one side, and a clear space roughly eighteen inches square in front of your chair. What covers the rest of the desktop is probably piles -- piles of papers, journals, magazines, binders, postcards, videotapes, and all the other artifacts of the knowledge economy. The piles look like a mess, but they aren't. When a group at Apple Computer studied piling behavior several years ago, they found that even the most disorderly piles usually make perfect sense to the piler, and that office workers could hold forth in great detail about the precise history and meaning of their piles. The pile closest to the cleared, eighteen-inch-square working area, for example, generally represents the most urgent business, and within that pile the most important document of all is likely to be at the top. Piles are living, breathing archives. Over time, they get broken down and resorted, sometimes chronologically and sometimes thematically and sometimes chronologically and thematically; clues about certain documents may be physically embedded in the file by, say, stacking a certain piece of paper at an angle or inserting dividers into the stack.
But why do we pile documents instead of filing them? Because piles represent the process of active, ongoing thinking. The psychologist Alison Kidd, whose research Sellen and Harper refer to extensively, argues that "knowledge workers" use the physical space of the desktop to hold "ideas which they cannot yet categorize or even decide how they might use." The messy desk is not necessarily a sign of disorganization. It may be a sign of complexity: those who deal with many unresolved ideas simultaneously cannot sort and file the papers on their desks, because they haven't yet sorted and filed the ideas in their head. Kidd writes that many of the people she talked to use the papers on their desks as contextual cues to "recover a complex set of threads without difficulty and delay" when they come in on a Monday morning, or after their work has been interrupted by a phone call. What we see when we look at the piles on our desks is, in a sense, the contents of our brains.
Sellen and Harper arrived at similar findings when they did some consulting work with a chocolate manufacturer. The people in the firm they were most interested in were the buyers -- the staff who handled the company's relationships with its venders, from cocoa and sugar manufacturers to advertisers. The buyers kept folders (containing contracts, correspondence, meeting notes, and so forth) on every supplier they had dealings with. The company wanted to move the information in those documents online, to save space and money, and make it easier for everyone in the firm to have access to it. That sounds like an eminently rational thing to do. But when Sellen and Harper looked at the folders they discovered that they contained all kinds of idiosyncratic material -- advertising paraphernalia, printouts of e-mails, presentation notes, and letters -- much of which had been annotated in the margins with thoughts and amendments and, they write, "perhaps most important, comments about problems and issues with a supplier's performance not intended for the supplier's eyes." The information in each folder was organized -- if it was organized at all -- according to the whims of the particular buyer. Whenever other people wanted to look at a document, they generally had to be walked through it by the buyer who "owned" it, because it simply wouldn't make sense otherwise. The much advertised advantage of digitizing documents -- that they could be made available to anyone, at any time -- was illusory: documents cannot speak for themselves. "All of this emphasized that most of what constituted a buyer's expertise resulted from involvement with the buyer's own suppliers through a long history of phone calls and meetings," Sellen and Harper write:
The correspondence, notes, and other documents such discussions would produce formed a significant part of the documents buyers kept. These materials therefore supported rather than constituted the expertise of the buyers. In other words, the knowledge existed not so much in the documents as in the heads of the people who owned them -- in their memories of what the documents were, in their knowledge of the history of that supplier relationship, and in the recollections that were prompted whenever they went through the files.
4.
This idea that paper facilitates a highly specialized cognitive and social process is a far cry from the way we have historically thought about the stuff. Paper first began to proliferate in the workplace in the late nineteenth century as part of the move toward "systematic management." To cope with the complexity of the industrial economy, managers were instituting company-wide policies and demanding monthly, weekly, or even daily updates from their subordinates. Thus was born the monthly sales report, and the office manual and the internal company newsletter. The typewriter took off in the eighteen-eighties, making it possible to create documents in a fraction of the time it had previously taken, and that was followed closely by the advent of carbon paper, which meant that a typist could create ten copies of that document simultaneously. If you were, say, a railroad company, then you would now have a secretary at the company headquarters type up a schedule every week, setting out what train was travelling in what direction at what time, because in the mid-nineteenth century collisions were a terrible problem. Then the secretary would make ten carbon copies of that schedule and send them out to the stations along your railway line. Paper was important not to facilitate creative collaboration and thought but as an instrument of control.
Perhaps no one embodied this notion more than the turn-of-the-century reformer Melvil Dewey. Dewey has largely been forgotten by history, perhaps because he was such a nasty fellow -- an outspoken racist and anti-Semite -- but in his day he dominated America's thinking about the workplace. He invented the Dewey decimal system, which revolutionized the organization of libraries. He was an ardent advocate of shorthand and of the metric system, and was so obsessed with time-saving and simplification that he changed his first name from Melville to the more logical Melvil. (He also pushed for the adoption of "catalog" in place of "catalogue," and of "thruway" to describe major highways, a usage that survives to this day in New York State). Dewey's principal business was something called the Library Bureau, which was essentially the Office Depot of his day, selling card catalogues, cabinets, office chairs and tables, pre-printed business forms, and, most important, filing cabinets. Previously, businessmen had stored their documents in cumbersome cases, or folded and labelled the pieces of paper and stuck them in the pigeonholes of the secretary desks so common in the Victorian era. What Dewey proposed was essentially an enlarged version of a card catalogue, where paper documents hung vertically in long drawers.
The vertical file was a stunning accomplishment. In those efficiency-obsessed days, it prompted books and articles and debates and ended up winning a gold medal at the 1893 World's Fair, because it so neatly addressed the threat of disorder posed by the proliferation of paper. What good was that railroad schedule, after all, if it was lost on someone's desk? Now a railroad could buy one of Dewey's vertical filing cabinets, and put the schedule under "S," where everyone could find it. In "Scrolling Forward: Making Sense of Documents in the Digital Age" (Arcade; $24.95), the computer scientist David M. Levy argues that Dewey was the anti-Walt Whitman, and that his vision of regularizing and standardizing life ended up being just as big a component of the American psyche as Whitman's appeal to embrace the world just as it is. That seems absolutely right. The fact is, the thought of all those memos and reports and manuals made Dewey anxious, and that anxiety has never really gone away, even in the face of evidence that paper is no longer something to be anxious about.
When Thomas Edison invented the phonograph, for example, how did he imagine it would be used? As a dictation device that a businessman could pass around the office in place of a paper memo. In 1945, the computer pioneer Vannevar Bush imagined what he called a "memex" -- a mechanized library and filing cabinet, on which an office worker would store all his relevant information without the need for paper files at all. So, too, with the information-technology wizards who have descended on the workplace in recent years. Instead of a real desktop, they have offered us the computer desktop, where cookie-cutter icons run in orderly rows across a soothing background, implicitly promising to bring order to the chaos of our offices.
Sellen and Harper include in their book a photograph of an office piled high with stacks of paper. The occupant of the office -- a researcher in Xerox's European research facility -- was considered neither ineffective nor inefficient. Quite the contrary: he was, they tell us, legendary in being able to find any document in his office very quickly. But the managers of the laboratory were uncomfortable with his office because of what it said about their laboratory. They were, after all, an organization looking to develop digital workplace solutions. "They wanted to show that this was a workplace reaching out to the future rather than being trapped in an inefficient past," Sellen and Harper write. "Yet, if this individual's office was anything to go by, the reality was that this workplace of the future was full of paper." Whenever senior colleagues came by the office, then, the man with the messy desk was instructed to put his papers in boxes and hide them under the stairs. The irony is, of course, that it was not the researcher who was trapped in an inefficient past but the managers. They were captives of the nineteenth-century notion that paper was most useful when it was put away. They were channelling Melvil Dewey. But this is a different era. In the tasks that face modern knowledge workers, paper is most useful out in the open, where it can be shuffled and sorted and annotated and spread out. The mark of the contemporary office is not the file. It's the pile.
5.
Air-traffic controllers are quintessential knowledge workers. They perform a rarefied version of the task faced by the economists at the I.M.F. when they sit down at the computer with the comments and drafts of five other people spread around them, or the manager when she gets to her office on Monday morning, looks at the piles of papers on her desk, and tries to make sense of all the things she has to do in the coming week. When an air-traffic controller looks at his radar, he sees a two-dimensional picture of where the planes in his sector are. But what he needs to know is where his planes will be. He has to be able to take the evidence from radar, what he hears from the pilots and other controllers, and what he has written down on the flight strips in front of him, and construct a three-dimensional "picture" of all the planes in his sector. Psychologists call the ability to create that mental picture "situation awareness." "Situation awareness operates on three levels," says Mica Endsley, the president of S.A. Technologies, in Georgia, and perhaps the country's leading expert on the subject. "One is perceiving. Second is understanding what the information means -- analogous to reading comprehension. That's where you or I would have problems. We'd see the blips on the screen, and it wouldn't mean anything to us. The highest level, though, is projection -- the ability to predict which aircraft are coming in and when. You've got to be able to look into the future, probably by as much as five minutes."
Psychologists believe that those so-called flight strips play a major role in helping controllers achieve this situation awareness. Recently, for example, Wendy Mackay, a computer scientist now working in Paris, spent several months at an air-traffic-control facility near Orly Airport, in Paris. The French air-traffic-control system is virtually identical to the American system. One controller, the planning controller, is responsible for the radar. He has a partner, whose job is to alert the radar controller to incoming traffic, and what Mackay observed was how beautifully the strips enable efficient interaction between these two people. The planning controller, for instance, overhears what his partner is saying on the radio, and watches him annotate strips. If he has a new strip, he might keep it just out of his partner's visual field until it is relevant. "She [the planner] moves it into his peripheral view if the strip should be dealt with soon, but not immediately," Mackay writes. "If the problem is urgent, she will physically move it into his focal view, placing the strip on top of the stripboard or, rarely, inserting it."
Those strips moving in and out of the peripheral view of the controller serve as cognitive cues, which the controller uses to help keep the "picture" of his sector clear in his head. When taking over a control position, controllers touch and rearrange the strips in front of them. When they are given a new strip, they are forced mentally to register a new flight and the new traffic situation. By writing on the strips, they can off-load information, keeping their minds free to attend to other matters. The controller's flight strips are like the piles of paper on a desk: they are the physical manifestations of what goes on inside his head. Is it any wonder that the modernization of the air-traffic-control system has taken so long? No one wants to do anything that might disrupt that critical mental process.
This is, of course, a difficult conclusion for us to accept. Like the managers of the office-technology lab, we have in our heads the notion that an air-traffic-control center ought to be a pristine and gleaming place, full of the latest electronic gadgetry. We think of all those flight strips as cluttering and confusing the work of the office, and we fret about where all that paper will go. But, as Sellen and Harper point out, we needn't worry. It is only if paper's usefulness is in the information written directly on it that it must be stored. If its usefulness lies in the promotion of ongoing creative thinking, then, once that thinking is finished, the paper becomes superfluous. The solution to our paper problem, they write, is not to use less paper but to keep less paper. Why bother filing at all? Everything we know about the workplace suggests that few if any knowledge workers ever refer to documents again once they have filed them away, which should come as no surprise, since paper is a lousy way to archive information. It's too hard to search and it takes up too much space. Besides, we all have the best filing system ever invented, right there on our desks -- the personal computer. That is the irony of the P.C.: the workplace problem that it solves is the nineteenth-century anxiety. It's a better filing cabinet than the original vertical file, and if Dewey were alive today, he'd no doubt be working very happily in an information-technology department somewhere. The problem that paper solves, by contrast, is the problem that most concerns us today, which is how to support knowledge work. In fretting over paper, we have been tripped up by a historical accident of innovation, confused by the assumption that the most important invention is always the most recent. Had the computer come first -- and paper second -- no one would raise an eyebrow at the flight strips cluttering our air-traffic-control centers.
THE ARCHIVE
complete list
Articles from the New Yorker
Blowing Up
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
April 22 & 29, 2002
DEPARTMENT OF FINANCE
How Nassim Taleb turned the inevitability of disaster into an investment strategy
1.
One day in 1996, a Wall Street trader named Nassim Nicholas Taleb went to see Victor Niederhoffer. Victor Niederhoffer was one of the most successful money managers in the country. He lived and worked out of a thirteen-acre compound in Fairfield County, Connecticut, and when Taleb drove up that day from his home in Larchmont he had to give his name at the gate, and then make his way down a long, curving driveway. Niederhoffer had a squash court and a tennis court and a swimming pool and a colossal, faux-alpine mansion in which virtually every square inch of space was covered with eighteenth- and nineteenth-century American folk art. In those days, he played tennis regularly with the billionaire financier George Soros. He had just written a best-selling book, "The Education of a Speculator," dedicated to his father, Artie Niederhoffer, a police officer from Coney Island. He had a huge and eclectic library and a seemingly insatiable desire for knowledge. When Niederhoffer went to Harvard as an undergraduate, he showed up for the very first squash practice and announced that he would someday be the best in that sport; and, sure enough, he soon beat the legendary Shariff Khan to win the U.S. Open squash championship. That was the kind of man Niederhoffer was. He had heard of Taleb's growing reputation in the esoteric field of options trading, and summoned him to Connecticut. Taleb was in awe.
"He didn't talk much, so I observed him," Taleb recalls. "I spent seven hours watching him trade. Everyone else in his office was in his twenties, and he was in his fifties, and he had the most energy of them all. Then, after the markets closed, he went out to hit a thousand backhands on the tennis court." Taleb is Greek-Orthodox Lebanese and his first language was French, and in his pronunciation the name Niederhoffer comes out as the slightly more exotic Nieder hoffer. "Here was a guy living in a mansion with thousands of books, and that was my dream as a child," Taleb went on. "He was part chevalier, part scholar. My respect for him was intense." There was just one problem, however, and it is the key to understanding the strange path that Nassim Taleb has chosen, and the position he now holds as Wall Street's principal dissident. Despite his envy and admiration, he did not want to be Victor Niederhoffer -- not then, not now, and not even for a moment in between. For when he looked around him, at the books and the tennis court and the folk art on the walls -- when he contemplated the countless millions that Niederhoffer had made over the years -- he could not escape the thought that it might all have been the result of sheer, dumb luck.
Taleb knew how heretical that thought was. Wall Street was dedicated to the principle that when it came to playing the markets there was such a thing as expertise, that skill and insight mattered in investing just as skill and insight mattered in surgery and golf and flying fighter jets. Those who had the foresight to grasp the role that software would play in the modern world bought Microsoft in 1985, and made a fortune. Those who understood the psychology of investment bubbles sold their tech stocks at the end of 1999 and escaped the Nasdaq crash. Warren Buffett was known as the "sage of Omaha" because it seemed incontrovertible that if you started with nothing and ended up with billions then you had to be smarter than everyone else: Buffett was successful for a reason. Yet how could you know, Taleb wondered, whether that reason was responsible for someone's success, or simply a rationalization invented after the fact? George Soros seemed to be successful for a reason, too. He used to say that he followed something called "the theory of reflexivity." But then, later, Soros wrote that in most situations his theory "is so feeble that it can be safely ignored." An old trading partner of Taleb's, a man named Jean-Manuel Rozan, once spent an entire afternoon arguing about the stock market with Soros. Soros was vehemently bearish, and he had an elaborate theory to explain why, which turned out to be entirely wrong. The stock market boomed. Two years later, Rozan ran into Soros at a tennis tournament. "Do you remember our conversation?" Rozan asked. "I recall it very well," Soros replied. "I changed my mind, and made an absolute fortune." He changed his mind! The truest thing about Soros seemed to be what his son Robert had once said:
My father will sit down and give you theories to explain why he does this or that. But I remember seeing it as a kid and thinking, Jesus Christ, at least half of this is bullshit. I mean, you know the reason he changes his position on the market or whatever is because his back starts killing him. It has nothing to do with reason. He literally goes into a spasm, and it?s this early warning sign.
For Taleb, then, the question why someone was a success in the financial marketplace was vexing. Taleb could do the arithmetic in his head. Suppose that there were ten thousand investment managers out there, which is not an outlandish number, and that every year half of them, entirely by chance, made money and half of them, entirely by chance, lost money. And suppose that every year the losers were tossed out, and the game replayed with those who remained. At the end of five years, there would be three hundred and thirteen people who had made money in every one of those years, and after ten years there would be nine people who had made money every single year in a row, all out of pure luck. Niederhoffer, like Buffett and Soros, was a brilliant man. He had a Ph.D. in economics from the University of Chicago. He had pioneered the idea that through close mathematical analysis of patterns in the market an investor could identify profitable anomalies. But who was to say that he wasn't one of those lucky nine? And who was to say that in the eleventh year Niederhoffer would be one of the unlucky ones, who suddenly lost it all, who suddenly, as they say on Wall Street, "blew up"?
Taleb remembered his childhood in Lebanon and watching his country turn, as he puts it, from "paradise to hell" in six months. His family once owned vast tracts of land in northern Lebanon. All of that was gone. He remembered his grandfather, the former Deputy Prime Minister of Lebanon and the son of a Deputy Prime Minister of Lebanon and a man of great personal dignity, living out his days in a dowdy apartment in Athens. That was the problem with a world in which there was so much uncertainty about why things ended up the way they did: you never knew whether one day your luck would turn and it would all be washed away.
So here is what Taleb took from Niederhoffer. He saw that Niederhoffer was a serious athlete, and he decided that he would be, too. He would bicycle to work and exercise in the gym. Niederhoffer was a staunch empiricist, who turned to Taleb that day in Connecticut and said to him sternly, "Everything that can be tested must be tested," and so when Taleb started his own hedge fund, a few years later, he called it Empirica. But that is where it stopped. Nassim Taleb decided that he could not pursue an investment strategy that had any chance of blowing up.
2.
Nassim Taleb is a tall, muscular man in his early forties, with a salt-and-pepper beard and a balding head. His eyebrows are heavy and his nose is long. His skin has the olive hue of the Levant. He is a man of moods, and when his world turns dark the eyebrows come together and the eyes narrow and it is as if he were giving off an electrical charge. It is said, by some of his friends, that he looks like Salman Rushdie, although at his office his staff have pinned to the bulletin board a photograph of a mullah they swear is Taleb's long-lost twin, while Taleb himself maintains, wholly implausibly, that he resembles Sean Connery. He lives in a four-bedroom Tudor with twenty-six Russian Orthodox icons, nineteen Roman heads, and four thousand books, and he rises at dawn to spend an hour writing. He is the author of two books, the first a technical and highly regarded work on derivatives, and the second a treatise entitled "Fooled by Randomness," which was published last year and is to conventional Wall Street wisdom approximately what Martin Luther's ninety-five theses were to the Catholic Church. Some afternoons, he drives into the city and attends a philosophy lecture at City University. During the school year, in the evenings, he teaches a graduate course in finance at New York University, after which he can often be found at the bar at Odeon Café in Tribeca, holding forth, say, on the finer points of stochastic volatility or his veneration of the Greek poet C. P. Cavafy.
Taleb runs Empirica Capital out of an anonymous, concrete office park somewhere in the woods outside Greenwich, Connecticut. His offices consist, principally, of a trading floor about the size of a Manhattan studio apartment. Taleb sits in one corner, in front of a laptop, surrounded by the rest of his team -- Mark Spitznagel, the chief trader, another trader named Danny Tosto, a programmer named Winn Martin, and a graduate student named Pallop Angsupun. Mark Spitznagel is perhaps thirty. Win, Danny, and Pallop look as if they belonged in high school. The room has an overstuffed bookshelf in one corner, and a television muted and tuned to CNBC. There are two ancient Greek heads, one next to Taleb's computer and the other, somewhat bafflingly, on the floor, next to the door, as if it were being set out for the trash. There is almost nothing on the walls, except for a slightly battered poster for an exhibition of Greek artifacts, the snapshot of the mullah, and a small pen-and-ink drawing of the patron saint of Empirica Capital, the philosopher Karl Popper.
On a recent spring morning, the staff of Empirica were concerned with solving a thorny problem, having to do with the square root of n, where n is a given number of random set of observations, and what relation n might have to a speculator's confidence in his estimations. Taleb was up at a whiteboard by the door, his marker squeaking furiously as he scribbled possible solutions. Spitznagel and Pallop looked on intently. Spitznagel is blond and from the Midwest and does yoga: in contrast to Taleb, he exudes a certain laconic levelheadedness. In a bar, Taleb would pick a fight. Spitznagel would break it up. Pallop is of Thai extraction and is doing a Ph.D. in financial mathematics at Princeton. He has longish black hair, and a slightly quizzical air. "Pallop is very lazy," Taleb will remark, to no one in particular, several times over the course of the day, although this is said with such affection that it suggests that "laziness," in the Talebian nomenclature, is a synonym for genius. Pallop's computer was untouched and he often turned his chair around, so that he faced completely away from his desk. He was reading a book by the cognitive psychologists Amos Tversky and Daniel Kahneman, whose arguments, he said a bit disappointedly, were "not really quantifiable." The three argued back and forth about the solution. It appeared that Taleb might be wrong, but before the matter could be resolved the markets opened. Taleb returned to his desk and began to bicker with Spitznagel about what exactly would be put on the company boom box. Spitznagel plays the piano and the French horn and has appointed himself the Empirica d.j. He wanted to play Mahler, and Taleb does not like Mahler. "Mahler is not good for volatility," Taleb complained. "Bach is good. St. Matthew's Passion!" Taleb gestured toward Spitznagel, who was wearing a gray woollen turtleneck. "Look at him. He wants to be like von Karajan, like someone who wants to live in a castle. Technically superior to the rest of us. No chitchatting. Top skier. That's Mark!" As Spitznagel rolled his eyes, a man whom Taleb refers to, somewhat mysteriously, as Dr. Wu wandered in. Dr. Wu works for another hedge fund, down the hall, and is said to be brilliant. He is thin and squints through black-rimmed glasses. He was asked his opinion on the square root of n but declined to answer. "Dr. Wu comes here for intellectual kicks and to borrow books and to talk music with Mark," Taleb explained after their visitor had drifted away. He added darkly, "Dr. Wu is a Mahlerian."
Empirica follows a very particular investment strategy. It trades options, which is to say that it deals not in stocks and bonds but with bets on stocks and bonds. Imagine, for example, that General Motors stock is trading at fifty dollars, and imagine that you are a major investor on Wall Street. An options trader comes up to you with a proposition. What if, within the next three months, he decides to sell you a share of G.M. at forty-five dollars? How much would you charge for agreeing to buy it at that price? You would look at the history of G.M. and see that in a three-month period it has rarely dropped ten per cent, and obviously the trader is only going to make you buy his G.M. at forty-five dollars if the stock drops below that point. So you say you'll make that promise, or sell that option, for a relatively small fee, say, a dime. You are betting on the high probability that G.M. stock will stay relatively calm over the next three months, and if you are right you'll pocket the dime as pure profit. The trader, on the other hand, is betting on the unlikely event that G.M. stock will drop a lot, and if that happens his profits are potentially huge. If the trader bought a million options from you at a dime each and G.M. drops to thirty-five dollars, he'll buy a million shares at thirty-five dollars and turn around and force you to buy them at forty-five dollars, making himself suddenly very rich and you substantially poorer.
That particular transaction is called, in the argot of Wall Street, an "out-of-the-money option." But an option can be configured in a vast number of ways. You could sell the trader a G.M. option at thirty dollars, or, if you wanted to bet against G.M. stock going up, you could sell a G.M. option at sixty dollars. You could sell or buy options on bonds, on the S. & P. index, on foreign currencies or on mortgages, or on the relationship among any number of financial instruments of your choice; you can bet on the market booming, or the market crashing, or the market staying the same. Options allow investors to gamble heavily and turn one dollar into ten. They also allow investors to hedge their risk. The reason your pension fund may not be wiped out in the next crash is that it has protected itself by buying options. What drives the options game is the notion that the risks represented by all of these bets can be quantified; that by looking at the past behavior of G.M. you can figure out the exact chance of G.M. hitting forty-five dollars in the next three months, and whether at a dollar that option is a good or a bad investment. The process is a lot like the way insurance companies analyze actuarial statistics in order to figure out how much to charge for a life-insurance premium, and to make those calculations every investment bank has, on staff, a team of Ph.D.s, physicists from Russia, applied mathematicians from China, computer scientists from India. On Wall Street, those Ph.D.s are called "quants."
Nassim Taleb and his team at Empirica are quants. But they reject the quant orthodoxy, because they don't believe that things like the stock market behave in the way that physical phenomena like mortality statistics do. Physical events, whether death rates or poker games, are the predictable function of a limited and stable set of factors, and tend to follow what statisticians call a "normal distribution," a bell curve. But do the ups and downs of the market follow a bell curve? The economist Eugene Fama once studied stock prices and pointed out that if they followed a normal distribution you'd expect a really big jump, what he specified as a movement five standard deviations from the mean, once every seven thousand years. In fact, jumps of that magnitude happen in the stock market every three or four years, because investors don't behave with any kind of statistical orderliness. They change their mind. They do stupid things. They copy each other. They panic. Fama concluded that if you charted the ups and downs of the stock market the graph would have a "fat tail,"meaning that at the upper and lower ends of the distribution there would be many more outlying events than statisticians used to modelling the physical world would have imagined.
In the summer of 1997, Taleb predicted that hedge funds like Long Term Capital Management were headed for trouble, because they did not understand this notion of fat tails. Just a year later, L.T.C.M. sold an extraordinary number of options, because its computer models told it that the markets ought to be calming down. And what happened? The Russian government defaulted on its bonds; the markets went crazy; and in a matter of weeks L.T.C.M. was finished. Spitznagel, Taleb's head trader, says that he recently heard one of the former top executives of L.T.C.M. give a lecture in which he defended the gamble that the fund had made. "What he said was, Look, when I drive home every night in the fall I see all these leaves scattered around the base of the trees,?" Spitznagel recounts. "There is a statistical distribution that governs the way they fall, and I can be pretty accurate in figuring out what that distribution is going to be. But one day I came home and the leaves were in little piles. Does that falsify my theory that there are statistical rules governing how leaves fall? No. It was a man-made event." In other words, the Russians, by defaulting on their bonds, did something that they were not supposed to do, a once-in-a-lifetime, rule-breaking event. But this, to Taleb, is just the point: in the markets, unlike in the physical universe, the rules of the game can be changed. Central banks can decide to default on government-backed securities.
One of Taleb's earliest Wall Street mentors was a short-tempered Frenchman named Jean-Patrice, who dressed like a peacock and had an almost neurotic obsession with risk. Jean-Patrice would call Taleb from Regine's at three in the morning, or take a meeting in a Paris nightclub, sipping champagne and surrounded by scantily clad women, and once Jean-Patrice asked Taleb what would happen to his positions if a plane crashed into his building. Taleb was young then and brushed him aside. It seemed absurd. But nothing, Taleb soon realized, is absurd. Taleb likes to quote David Hume: "No amount of observations of white swans can allow the inference that all swans are white, but the observation of a single black swan is sufficient to refute that conclusion." Because L.T.C.M. had never seen a black swan in Russia, it thought no Russian black swans existed. Taleb, by contrast, has constructed a trading philosophy predicated entirely on the existence of black swans. on the possibility of some random, unexpected event sweeping the markets. He never sells options, then. He only buys them. He's never the one who can lose a great deal of money if G.M. stock suddenly plunges. Nor does he ever bet on the market moving in one direction or another. That would require Taleb to assume that he understands the market, and he doesn't. He hasn't Warren Buffett's confidence. So he buys options on both sides, on the possibility of the market moving both up and down. And he doesn't bet on minor fluctuations in the market. Why bother? If everyone else is vastly underestimating the possibility of rare events, then an option on G.M. at, say, forty dollars is going to be undervalued. So Taleb buys out-of-the-money options by the truckload. He buys them for hundreds of different stocks, and if they expire before he gets to use them he simply buys more. Taleb doesn't even invest in stocks, not for Empirica and not for his own personal account. Buying a stock, unlike buying an option, is a gamble that the future will represent an improved version of the past. And who knows whether that will be true? So all of Taleb's personal wealth, and the hundreds of millions that Empirica has in reserve, is in Treasury bills. Few on Wall Street have taken the practice of buying options to such extremes. But if anything completely out of the ordinary happens to the stock market, if some random event sends a jolt through all of Wall Street and pushes G.M. to, say, twenty dollars, Nassim Taleb will not end up in a dowdy apartment in Athens. He will be rich.
Not long ago, Taleb went to a dinner in a French restaurant just north of Wall Street. The people at the dinner were all quants: men with bulging pockets and open-collared shirts and the serene and slightly detached air of those who daydream in numbers. Taleb sat at the end of the table, drinking pastis and discussing French literature. There was a chess grand master at the table, with a shock of white hair, who had once been one of Anatoly Karpov's teachers, and another man who over the course of his career had worked, in order, at Stanford University, Exxon, Los Alamos National Laboratory, Morgan Stanley, and a boutique French investment bank. They talked about mathematics and chess and fretted about one of their party who had not yet arrived and who had the reputation, as one of the quants worriedly said, of "not being able to find the bathroom." When the check came, it was given to a man who worked in risk management at a big Wall Street bank, and he stared at it for a long time, with a slight mixture of perplexity and amusement, as if he could not remember what it was like to deal with a mathematical problem of such banality. The men at the table were in a business that was formally about mathematics but was really about epistemology, because to sell or to buy an option requires each party to confront the question of what it is he truly knows. Taleb buys options because he is certain that, at root, he knows nothing, or, more precisely, that other people believe they know more than they do. But there were plenty of people around that table who sold options, who thought that if you were smart enough to set the price of the option properly you could win so many of those one-dollar bets on General Motors that, even if the stock ever did dip below forty-five dollars, you'd still come out far ahead. They believe that the world is a place where, at the end of the day, leaves fall more or less in a predictable pattern.
The distinction between these two sides is the divide that emerged between Taleb and Niederhoffer all those years ago in Connecticut. Niederhoffer's hero is the nineteenth-century scientist Francis Galton. Niederhoffer called his eldest daughter Galt, and there is a full-length portrait of Galton in his library. Galton was a statistician and a social scientist (and a geneticist and a meteorologist), and if he was your hero you believed that by marshalling empirical evidence, by aggregating data points, you could learn whatever it was you needed to know. Taleb's hero, on the other hand, is Karl Popper, who said that you could not know with any certainty that a proposition was true; you could only know that it was not true. Taleb makes much of what he learned from Niederhoffer, but Niederhoffer insists that his example was wasted on Taleb. "In one of his cases, Rumpole of the Bailey talked about being tried by the bishop who doesn't believe in God," Niederhoffer says. "Nassim is the empiricist who doesn't believe in empiricism." What is it that you claim to learn from experience, if you believe that experience cannot be trusted? Today, Niederhoffer makes a lot of his money selling options, and more often than not the person who he sells those options to is Nassim Taleb. If one of them is up a dollar one day, in other words, that dollar is likely to have come from the other. The teacher and pupil have become predator and prey.
3.
Years ago, Nassim Taleb worked at the investment bank First Boston, and one of the things that puzzled him was what he saw as the mindless industry of the trading floor. A trader was supposed to come in every morning and buy and sell things, and on the basis of how much money he made buying and selling he was given a bonus. If he went too many weeks without showing a profit, his peers would start to look at him funny, and if he went too many months without showing a profit he would be gone. The traders were often well educated, and wore Savile Row suits and Ferragamo ties. They dove into the markets with a frantic urgency. They read the Wall Street Journal closely and gathered around the television to catch breaking news. "The Fed did this, the Prime Minister of Spain did that," Taleb recalls. "The Italian Finance Minister says there will be no competitive devaluation, this number is higher than expected, Abby Cohen just said this." It was a scene that Taleb did not understand.
"He was always so conceptual about what he was doing," says Howard Savery, who was Taleb?s assistant at the French bank Indosuez in the nineteen-eighties. "He used to drive our floor trader (his name was Tim) crazy. Floor traders are used to precision: "Sell a hundred futures at eighty-seven." Nassim would pick up the phone and say, "Tim, sell some." And Tim would say, "How many?" And he would say, "Oh, a social amount." It was like saying, "I don't have a number in mind, I just know I want to sell." There would be these heated arguments in French, screaming arguments. Then everyone would go out to dinner and have fun. Nassim and his group had this attitude that we're not interested in knowing what the new trade number is. When everyone else was leaning over their desks, listening closely to the latest figures, Nassim would make a big scene of walking out of the room."
At Empirica, then, there are no Wall Street Journals to be found. There is very little active trading, because the options that the fund owns are selected by computer. Most of those options will be useful only if the market does something dramatic, and, of course, on most days the market doesn't. So the job of Taleb and his team is to wait and to think. They analyze the company's trading policies, back-test various strategies, and construct ever-more sophisticated computer models of options pricing. Danny, in the corner, occasionally types things into the computer. Pallop looks dreamily off into the distance. Spitznagel takes calls from traders, and toggles back and forth between screens on his computer. Taleb answers e-mails and calls one of the firm's brokers in Chicago, affecting, as he does, the kind of Brooklyn accent that people from Brooklyn would have if they were actually from northern Lebanon: "Howyoudoin?" It is closer to a classroom than to a trading floor.
"Pallop, did you introspect?" Taleb calls out as he wanders back in from lunch. Pallop is asked what his Ph.D. is about. "Pretty much this," he says, waving a languid hand around the room.
"It looks like we will have to write it for him," Taleb chimes in, "because Pollop is very lazy."
What Empirica has done is to invert the traditional psychology of investing. You and I, if we invest conventionally in the market, have a fairly large chance of making a small amount of money in a given day from dividends or interest or the general upward trend of the market. We have almost no chance of making a large amount of money in one day, and there is a very small, but real, possibility that if the market collapses we could blow up. We accept that distribution of risks because, for fundamental reasons, it feels right. In the book that Pallop was reading by Kahneman and Tversky, for example, there is a description of a simple experiment, where a group of people were told to imagine that they had three hundred dollars. They were then given a choice between (a) receiving another hundred dollars or (b) tossing a coin, where if they won they got two hundred dollars and if they lost they got nothing. Most of us, it turns out, prefer (a) to (b). But then Kahneman and Tversky did a second experiment. They told people to imagine that they had five hundred dollars, and then asked them if they would rather (c) give up a hundred dollars or (d) toss a coin and pay two hundred dollars if they lost and nothing at all if they won. Most of us now prefer (d) to (c). What is interesting about those four choices is that, from a probabilistic standpoint, they are identical. They all yield an expected outcome of four hundred dollars. Nonetheless, we have strong preferences among them. Why? Because we're more willing to gamble when it comes to losses, but are risk averse when it comes to our gains. That's why we like small daily winnings in the stock market, even if that requires that we risk losing everything in a crash.
At Empirica, by contrast, every day brings a small but real possibility that they'll make a huge amount of money in a day; no chance that they'll blow up; and a very large possibility that they'll lose a small amount of money. All those dollar, and fifty-cent, and nickel options that Empirica has accumulated, few of which will ever be used, soon begin to add up. By looking at a particular column on the computer screens showing Empirica's positions, anyone at the firm can tell you precisely how much money Empirica has lost or made so far that day. At 11:30 A.M., for instance, they had recovered just twenty-eight percent of the money they had spent that day on options. By 12:30, they had recovered forty per cent, meaning that the day was not yet half over and Empirica was already in the red to the tune of several hundred thousand dollars. The day before that, it had made back eighty-five per cent of its money; the day before that, forty-eight per cent; the day before that, sixty-five per cent; and the day before that also sixty-five per cent; and, in fact-with a few notable exceptions, like the few days when the market reopened after September 11th -- Empirica has done nothing but lose money since last April. "We cannot blow up, we can only bleed to death," Taleb says, and bleeding to death, absorbing the pain of steady losses, is precisely what human beings are hardwired to avoid. "Say you've got a guy who is long on Russian bonds," Savery says. "He's making money every day. One day, lightning strikes and he loses five times what he made. Still, on three hundred and sixty-four out of three hundred and sixty-five days he was very happily making money. It's much harder to be the other guy, the guy losing money three hundred and sixty-four days out of three hundred and sixty-five, because you start questioning yourself. Am I ever going to make it back? Am I really right? What if it takes ten years? Will I even be sane ten years from now?" What the normal trader gets from his daily winnings is feedback, the pleasing illusion of progress. At Empirica, there is no feedback. "It's like you're playing the piano for ten years and you still can't play chopsticks," Spitznagel say, "and the only thing you have to keep you going is the belief that one day you'll wake up and play like Rachmaninoff." Was it easy knowing that Niederhoffer -- who represented everything they thought was wrong -- was out there getting rich while they were bleeding away? Of course it wasn't . If you watched Taleb closely that day, you could see the little ways in which the steady drip of losses takes a toll. He glanced a bit too much at the Bloomberg. He leaned forward a bit too often to see the daily loss count. He succumbs to an array of superstitious tics. If the going is good, he parks in the same space every day; he turned against Mahler because he associates Mahler with the last year's long dry spell. "Nassim says all the time that he needs me there, and I believe him," Spitznagel says. He is there to remind Taleb that there is a point to waiting, to help Taleb resist the very human impulse to abandon everything and stanch the pain of losing. "Mark is my cop," Taleb says. So is Pallop: he is there to remind Taleb that Empirica has the intellectual edge.
"The key is not having the ideas but having the recipe to deal with your ideas," Taleb says. "We don't need moralizing. We need a set of tricks." His trick is a protocol that stipulates precisely what has to be done in every situation. "We built the protocol, and the reason we did was to tell the guys, Don't listen to me, listen to the protocol. Now, I have the right to change the protocol, but there is a protocol to changing the protocol. We have to be hard on ourselves to do what we do. The bias we see in Niederhoffer we see in ourselves." At the quant dinner, Taleb devoured his roll, and as the busboy came around with more rolls Taleb shouted out "No, no!" and blocked his plate. It was a never-ending struggle, this battle between head and heart. When the waiter came around with wine, he hastily covered the glass with his hand. When the time came to order, he asked for steak frites -- without the frites, please! -- and then immediately tried to hedge his choice by negotiating with the person next to him for a fraction of his frites.
The psychologist Walter Mischel has done a series of experiments where he puts a young child in a room and places two cookies in front of him, one small and one large. The child is told that if he wants the small cookie he need only ring a bell and the experimenter will come back into the room and give it to him. If he wants the better treat, though, he has to wait until the experimenter returns on his own, which might be anytime in the next twenty minutes. Mischel has videotapes of six-year-olds, sitting in the room by themselves, staring at the cookies, trying to persuade themselves to wait. One girl starts to sing to herself. She whispers what seems to be the instructions -- that she can have the big cookie if she can only wait. She closes her eyes. Then she turns her back on the cookies. Another little boy swings his legs violently back and forth, and then picks up the bell and examines it, trying to do anything but think about the cookie he could get by ringing it. The tapes document the beginnings of discipline and self-control -- the techniques we learn to keep our impulses in check -- and to watch all the children desperately distracting themselves is to experience the shock of recognition: that's Nassim Taleb!
There is something else as well that helps to explain Taleb's resolve -- more than the tics and the systems and the self-denying ordinances. It happened a year or so before he went to see Niederhoffer. Taleb had been working as a trader at the Chicago Mercantile Exchange, and developed a persistently hoarse throat. At first, he thought nothing of it: a hoarse throat was an occupational hazard of spending every day in the pit. Finally, when he moved back to New York, he went to see a doctor, in one of those Upper East Side prewar buildings with a glamorous façade. Taleb sat in the office, staring out at the plain brick of the courtyard, reading the medical diplomas on the wall over and over, waiting and waiting for the verdict. The doctor returned and spoke in a low, grave voice: "I got the pathology report. It's not as bad as it sounds ?" But, of course, it was: he had throat cancer. Taleb's mind shut down. He left the office. It was raining outside. He walked and walked and ended up at a medical library. There he read frantically about his disease, the rainwater forming a puddle under his feet. It made no sense. Throat cancer was the disease of someone who has spent a lifetime smoking heavily. But Taleb was young, and he barely smoked at all. His risk of getting throat cancer was something like one in a hundred thousand, almost unimaginably small. He was a black swan! The cancer is now beaten, but the memory of it is also Taleb's secret, because once you have been a black swan -- not just seen one, but lived and faced death as one -- it becomes easier to imagine another on the horizon.
As the day came to an end, Taleb and his team turned their attention once again to the problem of the square root of n. Taleb was back at the whiteboard. Spitznagel was looking on. Pallop was idly peeling a banana. Outside, the sun was beginning to settle behind the trees. "You do a conversion to p1 and p2," Taleb said. His marker was once again squeaking across the whiteboard. "We say we have a Gaussian distribution, and you have the market switching from a low-volume regime to a high-volume. P21. P22. You have your igon value." He frowned and stared at his handiwork. The markets were now closed. Empirica had lost money, which meant that somewhere off in the woods of Connecticut Niederhoffer had no doubt made money. That hurt, but if you steeled yourself, and thought about the problem at hand, and kept in mind that someday the market would do something utterly unexpected because in the world we live in something utterly unexpected always happens, then the hurt was not so bad. Taleb eyed his equations on the whiteboard, and arched an eyebrow. It was a very difficult problem. "Where is Dr. Wu? Should we call in Dr. Wu?"
4.
A year after Nassim Taleb came to visit him, Victor Niederhoffer blew up. He sold a very large number of options on the S. & P. index, taking millions of dollars from other traders in exchange for promising to buy a basket of stocks from them at current prices, if the market ever fell. It was an unhedged bet, or what was called on Wall Street a "naked put," meaning that he bet everyone on one outcome: he bet in favor of the large probability of making a small amount of money, and against the small probability of losing a large amount of money-and he lost. On October 27, 1997, the market plummeted eight per cent, and all of the many, many people who had bought those options from Niederhoffer came calling all at once, demanding that he buy back their stocks at pre-crash prices. He ran through a hundred and thirty million dollars -- his cash reserves, his savings, his other stocks -- and when his broker came and asked for still more he didn't have it. In a day, one of the most successful hedge funds in America was wiped out. Niederhoffer had to shut down his firm. He had to mortgage his house. He had to borrow money from his children. He had to call Sotheby's and sell his prized silver collection -- the massive nineteenth-century Brazilian "sculptural group of victory" made for the Visconde De Figueirdeo, the massive silver bowl designed in 1887 by Tiffany & Company for the James Gordon Bennet Cup yacht race, and on and on. He stayed away from the auction. He couldn't bear to watch.
"It was one of the worst things that has ever happened to me in my life, right up there with the death of those closest to me," Niederhoffer said recently. It was a Saturday in March, and he was in the library of his enormous house. Two weary-looking dogs wandered in and out. He is a tall man, an athlete, thick through the upper body and trunk, with a long, imposing face and baleful, hooded eyes. He was shoeless. One collar on his shirt was twisted inward, and he looked away as he talked. "I let down my friends. I lost my business. I was a major money manager. Now I pretty much have had to start from ground zero." He paused. "Five years have passed. The beaver builds a dam. The river washes it away, so he tries to build a better foundation, and I think I have. But I'm always mindful of the possibility of more failures." In the distance, there was a knock on the door. It was a man named Milton Bond, an artist who had come to present Niederhoffer with a painting he had done of Moby Dick ramming the Pequod. It was in the folk-art style that Niederhoffer likes so much, and he went to meet Bond in the foyer, kneeling down in front of the painting as Bond unwrapped it. Niederhoffer has other paintings of the Pequod in his house, and paintings of the Essex, the ship on which Melville's story was based. In his office, on a prominent wall, is a painting of the Titanic. They were, he said, his way of staying humble. "One of the reasons I've paid lots of attention to the Essex is that it turns out that the captain of the Essex, as soon as he got back to Nantucket, was given another job," Niederhoffer said. "They thought he did a good job in getting back after the ship was rammed. The captain was asked, `How could people give you another ship?' And he said, `I guess on the theory that lightning doesn't strike twice.' It was a fairly random thing. But then he was given the other ship, and that one foundered, too. Got stuck in the ice. At that time, he was a lost man. He wouldn't even let them save him. They had to forcibly remove him from the ship. He spent the rest of his life as a janitor in Nantucket. He became what on Wall Street they call a ghost." Niederhoffer was back in his study now, his lanky body stretched out, his feet up on the table, his eyes a little rheumy. "You see? I can't afford to fail a second time. Then I'll be a total washout. That's the significance of the Pequod."
A month or so before he blew up, Taleb had dinner with Niederhoffer at a restaurant in Westport, and Niederhoffer told him that he had been selling naked puts. You can imagine the two of them across the table from each other, Niederhoffer explaining that his bet was an acceptable risk, that the odds of the market going down so heavily that he would be wiped out were minuscule, and Taleb listening and shaking his head, and thinking about black swans. "I was depressed when I left him," Taleb said. "Here is a guy who goes out and hits a thousand backhands. He plays chess like his life depends on it. Here is a guy who, whatever he wants to do when he wakes up in the morning, he ends up better than anyone else. Whatever he wakes up in the morning and decides to do, he did better than anyone else. I was talking to my hero . . ." This was the reason Taleb didn't want to be Niederhoffer when Niederhoffer was at his height -- the reason he didn't want the silver and the house and the tennis matches with George Soros. He could see all too clearly where it all might end up. In his mind's eye, he could envision Niederhoffer borrowing money from his children, and selling off his silver, and talking in a hollow voice about letting down his friends, and Taleb did not know if he had the strength to live with that possibility. Unlike Niederhoffer, Taleb never thought he was invincible. You couldn't if you had watched your homeland blow up, and had been the one person in a hundred thousand who gets throat cancer, and so for Taleb there was never any alternative to the painful process of insuring himself against catastrophe.
This kind of caution does not seem heroic, of course. It seems like the joyless prudence of the accountant and the Sunday-school teacher. The truth is that we are drawn to the Niederhoffers of this world because we are all, at heart, like Niederhoffer: we associate the willingness to risk great failure -- and the ability to climb back from catastrophe--with courage. But in this we are wrong. That is the lesson of Taleb and Niederhoffer, and also the lesson of our volatile times. There is more courage and heroism in defying the human impulse, in taking the purposeful and painful steps to prepare for the unimaginable.
Last fall, Niederhoffer sold a large number of options, betting that the markets would be quiet, and they were, until out of nowhere two planes crashed into the World Trade Center. "I was exposed. It was nip and tuck." Niederhoffer shook his head, because there was no way to have anticipated September 11th. "That was a totally unexpected event."
THE ARCHIVE
complete list
Articles from the New Yorker
Personality Plus
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
September 20, 2004
ANNALS OF PSYCHOLOGY
Employers love personality tests.
But what do they really reveal?
1.
When Alexander (Sandy) Nininger was twenty-three, and newly commissioned as a lieutenant in the United States Army, he was sent to the South Pacific to serve with the 57th Infantry of the Philippine Scouts. It was January, 1942. The Japanese had just seized Philippine ports at Vigan, Legazpi, Lamon Bay, and Lingayen, and forced the American and Philippine forces to retreat into Bataan, a rugged peninsula on the South China Sea. There, besieged and outnumbered, the Americans set to work building a defensive line, digging foxholes and constructing dikes and clearing underbrush to provide unobstructed sight lines for rifles and machine guns. Nininger's men were on the line's right flank. They labored day and night. The heat and the mosquitoes were nearly unbearable.
Quiet by nature, Nininger was tall and slender, with wavy blond hair. As Franklin M. Reck recounts in "Beyond the Call of Duty," Nininger had graduated near the top of his class at West Point, where he chaired the lecture-and-entertainment committee. He had spent many hours with a friend, discussing everything from history to the theory of relativity. He loved the theatre. In the evenings, he could often be found sitting by the fireplace in the living room of his commanding officer, sipping tea and listening to Tchaikovsky. As a boy, he once saw his father kill a hawk and had been repulsed. When he went into active service, he wrote a friend to say that he had no feelings of hate, and did not think he could ever kill anyone out of hatred. He had none of the swagger of the natural warrior. He worked hard and had a strong sense of duty.
In the second week of January, the Japanese attacked, slipping hundreds of snipers through the American lines, climbing into trees, turning the battlefield into what Reck calls a "gigantic possum hunt." On the morning of January 12th, Nininger went to his commanding officer. He wanted, he said, to be assigned to another company, one that was in the thick of the action, so he could go hunting for Japanese snipers.
He took several grenades and ammunition belts, slung a Garand rifle over his shoulder, and grabbed a sub machine gun. Starting at the point where the fighting was heaviest—near the position of the battalion's K Company—he crawled through the jungle and shot a Japanese soldier out of a tree. He shot and killed snipers. He threw grenades into enemy positions. He was wounded in the leg, but he kept going, clearing out Japanese positions for the other members of K Company, behind him. He soon ran out of grenades and switched to his rifle, and then, when he ran out of ammunition, used only his bayonet. He was wounded a second time, but when a medic crawled toward him to help bring him back behind the lines Nininger waved him off. He saw a Japanese bunker up ahead. As he leaped out of a shell hole, he was spun around by a bullet to the shoulder, but he kept charging at the bunker, where a Japanese officer and two enlisted men were dug in. He dispatched one soldier with a double thrust of his bayonet, clubbed down the other, and bayonetted the officer. Then, with outstretched arms, he collapsed face down. For his heroism, Nininger was posthumously awarded the Medal of Honor, the first American soldier so decorated in the Second World War.
2.
Suppose that you were a senior Army officer in the early days of the Second World War and were trying to put together a crack team of fearless and ferocious fighters. Sandy Nininger, it now appears, had exactly the right kind of personality for that assignment, but is there any way you could have known this beforehand? It clearly wouldn't have helped to ask Nininger if he was fearless and ferocious, because he didn't know that he was fearless and ferocious. Nor would it have worked to talk to people who spent time with him. His friend would have told you only that Nininger was quiet and thoughtful and loved the theatre, and his commanding officer would have talked about the evenings of tea and Tchaikovsky. With the exception, perhaps, of the Scarlet Pimpernel, a love of music, theatre, and long afternoons in front of a teapot is not a known predictor of great valor. What you need is some kind of sophisticated psychological instrument, capable of getting to the heart of his personality.
Over the course of the past century, psychology has been consumed with the search for this kind of magical instrument. Hermann Rorschach proposed that great meaning lay in the way that people described inkblots. The creators of the Minnesota Multiphasic Personality Inventory believed in the revelatory power of true-false items such as "I have never had any black, tarry-looking bowel movements" or "If the money were right, I would like to work for a circus or a carnival." Today, Annie Murphy Paul tells us in her fascinating new book, "Cult of Personality," that there are twenty-five hundred kinds of personality tests. Testing is a four-hundred-million-dollar-a-year industry. A hefty percentage of American corporations use personality tests as part of the hiring and promotion process. The tests figure in custody battles and in sentencing and parole decisions. "Yet despite their prevalence—and the importance of the matters they are called upon to decide—personality tests have received surprisingly little scrutiny," Paul writes. We can call in the psychologists. We can give Sandy Nininger a battery of tests. But will any of it help?
One of the most popular personality tests in the world is the Myers-Briggs Type Indicator (M.B.T.I.), a psychological-assessment system based on Carl Jung's notion that people make sense of the world through a series of psychological frames. Some people are extroverts, some are introverts. Some process information through logical thought. Some are directed by their feelings. Some make sense of the world through intuitive leaps. Others collect data through their senses. To these three categories— (I)ntroversion/(E)xtroversion, i(N)tuition/(S)ensing, (T)hinking/(F)eeling—the Myers-Briggs test adds a fourth: (J)udging/(P)erceiving. Judgers "like to live in a planned, orderly way, seeking to regulate and manage their lives," according to an M.B.T.I. guide, whereas Perceivers "like to live in a flexible, spontaneous way, seeking to experience and understand life, rather than control it." The M.B.T.I. asks the test-taker to answer a series of "forced-choice" questions, where one choice identifies you as belonging to one of these paired traits. The basic test takes twenty minutes, and at the end you are presented with a precise, multidimensional summary of your personality-your type might be INTJ or ESFP, or some other combination. Two and a half million Americans a year take the Myers-Briggs. Eighty-nine companies out of the Fortune 100 make use of it, for things like hiring or training sessions to help employees "understand" themselves or their colleagues. Annie Murphy Paul says that at the eminent consulting firm McKinsey, " 'associates' often know their colleagues' four-letter M.B.T.I. types by heart," the way they might know their own weight or (this being McKinsey) their S.A.T. scores.
It is tempting to think, then, that we could figure out the Myers-Briggs type that corresponds best to commando work, and then test to see whether Sandy Nininger fits the profile. Unfortunately, the notion of personality type is not nearly as straightforward as it appears. For example, the Myers-Briggs poses a series of items grouped around the issue of whether you—the test-taker—are someone who likes to plan your day or evening beforehand or someone who prefers to be spontaneous. The idea is obviously to determine whether you belong to the Judger or Perceiver camp, but the basic question here is surprisingly hard to answer. I think I'm someone who likes to be spontaneous. On the other hand, I have embarked on too many spontaneous evenings that ended up with my friends and me standing on the sidewalk, looking at each other and wondering what to do next. So I guess I'm a spontaneous person who recognizes that life usually goes more smoothly if I plan first, or, rather, I'm a person who prefers to be spontaneous only if there's someone around me who isn't. Does that make me spontaneous or not? I'm not sure. I suppose it means that I'm somewhere in the middle.
This is the first problem with the Myers-Briggs. It assumes that we are either one thing or another—Intuitive or Sensing, Introverted or Extroverted. But personality doesn't fit into neat binary categories: we fall somewhere along a continuum.
Here's another question: Would you rather work under a boss (or a teacher) who is good-natured but often inconsistent, or sharp-tongued but always logical?
On the Myers-Briggs, this is one of a series of questions intended to establish whether you are a Thinker or a Feeler. But I'm not sure I know how to answer this one, either. I once had a good-natured boss whose inconsistency bothered me, because he exerted a great deal of day-to-day control over my work. Then I had a boss who was quite consistent and very sharp-tongued—but at that point I was in a job where day-to-day dealings with my boss were minimal, so his sharp tongue didn't matter that much. So what do I want in a boss? As far as I can tell, the only plausible answer is: It depends. The Myers-Briggs assumes that who we are is consistent from one situation to another. But surely what we want in a boss, and how we behave toward our boss, is affected by what kind of job we have.
This is the gist of the now famous critique that the psychologist Walter Mischel has made of personality testing. One of Mischel's studies involved watching children interact with one another at a summer camp. Aggressiveness was among the traits that he was interested in, so he watched the children in five different situations: how they behaved when approached by a peer, when teased by a peer, when praised by an adult, when punished by an adult, and when warned by an adult. He found that how aggressively a child responded in one of those situations wasn't a good predictor of how that same child responded in another situation. Just because a boy was aggressive in the face of being teased by another boy didn't mean that he would be aggressive in the face of being warned by an adult. On the other hand, if a child responded aggressively to being teased by a peer one day, it was a pretty good indicator that he'd respond aggressively to being teased by a peer the next day. We have a personality in the sense that we have a consistent pattern of behavior. But that pattern is complex and that personality is contingent: it represents an interaction between our internal disposition and tendencies and the situations that we find ourselves in.
It's not surprising, then, that the Myers-Briggs has a large problem with consistency: according to some studies, more than half of those who take the test a second time end up with a different score than when they took it the first time. Since personality is continuous, not dichotomous, clearly some people who are borderline Introverts or Feelers one week slide over to Extroversion or Thinking the next week. And since personality is contingent, not stable, how we answer is affected by which circumstances are foremost in our minds when we take the test. If I happen to remember my first boss, then I come out as a Thinker. If my mind is on my second boss, I come out as a Feeler. When I took the Myers-Briggs, I scored as an INTJ. But, if odds are that I'm going to be something else if I take the test again, what good is it?
Once, for fun, a friend and I devised our own personality test. Like the M.B.T.I., it has four dimensions. The first is Canine/Feline. In romantic relationships, are you the pursuer, who runs happily to the door, tail wagging? Or are you the pursued? The second is More/Different. Is it your intellectual style to gather and master as much information as you can or to make imaginative use of a discrete amount of information? The third is Insider/Outsider. Do you get along with your parents or do you define yourself outside your relationship with your mother and father? And, finally, there is Nibbler/Gobbler. Do you work steadily, in small increments, or do everything at once, in a big gulp? I'm quite pleased with the personality inventory we devised. It directly touches on four aspects of life and temperament-romance, cognition, family, and work style—that are only hinted at by Myers-Briggs. And it can be completed in under a minute, nineteen minutes faster than Myers-Briggs, an advantage not to be dismissed in today's fast-paced business environment. Of course, the four traits it measures are utterly arbitrary, based on what my friend and I came up with over the course of a phone call. But then again surely all universal dichotomous typing systems are arbitrary.
Where did the Myers-Briggs come from, after all? As Paul tells us, it began with a housewife from Washington, D.C., named Katharine Briggs, at the turn of the last century. Briggs had a daughter, Isabel, an only child for whom (as one relative put it) she did "everything but breathe." When Isabel was still in her teens, Katharine wrote a book-length manuscript about her daughter's remarkable childhood, calling her a "genius" and "a little Shakespeare." When Isabel went off to Swarthmore College, in 1915, the two exchanged letters nearly every day. Then, one day, Isabel brought home her college boyfriend and announced that they were to be married. His name was Clarence (Chief) Myers. He was tall and handsome and studying to be a lawyer, and he could not have been more different from the Briggs women. Katharine and Isabel were bold and imaginative and intuitive. Myers was practical and logical and detail-oriented. Katharine could not understand her future son-in-law. "When the blissful young couple returned to Swarthmore," Paul writes, "Katharine retreated to her study, intent on 'figuring out Chief.' "She began to read widely in psychology and philosophy. Then, in 1923, she came across the first English translation of Carl Jung's "Psychological Types." "This is it!" Katharine told her daughter. Paul recounts, "In a dramatic display of conviction she burned all her own research and adopted Jung's book as her 'Bible,' as she gushed in a letter to the man himself. His system explained it all: Lyman [Katharine's husband], Katharine, Isabel, and Chief were introverts; the two men were thinkers, while the women were feelers; and of course the Briggses were intuitives, while Chief was a senser." Encouraged by her mother, Isabel—who was living in Swarthmore and writing mystery novels—devised a paper-and-pencil test to help people identify which of the Jungian categories they belonged to, and then spent the rest of her life tirelessly and brilliantly promoting her creation.
The problem, as Paul points out, is that Myers and her mother did not actually understand Jung at all. Jung didn't believe that types were easily identifiable, and he didn't believe that people could be permanently slotted into one category or another. "Every individual is an exception to the rule," he wrote; to "stick labels on people at first sight," in his view, was "nothing but a childish parlor game." Why is a parlor game based on my desire to entertain my friends any less valid than a parlor game based on Katharine Briggs's obsession with her son-in-law?
3.
The problems with the Myers-Briggs suggest that we need a test that is responsive to the complexity and variability of the human personality. And that is why, not long ago, I found myself in the office of a psychologist from New Jersey named Lon Gieser. He is among the country's leading experts on what is called the Thematic Apperception Test (T.A.T.), an assessment tool developed in the nineteen-thirties by Henry Murray, one of the most influential psychologists of the twentieth century.
I sat in a chair facing Gieser, as if I were his patient. He had in his hand two dozen or so pictures—mostly black-and-white drawings—on legal-sized cards, all of which had been chosen by Murray years before. "These pictures present a series of scenes," Gieser said to me. "What I want you to do with each scene is tell a story with a beginning, a middle, and an end." He handed me the first card. It was of a young boy looking at a violin. I had imagined, as Gieser was describing the test to me, that it would be hard to come up with stories to match the pictures. As I quickly discovered, though, the exercise was relatively effortless: the stories just tumbled out.
"This is a young boy," I began. "His parents want him to take up the violin, and they've been encouraging him. I think he is uncertain whether he wants to be a violin player, and maybe even resents the imposition of having to play this instrument, which doesn't seem to have any appeal for him. He's not excited or thrilled about this. He'd rather be somewhere else. He's just sitting there looking at it, and dreading having to fulfill this parental obligation."
I continued in that vein for a few more minutes. Gieser gave me another card, this one of a muscular man clinging to a rope and looking off into the distance. "He's climbing up, not climbing down," I said, and went on:
It's out in public. It's some kind of big square, in Europe, and there is some kind of spectacle going on. It's the seventeenth or eighteenth century. The King is coming by in a carriage, and this man is shimmying up, so he can see over everyone else and get a better view of the King. I don't get the sense that he's any kind of highborn person. I think he aspires to be more than he is. And he's kind of getting a glimpse of the King as a way of giving himself a sense of what he could be, or what his own future could be like.
We went on like this for the better part of an hour, as I responded to twelve cards—each of people in various kinds of ambiguous situations. One picture showed a woman slumped on the ground, with some small object next to her; another showed an attractive couple in a kind of angry embrace, apparently having an argument. (I said that the fight they were having was staged, that each was simply playing a role.) As I talked, Gieser took notes. Later, he called me and gave me his impressions. "What came out was the way you deal with emotion," he said. "Even when you recognized the emotion, you distanced yourself from it. The underlying motive is this desire to avoid conflict. The other thing is that when there are opportunities to go to someone else and work stuff out, your character is always going off alone. There is a real avoidance of emotion and dealing with other people, and everyone goes to their own corners and works things out on their own."
How could Gieser make such a confident reading of my personality after listening to me for such a short time? I was baffled by this, at first, because I felt that I had told a series of random and idiosyncratic stories. When I listened to the tape I had made of the session, though, I saw what Gieser had picked up on: my stories were exceedingly repetitive in just the way that he had identified. The final card that Gieser gave me was blank, and he asked me to imagine my own picture and tell a story about it. For some reason, what came to mind was Andrew Wyeth's famous painting "Christina's World," of a woman alone in a field, her hair being blown by the wind. She was from the city, I said, and had come home to see her family in the country: "I think she is taking a walk. She is pondering some piece of important news. She has gone off from the rest of the people to think about it." Only later did I realize that in the actual painting the woman is not strolling through the field. She is crawling, desperately, on her hands and knees. How obvious could my aversion to strong emotion be?
The T.A.T. has a number of cards that are used to assess achievement—that is, how interested someone is in getting ahead and succeeding in life. One is the card of the man on the rope; another is the boy looking at his violin. Gieser, in listening to my stories, concluded that I was very low in achievement:
Some people say this kid is dreaming about being a great violinist, and he's going to make it. With you, it wasn't what he wanted to do at all. His parents were making him do it. With the rope climbing, some people do this Tarzan thing. They climb the pole and get to the top and feel this great achievement. You have him going up the rope—and why is he feeling the pleasure? Because he's seeing the King. He's still a nobody in the public square, looking at the King.
Now, this is a little strange. I consider myself quite ambitious. On a questionnaire, if you asked me to rank how important getting ahead and being successful was to me, I'd check the "very important" box. But Gieser is suggesting that the T.A.T. allowed him to glimpse another dimension of my personality.
This idea—that our personality can hold contradictory elements—is at the heart of "Strangers to Ourselves," by the social psychologist Timothy D. Wilson. He is one of the discipline's most prominent researchers, and his book is what popular psychology ought to be (and rarely is): thoughtful, beautifully written, and full of unexpected insights. Wilson's interest is in what he calls the "adaptive unconscious" (not to be confused with the Freudian unconscious). The adaptive unconscious, in Wilson's description, is a big computer in our brain which sits below the surface and evaluates, filters, and looks for patterns in the mountain of data that come in through our senses. That system, Wilson argues, has a personality: it has a set of patterns and responses and tendencies that are laid down by our genes and our early-childhood experiences. These patterns are stable and hard to change, and we are only dimly aware of them. On top of that, in his schema we have another personality: it's the conscious identity that we create for ourselves with the choices we make, the stories we tell about ourselves, and the formal reasons we come up with to explain our motives and feelings. Yet this "constructed self" has no particular connection with the personality of our adaptive unconscious. In fact, they could easily be at odds. Wilson writes:
The adaptive unconscious is more likely to influence people's uncontrolled, implicit responses, whereas the constructed self is more likely to influence people's deliberative, explicit responses. For example, the quick, spontaneous decision of whether to argue with a co-worker is likely to be under the control of one's nonconscious needs for power and affiliation. A more thoughtful decision about whether to invite a co-worker over for dinner is more likely to be under the control of one's conscious, self-attributed motives.
When Gieser said that he thought I was low in achievement, then, he presumably saw in my stories an unconscious ambivalence toward success. The T.A.T., he believes, allowed him to go beyond the way I viewed myself and arrive at a reading with greater depth and nuance.
Even if he's right, though, does this help us pick commandos? I'm not so sure. Clearly, underneath Sandy Nininger's peaceful façade there was another Nininger capable of great bravery and ferocity, and a T.A.T. of Nininger might have given us a glimpse of that part of who he was. But let's not forget that he volunteered for the front lines: he made a conscious decision to put himself in the heat of the action. What we really need is an understanding of how those two sides of his personality interact in critical situations. When is Sandy Nininger's commitment to peacefulness more, or less, important than some unconscious ferocity? The other problem with the T.A.T., of course, is that it's a subjective instrument. You could say that my story about the man climbing the rope is evidence that I'm low in achievement or you could say that it shows a strong desire for social mobility. The climber wants to look down—not up—at the King in order to get a sense "of what he could be." You could say that my interpretation that the couple's fighting was staged was evidence of my aversion to strong emotion. Or you could say that it was evidence of my delight in deception and role-playing. This isn't to question Gieser's skill or experience as a diagnostician. The T.A.T. is supposed to do no more than identify themes and problem areas, and I'm sure Gieser would be happy to put me on the couch for a year to explore those themes and see which of his initial hypotheses had any validity. But the reason employers want a magical instrument for measuring personality is that they don't have a year to work through the ambiguities. They need an answer now.
4.
A larger limitation of both Myers-Briggs and the T.A.T. is that they are indirect. Tests of this kind require us first to identify a personality trait that corresponds to the behavior we're interested in, and then to figure out how to measure that trait—but by then we're two steps removed from what we're after. And each of those steps represents an opportunity for error and distortion. Shouldn't we try, instead, to test directly for the behavior we're interested in? This is the idea that lies behind what's known as the Assessment Center, and the leading practitioner of this approach is a company called Development Dimensions International, or D.D.I.
Companies trying to evaluate job applicants send them to D.D.I.'s headquarters, outside Pittsburgh, where they spend the day role-playing as business executives. When I contacted D.D.I., I was told that I was going to be Terry Turner, the head of the robotics division of a company called Global Solutions.
I arrived early in the morning, and was led to an office. On the desk was a computer, a phone, and a tape recorder. In the corner of the room was a video camera, and on my desk was an agenda for the day. I had a long telephone conversation with a business partner from France. There were labor difficulties at an overseas plant. A new product—a robot for the home-had run into a series of technical glitches. I answered e-mails. I prepared and recorded a talk for a product-launch meeting. I gave a live interview to a local television reporter. In the afternoon, I met with another senior Global Solutions manager, and presented a strategic plan for the future of the robotics division. It was a long, demanding day at the office, and when I left, a team of D.D.I. specialists combed through copies of my e-mails, the audiotapes of my phone calls and my speech, and the videotapes of my interviews, and analyzed me across four dimensions: interpersonal skills, leadership skills, business-management skills, and personal attributes. A few weeks later, I was given my report. Some of it was positive: I was a quick learner. I had good ideas. I expressed myself well, and—I was relieved to hear—wrote clearly. But, as the assessment of my performance made plain, I was something less than top management material:
Although you did a remarkable job addressing matters, you tended to handle issues from a fairly lofty perch, pitching good ideas somewhat unilaterally while lobbing supporting rationale down to the team below. . . . Had you brought your team closer to decisions by vesting them with greater accountability, responsibility and decision-making authority, they would have undoubtedly felt more engaged, satisfied and valued. . . .In a somewhat similar vein, but on a slightly more interpersonal level, while you seemed to recognize the value of collaboration and building positive working relationships with people, you tended to take a purely businesslike approach to forging partnerships. You spoke of win/win solutions from a business perspective and your rationale for partnering and collaboration seemed to be based solely on business logic. Additionally, at times you did not respond to some of the softer, subtler cues that spoke to people's real frustrations, more personal feelings, or true point of view.
Ouch! Of course, when the D.D.I. analysts said that I did not respond to "some of the softer, subtler cues that spoke to people's real frustrations, more personal feelings, or true point of view," they didn't mean that I was an insensitive person. They meant that I was insensitive in the role of manager. The T.A.T. and M.B.T.I. aimed to make global assessments of the different aspects of my personality. My day as Terry Turner was meant to find out only what I'm like when I'm the head of the robotics division of Global Solutions. That's an important difference. It respects the role of situation and contingency in personality. It sidesteps the difficulty of integrating my unconscious self with my constructed self by looking at the way that my various selves interact in the real world. Most important, it offers the hope that with experience and attention I can construct a more appropriate executive "self." The Assessment Center is probably the best method that employers have for evaluating personality.
But could an Assessment Center help us identify the Sandy Niningers of the world? The center makes a behavioral prediction, and, as solid and specific as that prediction is, people are least predictable at those critical moments when prediction would be most valuable. The answer to the question of whether my Terry Turner would be a good executive is, once again: It depends. It depends on what kind of company Global Solutions is, and on what kind of respect my co-workers have for me, and on how quickly I manage to correct my shortcomings, and on all kinds of other things that cannot be anticipated. The quality of being a good manager is, in the end, as irreducible as the quality of being a good friend. We think that a friend has to be loyal and nice and interesting—and that's certainly a good start. But people whom we don't find loyal, nice, or interesting have friends, too, because loyalty, niceness, and interestingness are emergent traits. They arise out of the interaction of two people, and all we really mean when we say that someone is interesting or nice is that they are interesting or nice to us.
All these difficulties do not mean that we should give up on the task of trying to understand and categorize one another. We could certainly send Sandy Nininger to an Assessment Center, and find out whether, in a make-believe battle, he plays the role of commando with verve and discipline. We could talk to his friends and discover his love of music and theatre. We could find out how he responded to the picture of the man on a rope. We could sit him down and have him do the Myers-Briggs and dutifully note that he is an Introverted, Intuitive, Thinking Judger, and, for good measure, take an extra minute to run him through my own favorite personality inventory and type him as a Canine, Different, Insider Gobbler. We will know all kinds of things about him then. His personnel file will be as thick as a phone book, and we can consult our findings whenever we make decisions about his future. We just have to acknowledge that his file will tell us little about the thing we're most interested in. For that, we have to join him in the jungles of Bataan.
THE ARCHIVE
complete list
Articles from the New Yorker
Getting Over It
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
November 8, 2004
ANNALS OF PSYCHOLOGY
The Man in the Gray Flannel Suit put
the war behind him. Why can't we?
1.
When Tom Rath, the hero of Sloan Wilson's 1955 novel "The Man in the Gray Flannel Suit," comes home to Connecticut each day from his job in Manhattan, his wife mixes him a Martini. If he misses the train, he'll duck into the bar at Grand Central Terminal and have a highball, or perhaps a Scotch. On Sunday mornings, Rath and his wife lie around drinking Martinis. Once, Rath takes a tumbler of Martinis to bed, and after finishing it drifts off to sleep. Then his wife wakes him up in the middle of the night, wanting to talk. "I will if you get me a drink," he says. She comes back with a glass half full of ice and gin. "On Greentree Avenue cocktail parties started at seven-thirty, when the men came home from New York, and they usually continued without any dinner until three or four o'clock in the morning," Wilson writes of the tidy neighborhood in Westport where Rath and countless other young, middle-class families live. "Somewhere around nine-thirty in the evening, Martinis and Manhattans would give way to highballs, but the formality of eating anything but hors d'oeuvres in-between had been entirely omitted."
"The Man in the Gray Flannel Suit" is about a public-relations specialist who lives in the suburbs, works for a media company in midtown, and worries about money, job security, and educating his children. It was an enormous best-seller. Gregory Peck played Tom Rath in the Hollywood version, and today, on the eve of the fiftieth anniversary of the book's publication, many of the themes the novel addresses seem strikingly contemporary. But in other ways "The Man in the Gray Flannel Suit" is utterly dated. The details are all wrong. Tom Rath, despite an introspective streak, is supposed to be a figure of middle-class normalcy. But by our standards he and almost everyone else in the novel look like alcoholics. The book is supposed to be an argument for the importance of family over career. But Rath's three children—the objects of his sacrifice—are so absent from the narrative and from Rath's consciousness that these days he'd be called an absentee father.
The most discordant note, though, is struck by the account of Rath's experience in the Second World War. He had, it becomes clear, a terrible war. As a paratrooper in Europe, he and his close friend Hank Mahoney find themselves trapped—starving and freezing—behind enemy lines, and end up killing two German sentries in order to take their sheepskin coats. But Rath doesn't quite kill one of them, and Mahoney urges him to finish the job:
Tom had knelt beside the sentry. He had not thought it would be difficult, but the tendons of the boy's neck had proved tough, and suddenly the sentry had started to sit up. In a rage Tom had plunged the knife repeatedly into his throat, ramming it home with all his strength until he had almost severed the head from the body.
At the end of the war, Rath and Mahoney are transferred to the Pacific theatre for the invasion of the island of Karkow. There Rath throws a hand grenade and inadvertently kills his friend. He crawls over to Hank's body, calling out his name. "Tom had put his hand under Mahoney's arm and turned him over," Wilson writes. "Mahoney's entire chest had been torn away, leaving the naked lungs and splintered ribs exposed."
Rath picks up the body and runs back toward his own men, dodging enemy fire. Coming upon a group of Japanese firing from a cave, he props the body up, crawls within fifteen feet of the machine gun, tosses in two grenades, and then finishes off the lone survivor with a knife. He takes Hank's body into a bombed-out pillbox and tries to resuscitate his friend's corpse. The medics tell him that Hank has been dead for hours. He won't listen. In a daze, he runs with the body toward the sea.
Wilson's description of Mahoney's death is as brutal and moving a description of the madness of combat as can be found in postwar fiction. But what happens to Rath as a result of that day in Karkow? Not much. It does not destroy him, or leave him permanently traumatized. The part of Rath's war experience that leaves him truly guilt-ridden is the adulterous affair that he has with a woman named Maria while waiting for redeployment orders in Rome. In the elevator of his midtown office, he runs into a friend who knew Maria, and learns that he fathered a son. He obsessively goes over and over the affair in his mind, trying to square his feeling toward Maria with his love for his wife, and his marriage is fully restored only when he confesses to the existence of his Italian child. Killing his best friend, by contrast, is something that comes up and then gets tucked away. As Rath sat on the beach, and Mahoney's body was finally taken away, Wilson writes:
A major, coming to squat beside him, said, "Some of these goddamn sailors got heads. They went ashore and got Jap heads, and they tried to boil them in the galley to get the skulls for souvenirs."
Tom had shrugged and said nothing. The fact that he had been too quick to throw a hand grenade and had killed Mahoney, the fact that some young sailors had wanted skulls for souvenirs, and the fact that a few hundred men had lost their lives to take the island of Karkow—all these facts were simply incomprehensible and had to be forgotten. That, he had decided, was the final truth of the war, and he had greeted it with relief, greeted it eagerly, the simple fact that it was incomprehensible and had to be forgotten. Things just happen, he had decided; they happen and they happen again, and anybody who tries to make sense out of it goes out of his mind.
You couldn't write that scene today, at least not without irony. No soldier, according to our contemporary understanding, could ever shrug off an experience like that. Today, it is Rath's affair with Maria that would be rationalized and explained away. He was a soldier, after all, in the midst of war. Who knew if he would ever see his wife again? Tim O'Brien's best-selling 1994 novel "In the Lake of the Woods" has a narrative structure almost identical to that of "The Man in the Gray Flannel Suit." O'Brien's hero, John Wade, is present at a massacre of civilians in the Vietnamese village of Thuan Yen. He kills a fellow-soldier—a man he loved like a brother. And, just like Rath, Wade sits down at the end of the long afternoon of the worst day of his war and tries to wish the memory away:
And then later still, snagged in the sunlight, he gave himself over to forgetfulness. "Go away," he murmured. He waited a moment, then said it again, firmly, much louder, and the little village began to vanish inside its own rosy glow. Here, he reasoned, was the most majestic trick of all. In the months and years ahead, John Wade would remember Thuan Yen the way chemical nightmares are remembered, impossible combinations, impossible events, and over time the impossibility itself would become the richest and deepest and most profound memory
This could not have happened. Therefore it did not.
Already he felt better.
But John Wade cannot forget. That's the point of O'Brien's book. "The Man in the Gray Flannel Suit" ends with Tom Rath stronger, and his marriage renewed. Wade falls apart, and when he returns home to the woman he left behind he wakes up screaming in his sleep. By the end of the novel, the past has come back and destroyed Wade, and one reason for the book's power is the inevitability of that disaster. This is the difference between a novel written in the middle of the last century and a novel written at the end of the century. Somehow in the intervening decades our understanding of what it means to experience a traumatic event has changed. We believe in John Wade now, not Tom Rath, and half a century after the publication of "The Man in the Gray Flannel Suit" it's worth wondering whether we've got it right.
2.
Several years ago, three psychologists—Bruce Rind, Robert Bauserman, and Philip Tromovitch—published an article on childhood sexual abuse in Psychological Bulletin, one of academic psychology's most prestigious journals. It was what psychologists call a meta-analysis. The three researchers collected fifty-nine studies that had been conducted over the years on the long-term psychological effects of childhood sexual abuse (C.S.A.), and combined the data, in order to get the most definitive and statistically powerful result possible.
What most studies of sexual abuse show is that if you gauge the psychological health of young adults—typically college students—using various measures of mental health (alcohol problems, depression, anxiety, eating disorders, obsessive-compulsive symptoms, social adjustment, sleeping problems, suicidal thoughts and behavior, and so on), those with a history of childhood sexual abuse will have more problems across the board than those who weren't abused. That makes intuitive sense. But Rind and his colleagues wanted to answer that question more specifically: how much worse off were the sexually abused? The fifty-nine studies were run through a series of sophisticated statistical tests. Studies from different times and places were put on the same scale. The results were surprising. The difference between the psychological health of those who had been abused and those who hadn't, they found, was marginal. It was two-tenths of a standard deviation. "That's like the difference between someone with an I.Q. of 100 and someone with an I.Q. of 97," Rind says. "Ninety-seven is statistically different from 100. But it's a trivial difference."
Then Rind and his colleagues went one step further. A significant percentage of people who were sexually abused as children grew up in families with a host of other problems, like violence, neglect, and verbal abuse. So, to the extent that the sexually abused were damaged, what caused the damage—the sexual abuse, or the violence and neglect that so often accompanied the abuse? The data suggested that it was the latter, and, if you account for such factors, that two-tenths of a standard deviation shrinks even more. "The real gap is probably smaller than 100 and 97," Rind says. "It might be 98, or maybe it's 99." The studies analyzed by Rind and his colleagues show that some victims of sexual abuse don't even regard themselves, in retrospect, as victims. Among the male college students surveyed, for instance, Rind and his colleagues found that "37 percent viewed their C.S.A. experiences as positive at the time they occurred," while forty-two per cent viewed them as positive when reflecting back on them.
The Rind article was published in the summer of 1998, and almost immediately it was denounced by conservative groups and lambasted in the media. Laura Schlessinger—a popular radio talk-show host known as Dr. Laura—called it "junk science." In Washington, Representative Matt Salmon called it "the Emancipation Proclamation for pedophiles," while Representative Tom DeLay accused it of "normalizing pedophilia." They held a press conference at which they demanded that the American Psychological Association censure the paper. In July of 1999, a year after its publication, both the House and the Senate overwhelmingly passed resolutions condemning the analysis. Few articles in the history of academic psychology have created such a stir.
But why? It's not as if the authors said that C.S.A. was a good thing. They just suggested that it didn't cause as many problems as we'd thought—and the question of whether C.S.A. is morally wrong doesn't hinge on its long-term consequences. Nor did the study say that sexual abuse was harmless. On average, the researchers concluded, the long-term damage is small. But that average is made up of cases where the damage is hard to find (like C.S.A. involving adolescent boys) and cases where the damage is quite significant (like father-daughter incest). Rind was trying to help psychologists focus on what was truly harmful. And, when it came to the effects of things like physical abuse and neglect, he and his colleagues sounded the alarm. "What happens in physical abuse is that it doesn't happen once," Rind says. "It happens time and time again. And, when it comes to neglect, the research shows that is the most noxious factor of all—worse than physical abuse. Why? Because it's not practiced for one week. It's a persistent thing. It's a permanent feature of the parent-child relationship. These are the kinds of things that cause problems in adulthood."
All Rind and his colleagues were saying is that sexual abuse is often something that people eventually can get over, and one of the reasons that the Rind study was so unacceptable is that we no longer think that traumatic experiences are things we can get over. We believe that the child who is molested by an uncle or a priest, on two or three furtive occasions, has to be permanently scarred by the experience—just as the soldier who accidentally kills his best friend must do more than sit down on the beach and decide that sometimes things just "happen."
In a recent history of the Rind controversy, the psychologist Scott Lilienfeld pointed out that when we find out that something we thought was very dangerous actually isn't that dangerous after all we usually regard what we've learned as good news. To him, the controversy was a paradox, and he is quite right. This attachment we have to John Wade over Tom Rath is not merely a preference for one kind of war narrative over another. It is a shift in perception so profound that the United States Congress could be presented with evidence of the unexpected strength and resilience of the human spirit and reject it without a single dissenting vote.
3.
In "The Man in the Gray Flannel Suit," Tom Rath works for Ralph Hopkins, who is the president of the United Broadcasting Company. Hopkins has decided that he wants to play a civic role in the issue of mental health, and Rath's job is to write his speeches and handle public relations connected to the project. "It all started when a group of doctors called on me a few months ago," Hopkins tells Rath, when he hires him for the job. "They apparently felt that there is too little public understanding of the whole question of mental illness, and that a campaign like the fight against cancer or polio is needed." Again and again, in the novel, the topic of mental health surfaces. Rath's father, we learn, suffered a nervous breakdown after serving in the trenches of the First World War, and died in what may well have been a suicide. His grandmother, whose death sets the book's plot in motion, wanders in and out of lucidity at the end of her life. Hopkins, in a hilarious scene, recalls his unsatisfactory experience with a psychiatrist. To Wilson's readers, this preoccupation would not have seemed out of place. In 1955, the population of New York State's twenty-seven psychiatric hospitals was nearly ninety-four thousand. (Today, largely because of anti-psychotic drugs, it is less than six thousand.) It was impossible to drive any distance from Manhattan and not be confronted with ominous, hulking reminders of psychiatric distress: the enormous complex across the Triborough Bridge, on Wards Island; Sagamore and Pilgrim Hospitals, on Long Island; Creedmoor, in Queens. Mental health mattered to the reader of the nineteen-fifties, in a way that, say, aids mattered in the novels of the late nineteen-eighties.
But Wilson draws a very clear line between the struggles of the Raths and the plight of those suffering from actual mental illness. At one point, for example, Rath's wife, Betsy, wonders why nothing is fun anymore:
It probably would take a psychiatrist to answer that. Maybe Tom and I both ought to visit one, she thought. What's the matter? the psychiatrist would say, and I would reply, I don't know—nothing seems to be much fun any more. All of a sudden the music stopped, and it didn't start again. Is that strange, or does it happen to everyone about the time when youth starts to go?
The psychiatrist would have an explanation, Betsy thought, but I don't want to hear it. People rely too much on explanations these days, and not enough on courage and action. . . . Tom has a good job, and he'll get his enthusiasm back, be a success at it. Everything's going to be fine. It does no good to wallow in night thoughts. In God we trust, and that's that.
This is not denial, much as it may sound like it. Betsy Rath is not saying that her husband doesn't have problems. She's just saying that, in all likelihood, Tom will get over his problems. This is precisely the idea that lies at the heart of the Rind meta-analysis. Once you've separated out the small number of seriously damaged people—the victims of father-daughter incest, or of prolonged neglect and physical abuse—the balance of C.S.A. survivors are pretty much going to be fine. The same is true, it turns out, of other kinds of trauma. The Columbia University psychologist George Bonanno, for instance, followed a large number of men and women who had recently lost a spouse. "In the bereavement area, the assumption has been that when people lose a loved one there is a kind of unitary process that everybody must go through," Bonanno says. "That process has been called grief work. The grief must be processed. It must be examined. It must be fully understood, then finished. It was the same kind of assumption that dominated the trauma world. The idea was that everybody exposed to these kinds of events will have to go through the same kind of process if they are to recover. And if you don't do this, if you have somehow inhibited or buried the experience, the assumption was that you would pay in the long run."
Instead, Bonanno found a wide range of responses. Some people went through a long and painful grieving process; others a period of debilitating depression. But by far the most common response was resilience: the majority of those who had just suffered from one of the most painful experiences of their lives never lapsed into serious depression, experienced a relatively brief period of grief symptoms, and soon returned to normal functioning. These people were not necessarily the hardiest or the healthiest. They just managed, by one means or another, to muddle through.
"Most people just plain cope well," Bonanno says. "The vast majority of people get over traumatic events, and get over them remarkably well. Only a small subset—five to fifteen per cent—struggle in a way that says they need help."
What these patterns of resilience suggest is that human beings are naturally endowed with a kind of psychological immune system, which keeps us in balance and overcomes wild swings to either end of the emotional spectrum. Most of us aren't resilient just in the wake of bad experiences, after all. We're also resilient in the wake of wonderful experiences; the joy of a really good meal, or winning a tennis match, or getting praised by a boss doesn't last that long, either. "One function of emotions is to signal to people quickly which things in their environments are dangerous and should be avoided and which are positive and should be approached," Timothy Wilson, a psychologist at the University of Virginia, has said. "People have very fast emotional reactions to events that serve as signals, informing them what to do. A problem with prolonged emotional reactions to past events is that it might be more difficult for these signals to get through. If people are still in a state of bliss over yesterday's success, today's dangers and hazards might be more difficult to recognize." (Wilson, incidentally, is Sloan Wilson's nephew.)
Wilson and his longtime collaborator, Daniel T. Gilbert, argue that a distinctive feature of this resilience is that people don't realize that they possess it. People are bad at forecasting their emotions—at appreciating how well, under most circumstances, they will recover. Not long ago, for instance, Gilbert, Wilson, and two other researchers—Carey Morewedge and Jane Risen—asked passengers at a subway station in Cambridge, Massachusetts, how much regret they thought they would feel if they arrived on the platform just as a train was pulling away. Then they approached passengers who really had arrived just as their train was leaving, and asked them how they felt. They found that the predictions of how bad it would feel to have just barely missed a train were on average greater than reports of how it actually felt to watch the train pull away. We suffer from what Wilson and Gilbert call an impact bias: we always assume that our emotional states will last much longer than they do. We forget that other experiences will compete for our attention and emotions. We forget that our psychological immune system will kick in and take away the sting of adversity. "When I talk about our research, I say to people, 'I'm not telling you that bad things don't hurt,'" Gilbert says. "Of course they do. It would be perverse to say that having a child or a spouse die is not a big deal. All I'm saying is that the reality doesn't meet the expectation."
This is the difference between our own era and the one of half a century ago—between "The Man in the Gray Flannel Suit" and "In the Lake of the Woods." Sloan Wilson's book came from a time and a culture that had the confidence and wisdom to understand this truth. "I love you more than I can tell," Rath says to his wife at the end of the novel. It's an ending that no one would write today, but only because we have become blind to the fact that the past—in all but the worst of cases—sooner or later fades away. Betsy turns back to her husband:
"I want you to be able to talk to me about the war. It might help us to understand each other. Did you really kill seventeen men?"
"Yes."
"Do you want to talk about it now?"
"No. It's not that I want to and can't—it's just that I'd rather think about the future. About getting a new car and driving up to Vermont with you tomorrow."
"That will be fun. It's not an insane world. At least, our part of it doesn't have to be."
THE ARCHIVE
complete list
Articles from the New Yorker
Something Borrowed
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
November 22, 2004
Annals of culture
Should a charge of plagiarism ruin your life?
1.
One day this spring, a psychiatrist named Dorothy Lewis got a call from her friend Betty, who works in New York City. Betty had just seen a Broadway play called "Frozen," written by the British playwright Bryony Lavery. "She said, 'Somehow it reminded me of you. You really ought to see it,'" Lewis recalled. Lewis asked Betty what the play was about, and Betty said that one of the characters was a psychiatrist who studied serial killers. "And I told her, 'I need to see that as much as I need to go to the moon.'"
Lewis has studied serial killers for the past twenty-five years. With her collaborator, the neurologist Jonathan Pincus, she has published a great many research papers, showing that serial killers tend to suffer from predictable patterns of psychological, physical, and neurological dysfunction: that they were almost all the victims of harrowing physical and sexual abuse as children, and that almost all of them have suffered some kind of brain injury or mental illness. In 1998, she published a memoir of her life and work entitled "Guilty by Reason of Insanity." She was the last person to visit Ted Bundy before he went to the electric chair. Few people in the world have spent as much time thinking about serial killers as Dorothy Lewis, so when her friend Betty told her that she needed to see "Frozen" it struck her as a busman's holiday.
But the calls kept coming. "Frozen" was winning raves on Broadway, and it had been nominated for a Tony. Whenever someone who knew Dorothy Lewis saw it, they would tell her that she really ought to see it, too. In June, she got a call from a woman at the theatre where "Frozen" was playing. "She said she'd heard that I work in this field, and that I see murderers, and she was wondering if I would do a talk-back after the show," Lewis said. "I had done that once before, and it was a delight, so I said sure. And I said, would you please send me the script, because I wanted to read the play."
The script came, and Lewis sat down to read it. Early in the play, something caught her eye, a phrase: "it was one of those days." One of the murderers Lewis had written about in her book had used that same expression. But she thought it was just a coincidence. "Then, there's a scene of a woman on an airplane, typing away to her friend. Her name is Agnetha Gottmundsdottir. I read that she's writing to her colleague, a neurologist called David Nabkus. And with that I realized that more was going on, and I realized as well why all these people had been telling me to see the play."
Lewis began underlining line after line. She had worked at New York University School of Medicine. The psychiatrist in "Frozen" worked at New York School of Medicine. Lewis and Pincus did a study of brain injuries among fifteen death-row inmates. Gottmundsdottir and Nabkus did a study of brain injuries among fifteen death-row inmates. Once, while Lewis was examining the serial killer Joseph Franklin, he sniffed her, in a grotesque, sexual way. Gottmundsdottir is sniffed by the play's serial killer, Ralph. Once, while Lewis was examining Ted Bundy, she kissed him on the cheek. Gottmundsdottir, in some productions of "Frozen," kisses Ralph. "The whole thing was right there," Lewis went on. "I was sitting at home reading the play, and I realized that it was I. I felt robbed and violated in some peculiar way. It was as if someone had stolen—I don't believe in the soul, but, if there was such a thing, it was as if someone had stolen my essence."
Lewis never did the talk-back. She hired a lawyer. And she came down from New Haven to see "Frozen." "In my book," she said, "I talk about where I rush out of the house with my black carry-on, and I have two black pocketbooks, and the play opens with her"—Agnetha—"with one big black bag and a carry-on, rushing out to do a lecture." Lewis had written about biting her sister on the stomach as a child. Onstage, Agnetha fantasized out loud about attacking a stewardess on an airplane and "biting out her throat." After the play was over, the cast came onstage and took questions from the audience. "Somebody in the audience said, 'Where did Bryony Lavery get the idea for the psychiatrist?'" Lewis recounted. "And one of the cast members, the male lead, said, 'Oh, she said that she read it in an English medical magazine.'" Lewis is a tiny woman, with enormous, childlike eyes, and they were wide open now with the memory. "I wouldn't have cared if she did a play about a shrink who's interested in the frontal lobe and the limbic system. That's out there to do. I see things week after week on television, on 'Law & Order' or 'C.S.I.,' and I see that they are using material that Jonathan and I brought to light. And it's wonderful. That would have been acceptable. But she did more than that. She took things about my own life, and that is the part that made me feel violated."
At the request of her lawyer, Lewis sat down and made up a chart detailing what she felt were the questionable parts of Lavery's play. The chart was fifteen pages long. The first part was devoted to thematic similarities between "Frozen" and Lewis's book "Guilty by Reason of Insanity." The other, more damning section listed twelve instances of almost verbatim similarities—totalling perhaps six hundred and seventy-five words—between passages from "Frozen" and passages from a 1997 magazine profile of Lewis. The profile was called "Damaged." It appeared in the February 24, 1997, issue of The New Yorker. It was written by me.
2.
Words belong to the person who wrote them. There are few simpler ethical notions than this one, particularly as society directs more and more energy and resources toward the creation of intellectual property. In the past thirty years, copyright laws have been strengthened. Courts have become more willing to grant intellectual-property protections. Fighting piracy has become an obsession with Hollywood and the recording industry, and, in the worlds of academia and publishing, plagiarism has gone from being bad literary manners to something much closer to a crime. When, two years ago, Doris Kearns Goodwin was found to have lifted passages from several other historians, she was asked to resign from the board of the Pulitzer Prize committee. And why not? If she had robbed a bank, she would have been fired the next day.
I'd worked on "Damaged" through the fall of 1996. I would visit Dorothy Lewis in her office at Bellevue Hospital, and watch the videotapes of her interviews with serial killers. At one point, I met up with her in Missouri. Lewis was testifying at the trial of Joseph Franklin, who claims responsibility for shooting, among others, the civil-rights leader Vernon Jordan and the pornographer Larry Flynt. In the trial, a videotape was shown of an interview that Franklin once gave to a television station. He was asked whether he felt any remorse. I wrote:
"I can't say that I do," he said. He paused again, then added, "The only thing I'm sorry about is that it's not legal."
"What's not legal?"
Franklin answered as if he'd been asked the time of day: "Killing Jews."
That exchange, almost to the word, was reproduced in "Frozen."
Lewis, the article continued, didn't feel that Franklin was fully responsible for his actions. She viewed him as a victim of neurological dysfunction and childhood physical abuse. "The difference between a crime of evil and a crime of illness," I wrote, "is the difference between a sin and a symptom." That line was in "Frozen," too—not once but twice. I faxed Bryony Lavery a letter:
I am happy to be the source of inspiration for other writers, and had you asked for my permission to quote—even liberally—from my piece, I would have been delighted to oblige. But to lift material, without my approval, is theft.
Almost as soon as I'd sent the letter, though, I began to have second thoughts. The truth was that, although I said I'd been robbed, I didn't feel that way. Nor did I feel particularly angry. One of the first things I had said to a friend after hearing about the echoes of my article in "Frozen" was that this was the only way I was ever going to get to Broadway—and I was only half joking. On some level, I considered Lavery's borrowing to be a compliment. A savvier writer would have changed all those references to Lewis, and rewritten the quotes from me, so that their origin was no longer recognizable. But how would I have been better off if Lavery had disguised the source of her inspiration?
Dorothy Lewis, for her part, was understandably upset. She was considering a lawsuit. And, to increase her odds of success, she asked me to assign her the copyright to my article. I agreed, but then I changed my mind. Lewis had told me that she "wanted her life back." Yet in order to get her life back, it appeared, she first had to acquire it from me. That seemed a little strange.
Then I got a copy of the script for "Frozen." I found it breathtaking. I realize that this isn't supposed to be a relevant consideration. And yet it was: instead of feeling that my words had been taken from me, I felt that they had become part of some grander cause. In late September, the story broke. The Times, the Observer in England, and the Associated Press all ran stories about Lavery's alleged plagiarism, and the articles were picked up by newspapers around the world. Bryony Lavery had seen one of my articles, responded to what she read, and used it as she constructed a work of art. And now her reputation was in tatters. Something about that didn't seem right.
3.
In 1992, the Beastie Boys released a song called "Pass the Mic," which begins with a six-second sample taken from the 1976 composition "Choir," by the jazz flutist James Newton. The sample was an exercise in what is called multiphonics, where the flutist "overblows" into the instrument while simultaneously singing in a falsetto. In the case of "Choir," Newton played a C on the flute, then sang C, D-flat, C—and the distortion of the overblown C, combined with his vocalizing, created a surprisingly complex and haunting sound. In "Pass the Mic," the Beastie Boys repeated the Newton sample more than forty times. The effect was riveting.
In the world of music, copyrighted works fall into two categories—the recorded performance and the composition underlying that performance. If you write a rap song, and want to sample the chorus from Billy Joel's "Piano Man," you first have to get permission from the record label to use the "Piano Man" recording, and then get permission from Billy Joel (or whoever owns his music) to use the underlying composition. In the case of "Pass the Mic," the Beastie Boys got the first kind of permission—the rights to use the recording of "Choir"—but not the second. Newton sued, and he lost—and the reason he lost serves as a useful introduction to how to think about intellectual property.
At issue in the case wasn't the distinctiveness of Newton's performance. The Beastie Boys, everyone agreed, had properly licensed Newton's performance when they paid the copyright recording fee. And there was no question about whether they had copied the underlying music to the sample. At issue was simply whether the Beastie Boys were required to ask for that secondary permission: was the composition underneath those six seconds so distinctive and original that Newton could be said to own it? The court said that it wasn't.
The chief expert witness for the Beastie Boys in the "Choir" case was Lawrence Ferrara, who is a professor of music at New York University, and when I asked him to explain the court's ruling he walked over to the piano in the corner of his office and played those three notes: C, D-flat, C. "That's it!" he shouted. "There ain't nothing else! That's what was used. You know what this is? It's no more than a mordent, a turn. It's been done thousands upon thousands of times. No one can say they own that."
Ferrara then played the most famous four-note sequence in classical music, the opening of Beethoven's Fifth: G, G, G, E-flat. This was unmistakably Beethoven. But was it original? "That's a harder case," Ferrara said. "Actually, though, other composers wrote that. Beethoven himself wrote that in a piano sonata, and you can find figures like that in composers who predate Beethoven. It's one thing if you're talking about da-da-da dummm, da-da-da dummm—those notes, with those durations. But just the four pitches, G, G, G, E-flat? Nobody owns those."
Ferrara once served as an expert witness for Andrew Lloyd Webber, who was being sued by Ray Repp, a composer of Catholic folk music. Repp said that the opening few bars of Lloyd Webber's 1984 "Phantom Song," from "The Phantom of the Opera," bore an overwhelming resemblance to his composition "Till You," written six years earlier, in 1978. As Ferrara told the story, he sat down at the piano again and played the beginning of both songs, one after the other; sure enough, they sounded strikingly similar. "Here's Lloyd Webber," he said, calling out each note as he played it. "Here's Repp. Same sequence. The only difference is that Andrew writes a perfect fourth and Repp writes a sixth."
But Ferrara wasn't quite finished. "I said, let me have everything Andrew Lloyd Webber wrote prior to 1978—'Jesus Christ Superstar,''Joseph,''Evita.'" He combed through every score, and in "Joseph and the Amazing Technicolor Dreamcoat" he found what he was looking for. "It's the song 'Benjamin Calypso.'" Ferrara started playing it. It was immediately familiar. "It's the first phrase of 'Phantom Song.' It's even using the same notes. But wait—it gets better. Here's 'Close Every Door,' from a 1969 concert performance of 'Joseph.'" Ferrara is a dapper, animated man, with a thin, well-manicured mustache, and thinking about the Lloyd Webber case was almost enough to make him jump up and down. He began to play again. It was the second phrase of ""The first half of 'Phantom' is in 'Benjamin Calypso.' The second half is in 'Close Every Door.' They are identical. On the button. In the case of the first theme, in fact, 'Benjamin Calypso' is closer to the first half of the theme at issue than the plaintiff's song. Lloyd Webber writes something in 1984, and he borrows from himself."
In the "Choir" case, the Beastie Boys' copying didn't amount to theft because it was too trivial. In the "Phantom" case, what Lloyd Webber was alleged to have copied didn't amount to theft because the material in question wasn't original to his accuser. Under copyright law, what matters is not that you copied someone else's work. What matters is what you copied, and how much you copied. Intellectual-property doctrine isn't a straightforward application of the ethical principle "Thou shalt not steal." At its core is the notion that there are certain situations where you can steal. The protections of copyright, for instance, are time-limited; once something passes into the public domain, anyone can copy it without restriction. Or suppose that you invented a cure for breast cancer in your basement lab. Any patent you received would protect your intellectual property for twenty years, but after that anyone could take your invention. You get an initial monopoly on your creation because we want to provide economic incentives for people to invent things like cancer drugs. But everyone gets to steal your breast-cancer cure—after a decent interval—because it is also in society's interest to let as many people as possible copy your invention; only then can others learn from it, and build on it, and come up with better and cheaper alternatives. This balance between the protecting and the limiting of intellectual property is, in fact, enshrined in the Constitution: "Congress shall have the power to promote the Progress of Science and useful Arts, by securing for limited"—note that specification, limited—"Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries."
4.
So is it true that words belong to the person who wrote them, just as other kinds of property belong to their owners? Actually, no. As the Stanford law professor Lawrence Lessig argues in his new book "Free Culture":
In ordinary language, to call a copyright a "property" right is a bit misleading, for the property of copyright is an odd kind of property. . . . I understand what I am taking when I take the picnic table you put in your backyard. I am taking a thing, the picnic table, and after I take it, you don't have it. But what am I taking when I take the good idea you had to put a picnic table in the backyard—by, for example, going to Sears, buying a table, and putting it in my backyard? What is the thing that I am taking then?
The point is not just about the thingness of picnic tables versus ideas, though that is an important difference. The point instead is that in the ordinary case—indeed, in practically every case except for a narrow range of exceptions—ideas released to the world are free. I don't take anything from you when I copy the way you dress—though I might seem weird if I do it every day. . . . Instead, as Thomas Jefferson said (and this is especially true when I copy the way someone dresses), "He who receives an idea from me, receives instruction himself without lessening mine; as he who lights his taper at mine, receives light without darkening me."
Lessig argues that, when it comes to drawing this line between private interests and public interests in intellectual property, the courts and Congress have, in recent years, swung much too far in the direction of private interests. He writes, for instance, about the fight by some developing countries to get access to inexpensive versions of Western drugs through what is called "parallel importation"—buying drugs from another developing country that has been licensed to produce patented medicines. The move would save countless lives. But it has been opposed by the United States not on the ground that it would cut into the profits of Western pharmaceutical companies (they don't sell that many patented drugs in developing countries anyway) but on the ground that it violates the sanctity of intellectual property. "We as a culture have lost this sense of balance," Lessig writes. "A certain property fundamentalism, having no connection to our tradition, now reigns in this culture."
Even what Lessig decries as intellectual-property extremism, however, acknowledges that intellectual property has its limits. The United States didn't say that developing countries could never get access to cheap versions of American drugs. It said only that they would have to wait until the patents on those drugs expired. The arguments that Lessig has with the hard-core proponents of intellectual property are almost all arguments about where and when the line should be drawn between the right to copy and the right to protection from copying, not whether a line should be drawn.
But plagiarism is different, and that's what's so strange about it. The ethical rules that govern when it's acceptable for one writer to copy another are even more extreme than the most extreme position of the intellectual-property crowd: when it comes to literature, we have somehow decided that copying is never acceptable. Not long ago, the Harvard law professor Laurence Tribe was accused of lifting material from the historian Henry Abraham for his 1985 book, "God Save This Honorable Court." What did the charge amount to? In an exposé that appeared in the conservative publication The Weekly Standard, Joseph Bottum produced a number of examples of close paraphrasing, but his smoking gun was this one borrowed sentence: "Taft publicly pronounced Pitney to be a 'weak member' of the Court to whom he could not assign cases." That's it. Nineteen words.
Not long after I learned about "Frozen," I went to see a friend of mine who works in the music industry. We sat in his living room on the Upper East Side, facing each other in easy chairs, as he worked his way through a mountain of CDs. He played "Angel," by the reggae singer Shaggy, and then "The Joker," by the Steve Miller Band, and told me to listen very carefully to the similarity in bass lines. He played Led Zeppelin's "Whole Lotta Love" and then Muddy Waters's "You Need Love," to show the extent to which Led Zeppelin had mined the blues for inspiration. He played "Twice My Age," by Shabba Ranks and Krystal, and then the saccharine seventies pop standard "Seasons in the Sun," until I could hear the echoes of the second song in the first. He played "Last Christmas," by Wham!, followed by Barry Manilow's "Can't Smile Without You" to explain why Manilow might have been startled when he first heard that song, and then "Joanna," by Kool and the Gang, because, in a different way, "Last Christmas" was an homage to Kool and the Gang as well. "That sound you hear in Nirvana," my friend said at one point, "that soft and then loud, kind of exploding thing, a lot of that was inspired by the Pixies. Yet Kurt Cobain"—Nirvana's lead singer and songwriter—"was such a genius that he managed to make it his own. And 'Smells Like Teen Spirit'?"—here he was referring to perhaps the best-known Nirvana song. "That's Boston's 'More Than a Feeling.'" He began to hum the riff of the Boston hit, and said, "The first time I heard 'Teen Spirit,' I said, 'That guitar lick is from "More Than a Feeling."' But it was different—it was urgent and brilliant and new."
He played another CD. It was Rod Stewart's "Do Ya Think I'm Sexy," a huge hit from the nineteen-seventies. The chorus has a distinctive, catchy hook—the kind of tune that millions of Americans probably hummed in the shower the year it came out. Then he put on "Taj Mahal," by the Brazilian artist Jorge Ben Jor, which was recorded several years before the Rod Stewart song. In his twenties, my friend was a d.j. at various downtown clubs, and at some point he'd become interested in world music. "I caught it back then," he said. A small, sly smile spread across his face. The opening bars of "Taj Mahal" were very South American, a world away from what we had just listened to. And then I heard it. It was so obvious and unambiguous that I laughed out loud; virtually note for note, it was the hook from "Do Ya Think I'm Sexy." It was possible that Rod Stewart had independently come up with that riff, because resemblance is not proof of influence. It was also possible that he'd been in Brazil, listened to some local music, and liked what he heard.
My friend had hundreds of these examples. We could have sat in his living room playing at musical genealogy for hours. Did the examples upset him? Of course not, because he knew enough about music to know that these patterns of influence—cribbing, tweaking, transforming—were at the very heart of the creative process. True, copying could go too far. There were times when one artist was simply replicating the work of another, and to let that pass inhibited true creativity. But it was equally dangerous to be overly vigilant in policing creative expression, because if Led Zeppelin hadn't been free to mine the blues for inspiration we wouldn't have got "Whole Lotta Love," and if Kurt Cobain couldn't listen to "More Than a Feeling" and pick out and transform the part he really liked we wouldn't have "Smells Like Teen Spirit"—and, in the evolution of rock, "Smells Like Teen Spirit" was a real step forward from "More Than a Feeling." A successful music executive has to understand the distinction between borrowing that is transformative and borrowing that is merely derivative, and that distinction, I realized, was what was missing from the discussion of Bryony Lavery's borrowings. Yes, she had copied my work. But no one was asking why she had copied it, or what she had copied, or whether her copying served some larger purpose.
5.
Bryony Lavery came to see me in early October. It was a beautiful Saturday afternoon, and we met at my apartment. She is in her fifties, with short tousled blond hair and pale-blue eyes, and was wearing jeans and a loose green shirt and clogs. There was something rugged and raw about her. In the Times the previous day, the theatre critic Ben Brantley had not been kind to her new play, "Last Easter." This was supposed to be her moment of triumph. "Frozen" had been nominated for a Tony. ""Last Easter" had opened Off Broadway. And now? She sat down heavily at my kitchen table. "I've had the absolute gamut of emotions," she said, playing nervously with her hands as she spoke, as if she needed a cigarette. "I think when one's working, one works between absolute confidence and absolute doubt, and I got a huge dollop of each. I was terribly confident that I could write well after 'Frozen,' and then this opened a chasm of doubt." She looked up at me. "I'm terribly sorry," she said.
Lavery began to explain: "What happens when I write is that I find that I'm somehow zoning on a number of things. I find that I've cut things out of newspapers because the story or something in them is interesting to me, and seems to me to have a place onstage. Then it starts coagulating. It's like the soup starts thickening. And then a story, which is also a structure, starts emerging. I'd been reading thrillers like 'The Silence of the Lambs,' about fiendishly clever serial killers. I'd also seen a documentary of the victims of the Yorkshire killers, Myra Hindley and Ian Brady, who were called the Moors Murderers. They spirited away several children. It seemed to me that killing somehow wasn't fiendishly clever. It was the opposite of clever. It was as banal and stupid and destructive as it could be. There are these interviews with the survivors, and what struck me was that they appeared to be frozen in time. And one of them said, 'If that man was out now, I'm a forgiving man but I couldn't forgive him. I'd kill him.' That's in 'Frozen.' I was thinking about that. Then my mother went into hospital for a very simple operation, and the surgeon punctured her womb, and therefore her intestine, and she got peritonitis and died."
When Lavery started talking about her mother, she stopped, and had to collect herself. "She was seventy-four, and what occurred to me is that I utterly forgave him. I thought it was an honest mistake. I'm very sorry it happened to my mother, but it's an honest mistake." Lavery's feelings confused her, though, because she could think of people in her own life whom she had held grudges against for years, for the most trivial of reasons. "In a lot of ways, 'Frozen' was an attempt to understand the nature of forgiveness," she said.
Lavery settled, in the end, on a play with three characters. The first is a serial killer named Ralph, who kidnaps and murders a young girl. The second is the murdered girl's mother, Nancy. The third is a psychiatrist from New York, Agnetha, who goes to England to examine Ralph. In the course of the play, the three lives slowly intersect—and the characters gradually change and become "unfrozen" as they come to terms with the idea of forgiveness. For the character of Ralph, Lavery says that she drew on a book about a serial killer titled "The Murder of Childhood," by Ray Wyre and Tim Tate. For the character of Nancy, she drew on an article written in the Guardian by a woman named Marian Partington, whose sister had been murdered by the serial killers Frederick and Rosemary West. And, for the character of Agnetha, Lavery drew on a reprint of my article that she had read in a British publication. "I wanted a scientist who would understand," Lavery said—a scientist who could explain how it was possible to forgive a man who had killed your daughter, who could explain that a serial killing was not a crime of evil but a crime of illness. "I wanted it to be accurate," she added.
So why didn't she credit me and Lewis? How could she have been so meticulous about accuracy but not about attribution? Lavery didn't have an answer. "I thought it was O.K. to use it," she said with an embarrassed shrug. "It never occurred to me to ask you. I thought it was news."
She was aware of how hopelessly inadequate that sounded, and when she went on to say that my article had been in a big folder of source material that she had used in the writing of the play, and that the folder had got lost during the play's initial run, in Birmingham, she was aware of how inadequate that sounded, too.
But then Lavery began to talk about Marian Partington, her other important inspiration, and her story became more complicated. While she was writing "Frozen," Lavery said, she wrote to Partington to inform her of how much she was relying on Partington's experiences. And when "Frozen" opened in London she and Partington met and talked. In reading through articles on Lavery in the British press, I found this, from the Guardian two years ago, long before the accusations of plagiarism surfaced:
Lavery is aware of the debt she owes to Partington's writing and is eager to acknowledge it.
"I always mention it, because I am aware of the enormous debt that I owe to the generosity of Marian Partington's piece . . . . You have to be hugely careful when writing something like this, because it touches on people's shattered lives and you wouldn't want them to come across it unawares."
Lavery wasn't indifferent to other people's intellectual property, then; she was just indifferent to my intellectual property. That's because, in her eyes, what she took from me was different. It was, as she put it, "news." She copied my description of Dorothy Lewis's collaborator, Jonathan Pincus, conducting a neurological examination. She copied the description of the disruptive neurological effects of prolonged periods of high stress. She copied my transcription of the television interview with Franklin. She reproduced a quote that I had taken from a study of abused children, and she copied a quotation from Lewis on the nature of evil. She didn't copy my musings, or conclusions, or structure. She lifted sentences like "It is the function of the cortex—and, in particular, those parts of the cortex beneath the forehead, known as the frontal lobes—to modify the impulses that surge up from within the brain, to provide judgment, to organize behavior and decision-making, to learn and adhere to rules of everyday life." It is difficult to have pride of authorship in a sentence like that. My guess is that it's a reworked version of something I read in a textbook. Lavery knew that failing to credit Partington would have been wrong. Borrowing the personal story of a woman whose sister was murdered by a serial killer matters because that story has real emotional value to its owner. As Lavery put it, it touches on someone's shattered life. Are boilerplate descriptions of physiological functions in the same league?
It also matters how Lavery chose to use my words. Borrowing crosses the line when it is used for a derivative work. It's one thing if you're writing a history of the Kennedys, like Doris Kearns Goodwin, and borrow, without attribution, from another history of the Kennedys. But Lavery wasn't writing another profile of Dorothy Lewis. She was writing a play about something entirely new—about what would happen if a mother met the man who killed her daughter. And she used my descriptions of Lewis's work and the outline of Lewis's life as a building block in making that confrontation plausible. Isn't that the way creativity is supposed to work? Old words in the service of a new idea aren't the problem. What inhibits creativity is new words in the service of an old idea.
And this is the second problem with plagiarism. It is not merely extremist. It has also become disconnected from the broader question of what does and does not inhibit creativity. We accept the right of one writer to engage in a full-scale knockoff of another—think how many serial-killer novels have been cloned from "The Silence of the Lambs." Yet, when Kathy Acker incorporated parts of a Harold Robbins sex scene verbatim in a satiric novel, she was denounced as a plagiarist (and threatened with a lawsuit). When I worked at a newspaper, we were routinely dispatched to "match" a story from the Times: to do a new version of someone else's idea. But had we "matched" any of the Times' words—even the most banal of phrases—it could have been a firing offense. The ethics of plagiarism have turned into the narcissism of small differences: because journalism cannot own up to its heavily derivative nature, it must enforce originality on the level of the sentence.
Dorothy Lewis says that one of the things that hurt her most about "Frozen" was that Agnetha turns out to have had an affair with her collaborator, David Nabkus. Lewis feared that people would think she had had an affair with her collaborator, Jonathan Pincus. "That's slander," Lewis told me. "I'm recognizable in that. Enough people have called me and said, 'Dorothy, it's about you,' and if everything up to that point is true, then the affair becomes true in the mind. So that is another reason that I feel violated. If you are going to take the life of somebody, and make them absolutely identifiable, you don't create an affair, and you certainly don't have that as a climax of the play."
It is easy to understand how shocking it must have been for Lewis to sit in the audience and see her "character" admit to that indiscretion. But the truth is that Lavery has every right to create an affair for Agnetha, because Agnetha is not Dorothy Lewis. She is a fictional character, drawn from Lewis's life but endowed with a completely imaginary set of circumstances and actions. In real life, Lewis kissed Ted Bundy on the cheek, and in some versions of "Frozen" Agnetha kisses Ralph. But Lewis kissed Bundy only because he kissed her first, and there's a big difference between responding to a kiss from a killer and initiating one. When we first see Agnetha, she's rushing out of the house and thinking murderous thoughts on the airplane. Dorothy Lewis also charges out of her house and thinks murderous thoughts. But the dramatic function of that scene is to make us think, in that moment, that Agnetha is crazy. And the one inescapable fact about Lewis is that she is not crazy: she has helped get people to rethink their notions of criminality because of her unshakable command of herself and her work. Lewis is upset not just about how Lavery copied her life story, in other words, but about how Lavery changed her life story. She's not merely upset about plagiarism. She's upset about art—about the use of old words in the service of a new idea—and her feelings are perfectly understandable, because the alterations of art can be every bit as unsettling and hurtful as the thievery of plagiarism. It's just that art is not a breach of ethics.
When I read the original reviews of "Frozen," I noticed that time and again critics would use, without attribution, some version of the sentence "The difference between a crime of evil and a crime of illness is the difference between a sin and a symptom." That's my phrase, of course. I wrote it. Lavery borrowed it from me, and now the critics were borrowing it from her. The plagiarist was being plagiarized. In this case, there is no "art" defense: nothing new was being done with that line. And this was not "news." Yet do I really own "sins and symptoms"? There is a quote by Gandhi, it turns out, using the same two words, and I'm sure that if I were to plow through the body of English literature I would find the path littered with crimes of evil and crimes of illness. The central fact about the "Phantom" case is that Ray Repp, if he was borrowing from Andrew Lloyd Webber, certainly didn't realize it, and Andrew Lloyd Webber didn't realize that he was borrowing from himself. Creative property, Lessig reminds us, has many lives—the newspaper arrives at our door, it becomes part of the archive of human knowledge, then it wraps fish. And, by the time ideas pass into their third and fourth lives, we lose track of where they came from, and we lose control of where they are going. The final dishonesty of the plagiarism fundamentalists is to encourage us to pretend that these chains of influence and evolution do not exist, and that a writer's words have a virgin birth and an eternal life. I suppose that I could get upset about what happened to my words. I could also simply acknowledge that I had a good, long ride with that line—and let it go.
"It's been absolutely bloody, really, because it attacks my own notion of my character," Lavery said, sitting at my kitchen table. A bouquet of flowers she had brought were on the counter behind her. "It feels absolutely terrible. I've had to go through the pain for being careless. I'd like to repair what happened, and I don't know how to do that. I just didn't think I was doing the wrong thing . . . and then the article comes out in the New York Times and every continent in the world." There was a long silence. She was heartbroken. But, more than that, she was confused, because she didn't understand how six hundred and seventy-five rather ordinary words could bring the walls tumbling down. "It's been horrible and bloody." She began to cry. "I'm still composting what happened. It will be for a purpose . . . whatever that purpose is.
THE ARCHIVE
complete list
Articles from the New Yorker
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
December 13, 2004
ANNALS OF TECHNOLOGY
Mammography, air power, and the limits of looking.
1.
At the beginning of the first Gulf War, the United States Air Force dispatched two squadrons of F-15E Strike Eagle fighter jets to find and destroy the Scud missiles that Iraq was firing at Israel. The rockets were being launched, mostly at night, from the backs of modified flatbed tractor-trailers, moving stealthily around a four-hundred-square-mile "Scud box" in the western desert. The plan was for the fighter jets to patrol the box from sunset to sunrise. When a Scud was launched, it would light up the night sky. An F-15E pilot would fly toward the launch point, follow the roads that crisscrossed the desert, and then locate the target using a state-of-the-art, $4.6-million device called a LANTIM navigation and targeting pod, capable of taking a high-resolution infrared photograph of a four-and-a-half-mile swath below the plane. How hard could it be to pick up a hulking tractor-trailer in the middle of an empty desert?
Almost immediately, reports of Scud kills began to come back from the field. The Desert Storm commanders were elated. "I remember going out to Nellis Air Force Base after the war," Barry Watts, a former Air Force colonel, says. "They did a big static display, and they had all the Air Force jets that flew in Desert Storm, and they had little placards in front of them, with a box score, explaining what this plane did and that plane did in the war. And, when you added up how many Scud launchers they claimed each got, the total was about a hundred." Air Force officials were not guessing at the number of Scud launchers hit; as far as they were concerned, they knew. They had a four-million-dollar camera, which took a nearly perfect picture, and there are few cultural reflexes more deeply ingrained than the idea that a picture has the weight of truth. "That photography not only does not, but cannot lie, is a matter of belief, an article of faith," Charles Rosen and Henri Zerner have written. "We tend to trust the camera more than our own eyes." Thus was victory declared in the Scud hunt--until hostilities ended and the Air Force appointed a team to determine the effectiveness of the air campaigns in Desert Storm. The actual number of definite Scud kills, the team said, was zero.
The problem was that the pilots were operating at night, when depth perception is impaired. LANTIM could see in the dark, but the camera worked only when it was pointed in the right place, and the right place wasn't obvious. Meanwhile, the pilot had only about five minutes to find his quarry, because after launch the Iraqis would immediately hide in one of the many culverts underneath the highway between Baghdad and Jordan, and the screen the pilot was using to scan all that desert measured just six inches by six inches. "It was like driving down an interstate looking through a soda straw," Major General Mike DeCuir, who flew numerous Scud-hunt missions throughout the war, recalled. Nor was it clear what a Scud launcher looked like on that screen. "We had an intelligence photo of one on the ground. But you had to imagine what it would look like on a black-and-white screen from twenty thousand feet up and five or more miles away," DeCuir went on. "With the resolution we had at the time, you could tell something was a big truck and that it had wheels, but at that altitude it was hard to tell much more than that." The postwar analysis indicated that a number of the targets the pilots had hit were actually decoys, constructed by the Iraqis from old trucks and spare missile parts. Others were tanker trucks transporting oil on the highway to Jordan. A tanker truck, after all, is a tractor-trailer hauling a long, shiny cylindrical object, and, from twenty thousand feet up at four hundred miles an hour on a six-by-six-inch screen, a long, shiny cylindrical object can look a lot like a missile. "It's a problem we've always had," Watts, who served on the team that did the Gulf War analysis, said. "It's night out. You think you've got something on the sensor. You roll out your weapons. Bombs go off. It's really hard to tell what you did."
You can build a high-tech camera, capable of taking pictures in the middle of the night, in other words, but the system works only if the camera is pointed in the right place, and even then the pictures are not self-explanatory. They need to be interpreted, and the human task of interpretation is often a bigger obstacle than the technical task of picture-taking. This was the lesson of the Scud hunt: pictures promise to clarify but often confuse. The Zapruder film intensified rather than dispelled the controversy surrounding John F. Kennedy's assassination. The videotape of the beating of Rodney King led to widespread uproar about police brutality; it also served as the basis for a jury's decision to acquit the officers charged with the assault. Perhaps nowhere have these issues been so apparent, however, as in the arena of mammography. Radiologists developed state-of-the-art X-ray cameras and used them to scan women's breasts for tumors, reasoning that, if you can take a nearly perfect picture, you can find and destroy tumors before they go on to do serious damage. Yet there remains a great deal of confusion about the benefits of mammography. Is it possible that we place too much faith in pictures?
2.
The head of breast imaging at Memorial Sloan-Kettering Cancer Center, in New York City, is a physician named David Dershaw, a youthful man in his fifties, who bears a striking resemblance to the actor Kevin Spacey. One morning not long ago, he sat down in his office at the back of the Sloan-Kettering Building and tried to explain how to read a mammogram.
Dershaw began by putting an X-ray on a light box behind his desk. "Cancer shows up as one of two patterns," he said. "You look for lumps and bumps, and you look for calcium. And, if you find it, you have to make a determination: is it acceptable, or is it a pattern that might be due to cancer?" He pointed at the X-ray. "This woman has cancer. She has these tiny little calcifications. Can you see them? Can you see how small they are?" He took out a magnifying glass and placed it over a series of white flecks; as a cancer grows, it produces calcium deposits. "That's the stuff we are looking for," he said.
Then Dershaw added a series of slides to the light box and began to explain all the varieties that those white flecks came in. Some calcium deposits are oval and lucent. "They're called eggshell calcifications," Dershaw said. "And they're basically benign." Another kind of calcium runs like a railway track on either side of the breast's many blood vessels--that's benign, too. "Then there's calcium that's thick and heavy and looks like popcorn," Dershaw went on. "That's just dead tissue. That's benign. There's another calcification that's little sacs of calcium floating in liquid. It's called 'milk of calcium.' That's another kind of calcium that's always benign." He put a new set of slides against the light. "Then we have calcium that looks like this--irregular. All of these are of different density and different sizes and different configurations. Those are usually benign, but sometimes they are due to cancer. Remember you saw those railway tracks? This is calcium laid down inside a tube as well, but you can see that the outside of the tube is irregular. That's cancer." Dershaw's explanations were beginning to be confusing. "There are certain calcifications in benign tissues that are always benign," he said. "There are certain kinds that are always associated with cancer. But those are the ends of the spectrum, and the vast amount of calcium is somewhere in the middle. And making that differentiation, between whether the calcium is acceptable or not, is not clear-cut."
The same is true of lumps. Some lumps are simply benign clumps of cells. You can tell they are benign because the walls of the mass look round and smooth; in a cancer, cells proliferate so wildly that the walls of the tumor tend to be ragged and to intrude into the surrounding tissue. But sometimes benign lumps resemble tumors, and sometimes tumors look a lot like benign lumps. And sometimes you have lots of masses that, taken individually, would be suspicious but are so pervasive that the reasonable conclusion is that this is just how the woman's breast looks. "If you have a CAT scan of the chest, the heart always looks like the heart, the aorta always looks like the aorta," Dershaw said. "So when there's a lump in the middle of that, it's clearly abnormal. Looking at a mammogram is conceptually different from looking at images elsewhere in the body. Everything else has anatomy--anatomy that essentially looks the same from one person to the next. But we don't have that kind of standardized information on the breast. The most difficult decision I think anybody needs to make when we're confronted with a patient is: Is this person normal? And we have to decide that without a pattern that is reasonably stable from individual to individual, and sometimes even without a pattern that is the same from the left side to the right."
Dershaw was saying that mammography doesn't fit our normal expectations of pictures. In the days before the invention of photography, for instance, a horse in motion was represented in drawings and paintings according to the convention of ventre Ă terre, or "belly to the ground." Horses were drawn with their front legs extended beyond their heads, and their hind legs stretched straight back, because that was the way, in the blur of movement, a horse seemed to gallop. Then, in the eighteen-seventies, came Eadweard Muybridge, with his famous sequential photographs of a galloping horse, and that was the end of ventre Ă terre. Now we knew how a horse galloped. The photograph promised that we would now be able to capture reality itself.
The situation with mammography is different. The way in which we ordinarily speak about calcium and lumps is clear and unambiguous. But the picture demonstrates how blurry those seemingly distinct categories actually are. Joann Elmore, a physician and epidemiologist at the University of Washington Harborview Medical Center, once asked ten board-certified radiologists to look at a hundred and fifty mammograms--of which twenty-seven had come from women who developed breast cancer, and a hundred and twenty-three from women who were known to have been healthy. One radiologist caught eighty-five per cent of the cancers the first time around. Another caught only thirty-seven per cent. One looked at the same X-rays and saw suspicious masses in seventy-eight per cent of the cases. Another doctor saw "focal asymmetric density" in half of the cancer cases; yet another saw no "focal asymmetric density" at all. There was one particularly perplexing mammogram that three radiologists thought was normal, two thought was abnormal but probably benign, four couldn't make up their minds about, and one was convinced was cancer. (The patient was fine.) Some of these differences are a matter of skill, and there is good evidence that with more rigorous training and experience radiologists can become better at reading breast X-rays. But so much of what can be seen on an X-ray falls into a gray area that interpreting a mammogram is also, in part, a matter of temperament. Some radiologists see something ambiguous and are comfortable calling it normal. Others see something ambiguous and get suspicious.
Does that mean radiologists ought to be as suspicious as possible? You might think so, but caution simply creates another kind of problem. The radiologist in the Elmore study who caught the most cancers also recommended immediate workups--a biopsy, an ultrasound, or additional X-rays--on sixty-four per cent of the women who didn't have cancer. In the real world, a radiologist who needlessly subjected such an extraordinary percentage of healthy patients to the time, expense, anxiety, and discomfort of biopsies and further testing would find himself seriously out of step with his profession. Mammography is not a form of medical treatment, where doctors are justified in going to heroic lengths on behalf of their patients. Mammography is a form of medical screening: it is supposed to exclude the healthy, so that more time and attention can be given to the sick. If screening doesn't screen, it ceases to be useful.
Gilbert Welch, a medical-outcomes expert at Dartmouth, has pointed out that, given current breast-cancer mortality rates, nine out of every thousand sixty-year-old women will die of breast cancer in the next ten years. If every one of those women had a mammogram every year, that number would fall to six. The radiologist seeing those thousand women, in other words, would read ten thousand X-rays over a decade in order to save three lives--and that's using the most generous possible estimate of mammography's effectiveness. The reason a radiologist is required to assume that the overwhelming number of ambiguous things are normal, in other words, is that the overwhelming number of ambiguous things really are normal. Radiologists are, in this sense, a lot like baggage screeners at airports. The chances are that the dark mass in the middle of the suitcase isn't a bomb, because you've seen a thousand dark masses like it in suitcases before, and none of those were bombs--and if you flagged every suitcase with something ambiguous in it no one would ever make his flight. But that doesn't mean, of course, that it isn't a bomb. All you have to go on is what it looks like on the X-ray screen--and the screen seldom gives you quite enough information.
3.
Dershaw picked up a new X-ray and put it on the light box. It belonged to a forty-eight-year-old woman. Mammograms indicate density in the breast: the denser the tissue is, the more the X rays are absorbed, creating the variations in black and white that make up the picture. Fat hardly absorbs the beam at all, so it shows up as black. Breast tissue, particularly the thick breast tissue of younger women, shows up on an X-ray as shades of light gray or white. This woman's breasts consisted of fat at the back of the breast and more dense, glandular tissue toward the front, so the X-ray was entirely black, with what looked like a large white, dense cloud behind the nipple. Clearly visible, in the black, fatty portion of the left breast, was a white spot. "Now, that looks like a cancer, that little smudgy, irregular, infiltrative thing," Dershaw said. "It's about five millimetres across." He looked at the X-ray for a moment. This was mammography at its best: a clear picture of a problem that needed to be fixed. Then he took a pen and pointed to the thick cloud just to the right of the tumor. The cloud and the tumor were exactly the same color. "That cancer only shows up because it's in the fatty part of the breast," he said. "If you take that cancer and put it in the dense part of the breast, you'd never see it, because the whiteness of the mass is the same as the whiteness of normal tissue. If the tumor was over there, it could be four times as big and we still wouldn't see it."
What's more, mammography is especially likely to miss the tumors that do the most harm. A team led by the research pathologist Peggy Porter analyzed four hundred and twenty-nine breast cancers that had been diagnosed over five years at the Group Health Cooperative of Puget Sound. Of those, two hundred and seventy-nine were picked up by mammography, and the bulk of them were detected very early, at what is called Stage One. (Cancer is classified into four stages, according to how far the tumor has spread from its original position.) Most of the tumors were small, less than two centimetres. Pathologists grade a tumor's aggression according to such measures as the "mitotic count"--the rate at which the cells are dividing--and the screen-detected tumors were graded "low" in almost seventy per cent of the cases. These were the kinds of cancers that could probably be treated successfully. "Most tumors develop very, very slowly, and those tend to lay down calcium deposits--and what mammograms are doing is picking up those calcifications," Leslie Laufman, a hematologist-oncologist in Ohio, who served on a recent National Institutes of Health breast-cancer advisory panel, said. "Almost by definition, mammograms are picking up slow-growing tumors."
A hundred and fifty cancers in Porter's study, however, were missed by mammography. Some of these were tumors the mammogram couldn't see--that were, for instance, hiding in the dense part of the breast. The majority, though, simply didn't exist at the time of the mammogram. These cancers were found in women who had had regular mammograms, and who were legitimately told that they showed no sign of cancer on their last visit. In the interval between X-rays, however, either they or their doctor had manually discovered a lump in their breast, and these "interval" cancers were twice as likely to be in Stage Three and three times as likely to have high mitotic counts; twenty-eight per cent had spread to the lymph nodes, as opposed to eighteen per cent of the screen-detected cancers. These tumors were so aggressive that they had gone from undetectable to detectable in the interval between two mammograms.
The problem of interval tumors explains why the overwhelming majority of breast-cancer experts insist that women in the critical fifty-to-sixty-nine age group get regular mammograms. In Porter's study, the women were X-rayed at intervals as great as every three years, and that created a window large enough for interval cancers to emerge. Interval cancers also explain why many breast-cancer experts believe that mammograms must be supplemented by regular and thorough clinical breast exams. ("Thorough" is defined as palpation of the area from the collarbone to the bottom of the rib cage, one dime-size area at a time, at three levels of pressure--just below the skin, the mid-breast, and up against the chest wall--by a specially trained practitioner for a period not less than five minutes per breast.) In a major study of mammography's effectiveness--one of a pair of Canadian trials conducted in the nineteen-eighties--women who were given regular, thorough breast exams but no mammograms were compared with those who had thorough breast exams and regular mammograms, and no difference was found in the death rates from breast cancer between the two groups. The Canadian studies are controversial, and some breast-cancer experts are convinced that they may have understated the benefits of mammography. But there is no denying the basic lessons of the Canadian trials: that a skilled pair of fingertips can find out an extraordinary amount about the health of a breast, and that we should not automatically value what we see in a picture over what we learn from our other senses.
"The finger has hundreds of sensors per square centimetre," says Mark Goldstein, a sensory psychophysicist who co-founded MammaCare, a company devoted to training nurses and physicians in the art of the clinical exam. "There is nothing in science or technology that has even come close to the sensitivity of the human finger with respect to the range of stimuli it can pick up. It's a brilliant instrument. But we simply don't trust our tactile sense as much as our visual sense."
4.
On the night of August 17, 1943, two hundred B-17 bombers from the United States Eighth Air Force set out from England for the German city of Schweinfurt. Two months later, two hundred and twenty-eight B-17s set out to strike Schweinfurt a second time. The raids were two of the heaviest nights of bombing in the war, and the Allied experience at Schweinfurt is an example of a more subtle--but in some cases more serious--problem with the picture paradigm.
The Schweinfurt raids grew out of the United States military's commitment to bombing accuracy. As Stephen Budiansky writes in his wonderful recent book "Air Power," the chief lesson of aerial bombardment in the First World War was that hitting a target from eight or ten thousand feet was a prohibitively difficult task. In the thick of battle, the bombardier had to adjust for the speed of the plane, the speed and direction of the prevailing winds, and the pitching and rolling of the plane, all while keeping the bombsight level with the ground. It was an impossible task, requiring complex trigonometric calculations. For a variety of reasons, including the technical challenges, the British simply abandoned the quest for precision: in both the First World War and the Second, the British military pursued a strategy of "morale" or "area" bombing, in which bombs were simply dropped, indiscriminately, on urban areas, with the intention of killing, dispossessing, and dispiriting the German civilian population.
But the American military believed that the problem of bombing accuracy was solvable, and a big part of the solution was something called the Norden bombsight. This breakthrough was the work of a solitary, cantankerous genius named Carl Norden, who operated out of a factory in New York City. Norden built a fifty-pound mechanical computer called the Mark XV, which used gears and wheels and gyroscopes to calculate airspeed, altitude, and crosswinds in order to determine the correct bomb-release point. The Mark XV, Norden's business partner boasted, could put a bomb in a pickle barrel from twenty thousand feet. The United States spent $1.5 billion developing it, which, as Budiansky points out, was more than half the amount that was spent building the atomic bomb. "At air bases, the Nordens were kept under lock and key in secure vaults, escorted to their planes by armed guards, and shrouded in a canvas cover until after takeoff," Budiansky recounts. The American military, convinced that its bombers could now hit whatever they could see, developed a strategic approach to bombing, identifying and selectively destroying targets that were critical to the Nazi war effort. In early 1943, General Henry (Hap) Arnold--the head of the Army Air Forces--assembled a group of prominent civilians to analyze the German economy and recommend critical targets. The Advisory Committee on Bombardment, as it was called, determined that the United States should target Germany's ball-bearing factories, since ball bearings were critical to the manufacture of airplanes. And the center of the German ball-bearing industry was Schweinfurt. Allied losses from the two raids were staggering. Thirty-six B-17s were shot down in the August attack, sixty-two bombers were shot down in the October raid, and between the two operations a further hundred and thirty-eight planes were badly damaged. Yet, with the war in the balance, this was considered worth the price. When the damage reports came in, Arnold exulted, "Now we have got Schweinfurt!" He was wrong.
The problem was not, as in the case of the Scud hunt, that the target could not be found, or that what was thought to be the target was actually something else. The B-17s, aided by their Norden Mark XVs, hit the ball-bearing factories hard. The problem was that the picture Air Force officers had of their target didn't tell them what they really needed to know. The Germans, it emerged, had ample stockpiles of ball bearings. They also had no difficulty increasing their imports from Sweden and Switzerland, and, through a few simple design changes, they were able to greatly reduce their need for ball bearings in aircraft production. What's more, although the factory buildings were badly damaged by the bombing, the machinery inside wasn't. Ball-bearing equipment turned out to be surprisingly hardy. "As it was, not a tank, plane, or other piece of weaponry failed to be produced because of lack of ball bearings," Albert Speer, the Nazi production chief, wrote after the war. Seeing a problem and understanding it, then, are two different things.
In recent years, with the rise of highly accurate long-distance weaponry, the Schweinfurt problem has become even more acute. If you can aim at and hit the kitchen at the back of a house, after all, you don't have to bomb the whole building. So your bomb can be two hundred pounds rather than a thousand. That means, in turn, that you can fit five times as many bombs on a single plane and hit five times as many targets in a single sortie, which sounds good--except that now you need to get intelligence on five times as many targets. And that intelligence has to be five times more specific, because if the target is in the bedroom and not the kitchen, you've missed him.
This is the issue that the United States command faced in the most recent Iraq war. Early in the campaign, the military mounted a series of air strikes against specific targets, where Saddam Hussein or other senior Baathist officials were thought to be hiding. There were fifty of these so-called "decapitation" attempts, each taking advantage of the fact that modern-day G.P.S.-guided bombs can be delivered from a fighter to within thirteen metres of their intended target. The strikes were dazzling in their precision. In one case, a restaurant was levelled. In another, a bomb burrowed down into a basement. But, in the end, every single strike failed. "The issue isn't accuracy," Watts, who has written extensively on the limitations of high-tech weaponry, says. "The issue is the quality of targeting information. The amount of information we need has gone up an order of magnitude or two in the last decade."
5.
Mammography has a Schweinfurt problem as well. Nowhere is that more evident than in the case of the breast lesion known as ductal carcinoma in situ, or DCIS, which shows up as a cluster of calcifications inside the ducts that carry milk to the nipple. It's a tumor that hasn't spread beyond those ducts, and it is so tiny that without mammography few women with DCIS would ever know they had it. In the past couple of decades, as more and more people have received regular breast X-rays and the resolution of mammography has increased, diagnoses of DCIS have soared. About fifty thousand new cases are now found every year in the United States, and virtually every DCIS lesion detected by mammography is promptly removed. But what has the targeting and destruction of DCIS meant for the battle against breast cancer? You'd expect that if we've been catching fifty thousand early-stage cancers every year, we should be seeing a corresponding decrease in the number of late-stage invasive cancers. It's not clear whether we have. During the past twenty years, the incidence of invasive breast cancer has continued to rise by the same small, steady increment every year.
In 1987, pathologists in Denmark performed a series of autopsies of women in their forties who had not been known to have breast cancer when they died of other causes. The pathologists looked at an average of two hundred and seventy-five samples of breast tissue in each case, and found some evidence of cancer--usually DCIS--in nearly forty per cent of the women. Since breast cancer accounts for less than four per cent of female deaths, clearly the overwhelming majority of these women, had they lived longer, would never have died of breast cancer. "To me, that indicates that these kinds of genetic changes happen really frequently, and that they can happen without having an impact on women's health," Karla Kerlikowske, a breast-cancer expert at the University of California at San Francisco, says. "The body has this whole mechanism to repair things, and maybe that's what happened with these tumors." Gilbert Welch, the medical-outcomes expert, thinks that we fail to understand the hit-or-miss nature of cancerous growth, and assume it to be a process that, in the absence of intervention, will eventually kill us. "A pathologist from the International Agency for Research on Cancer once told me that the biggest mistake we ever made was attaching the word 'carcinoma' to DCIS," Welch says. "The minute carcinoma got linked to it, it all of a sudden drove doctors to recommend therapy, because what was implied was that this was a lesion that would inexorably progress to invasive cancer. But we know that that's not always the case."
In some percentage of cases, however, DCIS does progress to something more serious. Some studies suggest that this happens very infrequently. Others suggest that it happens frequently enough to be of major concern. There is no definitive answer, and it's all but impossible to tell, simply by looking at a mammogram, whether a given DCIS tumor is among those lesions which will grow out from the duct or part of the majority that will never amount to anything. That's why some doctors feel that we have no choice but to treat every DCIS as life-threatening, and in thirty per cent of cases that means a mastectomy, and in another thirty-five per cent it means a lumpectomy and radiation. Would taking a better picture solve the problem? Not really, because the problem is that you don't know for sure what you're seeing, and as pictures have become better we have put ourselves in a position where we see more and more things that we don't know how to interpret. When it comes to DCIS, the mammogram delivers information without true understanding. "Almost half a million women have been diagnosed and treated for DCIS since the early nineteen-eighties--a diagnosis virtually unknown before then," Welch writes in his new book, "Should I Be Tested for Cancer?," a brilliant account of the statistical and medical uncertainties surrounding cancer screening. "This increase is the direct result of looking harder--in this case with 'better' mammography equipment. But I think you can see why it is a diagnosis that some women might reasonably prefer not to know about."
6.
The disturbing thing about DCIS, of course, is that our approach to this tumor seems like a textbook example of how the battle against cancer is supposed to work. Use a powerful camera. Take a detailed picture. Spot the tumor as early as possible. Treat it immediately and aggressively. The campaign to promote regular mammograms has used this early-detection argument with great success, because it makes intuitive sense. The danger posed by a tumor is represented visually. Large is bad; small is better--less likely to have metastasized. But here, too, tumors defy our visual intuitions.
According to Donald Berry, who is the chairman of the Department of Biostatistics and Applied Mathematics at M. D. Anderson Cancer Center, in Houston, a woman's risk of death increases only by about ten per cent for every additional centimetre in tumor length. "Suppose there is a tumor size above which the tumor is lethal, and below which it's not," Berry says. "The problem is that the threshold varies. When we find a tumor, we don't know whether it has metastasized already. And we don't know whether it's tumor size that drives the metastatic process or whether all you need is a few million cells to start sloughing off to other parts of the body. We do observe that it's worse to have a bigger tumor. But not amazingly worse. The relationship is not as great as you'd think."
In a recent genetic analysis of breast-cancer tumors, scientists selected women with breast cancer who had been followed for many years, and divided them into two groups--those whose cancer had gone into remission, and those whose cancer spread to the rest of their body. Then the scientists went back to the earliest moment that each cancer became apparent, and analyzed thousands of genes in order to determine whether it was possible to predict, at that moment, who was going to do well and who wasn't. Early detection presumes that it isn't possible to make that prediction: a tumor is removed before it becomes truly dangerous. But scientists discovered that even with tumors in the one-centimetre range--the range in which cancer is first picked up by a mammogram--the fate of the cancer seems already to have been set. "What we found is that there is biology that you can glean from the tumor, at the time you take it out, that is strongly predictive of whether or not it will go on to metastasize," Stephen Friend, a member of the gene-expression team at Merck, says. "We like to think of a small tumor as an innocent. The reality is that in that innocent lump are a lot of behaviors that spell a potential poor or good prognosis."
The good news here is that it might eventually be possible to screen breast cancers on a genetic level, using other kinds of tests--even blood tests--to look for the biological traces of those genes. This might also help with the chronic problem of overtreatment in breast cancer. If we can single out that small percentage of women whose tumors will metastasize, we can spare the rest the usual regimen of surgery, radiation, and chemotherapy. Gene-signature research is one of a number of reasons that many scientists are optimistic about the fight against breast cancer. But it is an advance that has nothing to do with taking more pictures, or taking better pictures. It has to do with going beyond the picture.
Under the circumstances, it is not hard to understand why mammography draws so much controversy. The picture promises certainty, and it cannot deliver on that promise. Even after forty years of research, there remains widespread disagreement over how much benefit women in the critical fifty-to-sixty-nine age bracket receive from breast X-rays, and further disagreement about whether there is enough evidence to justify regular mammography in women under fifty and over seventy. Is there any way to resolve the disagreement? Donald Berry says that there probably isn't--that a clinical trial that could definitively answer the question of mammography's precise benefits would have to be so large (involving more than five hundred thousand women) and so expensive (costing billions of dollars) as to be impractical. The resulting confusion has turned radiologists who do mammograms into one of the chief targets of malpractice litigation. "The problem is that mammographers--radiology groups--do hundreds of thousands of these mammograms, giving women the illusion that these things work and they are good, and if a lump is found and in most cases if it is found early, they tell women they have the probability of a higher survival rate," says E. Clay Parker, a Florida plaintiff's attorney, who recently won a $5.1 million judgment against an Orlando radiologist. "But then, when it comes to defending themselves, they tell you that the reality is that it doesn't make a difference when you find it. So you scratch your head and say, 'Well, why do you do mammography, then?'"
The answer is that mammograms do not have to be infallible to save lives. A modest estimate of mammography's benefit is that it reduces the risk of dying from breast cancer by about ten per cent--which works out, for the average woman in her fifties, to be about three extra days of life, or, to put it another way, a health benefit on a par with wearing a helmet on a ten-hour bicycle trip. That is not a trivial benefit. Multiplied across the millions of adult women in the United States, it amounts to thousands of lives saved every year, and, in combination with a medical regimen that includes radiation, surgery, and new and promising drugs, it has helped brighten the prognosis for women with breast cancer. Mammography isn't as a good as we'd like it to be. But we are still better off than we would be without it.
"There is increasingly an understanding among those of us who do this a lot that our efforts to sell mammography may have been over-vigorous," Dershaw said, "and that although we didn't intend to, the perception may have been that mammography accomplishes even more than it does." He was looking, as he spoke, at the mammogram of the woman whose tumor would have been invisible had it been a few centimetres to the right. Did looking at an X-ray like that make him nervous? Dershaw shook his head. "You have to respect the limitations of the technology," he said. "My job with the mammogram isn't to find what I can't find with a mammogram. It's to find what I can find with a mammogram. If I'm not going to accept that, then I shouldn't be reading mammograms."
7.
In February of last year, just before the start of the Iraq war, Secretary of State Colin Powell went before the United Nations to declare that Iraq was in defiance of international law. He presented transcripts of telephone conversations between senior Iraqi military officials, purportedly discussing attempts to conceal weapons of mass destruction. He told of eyewitness accounts of mobile biological-weapons facilities. And, most persuasively, he presented a series of images--carefully annotated, high-resolution satellite photographs of what he said was the Taji Iraqi chemical-munitions facility.
"Let me say a word about satellite images before I show a couple," Powell began. "The photos that I am about to show you are sometimes hard for the average person to interpret, hard for me. The painstaking work of photo analysis takes experts with years and years of experience, poring for hours and hours over light tables. But as I show you these images, I will try to capture and explain what they mean, what they indicate, to our imagery specialists." The first photograph was dated November 10, 2002, just three months earlier, and years after the Iraqis were supposed to have rid themselves of all weapons of mass destruction. "Let me give you a closer look," Powell said as he flipped to a closeup of the first photograph. It showed a rectangular building, with a vehicle parked next to it. "Look at the image on the left. On the left is a closeup of one of the four chemical bunkers. The two arrows indicate the presence of sure signs that the bunkers are storing chemical munitions. The arrow at the top that says 'Security' points to a facility that is a signature item for this kind of bunker. Inside that facility are special guards and special equipment to monitor any leakage that might come out of the bunker." Then he moved to the vehicle next to the building. It was, he said, another signature item. "It's a decontamination vehicle in case something goes wrong. . . . It is moving around those four and it moves as needed to move as people are working in the different bunkers."
Powell's analysis assumed, of course, that you could tell from the picture what kind of truck it was. But pictures of trucks, taken from above, are not always as clear as we would like; sometimes trucks hauling oil tanks look just like trucks hauling Scud launchers, and, while a picture is a good start, if you really want to know what you're looking at you probably need more than a picture. I looked at the photographs with Patrick Eddington, who for many years was an imagery analyst with the C.I.A. Eddington examined them closely. "They're trying to say that those are decontamination vehicles," he told me. He had a photo up on his laptop, and he peered closer to get a better look. "But the resolution is sufficient for me to say that I don't think it is--and I don't see any other decontamination vehicles down there that I would recognize." The standard decontamination vehicle was a Soviet-made box-body van, Eddington said. This truck was too long. For a second opinion, Eddington recommended Ray McGovern, a twenty-seven-year C.I.A. analyst, who had been one of George H. W. Bush's personal intelligence briefers when he was Vice-President. "If you're an expert, you can tell one hell of a lot from pictures like this," McGovern said. He'd heard another interpretation. "I think," he said, "that it's a fire truck."
THE ARCHIVE
complete list
Articles from the New Yorker
The Vanishing
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
January 15, 2005
BOOKS
In "Collapse," Jared Diamond
shows how societies destroy themselves.
1.
A thousand years ago, a group of Vikings led by Erik the Red set sail from Norway for the vast Arctic landmass west of Scandinavia which came to be known as Greenland. It was largely uninhabitable—a forbidding expanse of snow and ice. But along the southwestern coast there were two deep fjords protected from the harsh winds and saltwater spray of the North Atlantic Ocean, and as the Norse sailed upriver they saw grassy slopes flowering with buttercups, dandelions, and bluebells, and thick forests of willow and birch and alder. Two colonies were formed, three hundred miles apart, known as the Eastern and Western Settlements. The Norse raised sheep, goats, and cattle. They turned the grassy slopes into pastureland. They hunted seal and caribou. They built a string of parish churches and a magnificent cathedral, the remains of which are still standing. They traded actively with mainland Europe, and tithed regularly to the Roman Catholic Church. The Norse colonies in Greenland were law-abiding, economically viable, fully integrated communities, numbering at their peak five thousand people. They lasted for four hundred and fifty years—and then they vanished.
The story of the Eastern and Western Settlements of Greenland is told in Jared Diamond's "Collapse: How Societies Choose to Fail or Succeed" (Viking; $29.95). Diamond teaches geography at U.C.L.A. and is well known for his best-seller "Guns, Germs, and Steel," which won a Pulitzer Prize. In "Guns, Germs, and Steel," Diamond looked at environmental and structural factors to explain why Western societies came to dominate the world. In "Collapse," he continues that approach, only this time he looks at history's losers—like the Easter Islanders, the Anasazi of the American Southwest, the Mayans, and the modern-day Rwandans. We live in an era preoccupied with the way that ideology and culture and politics and economics help shape the course of history. But Diamond isn't particularly interested in any of those things—or, at least, he's interested in them only insofar as they bear on what to him is the far more important question, which is a society's relationship to its climate and geography and resources and neighbors. "Collapse" is a book about the most prosaic elements of the earth's ecosystem—soil, trees, and water—because societies fail, in Diamond's view, when they mismanage those environmental factors.
There was nothing wrong with the social organization of the Greenland settlements. The Norse built a functioning reproduction of the predominant northern-European civic model of the time—devout, structured, and reasonably orderly. In 1408, right before the end, records from the Eastern Settlement dutifully report that Thorstein Olafsson married Sigrid Bjornsdotter in Hvalsey Church on September 14th of that year, with Brand Halldorstson, Thord Jorundarson, Thorbjorn Bardarson, and Jon Jonsson as witnesses, following the proclamation of the wedding banns on three consecutive Sundays.
The problem with the settlements, Diamond argues, was that the Norse thought that Greenland really was green; they treated it as if it were the verdant farmland of southern Norway. They cleared the land to create meadows for their cows, and to grow hay to feed their livestock through the long winter. They chopped down the forests for fuel, and for the construction of wooden objects. To make houses warm enough for the winter, they built their homes out of six-foot-thick slabs of turf, which meant that a typical home consumed about ten acres of grassland.
But Greenland's ecosystem was too fragile to withstand that kind of pressure. The short, cool growing season meant that plants developed slowly, which in turn meant that topsoil layers were shallow and lacking in soil constituents, like organic humus and clay, that hold moisture and keep soil resilient in the face of strong winds. "The sequence of soil erosion in Greenland begins with cutting or burning the cover of trees and shrubs, which are more effective at holding soil than is grass," he writes. "With the trees and shrubs gone, livestock, especially sheep and goats, graze down the grass, which regenerates only slowly in Greenland's climate. Once the grass cover is broken and the soil is exposed, soil is carried away especially by the strong winds, and also by pounding from occasionally heavy rains, to the point where the topsoil can be removed for a distance of miles from an entire valley." Without adequate pastureland, the summer hay yields shrank; without adequate supplies of hay, keeping livestock through the long winter got harder. And, without adequate supplies of wood, getting fuel for the winter became increasingly difficult.
The Norse needed to reduce their reliance on livestock—particularly cows, which consumed an enormous amount of agricultural resources. But cows were a sign of high status; to northern Europeans, beef was a prized food. They needed to copy the Inuit practice of burning seal blubber for heat and light in the winter, and to learn from the Inuit the difficult art of hunting ringed seals, which were the most reliably plentiful source of food available in the winter. But the Norse had contempt for the Inuit—they called them skraelings, "wretches"—and preferred to practice their own brand of European agriculture. In the summer, when the Norse should have been sending ships on lumber-gathering missions to Labrador, in order to relieve the pressure on their own forestlands, they instead sent boats and men to the coast to hunt for walrus. Walrus tusks, after all, had great trade value. In return for those tusks, the Norse were able to acquire, among other things, church bells, stained-glass windows, bronze candlesticks, Communion wine, linen, silk, silver, churchmen's robes, and jewelry to adorn their massive cathedral at Gardar, with its three-ton sandstone building blocks and eighty-foot bell tower. In the end, the Norse starved to death.
2.
Diamond's argument stands in sharp contrast to the conventional explanations for a society's collapse. Usually, we look for some kind of cataclysmic event. The aboriginal civilization of the Americas was decimated by the sudden arrival of smallpox. European Jewry was destroyed by Nazism. Similarly, the disappearance of the Norse settlements is usually blamed on the Little Ice Age, which descended on Greenland in the early fourteen-hundreds, ending several centuries of relative warmth. (One archeologist refers to this as the "It got too cold, and they died" argument.) What all these explanations have in common is the idea that civilizations are destroyed by forces outside their control, by acts of God.
But look, Diamond says, at Easter Island. Once, it was home to a thriving culture that produced the enormous stone statues that continue to inspire awe. It was home to dozens of species of trees, which created and protected an ecosystem fertile enough to support as many as thirty thousand people. Today, it's a barren and largely empty outcropping of volcanic rock. What happened? Did a rare plant virus wipe out the island's forest cover? Not at all. The Easter Islanders chopped their trees down, one by one, until they were all gone. "I have often asked myself, 'What did the Easter Islander who cut down the last palm tree say while he was doing it?'" Diamond writes, and that, of course, is what is so troubling about the conclusions of "Collapse." Those trees were felled by rational actors—who must have suspected that the destruction of this resource would result in the destruction of their civilization. The lesson of "Collapse" is that societies, as often as not, aren't murdered. They commit suicide: they slit their wrists and then, in the course of many decades, stand by passively and watch themselves bleed to death.
This doesn't mean that acts of God don't play a role. It did get colder in Greenland in the early fourteen-hundreds. But it didn't get so cold that the island became uninhabitable. The Inuit survived long after the Norse died out, and the Norse had all kinds of advantages, including a more diverse food supply, iron tools, and ready access to Europe. The problem was that the Norse simply couldn't adapt to the country's changing environmental conditions. Diamond writes, for instance, of the fact that nobody can find fish remains in Norse archeological sites. One scientist sifted through tons of debris from the Vatnahverfi farm and found only three fish bones; another researcher analyzed thirty-five thousand bones from the garbage of another Norse farm and found two fish bones. How can this be? Greenland is a fisherman's dream: Diamond describes running into a Danish tourist in Greenland who had just caught two Arctic char in a shallow pool with her bare hands. "Every archaeologist who comes to excavate in Greenland . . . starts out with his or her own idea about where all those missing fish bones might be hiding," he writes. "Could the Norse have strictly confined their munching on fish to within a few feet of the shoreline, at sites now underwater because of land subsidence? Could they have faithfully saved all their fish bones for fertilizer, fuel, or feeding to cows?" It seems unlikely. There are no fish bones in Norse archeological remains, Diamond concludes, for the simple reason that the Norse didn't eat fish. For one reason or another, they had a cultural taboo against it.
Given the difficulty that the Norse had in putting food on the table, this was insane. Eating fish would have substantially reduced the ecological demands of the Norse settlements. The Norse would have needed fewer livestock and less pastureland. Fishing is not nearly as labor-intensive as raising cattle or hunting caribou, so eating fish would have freed time and energy for other activities. It would have diversified their diet.
Why did the Norse choose not to eat fish? Because they weren't thinking about their biological survival. They were thinking about their cultural survival. Food taboos are one of the idiosyncrasies that define a community. Not eating fish served the same function as building lavish churches, and doggedly replicating the untenable agricultural practices of their land of origin. It was part of what it meant to be Norse, and if you are going to establish a community in a harsh and forbidding environment all those little idiosyncrasies which define and cement a culture are of paramount importance. "The Norse were undone by the same social glue that had enabled them to master Greenland's difficulties," Diamond writes. "The values to which people cling most stubbornly under inappropriate conditions are those values that were previously the source of their greatest triumphs over adversity." He goes on:
To us in our secular modern society, the predicament in which the Greenlanders found themselves is difficult to fathom. To them, however, concerned with their social survival as much as their biological survival, it was out of the question to invest less in churches, to imitate or intermarry with the Inuit, and thereby to face an eternity in Hell just in order to survive another winter on Earth.
Diamond's distinction between social and biological survival is a critical one, because too often we blur the two, or assume that biological survival is contingent on the strength of our civilizational values. That was the lesson taken from the two world wars and the nuclear age that followed: we would survive as a species only if we learned to get along and resolve our disputes peacefully. The fact is, though, that we can be law-abiding and peace-loving and tolerant and inventive and committed to freedom and true to our own values and still behave in ways that are biologically suicidal. The two kinds of survival are separate.
Diamond points out that the Easter Islanders did not practice, so far as we know, a uniquely pathological version of South Pacific culture. Other societies, on other islands in the Hawaiian archipelago, chopped down trees and farmed and raised livestock just as the Easter Islanders did. What doomed the Easter Islanders was the interaction between what they did and where they were. Diamond and a colleague, Barry Rollet, identified nine physical factors that contributed to the likelihood of deforestation—including latitude, average rainfall, aerial-ash fallout, proximity to Central Asia's dust plume, size, and so on—and Easter Island ranked at the high-risk end of nearly every variable. "The reason for Easter's unusually severe degree of deforestation isn't that those seemingly nice people really were unusually bad or improvident," he concludes. "Instead, they had the misfortune to be living in one of the most fragile environments, at the highest risk for deforestation, of any Pacific people." The problem wasn't the Easter Islanders. It was Easter Island.
In the second half of "Collapse," Diamond turns his attention to modern examples, and one of his case studies is the recent genocide in Rwanda. What happened in Rwanda is commonly described as an ethnic struggle between the majority Hutu and the historically dominant, wealthier Tutsi, and it is understood in those terms because that is how we have come to explain much of modern conflict: Serb and Croat, Jew and Arab, Muslim and Christian. The world is a cauldron of cultural antagonism. It's an explanation that clearly exasperates Diamond. The Hutu didn't just kill the Tutsi, he points out. The Hutu also killed other Hutu. Why? Look at the land: steep hills farmed right up to the crests, without any protective terracing; rivers thick with mud from erosion; extreme deforestation leading to irregular rainfall and famine; staggeringly high population densities; the exhaustion of the topsoil; falling per-capita food production. This was a society on the brink of ecological disaster, and if there is anything that is clear from the study of such societies it is that they inevitably descend into genocidal chaos. In "Collapse," Diamond quite convincingly defends himself against the charge of environmental determinism. His discussions are always nuanced, and he gives political and ideological factors their due. The real issue is how, in coming to terms with the uncertainties and hostilities of the world, the rest of us have turned ourselves into cultural determinists.
3.
For the past thirty years, Oregon has had one of the strictest sets of land-use regulations in the nation, requiring new development to be clustered in and around existing urban development. The laws meant that Oregon has done perhaps the best job in the nation in limiting suburban sprawl, and protecting coastal lands and estuaries. But this November Oregon's voters passed a ballot referendum, known as Measure 37, that rolled back many of those protections. Specifically, Measure 37 said that anyone who could show that the value of his land was affected by regulations implemented since its purchase was entitled to compensation from the state. If the state declined to pay, the property owner would be exempted from the regulations.
To call Measure 37—and similar referendums that have been passed recently in other states—intellectually incoherent is to put it mildly. It might be that the reason your hundred-acre farm on a pristine hillside is worth millions to a developer is that it's on a pristine hillside: if everyone on that hillside could subdivide, and sell out to Target and Wal-Mart, then nobody's plot would be worth millions anymore. Will the voters of Oregon then pass Measure 38, allowing them to sue the state for compensation over damage to property values caused by Measure 37?
It is hard to read "Collapse," though, and not have an additional reaction to Measure 37. Supporters of the law spoke entirely in the language of political ideology. To them, the measure was a defense of property rights, preventing the state from unconstitutional "takings." If you replaced the term "property rights" with "First Amendment rights," this would have been indistinguishable from an argument over, say, whether charitable groups ought to be able to canvass in malls, or whether cities can control the advertising they sell on the sides of public buses. As a society, we do a very good job with these kinds of debates: we give everyone a hearing, and pass laws, and make compromises, and square our conclusions with our constitutional heritage—and in the Oregon debate the quality of the theoretical argument was impressively high.
The thing that got lost in the debate, however, was the land. In a rapidly growing state like Oregon, what, precisely, are the state's ecological strengths and vulnerabilities? What impact will changed land-use priorities have on water and soil and cropland and forest? One can imagine Diamond writing about the Measure 37 debate, and he wouldn't be very impressed by how seriously Oregonians wrestled with the problem of squaring their land-use rules with their values, because to him a society's environmental birthright is not best discussed in those terms. Rivers and streams and forests and soil are a biological resource. They are a tangible, finite thing, and societies collapse when they get so consumed with addressing the fine points of their history and culture and deeply held beliefs—with making sure that Thorstein Olafsson and Sigrid Bjornsdotter are married before the right number of witnesses following the announcement of wedding banns on the right number of Sundays—that they forget that the pastureland is shrinking and the forest cover is gone.
When archeologists looked through the ruins of the Western Settlement, they found plenty of the big wooden objects that were so valuable in Greenland—crucifixes, bowls, furniture, doors, roof timbers—which meant that the end came too quickly for anyone to do any scavenging. And, when the archeologists looked at the animal bones left in the debris, they found the bones of newborn calves, meaning that the Norse, in that final winter, had given up on the future. They found toe bones from cows, equal to the number of cow spaces in the barn, meaning that the Norse ate their cattle down to the hoofs, and they found the bones of dogs covered with knife marks, meaning that, in the end, they had to eat their pets. But not fish bones, of course. Right up until they starved to death, the Norse never lost sight of what they stood for.
THE ARCHIVE
complete list
Articles from the New Yorker
Brain Candy
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 16, 2005
The Critics: Books
Is pop culture dumbing us down or smartening us up?
1.
Twenty years ago, a political philosopher named James Flynn uncovered a curious fact. Americans—at least, as measured by I.Q. tests—were getting smarter. This fact had been obscured for years, because the people who give I.Q. tests continually recalibrate the scoring system to keep the average at 100. But if you took out the recalibration, Flynn found, I.Q. scores showed a steady upward trajectory, rising by about three points per decade, which means that a person whose I.Q. placed him in the top ten per cent of the American population in 1920 would today fall in the bottom third. Some of that effect, no doubt, is a simple by-product of economic progress: in the surge of prosperity during the middle part of the last century, people in the West became better fed, better educated, and more familiar with things like I.Q. tests. But, even as that wave of change has subsided, test scores have continued to rise—not just in America but all over the developed world. What's more, the increases have not been confined to children who go to enriched day-care centers and private schools. The middle part of the curve—the people who have supposedly been suffering from a deteriorating public-school system and a steady diet of lowest-common-denominator television and mindless pop music—has increased just as much. What on earth is happening? In the wonderfully entertaining "Everything Bad Is Good for You" (Riverhead; $23.95), Steven Johnson proposes that what is making us smarter is precisely what we thought was making us dumber: popular culture.
Johnson is the former editor of the online magazine Feed and the author of a number of books on science and technology. There is a pleasing eclecticism to his thinking. He is as happy analyzing "Finding Nemo" as he is dissecting the intricacies of a piece of software, and he's perfectly capable of using Nietzsche's notion of eternal recurrence to discuss the new creative rules of television shows. Johnson wants to understand popular culture—not in the postmodern, academic sense of wondering what "The Dukes of Hazzard" tells us about Southern male alienation but in the very practical sense of wondering what watching something like "The Dukes of Hazzard" does to the way our minds work.
As Johnson points out, television is very different now from what it was thirty years ago. It's harder. A typical episode of "Starsky and Hutch," in the nineteen-seventies, followed an essentially linear path: two characters, engaged in a single story line, moving toward a decisive conclusion. To watch an episode of "Dallas" today is to be stunned by its glacial pace—by the arduous attempts to establish social relationships, by the excruciating simplicity of the plotline, by how obvious it was. A single episode of "The Sopranos," by contrast, might follow five narrative threads, involving a dozen characters who weave in and out of the plot. Modern television also requires the viewer to do a lot of what Johnson calls "filling in," as in a "Seinfeld" episode that subtly parodies the Kennedy assassination conspiracists, or a typical "Simpsons" episode, which may contain numerous allusions to politics or cinema or pop culture. The extraordinary amount of money now being made in the television aftermarket—DVD sales and syndication—means that the creators of television shows now have an incentive to make programming that can sustain two or three or four viewings. Even reality shows like "Survivor," Johnson argues, engage the viewer in a way that television rarely has in the past:
When we watch these shows, the part of our brain that monitors the emotional lives of the people around us—the part that tracks subtle shifts in intonation and gesture and facial expression—scrutinizes the action on the screen, looking for clues. . . . The phrase "Monday-morning quarterbacking" was coined to describe the engaged feeling spectators have in relation to games as opposed to stories. We absorb stories, but we second-guess games. Reality programming has brought that second-guessing to prime time, only the game in question revolves around social dexterity rather than the physical kind.
How can the greater cognitive demands that television makes on us now, he wonders, not matter?
Johnson develops the same argument about video games. Most of the people who denounce video games, he says, haven't actually played them—at least, not recently. Twenty years ago, games like Tetris or Pac-Man were simple exercises in motor coördination and pattern recognition. Today's games belong to another realm. Johnson points out that one of the "walk-throughs" for "Grand Theft Auto III"—that is, the informal guides that break down the games and help players navigate their complexities—is fifty-three thousand words long, about the length of his book. The contemporary video game involves a fully realized imaginary world, dense with detail and levels of complexity.
Indeed, video games are not games in the sense of those pastimes—like Monopoly or gin rummy or chess—which most of us grew up with. They don't have a set of unambiguous rules that have to be learned and then followed during the course of play. This is why many of us find modern video games baffling: we're not used to being in a situation where we have to figure out what to do. We think we only have to learn how to press the buttons faster. But these games withhold critical information from the player. Players have to explore and sort through hypotheses in order to make sense of the game's environment, which is why a modern video game can take forty hours to complete. Far from being engines of instant gratification, as they are often described, video games are actually, Johnson writes, "all about delayed gratification—sometimes so long delayed that you wonder if the gratification is ever going to show."
At the same time, players are required to manage a dizzying array of information and options. The game presents the player with a series of puzzles, and you can't succeed at the game simply by solving the puzzles one at a time. You have to craft a longer-term strategy, in order to juggle and coördinate competing interests. In denigrating the video game, Johnson argues, we have confused it with other phenomena in teen-age life, like multitasking—simultaneously e-mailing and listening to music and talking on the telephone and surfing the Internet. Playing a video game is, in fact, an exercise in "constructing the proper hierarchy of tasks and moving through the tasks in the correct sequence," he writes. "It's about finding order and meaning in the world, and making decisions that help create that order."
2.
It doesn't seem right, of course, that watching "24" or playing a video game could be as important cognitively as reading a book. Isn't the extraordinary success of the "Harry Potter" novels better news for the culture than the equivalent success of "Grand Theft Auto III"? Johnson's response is to imagine what cultural critics might have said had video games been invented hundreds of years ago, and only recently had something called the book been marketed aggressively to children:
Reading books chronically understimulates the senses. Unlike the longstanding tradition of gameplaying—which engages the child in a vivid, three-dimensional world filled with moving images and musical sound-scapes, navigated and controlled with complex muscular movements—books are simply a barren string of words on the page. . . .
Books are also tragically isolating. While games have for many years engaged the young in complex social relationships with their peers, building and exploring worlds together, books force the child to sequester him or herself in a quiet space, shut off from interaction with other children. . . .
But perhaps the most dangerous property of these books is the fact that they follow a fixed linear path. You can't control their narratives in any fashion—you simply sit back and have the story dictated to you. . . . This risks instilling a general passivity in our children, making them feel as though they're powerless to change their circumstances. Reading is not an active, participatory process; it's a submissive one.
He's joking, of course, but only in part. The point is that books and video games represent two very different kinds of learning. When you read a biology textbook, the content of what you read is what matters. Reading is a form of explicit learning. When you play a video game, the value is in how it makes you think. Video games are an example of collateral learning, which is no less important.
Being "smart" involves facility in both kinds of thinking—the kind of fluid problem solving that matters in things like video games and I.Q. tests, but also the kind of crystallized knowledge that comes from explicit learning. If Johnson's book has a flaw, it is that he sometimes speaks of our culture being "smarter" when he's really referring just to that fluid problem-solving facility. When it comes to the other kind of intelligence, it is not clear at all what kind of progress we are making, as anyone who has read, say, the Gettysburg Address alongside any Presidential speech from the past twenty years can attest. The real question is what the right balance of these two forms of intelligence might look like. "Everything Bad Is Good for You" doesn't answer that question. But Johnson does something nearly as important, which is to remind us that we shouldn't fall into the trap of thinking that explicit learning is the only kind of learning that matters.
In recent years, for example, a number of elementary schools have phased out or reduced recess and replaced it with extra math or English instruction. This is the triumph of the explicit over the collateral. After all, recess is "play" for a ten-year-old in precisely the sense that Johnson describes video games as play for an adolescent: an unstructured environment that requires the child actively to intervene, to look for the hidden logic, to find order and meaning in chaos.
One of the ongoing debates in the educational community, similarly, is over the value of homework. Meta-analysis of hundreds of studies done on the effects of homework shows that the evidence supporting the practice is, at best, modest. Homework seems to be most useful in high school and for subjects like math. At the elementary-school level, homework seems to be of marginal or no academic value. Its effect on discipline and personal responsibility is unproved. And the causal relation between high-school homework and achievement is unclear: it hasn't been firmly established whether spending more time on homework in high school makes you a better student or whether better students, finding homework more pleasurable, spend more time doing it. So why, as a society, are we so enamored of homework? Perhaps because we have so little faith in the value of the things that children would otherwise be doing with their time. They could go out for a walk, and get some exercise; they could spend time with their peers, and reap the rewards of friendship. Or, Johnson suggests, they could be playing a video game, and giving their minds a rigorous workout.
THE ARCHIVE
complete list
Articles from the New Yorker
The Moral Hazard Myth
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
August 29, 2005
Dept. of Public Policy
The bad idea behind our failed health-care system.
1.
Tooth decay begins, typically, when debris becomes trapped between the teeth and along the ridges and in the grooves of the molars. The food rots. It becomes colonized with bacteria. The bacteria feeds off sugars in the mouth and forms an acid that begins to eat away at the enamel of the teeth. Slowly, the bacteria works its way through to the dentin, the inner structure, and from there the cavity begins to blossom three-dimensionally, spreading inward and sideways. When the decay reaches the pulp tissue, the blood vessels, and the nerves that serve the tooth, the pain starts—an insistent throbbing. The tooth turns brown. It begins to lose its hard structure, to the point where a dentist can reach into a cavity with a hand instrument and scoop out the decay. At the base of the tooth, the bacteria mineralizes into tartar, which begins to irritate the gums. They become puffy and bright red and start to recede, leaving more and more of the tooth's root exposed. When the infection works its way down to the bone, the structure holding the tooth in begins to collapse altogether.
Several years ago, two Harvard researchers, Susan Starr Sered and Rushika Fernandopulle, set out to interview people without health-care coverage for a book they were writing, "Uninsured in America." They talked to as many kinds of people as they could find, collecting stories of untreated depression and struggling single mothers and chronically injured laborers—and the most common complaint they heard was about teeth. Gina, a hairdresser in Idaho, whose husband worked as a freight manager at a chain store, had "a peculiar mannerism of keeping her mouth closed even when speaking." It turned out that she hadn't been able to afford dental care for three years, and one of her front teeth was rotting. Daniel, a construction worker, pulled out his bad teeth with pliers. Then, there was Loretta, who worked nights at a university research center in Mississippi, and was missing most of her teeth. "They'll break off after a while, and then you just grab a hold of them, and they work their way out," she explained to Sered and Fernandopulle. "It hurts so bad, because the tooth aches. Then it's a relief just to get it out of there. The hole closes up itself anyway. So it's so much better."
People without health insurance have bad teeth because, if you're paying for everything out of your own pocket, going to the dentist for a checkup seems like a luxury. It isn't, of course. The loss of teeth makes eating fresh fruits and vegetables difficult, and a diet heavy in soft, processed foods exacerbates more serious health problems, like diabetes. The pain of tooth decay leads many people to use alcohol as a salve. And those struggling to get ahead in the job market quickly find that the unsightliness of bad teeth, and the self-consciousness that results, can become a major barrier. If your teeth are bad, you're not going to get a job as a receptionist, say, or a cashier. You're going to be put in the back somewhere, far from the public eye. What Loretta, Gina, and Daniel understand, the two authors tell us, is that bad teeth have come to be seen as a marker of "poor parenting, low educational achievement and slow or faulty intellectual development." They are an outward marker of caste. "Almost every time we asked interviewees what their first priority would be if the president established universal health coverage tomorrow," Sered and Fernandopulle write, "the immediate answer was 'my teeth.' "
The U. S. health-care system, according to "Uninsured in America," has created a group of people who increasingly look different from others and suffer in ways that others do not. The leading cause of personal bankruptcy in the United States is unpaid medical bills. Half of the uninsured owe money to hospitals, and a third are being pursued by collection agencies. Children without health insurance are less likely to receive medical attention for serious injuries, for recurrent ear infections, or for asthma. Lung-cancer patients without insurance are less likely to receive surgery, chemotherapy, or radiation treatment. Heart-attack victims without health insurance are less likely to receive angioplasty. People with pneumonia who don't have health insurance are less likely to receive X rays or consultations. The death rate in any given year for someone without health insurance is twenty-five per cent higher than for someone with insur-ance. Because the uninsured are sicker than the rest of us, they can't get better jobs, and because they can't get better jobs they can't afford health insurance, and because they can't afford health insurance they get even sicker. John, the manager of a bar in Idaho, tells Sered and Fernandopulle that as a result of various workplace injuries over the years he takes eight ibuprofen, waits two hours, then takes eight more—and tries to cadge as much prescription pain medication as he can from friends. "There are times when I should've gone to the doctor, but I couldn't afford to go because I don't have insurance," he says. "Like when my back messed up, I should've gone. If I had insurance, I would've went, because I know I could get treatment, but when you can't afford it you don't go. Because the harder the hole you get into in terms of bills, then you'll never get out. So you just say, 'I can deal with the pain.' "
2.
One of the great mysteries of political life in the United States is why Americans are so devoted to their health-care system. Six times in the past century—during the First World War, during the Depression, during the Truman and Johnson Administrations, in the Senate in the nineteen-seventies, and during the Clinton years—efforts have been made to introduce some kind of universal health insurance, and each time the efforts have been rejected. Instead, the United States has opted for a makeshift system of increasing complexity and dysfunction. Americans spend $5,267 per capita on health care every year, almost two and half times the industrialized world's median of $2,193; the extra spending comes to hundreds of billions of dollars a year. What does that extra spending buy us? Americans have fewer doctors per capita than most Western countries. We go to the doctor less than people in other Western countries. We get admitted to the hospital less frequently than people in other Western countries. We are less satisfied with our health care than our counterparts in other countries. American life expectancy is lower than the Western average. Childhood-immunization rates in the United States are lower than average. Infant-mortality rates are in the nineteenth percentile of industrialized nations. Doctors here perform more high-end medical procedures, such as coronary angioplasties, than in other countries, but most of the wealthier Western countries have more CT scanners than the United States does, and Switzerland, Japan, Austria, and Finland all have more MRI machines per capita. Nor is our system more efficient. The United States spends more than a thousand dollars per capita per year—or close to four hundred billion dollars—on health-care-related paperwork and administration, whereas Canada, for example, spends only about three hundred dollars per capita. And, of course, every other country in the industrialized world insures all its citizens; despite those extra hundreds of billions of dollars we spend each year, we leave forty-five million people without any insurance. A country that displays an almost ruthless commitment to efficiency and performance in every aspect of its economy—a country that switched to Japanese cars the moment they were more reliable, and to Chinese T-shirts the moment they were five cents cheaper—has loyally stuck with a health-care system that leaves its citizenry pulling out their teeth with pliers.
America's health-care mess is, in part, simply an accident of history. The fact that there have been six attempts at universal health coverage in the last century suggests that there has long been support for the idea. But politics has always got in the way. In both Europe and the United States, for example, the push for health insurance was led, in large part, by organized labor. But in Europe the unions worked through the political system, fighting for coverage for all citizens. From the start, health insurance in Europe was public and universal, and that created powerful political support for any attempt to expand benefits. In the United States, by contrast, the unions worked through the collective-bargaining system and, as a result, could win health benefits only for their own members. Health insurance here has always been private and selective, and every attempt to expand benefits has resulted in a paralyzing political battle over who would be added to insurance rolls and who ought to pay for those additions.
Policy is driven by more than politics, however. It is equally driven by ideas, and in the past few decades a particular idea has taken hold among prominent American economists which has also been a powerful impediment to the expansion of health insurance. The idea is known as "moral hazard." Health economists in other Western nations do not share this obsession. Nor do most Americans. But moral hazard has profoundly shaped the way think tanks formulate policy and the way experts argue and the way health insurers structure their plans and the way legislation and regulations have been written. The health-care mess isn't merely the unintentional result of political dysfunction, in other words. It is also the deliberate consequence of the way in which American policymakers have come to think about insurance.
"Moral hazard" is the term economists use to describe the fact that insurance can change the behavior of the person being insured. If your office gives you and your co-workers all the free Pepsi you want—if your employer, in effect, offers universal Pepsi insurance—you'll drink more Pepsi than you would have otherwise. If you have a no-deductible fire-insurance policy, you may be a little less diligent in clearing the brush away from your house. The savings-and-loan crisis of the nineteen-eighties was created, in large part, by the fact that the federal government insured savings deposits of up to a hundred thousand dollars, and so the newly deregulated S. & L.s made far riskier investments than they would have otherwise. Insurance can have the paradoxical effect of producing risky and wasteful behavior. Economists spend a great deal of time thinking about such moral hazard for good reason. Insurance is an attempt to make human life safer and more secure. But, if those efforts can backfire and produce riskier behavior, providing insurance becomes a much more complicated and problematic endeavor.
In 1968, the economist Mark Pauly argued that moral hazard played an enormous role in medicine, and, as John Nyman writes in his book "The Theory of the Demand for Health Insurance," Pauly's paper has become the "single most influential article in the health economics literature." Nyman, an economist at the University of Minnesota, says that the fear of moral hazard lies behind the thicket of co-payments and deductibles and utilization reviews which characterizes the American health-insurance system. Fear of moral hazard, Nyman writes, also explains "the general lack of enthusiasm by U.S. health economists for the expansion of health insurance coverage (for example, national health insurance or expanded Medicare benefits) in the U.S."
What Nyman is saying is that when your insurance company requires that you make a twenty-dollar co-payment for a visit to the doctor, or when your plan includes an annual five-hundred-dollar or thousand-dollar deductible, it's not simply an attempt to get you to pick up a larger share of your health costs. It is an attempt to make your use of the health-care system more efficient. Making you responsible for a share of the costs, the argument runs, will reduce moral hazard: you'll no longer grab one of those free Pepsis when you aren't really thirsty. That's also why Nyman says that the notion of moral hazard is behind the "lack of enthusiasm" for expansion of health insurance. If you think of insurance as producing wasteful consumption of medical services, then the fact that there are forty-five million Americans without health insurance is no longer an immediate cause for alarm. After all, it's not as if the uninsured never go to the doctor. They spend, on average, $934 a year on medical care. A moral-hazard theorist would say that they go to the doctor when they really have to. Those of us with private insurance, by contrast, consume $2,347 worth of health care a year. If a lot of that extra $1,413 is waste, then maybe the uninsured person is the truly efficient consumer of health care.
The moral-hazard argument makes sense, however, only if we consume health care in the same way that we consume other consumer goods, and to economists like Nyman this assumption is plainly absurd. We go to the doctor grudgingly, only because we're sick. "Moral hazard is overblown," the Princeton economist Uwe Reinhardt says. "You always hear that the demand for health care is unlimited. This is just not true. People who are very well insured, who are very rich, do you see them check into the hospital because it's free? Do people really like to go to the doctor? Do they check into the hospital instead of playing golf?"
For that matter, when you have to pay for your own health care, does your consumption really become more efficient? In the late nineteen-seventies, the rand Corporation did an extensive study on the question, randomly assigning families to health plans with co-payment levels at zero per cent, twenty-five per cent, fifty per cent, or ninety-five per cent, up to six thousand dollars. As you might expect, the more that people were asked to chip in for their health care the less care they used. The problem was that they cut back equally on both frivolous care and useful care. Poor people in the high-deductible group with hypertension, for instance, didn't do nearly as good a job of controlling their blood pressure as those in other groups, resulting in a ten-per-cent increase in the likelihood of death. As a recent Commonwealth Fund study concluded, cost sharing is "a blunt instrument." Of course it is: how should the average consumer be expected to know beforehand what care is frivolous and what care is useful? I just went to the dermatologist to get moles checked for skin cancer. If I had had to pay a hundred per cent, or even fifty per cent, of the cost of the visit, I might not have gone. Would that have been a wise decision? I have no idea. But if one of those moles really is cancerous, that simple, inexpensive visit could save the health-care system tens of thousands of dollars (not to mention saving me a great deal of heartbreak). The focus on moral hazard suggests that the changes we make in our behavior when we have insurance are nearly always wasteful. Yet, when it comes to health care, many of the things we do only because we have insurance—like getting our moles checked, or getting our teeth cleaned regularly, or getting a mammogram or engaging in other routine preventive care—are anything but wasteful and inefficient. In fact, they are behaviors that could end up saving the health-care system a good deal of money.
Sered and Fernandopulle tell the story of Steve, a factory worker from northern Idaho, with a "grotesquelooking left hand—what looks like a bone sticks out the side." When he was younger, he broke his hand. "The doctor wanted to operate on it," he recalls. "And because I didn't have insurance, well, I was like 'I ain't gonna have it operated on.' The doctor said, 'Well, I can wrap it for you with an Ace bandage.' I said, 'Ahh, let's do that, then.' " Steve uses less health care than he would if he had insurance, but that's not because he has defeated the scourge of moral hazard. It's because instead of getting a broken bone fixed he put a bandage on it.
3.
At the center of the Bush Administration's plan to address the health-insurance mess are Health Savings Accounts, and Health Savings Accounts are exactly what you would come up with if you were concerned, above all else, with minimizing moral hazard. The logic behind them was laid out in the 2004 Economic Report of the President. Americans, the report argues, have too much health insurance: typical plans cover things that they shouldn't, creating the problem of overconsumption. Several paragraphs are then devoted to explaining the theory of moral hazard. The report turns to the subject of the uninsured, concluding that they fall into several groups. Some are foreigners who may be covered by their countries of origin. Some are people who could be covered by Medicaid but aren't or aren't admitting that they are. Finally, a large number "remain uninsured as a matter of choice." The report continues, "Researchers believe that as many as one-quarter of those without health insurance had coverage available through an employer but declined the coverage.... Still others may remain uninsured because they are young and healthy and do not see the need for insurance." In other words, those with health insurance are overinsured and their behavior is distorted by moral hazard. Those without health insurance use their own money to make decisions about insurance based on an assessment of their needs. The insured are wasteful. The uninsured are prudent. So what's the solution? Make the insured a little bit more like the uninsured.
Under the Health Savings Accounts system, consumers are asked to pay for routine health care with their own money—several thousand dollars of which can be put into a tax-free account. To handle their catastrophic expenses, they then purchase a basic health-insurance package with, say, a thousand-dollar annual deductible. As President Bush explained recently, "Health Savings Accounts all aim at empowering people to make decisions for themselves, owning their own health-care plan, and at the same time bringing some demand control into the cost of health care."
The country described in the President's report is a very different place from the country described in "Uninsured in America." Sered and Fernandopulle look at the billions we spend on medical care and wonder why Americans have so little insurance. The President's report considers the same situation and worries that we have too much. Sered and Fernandopulle see the lack of insurance as a problem of poverty; a third of the uninsured, after all, have incomes below the federal poverty line. In the section on the uninsured in the President's report, the word "poverty" is never used. In the Administration's view, people are offered insurance but "decline the coverage" as "a matter of choice." The uninsured in Sered and Fernandopulle's book decline coverage, but only because they can't afford it. Gina, for instance, works for a beauty salon that offers her a bare-bones health-insurance plan with a thousand-dollar deductible for two hundred dollars a month. What's her total income? Nine hundred dollars a month. She could "choose" to accept health insurance, but only if she chose to stop buying food or paying the rent.
The biggest difference between the two accounts, though, has to do with how each views the function of insurance. Gina, Steve, and Loretta are ill, and need insurance to cover the costs of getting better. In their eyes, insurance is meant to help equalize financial risk between the healthy and the sick. In the insurance business, this model of coverage is known as "social insurance," and historically it was the way health coverage was conceived. If you were sixty and had heart disease and diabetes, you didn't pay substantially more for coverage than a perfectly healthy twenty-five-year-old. Under social insurance, the twenty-five-year-old agrees to pay thousands of dollars in premiums even though he didn't go to the doctor at all in the previous year, because he wants to make sure that someone else will subsidize his health care if he ever comes down with heart disease or diabetes. Canada and Germany and Japan and all the other industrialized nations with universal health care follow the social-insurance model. Medicare, too, is based on the social-insurance model, and, when Americans with Medicare report themselves to be happier with virtually every aspect of their insurance coverage than people with private insurance (as they do, repeatedly and overwhelmingly), they are referring to the social aspect of their insurance. They aren't getting better care. But they are getting something just as valuable: the security of being insulated against the financial shock of serious illness.
There is another way to organize insurance, however, and that is to make it actuarial. Car insurance, for instance, is actuarial. How much you pay is in large part a function of your individual situation and history: someone who drives a sports car and has received twenty speeding tickets in the past two years pays a much higher annual premium than a soccer mom with a minivan. In recent years, the private insurance industry in the United States has been moving toward the actuarial model, with profound consequences. The triumph of the actuarial model over the social-insurance model is the reason that companies unlucky enough to employ older, high-cost employees—like United Airlines—have run into such financial difficulty. It's the reason that automakers are increasingly moving their operations to Canada. It's the reason that small businesses that have one or two employees with serious illnesses suddenly face unmanageably high health-insurance premiums, and it's the reason that, in many states, people suffering from a potentially high-cost medical condition can't get anyone to insure them at all.
Health Savings Accounts represent the final, irrevocable step in the actuarial direction. If you are preoccupied with moral hazard, then you want people to pay for care with their own money, and, when you do that, the sick inevitably end up paying more than the healthy. And when you make people choose an insurance plan that fits their individual needs, those with significant medical problems will choose expensive health plans that cover lots of things, while those with few health problems will choose cheaper, bare-bones plans. The more expensive the comprehensive plans become, and the less expensive the bare-bones plans become, the more the very sick will cluster together at one end of the insurance spectrum, and the more the well will cluster together at the low-cost end. The days when the healthy twenty-five-year-old subsidizes the sixty-year-old with heart disease or diabetes are coming to an end. "The main effect of putting more of it on the consumer is to reduce the social redistributive element of insurance," the Stanford economist Victor Fuchs says. Health Savings Accounts are not a variant of universal health care. In their governing assumptions, they are the antithesis of universal health care.
The issue about what to do with the health-care system is sometimes presented as a technical argument about the merits of one kind of coverage over another or as an ideological argument about socialized versus private medicine. It is, instead, about a few very simple questions. Do you think that this kind of redistribution of risk is a good idea? Do you think that people whose genes predispose them to depression or cancer, or whose poverty complicates asthma or diabetes, or who get hit by a drunk driver, or who have to keep their mouths closed because their teeth are rotting ought to bear a greater share of the costs of their health care than those of us who are lucky enough to escape such misfortunes? In the rest of the industrialized world, it is assumed that the more equally and widely the burdens of illness are shared, the better off the population as a whole is likely to be. The reason the United States has forty-five million people without coverage is that its health-care policy is in the hands of people who disagree, and who regard health insurance not as the solution but as the problem.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
The Bakeoff
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
September 5, 2005
Annals of Technology
Project Delta aims to create the perfect cookie.
1.
Steve Gundrum launched Project Delta at a small dinner last fall at Il Fornaio, in Burlingame, just down the road from the San Francisco Airport. It wasn't the first time he'd been to Il Fornaio, and he made his selection quickly, with just a glance at the menu; he is the sort of person who might have thought about his choice in advance — maybe even that morning, while shaving. He would have posed it to himself as a question — Ravioli alla Lucana?—and turned it over in his mind, assembling and disassembling the dish, ingredient by ingredient, as if it were a model airplane. Did the Pecorino pepato really belong? What if you dropped the basil? What would the ravioli taste like if you froze it, along with the ricotta and the Parmesan, and tried to sell it in the supermarket? And then what would you do about the fennel?
Gundrum is short and round. He has dark hair and a mustache and speaks with the flattened vowels of the upper Midwest. He is voluble and excitable and doggedly unpretentious, to the point that your best chance of seeing him in a suit is probably Halloween. He runs Mattson, one of the country's foremost food research-and-development firms, which is situated in a low-slung concrete-and-glass building in a nondescript office park in Silicon Valley. Gundrum's office is a spare, windowless room near the rear, and all day long white-coated technicians come to him with prototypes in little bowls, or on skewers, or in Tupperware containers. His job is to taste and advise, and the most common words out of his mouth are "I have an idea." Just that afternoon, Gundrum had ruled on the reformulation of a popular spinach dip (which had an unfortunate tendency to smell like lawn clippings) and examined the latest iteration of a low-carb kettle corn for evidence of rhythmic munching (the metronomic hand-to-mouth cycle that lies at the heart of any successful snack experience). Mattson created the shelf-stable Mrs. Fields Chocolate Chip Cookie, the new Boca Burger products for Kraft Foods, Orville Redenbacher's Butter Toffee Popcorn Clusters, and so many other products that it is impossible to walk down the aisle of a supermarket and not be surrounded by evidence of the company's handiwork.
That evening, Gundrum had invited two of his senior colleagues at Mattson — Samson Hsia and Carol Borba — to dinner, along with Steven Addis, who runs a prominent branding firm in the Bay Area. They sat around an oblong table off to one side of the dining room, with the sun streaming in the window, and Gundrum informed them that he intended to reinvent the cookie, to make something both nutritious and as "indulgent" as the premium cookies on the supermarket shelf. "We want to delight people," he said. "We don't want some ultra-high-nutrition power bar, where you have to rationalize your consumption." He said it again: "We want to delight people."
As everyone at the table knew, a healthful, good-tasting cookie is something of a contradiction. A cookie represents the combination of three unhealthful ingredients—sugar, white flour, and shortening. The sugar adds sweetness, bulk, and texture: along with baking powder, it produces the tiny cell structures that make baked goods light and fluffy. The fat helps carry the flavor. If you want a big hit of vanilla, or that chocolate taste that really blooms in the nasal cavities, you need fat. It also keeps the strands of gluten in the flour from getting too tightly bound together, so that the cookie stays chewable. The ¦our, of course, gives the batter its structure, and, with the sugar, provides the base for the browning reaction that occurs during baking. You could replace the standard white flour with wheat flour, which is higher in fibre, but fibre adds grittiness. Over the years, there have been many attempts to resolve these contradictions — from Snackwells and diet Oreos to the dry, grainy hockey pucks that pass for cookies in health-food stores — but in every case ¦flavor or fluffiness or tenderness has been compromised. Steve Gundrum was undeterred. He told his colleagues that he wanted Project Delta to create the world's great-est cookie. He wanted to do it in six months. He wanted to enlist the biggest players in the American food industry. And how would he come up with this wonder cookie? The old-fashioned way. He wanted to hold a bakeoff.
2.
The standard protocol for inventing something in the food industry is called the matrix model. There is a department for product development, which comes up with a new idea, and a department for process development, which figures out how to realize it, and then, down the line, departments for packing, quality assurance, regulatory affairs, chemistry, microbiology, and so on. In a conventional bakeoff, Gundrum would have pitted three identical matrixes against one another and compared the results. But he wasn't satisfied with the unexamined assumption behind the conventional bakeoff — that there was just one way of inventing something new.
Gundrum had a particular interest, as it happened, in software. He had read widely about it, and once, when he ran into Steve Jobs at an Apple store in the Valley, chatted with him for forty-five minutes on technical matters relating to the Apple operating system. He saw little difference between what he did for a living and what the soft-ware engineers in the surrounding hills of Silicon Valley did. "Lines of code are no different from a recipe," he explains. "It's the same thing. You add a little salt, and it tastes better. You write a little piece of code, and it makes the software work faster." But in the software world, Gundrum knew, there were ongoing debates about the best way to come up with new code.
On the one hand, there was the "open source" movement. Its patron saint was Linus Torvald, the Norwegian hacker who decided to build a free version of Unix, the hugely complicated operating system that runs many of the world's large computers. Torvald created the basic implementation of his version, which he called Linux, posted it online, and invited people to contribute to its development. Over the years, thousands of programmers had helped, and Linux was now considered as good as proprietary versions of Unix. "Given enough eyeballs all bugs are shallow" was the Linux mantra: a thousand people working for an hour each can do a better job writing and fixing code than a single person working for a thou-sand hours, because the chances are that among those thousand people you can find precisely the right expert for every problem that comes up.
On the other hand, there was the "extreme programming" movement, known as XP, which was led by a legendary programmer named Kent Beck. He called for breaking a problem into the smallest possible increments, and proceeding as simply and modestly as possible. He thought that programmers should work in pairs, two to a computer, passing the keyboard back and forth. Between Beck and Torvald were countless other people, arguing for slightly different variations. But everyone in the software world agreed that trying to get people to be as creative as possible was, as often as not, a social problem: it depended not just on who was on the team but on how the team was organized.
"I remember once I was working with a printing company in Chicago," Beck says. "The people there were having a terrible problem with their technology. I got there, and I saw that the senior people had these corner offices, and they were working separately and doing things separately that they had trouble integrating later on. So I said, 'Find a space where you can work together.' So they found a corner of the machine room. It was a raised floor, ice cold. They just loved it. They would go there five hours a day, making lots of progress. I flew home. They hired me for my technical expertise. And I told them to rearrange the office furniture, and that was the most valuable thing I could offer them."
It seemed to Gundrum that people in the food world had a great deal to learn from all this. They had become adept at solving what he called "science projects" — problems that required straightforward, linear applications of expensive German machinery and armies of white-coated people with advanced degrees in engineering. Cool Whip was a good example: a product processed so exquisitely — with air bubbles of such fantastic uniformity and stability — that it remains structurally sound for months, at high elevation and at low elevation, frozen and thawed and then refrozen. But coming up with a healthy cookie, which required finessing the inherent contradictions posed by sugar, flour, and shortening, was the kind of problem that the food industry had more trouble with. Gundrum recalled one brainstorming session that a client of his, a major food company, had convened. "This is no joke," he said. "They played a tape where it sounded like the wind was blowing and the birds were chirping. And they posed us out on a dance floor, and we had to hold our arms out like we were trees and close our eyes, and the ideas were supposed to grow like fruits off the limbs of the trees. Next to me was the head of R. & D., and he looked at me and said: 'What the hell are we doing here?'"
For Project Delta, Gundrum decreed that there would be three teams, each representing a different methodology of invention. He had read Kent Beck's writings, and decided that the first would be the XP team. He enlisted two of Mattson's brightest young associates — Peter Dea and Dan Howell. Dea is a food scientist, who worked as a confectionist before coming to Mattson. He is tall and spare, with short dark hair. "Peter is really good at hitting the high note," Gundrum said. "If a product needs to have a particular flavor profile, he's really good at getting that one dimension and getting it right." Howell is a culinarian-goateed and talkative, a man of enthusiasms who uses high-end Mattson equipment to make an exceptional cup of espresso every afternoon. He started his career as a barista at Starbucks, and then realized that his vocation lay elsewhere. "A customer said to me, 'What do you want to be doing? Because you clearly don't want to be here,'" Howell said. "I told him, 'I want to be sitting in a room working on a better non-fat pudding.' "
The second team was headed by Barb Stuckey, an executive vice-president of marketing at Mattson and one of the firm's stars. She is slender and sleek, with short blond hair. She tends to think out loud, and, because she thinks quickly, she ends up talking quickly, too-in nervous brilliant bursts. Stuckey, Gundrum decided, would represent "managed" research and development—a traditional hierarchical team, as opposed to a partnership like Dea and Howell's. She would work with Doug Berg, who runs one of Mattson's product-development teams. Stuckey would draw the big picture. Berg would serve as sounding board and project director. His team would execute their conceptions.
Then Gundrum was at a technology conference in California and heard the software pioneer Mitch Kapor talking about the open-source revolution. Afterward, Gundrum approached Kapor. "I said to Mitch, 'What do you think? Can I apply this—some of the same principles—outside of software and bring it to the food industry?'" Gundrum recounted. "He stopped and said, 'Why the hell not!'" So Gundrum invited an Ă©lite group of food-industry bakers and scientists to collaborate online. They would be the third team. He signed up a senior person from Mars, Inc., someone from R. & D. at Kraft, the marketing manager for NestlĂ© Toll House refrigerated/frozen cookie dough, a senior director of R. & D. at Birds Eye Foods, the head of the innovation program for Kellogg's Morning Foods, the director of seasoning at McCormick, a cookie maven formerly at Keebler, and six more high-level specialists. Mattson's innovation manager, Carol Borba, who began her career as a line cook at Bouley, in Manhattan, was given the role of project manager. Two Mattson staffers were assigned to carry out the group's recommendations. This was the Dream Team. It is quite possible that this was the most talented group of people ever to work together in the history of the food industry.
Soon after the launch of Project Delta, Steve Gundrum and his colleague Samson Hsia were standing around, talking about the current products in the supermarket which they particularly admire. "I like the Uncrustable line from Smuckers," Hsia said. "It's a frozen sandwich without any crust. It eats very well. You can put it in a lunchbox frozen, and it will be unfrozen by lunchtime." Hsia is a trim, silver-haired man who is said to know as much about emulsions as anyone in the business. "There's something else," he said, suddenly. "We just saw it last week. It's made by Jennie-O. It's turkey in a bag." This was a turkey that was seasoned, plumped with brine, and sold in a heat-resistant plastic bag: the customer simply has to place it in the oven. Hsia began to stride toward the Mattson kitchens, because he realized they actually had a Jennie-O turkey in the back. Gundrum followed, the two men weaving their way through the maze of corridors that make up the Mattson offices. They came to a large freezer. Gundrum pulled out a bright-colored bag. Inside was a second, clear bag, and inside that bag was a twelve-pound turkey. "This is one of my favorite innovations of the last year," Gundrum said, as Hsia nodded happily. "There is material science involved. There is food science involved. There is positioning involved. You can take this thing, throw it in your oven, and people will be blown away. It's that good. If I was Butterball, I'd be terrified."
Jennie-O had taken something old and made it new. But where had that idea come from? Was it a team? A committee? A lone turkey genius? Those of us whose only interaction with such innovations is at the point of sale have a naĂŻve faith in human creativity; we suppose that a world capable of coming up with turkey in a bag is capable of coming up with the next big thing as well—a healthy cookie, a faster computer chip, an automobile engine that gets a hundred miles to the gallon. But if you're the one responsible for those bright new ideas there is no such certainty. You come up with one great idea, and the process is so miraculous that all you do is puzzle over how on earth you ever did it, and worry whether you'll ever be able to do it again.
3.
The Mattson kitchens are a series of large, connecting rooms, running along the back of the building. There is a pilot plant in one corner — containing a mini version of the equipment that, say, Heinz would use to make canned soup, a soft-serve ice-cream machine, an industrial-strength pasta-maker, a colloid mill for making oil-and-water emulsions, a flash pasteurizer, and an eighty-five-thousand-dollar Japanese-made coextruder for, among other things, pastry-and-filling combinations. At any given time, the firm may have as many as fifty or sixty projects under way, so the kitchens are a hive of activity, with pressure cookers filled with baked beans bubbling in one corner, and someone rushing from one room to another carrying a tray of pizza slices with experimental toppings.
Dea and Howell, the XP team, took over part of one of the kitchens, setting up at a long stainless-steel lab bench. The countertop was crowded with tins of flour, a big white plastic container of wheat dextrin, a dozen bottles of liquid sweeteners, two plastic bottles of Kirkland olive oil, and, somewhat puzzlingly, three varieties of single-malt Scotch. The Project Delta brief was simple. All cookies had to have fewer than a hundred and thirty calories per serving. Carbohydrates had to be under 17.5 grams, saturated fat under two grams, fibre more than one gram, protein more than two grams, and so on; in other words, the cookie was to be at least fifteen per cent superior to the supermarket average in the major nutritional categories. To Dea and Howell, that suggested oatmeal, and crispy, as opposed to soft. "I've tried lots of cookies that are sold as soft and I never like them, because they're trying to be something that they're not," Dea explained. "A soft cookie is a fresh cookie, and what you are trying to do with soft is be a fresh cookie that's a month old. And that means you need to fake the freshness, to engineer the cookie."
The two decided to focus on a kind of oatmeal-chocolate-chip hybrid, with liberal applications of roasted soy nuts, toffee, and caramel. A straight oatmeal-raisin cookie or a straight low-cal chocolate-chip cookie was out of the question. This was a reflection of what might be called the Hidden Valley Ranch principle, in honor of a story that Samson Hsia often told about his years working on salad dressing when he was at Clorox. The couple who owned Hidden Valley Ranch, near Santa Barbara, had come up with a seasoning blend of salt, pepper, onion, garlic, and parsley flakes that was mixed with equal parts mayonnaise and buttermilk to make what was, by all accounts, an extraordinary dressing. Clorox tried to bottle it, but found that the buttermilk could not coexist, over any period of time, with the mayonnaise. The way to fix the problem, and preserve the texture, was to make the combination more acidic. But when you increased the acidity you ruined the flavor. Clorox's food engineers worked on Hidden Valley Ranch dressing for close to a decade. They tried different kinds of processing and stability control and endless cycles of consumer testing before they gave up and simply came out with a high-acid Hidden Valley Ranch dressing — which promptly became a runaway best-seller. Why? Because consumers had never tasted real Hidden Valley Ranch dressing, and as a result had no way of knowing that what they were eating was inferior to the original. For those in the food business, the lesson was unforgettable: if something was new, it didn't have to be perfect. And, since healthful, indulgent cookies couldn't be perfect, they had to be new: hence oatmeal, chocolate chips, toffee, and caramel.
Cookie development, at the Mattson level, is a matter of endless iteration, and Dea and Howell began by baking version after version in quick succession — establishing the cookie size, the optimal baking time, the desired variety of chocolate chips, the cut of oats (bulk oats? rolled oats? groats?), the varieties of flour, and the toffee dosage, while testing a variety of high-tech supplements, notably inulin, a fibre source derived from chicory root. As they worked, they made notes on tablet P.C.s, which gave them a running electronic record of each version. "With food, there's a large circle of pretty good, and we're solidly in pretty good," Dea announced, after several intensive days of baking. A tray of cookies was cooling in front of him on the counter. "Typically, that's when you take it to the customers."
In this case, the customer was Gundrum, and the next week Howell marched over to Gundrum's office with two Ziploc bags of cookies in his hand. There was a package of Chips Ahoy! on the table, and Howell took one out. "We've been eating these versus Chips Ahoy!," he said.
The two cookies looked remarkably alike. Gundrum tried one of each. "The Chips Ahoy!, it's tasty," he said. "When you eat it, the starch hydrates in your mouth. The XP doesn't have that same granulated-sugar kind of mouth feel."
"It's got more fat than us, though, and subsequently it's shorter in texture," Howell said. "And so, when you break it, it breaks more nicely. Ours is a little harder to break."
By "shorter in texture," he meant that the cookie "popped" when you bit into it. Saturated fats are solid fats, and give a cookie crispness. Parmesan cheese is short-textured. Brie is long. A shortbread like a Lorna Doone is a classic short-textured cookie. But the XP cookie had, for health reasons, substituted unsaturated fats for saturated fats, and unsaturated fats are liquid. They make the dough stickier, and inevitably compromise a little of that satisfying pop.
"The whole-wheat flour makes us a little grittier, too," Howell went on. "It has larger particulates." He broke open one of the Chips Ahoy!. "See how fine the grain is? Now look at one of our cookies. The particulates are larger. It is part of what we lose by going with a healthy profile. If it was just sugar and ¦our, for instance, the carbohydrate chains are going to be shorter, and so they will dissolve more quickly in your mouth. Whereas with more fibre you get longer carbohydrate chains and they don't dissolve as quickly, and you get that slightly tooth-packing feel."
"It looks very wholesome, like something you would want to feed your kids," Gundrum said, finally. They were still only in the realm of pretty good.
4.
Team Stuckey, meanwhile, was having problems of its own. Barb Stuckey's first thought had been a tea cookie, or, more specifically, a chai cookie — something with cardamom and cinnamon and vanilla and cloves and a soft dairy note. Doug Berg was dispatched to run the experiment. He and his team did three or four rounds of prototypes. The result was a cookie that tasted, astonishingly, like a cup of chai, which was, of course, its problem. Who wanted a cookie that tasted like a cup of chai? Stuckey called a meeting in the Mattson trophy room, where samples of every Mattson product that has made it to market are displayed. After everyone was done tasting the cookies, a bag of them sat in the middle of the table for forty-five minutes—and no one reached to take a second bite. It was a bad sign.
"You know, before the election Good Housekeeping had this cookie bakeoff," Stuckey said, as the meeting ended. "Laura Bush's entry was full of chocolate chips and had familiar ingredients. And Teresa Heinz went with pumpkin-spice cookies. I remember thinking, That's just like the Democrats! So not mainstream! I wanted her to win. But she's chosen this cookie that's funky and weird and out of the box. And I kind of feel the same way about the tea cookie. It's too far out, and will lose to something that's more comfortable for consumers."
Stuckey's next thought involved strawberries and a shortbread base. But shortbread was virtually impossible under the nutritional guidelines: there was no way to get that smooth butter-flour-sugar combination. So Team Stuckey switched to something closer to a strawberry-cobbler cookie, which had the Hidden Valley Ranch advantage that no one knew what a strawberry-cobbler cookie was supposed to taste like. Getting the carbohydrates down to the required 17.5 grams, though, was a struggle, because of how much flour and fruit cobbler requires. The obvious choice to replace the flour was almonds. But nuts have high levels of both saturated and unsaturated fat. "It became a balancing act," Anne Cristofano, who was doing the bench work for Team Stuckey, said. She baked batch after batch, playing the carbohydrates (first the flour, and then granulated sugar, and finally various kinds of what are called sugar alcohols, low-calorie sweeteners derived from hydrogenizing starch) against the almonds. Cristofano took a version to Stuckey. It didn't go well.
"We're not getting enough strawberry impact from the fruit alone," Stuckey said. "We have to find some way to boost the strawberry." She nibbled some more. "And, because of the low fat and all that stuff, I don't feel like we're getting that pop."
The Dream Team, by any measure, was the overwhelming Project Delta favorite. This was, after all, the Dream Team, and if any idea is ingrained in our thinking it is that the best way to solve a difficult problem is to bring the maximum amount of expertise to bear on it. Sure enough, in the early going the Dream Team was on fire. The members of the Dream Team did not doggedly fix on a single idea, like Dea and Howell, or move in fits and starts from chai sugar cookies to strawberry shortbread to strawberry cobbler, like Team Stuckey. It came up with thirty-four ideas, representing an astonishing range of cookie philosophies: a chocolate cookie with gourmet cocoa, high-end chocolate chips, pecans, raisins, Irish steel-cut oats, and the new Ultragrain White Whole Wheat flour; a bite-size oatmeal cookie with a Ceylon cinnamon filling, or chili and tamarind, or pieces of dried peaches with a cinnamon-and-ginger dusting; the classic seven-layer bar with oatmeal instead of graham crackers, coated in chocolate with a choice of coffee flavors; a "wellness" cookie, with an oatmeal base, soy and whey proteins, inulin and oat beta glucan and a combination of erythritol and sugar and sterol esters—and so on.
In the course of spewing out all those new ideas, however, the Dream Team took a difficult turn. A man named J. Hugh McEvoy (a.k.a. Chef J.), out of Chicago, tried to take control of the discussion. He wanted something exotic — not a health-food version of something already out there. But in the e-mail discussions with others on the team his sense of what constituted exotic began to get really exotic — "Chinese star anise plus fennel plus Pastis plus dark chocolate." Others, emboldened by his example, began talking about a possible role for zucchini or wasabi peas. Meanwhile, a more conservative faction, mindful of the Project Delta mandate to appeal to the whole family, started talking up peanut butter. Within a few days, the tensions were obvious:
From: Chef J.
Subject: <no subject>
Please keep in mind that less than 10 years ago, espresso, latte and dulce de leche were EXOTIC flavors / products that were considered unsuitable for the mainstream. And let's not even mention CHIPOTLE.
From: Andy Smith
Subject: Bought any Ben and Jerry's recently?
While we may not want to invent another Oreo or Chips Ahoy!, last I looked, World's Best Vanilla was B&J's # 2 selling flavor and Haagen Dazs' Vanilla (their top seller) outsold Dulce 3 to 1.
From: Chef J.
Subject: <no subject>Yes. Gourmet Vanilla does outsell any new flavor. But we must remember that DIET vanilla does not and never has. It is the high end, gourmet segment of ice cream that is growing. Diet Oreos were vastly outsold by new entries like Snackwells. Diet Snickers were vastly outsold by new entries like balance bars. New Coke failed miserably, while Red Bull is still growing.
What flavor IS Red Bull, anyway?
Eventually, Carol Borba, the Dream Team project leader, asked Gundrum whether she should try to calm things down. He told her no; the group had to find its "own kind of natural rhythm." He wanted to know what fifteen high-powered bakers thrown together on a project felt like, and the answer was that they felt like chaos. They took twice as long as the XP team. They created ten times the headache.
Worse, no one in the open-source group seemed to be having any fun. "Quite honestly, I was expecting a bit more involvement in this," Howard Plein, of Edlong Dairy Flavors, confessed afterward. "They said, expect to spend half an hour a day. But without doing actual bench work — all we were asked to do was to come up with ideas." He wanted to bake: he didn't enjoy being one of fifteen cogs in a machine. To Dan Fletcher, of Kellogg's, "the whole thing spun in place for a long time. I got frustrated with that. The number of people involved seemed unwieldy. You want some diversity of youth and experience, but you want to keep it close-knit as well. You get some depth in the process versus breadth. We were a mile wide and an inch deep." Chef J., meanwhile, felt thwarted by Carol Borba; he felt that she was pushing her favorite, a caramel turtle, to the detriment of better ideas. "We had the best people in the country involved," he says. "We were irrelevant. That's the weakness of it. Fifteen is too many. How much true input can any one person have when you are lost in the crowd?" In the end, the Dream Team whittled down its thirty-four possibilities to one: a chewy oatmeal cookie, with a pecan "thumbprint" in the middle, and ribbons of caramel-and-chocolate glaze. When Gundrum tasted it, he had nothing but praise for its "cookie hedonics." But a number of the team members were plainly unhappy with the choice. "It is not bad," Chef J. said. "But not bad doesn't win in the food business. There was nothing there that you couldn't walk into a supermarket and see on the shelf. Any Pepperidge Farm product is better than that. Any one."
It may have been a fine cookie. But, since no single person played a central role in its creation, it didn't seem to anyone to be a fine cookie.
The strength of the Dream Team — the fact that it had so many smart people on it — was also its weakness: it had too many smart people on it. Size provides expertise. But it also creates friction, and one of the truths Project Delta exposed is that we tend to overestimate the importance of expertise and underestimate the problem of friction. Gary Klein, a decision-making consultant, once examined this issue in depth at a nuclear power plant in North Carolina. In the nineteen-nineties, the power supply used to keep the reactor cool malfunctioned. The plant had to shut down in a hurry, and the shutdown went badly. So the managers brought in Klein's consulting group to observe as they ran through one of the crisis rehearsals mandated by federal regulators. "The drill lasted four hours," David Klinger, the lead consultant on the project, recalled. "It was in this big operations room, and there were between eighty and eighty-five people involved. We roamed around, and we set up a video camera, because we wanted to make sense of what was happening."
When the consultants asked people what was going on, though, they couldn't get any satisfactory answers. "Each person only knew a little piece of the puzzle, like the radiation person knew where the radiation was, or the maintenance person would say, 'I'm trying to get this valve closed,' " Klinger said. "No one had the big picture. We started to ask questions. We said, 'What is your mission?' And if the person didn't have one, we said, 'Get out.' There were just too many people. We ended up getting that team down from eighty-five to thirty-five people, and the first thing that happened was that the noise in the room was dramatically reduced." The room was quiet and calm enough so that people could easily find those they needed to talk to. "At the very end, they had a big drill that the N.R.C. was going to regulate. The regulators said it was one of their hardest drills. And you know what? They aced it." Was the plant's management team smarter with thirty-five people on it than it was with eighty-five? Of course not, but the expertise of those additional fifty people was more than cancelled out by the extra confusion and noise they created.
The open-source movement has had the same problem. The number of people involved can result in enormous friction. The software theorist Joel Spolsky points out that open-source software tends to have user interfaces that are difficult for ordinary people to use: "With Microsoft Windows, you right-click on a folder, and you're given the option to share that folder over the Web. To do the same thing with Apache, the open-source Web server, you've got to track down a file that has a different name and is stored in a different place on every system. Then you have to edit it, and it has its own syntax and its own little programming language, and there are lots of different comments, and you edit it the first time and it doesn't work and then you edit it the second time and it doesn't work."
Because there are so many individual voices involved in an open-source project, no one can agree on the right way to do things. And, because no one can agree, every possible option is built into the software, thereby frustrating the central goal of good design, which is, after all, to understand what to leave out. Spolsky notes that almost all the successful open-source products have been attempts to clone some preexisting software program, like Microsoft's Internet Explorer, or Unix. "One of the reasons open source works well for Linux is that there isn't any real design work to be undertaken," he says. "They were doing what we would call chasing tail-lights." Open source was great for a science project, in which the goals were clearly defined and the technical hurdles easily identifiable. Had Project Delta been a Cool Whip bakeoff, an exercise in chasing tail-lights, the Dream Team would easily win. But if you want to design a truly innovative software program — or a truly innovative cookie — the costs of bigness can become overwhelming.
In the frantic final weeks before the bakeoff, while the Dream Team was trying to fix a problem with crumbling, and hit on the idea of glazing the pecan on the face of the cookie, Dea and Howell continued to make steady, incremental improvements.
"These cookies were baked five days ago," Howell told Gundrum, as he handed him a Ziploc bag. Dea was off somewhere in the Midwest, meeting with clients, and Howell looked apprehensive, stroking his goatee nervously as he stood by Gundrum's desk. "We used wheat dextrin, which I think gives us some crispiness advantages and some shelf-stability advantages. We have a little more vanilla in this round, which gives you that brown, rounding background note."
Gundrum nodded. "The vanilla is almost like a surrogate for sugar," he said. "It potentiates the sweetness."
"Last time, the leavening system was baking soda and baking powder," Howell went on. "I switched that to baking soda and monocalcium phosphate. That helps them rise a little bit better. And we baked them at a slightly higher temperature for slightly longer, so that we drove off a little bit more moisture."
"How close are you?" Gundrum asked.
"Very close," Howell replied.
Gundrum was lost in thought for a moment. "It looks very wholesome. It looks like something you'd want to feed your kids. It has very good aroma. I really like the texture. My guess is that it eats very well with milk." He turned back to Howell, suddenly solicitous. "Do you want some milk?"
Meanwhile, Barb Stuckey had a revelation. She was working on a tortilla-chip project, and had bags of tortilla chips all over her desk. "You have no idea how much engineering goes into those things," she said, holding up a tortilla chip. "It's greater than what it takes to build a bridge. It's crazy." And one of the clever things about cheese tortilla chips—particularly the low-fat versions—is how they go about distracting the palate. "You know how you put a chip in your mouth and the minute it hits your tongue it explodes with flavor?" Stuckey said. "It's because it's got this topical seasoning. It's got dried cheese powders and sugar and probably M.S.G. and all that other stuff on the outside of the chip."
Her idea was to apply that technique to strawberry cobbler—to take large crystals of sugar, plate them with citric acid, and dust the cookies with them. "The minute they reach your tongue, you get this sweet-and-sour hit, and then you crunch into the cookie and get the rest—the strawberry and the oats," she said. The crystals threw off your taste buds. You weren't focussed on the fact that there was half as much fat in the cookie as there should be. Plus, the citric acid brought a tangy flavor to the dried strawberries: suddenly they felt fresh.
Batches of the new strawberry-cobbler prototype were ordered up, with different formulations of the citric acid and the crystals. A meeting was called in the trophy room. Anne Cristofano brought two plastic bags filled with cookies. Stuckey was there, as was a senior Mattson food technologist named Karen Smithson, an outsider brought to the meeting in an advisory role. Smithson, a former pastry chef, was a little older than Stuckey and Cristofano, with an air of self-possession. She broke the seal on the first bag, and took a bite with her eyes half closed. The other two watched intently.
"Umm," Smithson said, after the briefest of pauses. "That is pretty darn good. And this is one of the healthy cookies? I would not say, 'This is healthy.' I can't taste the trade-off." She looked up at Stuckey. "How old are they?"
"Today," Stuckey replied.
"O.K. . . ." This was a complicating fact. Any cookie tastes good on the day it's baked. The question was how it tasted after baking and packaging and shipping and sitting in a warehouse and on a supermarket shelf and finally in someone's cupboard.
"What we're trying to do here is a shelf-stable cookie that will last six months," Stuckey said. "I think we're better off if we can make it crispy."
Smithson thought for a moment. "You can have either a crispy, low-moisture cookie or a soft and chewy cookie," she said. "But you can't get the outside crisp and the inside chewy. We know that. The moisture will migrate. It will equilibrate over time, so you end up with a cookie that's consistent all the way through. Remember we did all that work on Mrs. Fields? That's what we learned."
They talked for a bit, in technical terms, about various kinds of sugars and starches. Smithson didn't think that the stability issue was going to be a problem.
"Isn't it compelling, visually?" Stuckey blurted out, after a lull in the conversation. And it was: the dried-strawberry chunks broke though the surface of the cookie, and the tiny citric-sugar crystals glinted in the light. "I just think you get so much more bang for the buck when you put the seasoning on the outside."
"Yet it's not weird," Smithson said, nodding. She picked up another cookie. "The mouth feel is a combination of chewy and crunchy. With the flavors, you have the caramelized sugar, the brown-sugar notes. You have a little bit of a chew from the oats. You have a flavor from the strawberry, and it helps to have a combination of the sugar alcohol and the brown sugar. You know, sugars have different deliveries, and sometimes you get some of the sweetness right off and some of it continues on. You notice that a lot with the artificial sweeteners. You get the sweetness that doesn't go away, long after the other flavors are gone. With this one, the sweetness is nice. The flavors come together at the same time and fade at the same time, and then you have the little bright after-hits from the fruit and the citric crunchies, which are" — she paused, looking for the right word — "brilliant."
5.
The bakeoff took place in April. Mattson selected a representative sample of nearly three hundred households from around the country. Each was mailed bubble-wrapped packages containing all three entrants. The vote was close but unequivocal. Fourteen per cent of the households voted for the XP oatmeal-chocolate-chip cookie. Forty-one per cent voted for the Dream Team's oatmeal-caramel cookie. Forty-four per cent voted for Team Stuckey's strawberry cobbler.
The Project Delta postmortem was held at Chaya Brasserie, a French-Asian fusion restaurant on the Embarcadero, in San Francisco. It was just Gundrum and Steven Addis, from the first Project Delta dinner, and their wives. Dan Howell was immersed in a confidential project for a big food conglomerate back East. Peter Dea was working with Cargill on a wellness product. Carol Borba was in Chicago, at a meeting of the Food Marketing Institute. Barb Stuckey was helping Ringling Brothers rethink the food at its concessions. "We've learned a lot about the circus," Gundrum said. Meanwhile, Addis's firm had created a logo and a brand name for Project Delta. Mattson has offered to license the winning cookie at no cost, as long as a percentage of its sales goes to a charitable foundation that Mattson has set up to feed the hungry. Someday soon, you should be able to go into a supermarket and buy Team Stuckey's strawberry-cobbler cookie.
"Which one would you have voted for?" Addis asked Gundrum.
"I have to say, they were all good in their own way," Gundrum replied. It was like asking a mother which of her children she liked best. "I thought Barb's cookie was a little too sweet, and I wish the open-source cookie was a little tighter, less crumbly. With XP, I think we would have done better, but we had a wardrobe malfunction. They used too much batter, overbaked it, and the cookie came out too hard and thick.
"In the end, it was not so much which cookie won that interested him. It was who won—and why. Three people from his own shop had beaten a Dream Team, and the decisive edge had come not from the collective wisdom of a large group but from one person's ability to make a lateral connection between two previously unconnected objects — a tortilla chip and a cookie. Was that just Barb being Barb? In large part, yes. But it was hard to believe that one of the Dream Team members would not have made the same kind of leap had they been in an environment quiet enough to allow them to think.
"Do you know what else we learned?" Gundrum said. He was talking about a questionnaire given to the voters. "We were looking at the open-ended questions — where all the families who voted could tell us what they were thinking. They all said the same thing — all of them." His eyes grew wide. "They wanted better granola bars and breakfast bars. I would not have expected that." He fell silent for a moment, turning a granola bar over and around in his mind, assembling and disassembling it piece by piece, as if it were a model airplane. "I thought that they were pretty good," he said. "I mean, there are so many of them out there. But apparently people want them better."
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
The Cellular Church
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
September 12, 2005
Letter From Saddleback
How Rick Warren built his ministry.
1.
On the occasion of the twenty-fifth anniversary of Saddleback Church, Rick Warren hired the Anaheim Angels' baseball stadium. He wanted to address his entire congregation at once, and there was no way to fit everyone in at Saddleback, where the crowds are spread across services held over the course of an entire weekend. So Warren booked the stadium and printed large, silver-black-and-white tickets, and, on a sunny Sunday morning last April, the tens of thousands of congregants of one of America's largest churches began to file into the stands. They were wearing shorts and T-shirts and buying Cokes and hamburgers from the concession stands, if they had not already tailgated in the parking lot. On the field, a rock band played loudly and enthusiastically. Just after one o'clock, a voice came over the public-address system—"RIIIICK WARRRREN"—and Warren bounded onto the stage, wearing black slacks, a red linen guayabera shirt, and wraparound NASCAR sunglasses. The congregants leaped to their feet."You know," Warren said, grabbing the microphone, "there are two things I've always wanted to do in a stadium." He turned his body sideways, playing an imaginary guitar, and belted out the first few lines of Jimi Hendrix's "Purple Haze." His image was up on the Jumbotrons in right and left fields, just below the Verizon and Pepsi and Budweiser logos. He stopped and grinned. "The other thing is, I want to do a wave!" He pointed to the bleachers, and then to the right-field seats, and around and around the stadium the congregation rose and fell, in four full circuits. "You are the most amazing church in America!" Warren shouted out, when they had finally finished. "AND I LOVE YOU!"
2.
Rick Warren is a large man, with a generous stomach. He has short, spiky hair and a goatee. He looks like an ex-athlete, or someone who might have many tattoos. He is a hugger, enfolding those he meets in his long arms and saying things like "Hey, man." According to Warren, from sixth grade through college there wasn't a day in his life that he wasn't president of something, and that makes sense, because he's always the one at the center of the room talking or laughing, with his head tilted way back, or crying, which he does freely. In the evangelical tradition, preachers are hard or soft. Billy Graham, with his piercing eyes and protruding chin and Bible clenched close to his chest, is hard. So was Martin Luther King, Jr., who overwhelmed his audience with his sonorous, forcefully enunciated cadences. Warren is soft. His sermons are conversational, delivered in a folksy, raspy voice. He talks about how he loves Krispy Kreme doughnuts, drives a four-year-old Ford, and favors loud Hawaiian shirts, even at the pulpit, because, he says, "they do not itch."
In December of 1979, when Warren was twenty-five years old, he and his wife, Kay, took their four-month-old baby and drove in a U-Haul from Texas to Saddleback Valley, in Orange County, because Warren had read that it was one of the fastest-growing counties in the country. He walked into the first real-estate office he found and introduced himself to the first agent he saw, a man named Don Dale. He was looking for somewhere to live, he said.
"Do you have any money to rent a house?" Dale asked.
"Not much, but we can borrow some," Warren replied.
"Do you have a job?"
"No. I don't have a job."
"What do you do for a living?"
"I'm a minister."
"So you have a church?"
"Not yet."
Dale found him an apartment that very day, of course: Warren is one of those people whose lives have an irresistible forward momentum. In the car on the way over, he recruited Dale as the first member of his still nonexistent church, of course. And when he held his first public service, three months later, he stood up in front of two hundred and five people he barely knew in a high-school gymnasium—this shiny-faced preacher fresh out of seminary—and told them that one day soon their new church would number twenty thousand people and occupy a campus of fifty acres. Today, Saddleback Church has twenty thousand members and occupies a campus of a hundred and twenty acres. Once, Warren wanted to increase the number of small groups at Saddleback—the groups of six or seven that meet for prayer and fellowship during the week—by three hundred. He went home and prayed and, as he tells it, God said to him that what he really needed to do was increase the number of small groups by three thousand, which is just what he did. Then, a few years ago, he wrote a book called "The Purpose-Driven Life," a genre of book that is known in the religious-publishing business as "Christian Living," and that typically sells thirty or forty thousand copies a year. Warren's publishers came to see him at Saddleback, and sat on the long leather couch in his office, and talked about their ideas for the book. "You guys don't understand," Warren told them. "This is a hundred-million-copy book." Warren remembers stunned silence: "Their jaws dropped." But now, nearly three years after its publication, "The Purpose-Driven Life" has sold twenty-three million copies. It is among the best-selling nonfiction hardcover books in American history. Neither the New York Times, the Los Angeles Times, nor the Washington Post has reviewed it. Warren's own publisher didn't see it coming. Only Warren had faith. "The best of the evangelical tradition is that you don't plan your way forward—you prophesy your way forward," the theologian Leonard Sweet says. "Rick's prophesying his way forward."
Not long after the Anaheim service, Warren went back to his office on the Saddleback campus. He put his feet up on the coffee table. On the wall in front of him were framed originals of the sermons of the nineteenth-century preacher Charles Spurgeon, and on the bookshelf next to him was his collection of hot sauces. "I had dinner with Jack Welch last Sunday night," he said. "He came to church, and we had dinner. I've been kind of mentoring him on his spiritual journey. And he said to me, 'Rick, you are the biggest thinker I have ever met in my life. The only other person I know who thinks globally like you is Rupert Murdoch.' And I said, 'That's interesting. I'm Rupert's pastor! Rupert published my book!'" Then he tilted back his head and gave one of those big Rick Warren laughs.
3.
Churches, like any large voluntary organization, have at their core a contradiction. In order to attract newcomers, they must have low barriers to entry. They must be unintimidating, friendly, and compatible with the culture they are a part of. In order to retain their membership, however, they need to have an identity distinct from that culture. They need to give their followers a sense of community—and community, exclusivity, a distinct identity are all, inevitably, casualties of growth. As an economist would say, the bigger an organization becomes, the greater a free-rider problem it has. If I go to a church with five hundred members, in a magnificent cathedral, with spectacular services and music, why should I volunteer or donate any substantial share of my money? What kind of peer pressure is there in a congregation that large? If the barriers to entry become too low—and the ties among members become increasingly tenuous—then a church as it grows bigger becomes weaker.
One solution to the problem is simply not to grow, and, historically, churches have sacrificed size for community. But there is another approach: to create a church out of a network of lots of little church cells—exclusive, tightly knit groups of six or seven who meet in one another's homes during the week to worship and pray. The small group as an instrument of community is initially how Communism spread, and in the postwar years Alcoholics Anonymous and its twelve-step progeny perfected the small-group technique. The small group did not have a designated leader who stood at the front of the room. Members sat in a circle. The focus was on discussion and interaction—not one person teaching and the others listening—and the remarkable thing about these groups was their power. An alcoholic could lose his job and his family, he could be hospitalized, he could be warned by half a dozen doctors—and go on drinking. But put him in a room of his peers once a week—make him share the burdens of others and have his burdens shared by others—and he could do something that once seemed impossible.
When churches—in particular, the megachurches that became the engine of the evangelical movement, in the nineteen-seventies and eighties—began to adopt the cellular model, they found out the same thing. The small group was an extraordinary vehicle of commitment. It was personal and flexible. It cost nothing. It was convenient, and every worshipper was able to find a small group that precisely matched his or her interests. Today, at least forty million Americans are in a religiously based small group, and the growing ranks of small-group membership have caused a profound shift in the nature of the American religious experience."
As I see it, one of the most unfortunate misunderstandings of our time has been to think of small intentional communities as groups 'within' the church," the philosopher Dick Westley writes in one of the many books celebrating the rise of small-group power. "When are we going to have the courage to publicly proclaim what everyone with any experience with small groups has known all along: they are not organizations 'within' the church; they are church."
Ram Cnaan, a professor of social work at the University of Pennsylvania, recently estimated the replacement value of the charitable work done by the average American church—that is, the amount of money it would take to equal the time, money, and resources donated to the community by a typical congregation—and found that it came to about a hundred and forty thousand dollars a year. In the city of Philadelphia, for example, that works out to an annual total of two hundred and fifty million dollars' worth of community "good"; on a national scale, the contribution of religious groups to the public welfare is, as Cnaan puts it, "staggering." In the past twenty years, as the enthusiasm for publicly supported welfare has waned, churches have quietly and steadily stepped in to fill the gaps. And who are the churchgoers donating all that time and money? People in small groups. Membership in a small group is a better predictor of whether people volunteer or give money than how often they attend church, whether they pray, whether they've had a deep religious experience, or whether they were raised in a Christian home. Social action is not a consequence of belief, in other words. I don't give because I believe in religious charity. I give because I belong to a social structure that enforces an ethic of giving. "Small groups are networks," the Princeton sociologist Robert Wuthnow, who has studied the phenomenon closely, says. "They create bonds among people. Expose people to needs, provide opportunities for volunteering, and put people in harm's way of being asked to volunteer. That's not to say that being there for worship is not important. But, even in earlier research, I was finding that if people say all the right things about being a believer but aren't involved in some kind of physical social setting that generates interaction, they are just not as likely to volunteer."
Rick Warren came to the Saddle-back Valley just as the small-group movement was taking off. He was the son of a preacher—a man who started seven churches in and around Northern California and was enough of a carpenter to have built a few dozen more with his own hands—and he wanted to do what his father had done: start a church from scratch.
For the first three months, he went from door to door in the neighborhood around his house, asking people why they didn't attend church. Churches were boring and irrelevant to everyday life, he was told. They were unfriendly to visitors. They were too interested in money. They had inadequate children's programs. So Warren decided that in his new church people would play and sing contemporary music, not hymns. (He could find no one, Warren likes to say, who listened to organ music in the car.) He would wear the casual clothes of his community. The sermons would be practical and funny and plainspoken, and he would use video and drama to illustrate his message. And when an actual church was finally built—Saddleback used seventy-nine different locations in its first thirteen years, from high-school auditoriums to movie theatres and then tents before building a permanent home—the church would not look churchy: no pews, or stained glass, or lofty spires. Saddleback looks like a college campus, and the main sanctuary looks like the school gymnasium. Parking is plentiful. The chairs are comfortable. There are loudspeakers and television screens everywhere broadcasting the worship service, and all the doors are open, so anyone can slip in or out, at any time, in the anonymity of the enormous crowds. Saddle-back is a church with very low barriers to entry.
But beneath the surface is a network of thousands of committed small groups. "Orange County is virtually a desert in social-capital terms," the Harvard political scientist Robert Putnam, who has taken a close look at the Saddleback success story, says. "The rate of mobility is really high. It has long and anonymous commutes. It's a very friendless place, and this church offers serious heavy friendship. It's a very interesting experience to talk to some of those groups. There were these eight people and they were all mountain bikers—mountain bikers for God. They go biking together, and they are one another's best friends. If one person's wife gets breast cancer, he can go to the others for support. If someone loses a job, the others are there for him. They are deeply best friends, in a larger social context where it is hard to find a best friend."
Putnam goes on, "Warren didn't invent the cellular church. But he's brought it to an amazing level of effectiveness. The real job of running Saddleback is the recruitment and training and retention of the thousands of volunteer leaders for all the small groups it has. That's the surprising thing to me—that they are able to manage that. Those small groups are incredibly vulnerable, and complicated to manage. How to keep all those little dinghies moving in the same direction is, organizationally, a major accomplishment."
At Saddleback, members are expected to tithe, and to volunteer. Sunday-school teachers receive special training and a police background check. Recently, Warren decided that Saddleback would feed every homeless person in Orange County three meals a day for forty days. Ninety-two hundred people volunteered. Two million pounds of food were collected, sorted, and distributed.
It may be easy to start going to Saddleback. But it is not easy to stay at Saddleback. "Last Sunday, we took a special offering called Extend the Vision, for people to give over and above their normal offering," Warren said. "We decided we would not use any financial consultants, no high-powered gimmicks, no thermometer on the wall. It was just 'Folks, you know you need to give.' Sunday's offering was seven million dollars in cash and fifty-three million dollars in commitments. That's one Sunday. The average commitment was fifteen thousand dollars a family. That's in addition to their tithe. When people say megachurches are shallow, I say you have no idea. These people are committed."
Warren's great talent is organizational. He's not a theological innovator. When he went from door to door, twenty-five years ago, he wasn't testing variants on the Christian message. As far as he was concerned, the content of his message was non-negotiable. Theologically, Warren is a straight-down-the-middle evangelical. What he wanted to learn was how to construct an effective religious institution. His interest was sociological. Putnam compares Warren to entrepreneurs like Ray Kroc and Sam Walton, pioneers not in what they sold but in how they sold. The contemporary thinker Warren cites most often in conversation is the management guru Peter Drucker, who has been a close friend of his for years. Before Warren wrote "The Purpose-Driven Life," he wrote a book called "The Purpose-Driven Church," which was essentially a how-to guide for church builders. He's run hundreds of training seminars around the world for ministers of small-to-medium-sized churches. At the beginning of the Internet boom, he created a Web site called pastors.com, on which he posted his sermons for sale for four dollars each. There were many pastors in the world, he reasoned, who were part time. They had a second, nine-to-five job and families of their own, and what little free time they had was spent ministering to their congregation. Why not help them out with Sunday morning? The Web site now gets nearly four hundred thousand hits a day.
"I went to South Africa two years ago," Warren said. "We did the purpose-driven-church training, and we simulcast it to ninety thousand pastors across Africa. After it was over, I said, 'Take me out to a village and show me some churches.'"
In the first village they went to, the local pastor came out, saw Warren, and said, "I know who you are. You're Pastor Rick."
"And I said, 'How do you know who I am?' " Warren recalled. "He said, 'I get your sermons every week.' And I said, 'How? You don't even have electricity here.' And he said, 'We're putting the Internet in every post office in South Africa. Once a week, I walk an hour and a half down to the post office. I download it. Then I teach it. You are the only training I have ever received.'"
A typical evangelist, of course, would tell stories about reaching ordinary people, the unsaved laity. But a typical evangelist is someone who goes from town to town, giving sermons to large crowds, or preaching to a broad audience on television. Warren has never pastored any congregation but Saddleback, and he refuses to preach on television, because that would put him in direct competition with the local pastors he has spent the past twenty years cultivating. In the argot of the New Economy, most evangelists follow a business-to-consumer model: b-to-c. Warren follows a business-to-business model: b-to-b. He reaches the people who reach people. He's a builder of religious networks. "I once heard Drucker say this," Warren said. "'Warren is not building a tent revival ministry, like the old-style evangelists. He's building an army, like the Jesuits.'"
4.
To write "The Purpose-Driven Life," Warren holed up in an office in a corner of the Saddleback campus, twelve hours a day for seven months. "I would get up at four-thirty, arrive at my special office at five, and I would write from five to five," he said. "I'm a people person, and it about killed me to be alone by my-self. By eleven-thirty, my A.D.D. would kick in. I would do anything not to be there. It was like birthing a baby." The book didn't tell any stories. It wasn't based on any groundbreaking new research or theory or theological insight. "I'm just not that good a writer," Warren said. "I'm a pastor. There's nothing new in this book. But sometimes as I was writing it I would break down in tears. I would be weeping, and I would feel like God was using me."
The book begins with an inscription: "This book is dedicated to you. Before you were born, God planned this moment in your life. It is no accident that you are holding this book. God longs for you to discover the life he created you to live—here on earth, and forever in eternity." Five sections follow, each detailing one of God's purposes in our lives—"You Were Planned for God's Pleasure"; "You Were Formed for God's Family"; "You Were Created to Become Like Christ"; "You Were Shaped for Serving God"; "You Were Made for a Mission"—and each of the sections, in turn, is divided into short chapters ("Understanding Your Shape" or "Using What God Gave You" or "How Real Servants Act"). The writing is simple and unadorned. The scriptural interpretation is literal: "Noah had never seen rain, because prior to the Flood, God irrigated the earth from the ground up." The religious vision is uncomplicated and accepting: "God wants to be your best friend." Warren's Christianity, like his church, has low barriers to entry: "Wherever you are reading this, I invite you to bow your head and quietly whisper the prayer that will change your eternity. Jesus, I believe in you and I receive you. Go ahead. If you sincerely meant that prayer, congratulations! Welcome to the family of God! You are now ready to discover and start living God's purpose for your life."
It is tempting to interpret the book's message as a kind of New Age self-help theology. Warren's God is not awesome or angry and does not stand in judgment of human sin. He's genial and mellow. "Warren's God 'wants to be your best friend,' and this means, in turn, that God's most daunting property, the exercise of eternal judgment, is strategically downsized," the critic Chris Lehmann writes, echoing a common complaint:
"When Warren turns his utility-minded feel-speak upon the symbolic iconography of the faith, the results are offensively bathetic: "When Jesus stretched his arms wide on the cross, he was saying, 'I love you this much.' " But God needs to be at a greater remove than a group hug."
The self-help genre, however, is fundamentally inward-focussed. M. Scott Peck's "The Road Less Traveled"—the only spiritual work that, in terms of sales, can even come close to "The Purpose-Driven Life"—begins with the sentence "Life is difficult." That's a self-help book: it focusses the reader on his own experience. Warren's first sentence, by contrast, is "It's not about you," which puts it in the spirit of traditional Christian devotional literature, which focusses the reader outward, toward God. In look and feel, in fact, "The Purpose-Driven Life" is less twenty-first-century Orange County than it is the nineteenth century of Warren's hero, the English evangelist Charles Spurgeon. Spurgeon was the Warren of his day: the pastor of a large church in London, and the author of best-selling devotional books. On Sunday, good Christians could go and hear Spurgeon preach at the Metropolitan Tabernacle. But during the week they needed something to replace the preacher, and so Spurgeon, in one of his best-known books, "Morning and Evening," wrote seven hundred and thirty-two short homilies, to be read in the morning and the evening of each day of the year. The homilies are not complex investigations of theology. They are opportunities for spiritual reflection. (Sample Spurgeonism: "Every child of God is where God has placed him for some purpose, and the practical use of this first point is to lead you to inquire for what practical purpose has God placed each one of you where you now are." Sound familiar?) The Oxford Times described one of Spurgeon's books as "a rich store of topics treated daintily, with broad humour, with quaint good sense, yet always with a subdued tone and high moral aim," and that describes "The Purpose-Driven Life" as well. It's a spiritual companion. And, like "Morning and Evening," it is less a book than a program. It's divided into forty chapters, to be read during "Forty Days of Purpose." The first page of the book is called "My Covenant." It reads, "With God's help, I commit the next 40 days of my life to discovering God's purpose for my life."
Warren departs from Spurgeon, though, in his emphasis on the purpose-driven life as a collective experience. Below the boxed covenant is a space for not one signature but three: "Your name," "Partner's name," and then Rick Warren's signature, already printed, followed by a quotation from Ecclesiastes 4:9:
"Two are better off than one, because together they can work more effectively. If one of them falls down, the other can help him up. . . . Two people can resist an attack that would defeat one person alone. A rope made of three cords is hard to break."
"The Purpose-Driven Life" is meant to be read in groups. If the vision of faith sometimes seems skimpy, that's because the book is supposed to be supplemented by a layer of discussion and reflection and debate. It is a testament to Warren's intuitive understanding of how small groups work that this is precisely how "The Purpose-Driven Life" has been used. It spread along the network that he has spent his career putting together, not from person to person but from group to group. It presold five hundred thousand copies. It averaged more than half a million copies in sales a month in its first two years, which is possible only when a book is being bought in lots of fifty or a hundred or two hundred. Of those who bought the book as individuals, nearly half have bought more than one copy, sixteen per cent have bought four to six copies, and seven per cent have bought ten or more. Twenty-five thousand churches have now participated in the congregation-wide "40 Days of Purpose" campaign, as have hundreds of small groups within companies and organizations, from the N.B.A. to the United States Postal Service.
"I remember the first time I met Rick," says Scott Bolinder, the head of Zondervan, the Christian publishing division of HarperCollins and the publisher of "The Purpose-Driven Life." "He was telling me about pastors.com. This is during the height of the dot-com boom. I was thinking, What's your angle? He had no angle. He said, 'I love pastors. I know what they go through.' I said, 'What do you put on there?' He said, 'I put my sermons with a little disclaimer on there: "You are welcome to preach it any way you can. I only ask one thing—I ask that you do it better than I did."' So then fast-forward seven years: he's got hundreds of thousands of pastors who come to this Web site. And he goes, 'By the way, my church and I are getting ready to do forty days of purpose. If you want to join us, I'm going to preach through this and put my sermons up. And I've arranged with my publisher that if you do join us with this campaign they will sell the book to you for a low price.' That became the tipping point—being able to launch that book with eleven hundred churches, right from the get-go. They became the evangelists for the book."
The book's high-water mark came earlier this year, when a fugitive named Brian Nichols, who had shot and killed four people in an Atlanta courthouse, accosted a young single mother, Ashley Smith, outside her apartment, and held her captive in her home for seven hours.
"I asked him if I could read," Smith said at the press conference after her ordeal was over, and so she went and got her copy of "The Purpose-Driven Life" and turned to the chapter she was reading that day. It was Chapter 33, "How Real Servants Act." It begins:
"We serve God by serving others.
The world defines greatness in terms of power, possessions, prestige, and position. If you can demand service from others, you've arrived. In our self-serving culture with its me-first mentality, acting like a servant is not a popular concept.
Jesus, however, measured greatness in terms of service, not status. God determines your greatness by how many people you serve, not how many people serve you."
Nichols listened and said, "Stop. Will you read it again?"
Smith read it to him again. They talked throughout the night. She made him pancakes. "I said, 'Do you believe in miracles? Because if you don't believe in miracles — you are here for a reason. You're here in my apartment for some reason.' " She might as well have been quoting from "The Purpose-Driven Life." She went on, "You don't think you're supposed to be sitting here right in front of me listening to me tell you, you know, your reason for being here?" When morning came, Nichols let her go.
Hollywood could not have scripted a better testimonial for "The Purpose-Driven Life." Warren's sales soared further. But the real lesson of that improbable story is that it wasn't improbable at all. What are the odds that a young Christian—a woman who, it turns out, sends her daughter to Hebron Church, in Dacula, Georgia—isn't reading "The Purpose-Driven Life"? And is it surprising that Ashley Smith would feel compelled to read aloud from the book to her captor, and that, in the discussion that followed, Nichols would come to some larger perspective on his situation? She and Nichols were in a small group, and reading aloud from "The Purpose-Driven Life" is what small groups do.
5.
Not long ago, the sociologist Christian Smith decided to find out what American evangelicals mean when they say that they believe in a "Christian America." The phrase seems to suggest that evangelicals intend to erode the separation of church and state. But when Smith asked a representative sample of evangelicals to explain the meaning of the phrase, the most frequent explanation was that America was founded by people who sought religious liberty and worked to establish religious freedom. The second most frequent explanation offered was that a majority of Americans of earlier generations were sincere Christians, which, as Smith points out, is empirically true. Others said what they meant by a Christian nation was that the basic laws of American government reflected Christian principles—which sounds potentially theocratic, except that when Smith asked his respondents to specify what they meant by basic laws they came up with representative government and the balance of powers.
"In other words," Smith writes, "the belief that America was once a Christian nation does not necessarily mean a commitment to making it a 'Christian' nation today, whatever that might mean. Some evangelicals do make this connection explicitly. But many discuss America's Christian heritage as a simple fact of history that they are not particularly interested in or optimistic about reclaiming. Further, some evangelicals think America never was a Christian nation; some think it still is; and others think it should not be a Christian nation, whether or not it was so in the past or is now."
As Smith explored one issue after another with the evangelicals—gender equality, education, pluralism, and politics—he found the same scattershot pattern. The Republican Party may have been adept at winning the support of evangelical voters, but that affinity appears to be as much cultural as anything; the Party has learned to speak the evangelical language. Scratch the surface, and the appearance of homogeneity and ideological consistency disappears. Evangelicals want children to have the right to pray in school, for example, and they vote for conservative Republicans who support that right. But what do they mean by prayer? The New Testament's most left-liberal text, the Lord's Prayer—which, it should be pointed out, begins with a call for utopian social restructuring ("Thy will be done, On earth as it is in Heaven"), then welfare relief ("Give us this day our daily bread"), and then income redistribution ("Forgive us our debts as we also have forgiven our debtors"). The evangelical movement isn't a movement, if you take movements to be characterized by a coherent philosophy, and that's hardly surprising when you think of the role that small groups have come to play in the evangelical religious experience. The answers that Smith got to his questions are the kind of answers you would expect from people who think most deeply about their faith and its implications on Tuesday night, or Wednesday, with five or six of their closest friends, and not Sunday morning, in the controlling hands of a pastor.
"Small groups cultivate spirituality, but it is a particular kind of spirituality," Robert Wuthnow writes. "They cannot be expected to nurture faith in the same way that years of theological study, meditation and reflection might." He says, "They provide ways of putting faith in practice. For the most part, their focus is on practical applications, not on abstract knowledge, or even on ideas for the sake of ideas themselves."
We are so accustomed to judging a social movement by its ideological coherence that the vagueness at the heart of evangelicalism sounds like a shortcoming. Peter Drucker calls Warren's network an army, like the Jesuits. But the Jesuits marched in lockstep and held to an all-encompassing and centrally controlled creed. The members of Warren's network don't all dress the same, and they march to the tune only of their own small group, and they agree, fundamentally, only on who the enemy is. It's not an army. It's an insurgency.
In the wake of the extraordinary success of "The Purpose-Driven Life," Warren says, he underwent a period of soul-searching. He had suddenly been given enormous wealth and influence and he did not know what he was supposed to do with it. "God led me to Psalm 72, which is Solomon's prayer for more influence," Warren says. "It sounds pretty selfish. Solomon is already the wisest and wealthiest man in the world. He's the King of Israel at the apex of its glory. And in that psalm he says, 'God, I want you to make me more powerful and influential.' It looks selfish until he says, 'So that the King may support the widow and orphan, care for the poor, defend the defenseless, speak up for the immigrant, the foreigner, be a friend to those in prison.' Out of that psalm, God said to me that the purpose of influence is to speak up for those who have no influence. That changed my life. I had to repent. I said, I'm sorry, widows and orphans have not been on my radar. I live in Orange County. I live in the Saddleback Valley, which is all gated communities. There aren't any homeless people around. They are thirteen miles away, in Santa Ana, not here." He gestured toward the rolling green hills outside. "I started reading through Scripture. I said, How did I miss the two thousand verses on the poor in the Bible? So I said, I will use whatever affluence and influence that you give me to help those who are marginalized."
He and his wife, Kay, decided to reverse tithe, giving away ninety per cent of the tens of millions of dollars they earned from "The Purpose-Driven Life." They sat down with gay community leaders to talk about fighting AIDS. Warren has made repeated trips to Africa. He has sent out volunteers to forty-seven countries around the world, test-piloting experiments in microfinance and H.I.V. prevention and medical education. He decided to take the same networks he had built to train pastors and spread the purpose-driven life and put them to work on social problems.
"There is only one thing big enough to handle the world's problems, and that is the millions and millions of churches spread out around the world," he says. "I can take you to thousands of villages where they don't have a school. They don't have a grocery store, don't have a fire department. But they have a church. They have a pastor. They have volunteers. The problem today is distribution. In the tsunami, millions of dollars of foodstuffs piled up on the shores and people couldn't get it into the places that needed it, because they didn't have a network. Well, the biggest distribution network in the world is local churches. There are millions of them, far more than all the franchises in the world. Put together, they could be a force for good."
That is, in one sense, a typical Warren pronouncement—bold to the point of audacity, like telling his publisher that his book will sell a hundred million copies. In another sense, it is profoundly modest. When Warren's nineteenth-century evangelical predecessors took on the fight against slavery, they brought to bear every legal, political, and economic lever they could get their hands on. But that was a different time, and that was a different church. Today's evangelicalism is a network, and networks, for better or worse, are informal and personal.
At the Anaheim stadium service, Warren laid out his plan for attacking poverty and disease. He didn't talk about governments, though, or the United Nations, or structures, or laws. He talked about the pastors he had met in his travels around the world. He brought out the President of Rwanda, who stood up at the microphone—a short, slender man in an immaculate black suit—and spoke in halting English about how Warren was helping him rebuild his country. When he was finished, the crowd erupted in applause, and Rick Warren walked across the stage and enfolded him in his long arms.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
Getting In
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
October 10, 2005
A Critic At Large
The social logic of Ivy League admissions.
1.
I applied to college one evening, after dinner, in the fall of my senior year in high school. College applicants in Ontario, in those days, were given a single sheet of paper which listed all the universities in the province. It was my job to rank them in order of preference. Then I had to mail the sheet of paper to a central college-admissions office. The whole process probably took ten minutes. My school sent in my grades separately. I vaguely remember filling out a supplementary two-page form listing my interests and activities. There were no S.A.T. scores to worry about, because in Canada we didn't have to take the S.A.T.s. I don't know whether anyone wrote me a recommendation. I certainly never asked anyone to. Why would I? It wasn't as if I were applying to a private club.
I put the University of Toronto first on my list, the University of Western Ontario second, and Queen's University third. I was working off a set of brochures that I'd sent away for. My parents' contribution consisted of my father's agreeing to drive me one afternoon to the University of Toronto campus, where we visited the residential college I was most interested in. I walked around. My father poked his head into the admissions office, chatted with the admissions director, and—I imagine—either said a few short words about the talents of his son or (knowing my father) remarked on the loveliness of the delphiniums in the college flower beds. Then we had ice cream. I got in.
Am I a better or more successful person for having been accepted at the University of Toronto, as opposed to my second or third choice? It strikes me as a curious question. In Ontario, there wasn't a strict hierarchy of colleges. There were several good ones and several better ones and a number of programs—like computer science at the University of Waterloo—that were world-class. But since all colleges were part of the same public system and tuition everywhere was the same (about a thousand dollars a year, in those days), and a B average in high school pretty much guaranteed you a spot in college, there wasn't a sense that anything great was at stake in the choice of which college we attended. The issue was whether we attended college, and—most important—how seriously we took the experience once we got there. I thought everyone felt this way. You can imagine my confusion, then, when I first met someone who had gone to Harvard.
There was, first of all, that strange initial reluctance to talk about the matter of college at all—a glance downward, a shuffling of the feet, a mumbled mention of Cambridge. "Did you go to Harvard?" I would ask. I had just moved to the United States. I didn't know the rules. An uncomfortable nod would follow. Don't define me by my school, they seemed to be saying, which implied that their school actually could define them. And, of course, it did. Wherever there was one Harvard graduate, another lurked not far behind, ready to swap tales of late nights at the Hasty Pudding, or recount the intricacies of the college—application essay, or wonder out loud about the whereabouts of Prince So-and-So, who lived down the hall and whose family had a place in the South of France that you would not believe. In the novels they were writing, the precocious and sensitive protagonist always went to Harvard; if he was troubled, he dropped out of Harvard; in the end, he returned to Harvard to complete his senior thesis. Once, I attended a wedding of a Harvard alum in his fifties, at which the best man spoke of his college days with the groom as if neither could have accomplished anything of greater importance in the intervening thirty years. By the end, I half expected him to take off his shirt and proudly display the large crimson "H" tattooed on his chest. What is this "Harvard" of which you Americans speak so reverently?
2.
In 1905, Harvard College adopted the College Entrance Examination Board tests as the principal basis for admission, which meant that virtually any academically gifted high—school senior who could afford a private college had a straightforward shot at attending. By 1908, the freshman class was seven per cent Jewish, nine per cent Catholic, and forty-five per cent from public schools, an astonishing transformation for a school that historically had been the preserve of the New England boarding-school complex known in the admissions world as St. Grottlesex.
As the sociologist Jerome Karabel writes in "The Chosen" (Houghton Mifflin; $28), his remarkable history of the admissions process at Harvard, Yale, and Princeton, that meritocratic spirit soon led to a crisis. The enrollment of Jews began to rise dramatically. By 1922, they made up more than a fifth of Harvard's freshman class. The administration and alumni were up in arms. Jews were thought to be sickly and grasping, grade-grubbing and insular. They displaced the sons of wealthy Wasp alumni, which did not bode well for fund-raising. A. Lawrence Lowell, Harvard's president in the nineteen-twenties, stated flatly that too many Jews would destroy the school: "The summer hotel that is ruined by admitting Jews meets its fate . . . because they drive away the Gentiles, and then after the Gentiles have left, they leave also."
The difficult part, however, was coming up with a way of keeping Jews out, because as a group they were academically superior to everyone else. Lowell's first idea—a quota limiting Jews to fifteen per cent of the student body—was roundly criticized. Lowell tried restricting the number of scholarships given to Jewish students, and made an effort to bring in students from public schools in the West, where there were fewer Jews. Neither strategy worked. Finally, Lowell—and his counterparts at Yale and Princeton—realized that if a definition of merit based on academic prowess was leading to the wrong kind of student, the solution was to change the definition of merit. Karabel argues that it was at this moment that the history and nature of the Ivy League took a significant turn.
The admissions office at Harvard became much more interested in the details of an applicant's personal life. Lowell told his admissions officers to elicit information about the "character" of candidates from "persons who know the applicants well," and so the letter of reference became mandatory. Harvard started asking applicants to provide a photograph. Candidates had to write personal essays, demonstrating their aptitude for leadership, and list their extracurricular activities. "Starting in the fall of 1922," Karabel writes, "applicants were required to answer questions on "Race and Color,' "Religious Preference,' "Maiden Name of Mother,' "Birthplace of Father,' and "What change, if any, has been made since birth in your own name or that of your father? (Explain fully).' "
At Princeton, emissaries were sent to the major boarding schools, with instructions to rate potential candidates on a scale of 1 to 4, where 1 was "very desirable and apparently exceptional material from every point of view" and 4 was "undesirable from the point of view of character, and, therefore, to be excluded no matter what the results of the entrance examinations might be." The personal interview became a key component of admissions in order, Karabel writes, "to ensure that "undesirables' were identified and to assess important but subtle indicators of background and breeding such as speech, dress, deportment and physical appearance." By 1933, the end of Lowell's term, the percentage of Jews at Harvard was back down to fifteen per cent.
If this new admissions system seems familiar, that's because it is essentially the same system that the Ivy League uses to this day. According to Karabel, Harvard, Yale, and Princeton didn't abandon the elevation of character once the Jewish crisis passed. They institutionalized it.
Starting in 1953, Arthur Howe, Jr., spent a decade as the chair of admissions at Yale, and Karabel describes what happened under his guidance:
The admissions committee viewed evidence of "manliness" with particular enthusiasm. One boy gained admission despite an academic prediction of 70 because "there was apparently something manly and distinctive about him that had won over both his alumni and staff interviewers." Another candidate, admitted despite his schoolwork being "mediocre in comparison with many others," was accepted over an applicant with a much better record and higher exam scores because, as Howe put it, "we just thought he was more of a guy." So preoccupied was Yale with the appearance of its students that the form used by alumni interviewers actually had a physical characteristics checklist through 1965. Each year, Yale carefully measured the height of entering freshmen, noting with pride the proportion of the class at six feet or more.
At Harvard, the key figure in that same period was Wilbur Bender, who, as the dean of admissions, had a preference for "the boy with some athletic interests and abilities, the boy with physical vigor and coordination and grace." Bender, Karabel tells us, believed that if Harvard continued to suffer on the football field it would contribute to the school's reputation as a place with "no college spirit, few good fellows, and no vigorous, healthy social life," not to mention a "surfeit of "pansies,' "decadent esthetes' and "precious sophisticates.' " Bender concentrated on improving Harvard's techniques for evaluating "intangibles" and, in particular, its "ability to detect homosexual tendencies and serious psychiatric problems."
By the nineteen-sixties, Harvard's admissions system had evolved into a series of complex algorithms. The school began by lumping all applicants into one of twenty-two dockets, according to their geographical origin. (There was one docket for Exeter and Andover, another for the eight Rocky Mountain states.) Information from interviews, references, and student essays was then used to grade each applicant on a scale of 1 to 6, along four dimensions: personal, academic, extracurricular, and athletic. Competition, critically, was within each docket, not between dockets, so there was no way for, say, the graduates of Bronx Science and Stuyvesant to shut out the graduates of Andover and Exeter. More important, academic achievement was just one of four dimensions, further diluting the value of pure intellectual accomplishment. Athletic ability, rather than falling under "extracurriculars," got a category all to itself, which explains why, even now, recruited athletes have an acceptance rate to the Ivies at well over twice the rate of other students, despite S.A.T. scores that are on average more than a hundred points lower. And the most important category? That mysterious index of "personal" qualities. According to Harvard's own analysis, the personal rating was a better predictor of admission than the academic rating. Those with a rank of 4 or worse on the personal scale had, in the nineteen-sixties, a rejection rate of ninety-eight per cent. Those with a personal rating of 1 had a rejection rate of 2.5 per cent. When the Office of Civil Rights at the federal education department investigated Harvard in the nineteen-eighties, they found handwritten notes scribbled in the margins of various candidates' files. "This young woman could be one of the brightest applicants in the pool but there are several references to shyness," read one. Another comment reads, "Seems a tad frothy." One application—and at this point you can almost hear it going to the bottom of the pile—was notated, "Short with big ears."
3.
Social scientists distinguish between what are known as treatment effects and selection effects. The Marine Corps, for instance, is largely a treatment-effect institution. It doesn't have an enormous admissions office grading applicants along four separate dimensions of toughness and intelligence. It's confident that the experience of undergoing Marine Corps basic training will turn you into a formidable soldier. A modelling agency, by contrast, is a selection-effect institution. You don't become beautiful by signing up with an agency. You get signed up by an agency because you're beautiful.
At the heart of the American obsession with the Ivy League is the belief that schools like Harvard provide the social and intellectual equivalent of Marine Corps basic training—that being taught by all those brilliant professors and meeting all those other motivated students and getting a degree with that powerful name on it will confer advantages that no local state university can provide. Fuelling the treatment-effect idea are studies showing that if you take two students with the same S.A.T. scores and grades, one of whom goes to a school like Harvard and one of whom goes to a less selective college, the Ivy Leaguer will make far more money ten or twenty years down the road.
The extraordinary emphasis the Ivy League places on admissions policies, though, makes it seem more like a modeling agency than like the Marine Corps, and, sure enough, the studies based on those two apparently equivalent students turn out to be flawed. How do we know that two students who have the same S.A.T. scores and grades really are equivalent? It's quite possible that the student who goes to Harvard is more ambitious and energetic and personable than the student who wasn't let in, and that those same intangibles are what account for his better career success. To assess the effect of the Ivies, it makes more sense to compare the student who got into a top school with the student who got into that same school but chose to go to a less selective one. Three years ago, the economists Alan Krueger and Stacy Dale published just such a study. And they found that when you compare apples and apples the income bonus from selective schools disappears.
"As a hypothetical example, take the University of Pennsylvania and Penn State, which are two schools a lot of students choose between," Krueger said. "One is Ivy, one is a state school. Penn is much more highly selective. If you compare the students who go to those two schools, the ones who go to Penn have higher incomes. But let's look at those who got into both types of schools, some of whom chose Penn and some of whom chose Penn State. Within that set it doesn't seem to matter whether you go to the more selective school. Now, you would think that the more ambitious student is the one who would choose to go to Penn, and the ones choosing to go to Penn State might be a little less confident in their abilities or have a little lower family income, and both of those factors would point to people doing worse later on. But they don't."
Krueger says that there is one exception to this. Students from the very lowest economic strata do seem to benefit from going to an Ivy. For most students, though, the general rule seems to be that if you are a hardworking and intelligent person you'll end up doing well regardless of where you went to school. You'll make good contacts at Penn. But Penn State is big enough and diverse enough that you can make good contacts there, too. Having Penn on your résumé opens doors. But if you were good enough to get into Penn you're good enough that those doors will open for you anyway. "I can see why families are really concerned about this," Krueger went on. "The average graduate from a top school is making nearly a hundred and twenty thousand dollars a year, the average graduate from a moderately selective school is making ninety thousand dollars. That's an enormous difference, and I can see why parents would fight to get their kids into the better school. But I think they are just assigning to the school a lot of what the student is bringing with him to the school."
Bender was succeeded as the dean of admissions at Harvard by Fred Glimp, who, Karabel tells us, had a particular concern with academic underperformers. "Any class, no matter how able, will always have a bottom quarter," Glimp once wrote. "What are the effects of the psychology of feeling average, even in a very able group? Are there identifiable types with the psychological or what—not tolerance to be "happy' or to make the most of education while in the bottom quarter?" Glimp thought it was critical that the students who populated the lower rungs of every Harvard class weren't so driven and ambitious that they would be disturbed by their status. "Thus the renowned (some would say notorious) Harvard admission practice known as the "happy-bottom-quarter' policy was born," Karabel writes.
It's unclear whether or not Glimp found any students who fit that particular description. (He wondered, in a marvellously honest moment, whether the answer was "Harvard sons.") But Glimp had the realism of the modelling scout. Glimp believed implicitly what Krueger and Dale later confirmed: that the character and performance of an academic class is determined, to a significant extent, at the point of admission; that if you want to graduate winners you have to admit winners; that if you want the bottom quarter of your class to succeed you have to find people capable of succeeding in the bottom quarter. Karabel is quite right, then, to see the events of the nineteen-twenties as the defining moment of the modern Ivy League. You are whom you admit in the Ă©lite-education business, and when Harvard changed whom it admitted, it changed Harvard. Was that change for the better or for the worse?
4.
In the wake of the Jewish crisis, Harvard, Yale, and Princeton chose to adopt what might be called the "best graduates" approach to admissions. France's École Normale SupĂ©rieure, Japan's University of Tokyo, and most of the world's other Ă©lite schools define their task as looking for the best students—that is, the applicants who will have the greatest academic success during their time in college. The Ivy League schools justified their emphasis on character and personality, however, by arguing that they were searching for the students who would have the greatest success after college. They were looking for leaders, and leadership, the officials of the Ivy League believed, was not a simple matter of academic brilliance. "Should our goal be to select a student body with the highest possible proportions of high-ranking students, or should it be to select, within a reasonably high range of academic ability, a student body with a certain variety of talents, qualities, attitudes, and backgrounds?" Wilbur Bender asked. To him, the answer was obvious. If you let in only the brilliant, then you produced bookworms and bench scientists: you ended up as socially irrelevant as the University of Chicago (an institution Harvard officials looked upon and shuddered). "Above a reasonably good level of mental ability, above that indicated by a 550-600 level of S.A.T. score," Bender went on, "the only thing that matters in terms of future impact on, or contribution to, society is the degree of personal inner force an individual has."
It's easy to find fault with the best-graduates approach. We tend to think that intellectual achievement is the fairest and highest standard of merit. The Ivy League process, quite apart from its dubious origins, seems subjective and opaque. Why should personality and athletic ability matter so much? The notion that "the ability to throw, kick, or hit a ball is a legitimate criterion in determining who should be admitted to our greatest research universities," Karabel writes, is "a proposition that would be considered laughable in most of the world's countries." At the same time that Harvard was constructing its byzantine admissions system, Hunter College Elementary School, in New York, required simply that applicants take an exam, and if they scored in the top fifty they got in. It's hard to imagine a more objective and transparent procedure.
But what did Hunter achieve with that best-students model? In the nineteen-eighties, a handful of educational researchers surveyed the students who attended the elementary school between 1948 and 1960. This was a group with an average I.Q. of 157—three and a half standard deviations above the mean—who had been given what, by any measure, was one of the finest classroom experiences in the world. As graduates, though, they weren't nearly as distinguished as they were expected to be. "Although most of our study participants are successful and fairly content with their lives and accomplishments," the authors conclude, "there are no superstars . . . and only one or two familiar names." The researchers spend a great deal of time trying to figure out why Hunter graduates are so disappointing, and end up sounding very much like Wilbur Bender. Being a smart child isn't a terribly good predictor of success in later life, they conclude. "Non-intellective" factors—like motivation and social skills—probably matter more. Perhaps, the study suggests, "after noting the sacrifices involved in trying for national or world-class leadership in a field, H.C.E.S. graduates decided that the intelligent thing to do was to choose relatively happy and successful lives." It is a wonderful thing, of course, for a school to turn out lots of relatively happy and successful graduates. But Harvard didn't want lots of relatively happy and successful graduates. It wanted superstars, and Bender and his colleagues recognized that if this is your goal a best-students model isn't enough.
Most Ă©lite law schools, to cite another example, follow a best-students model. That's why they rely so heavily on the L.S.A.T. Yet there's no reason to believe that a person's L.S.A.T. scores have much relation to how good a lawyer he will be. In a recent research project funded by the Law School Admission Council, the Berkeley researchers Sheldon Zedeck and Marjorie Shultz identified twenty-six "competencies" that they think effective lawyering demands—among them practical judgment, passion and engagement, legal-research skills, questioning and interviewing skills, negotiation skills, stress management, and so on—and the L.S.A.T. picks up only a handful of them. A law school that wants to select the best possible lawyers has to use a very different admissions process from a law school that wants to select the best possible law students. And wouldn't we prefer that at least some law schools try to select good lawyers instead of good law students?
This search for good lawyers, furthermore, is necessarily going to be subjective, because things like passion and engagement can't be measured as precisely as academic proficiency. Subjectivity in the admissions process is not just an occasion for discrimination; it is also, in better times, the only means available for giving us the social outcome we want. The first black captain of the Yale football team was a man named Levi Jackson, who graduated in 1950. Jackson was a hugely popular figure on campus. He went on to be a top executive at Ford, and is credited with persuading the company to hire thousands of African-Americans after the 1967 riots. When Jackson was tapped for the exclusive secret society Skull and Bones, he joked, "If my name had been reversed, I never would have made it." He had a point. The strategy of discretion that Yale had once used to exclude Jews was soon being used to include people like Levi Jackson.
In the 2001 book "The Game of Life," James L. Shulman and William Bowen (a former president of Princeton) conducted an enormous statistical analysis on an issue that has become one of the most contentious in admissions: the special preferences given to recruited athletes at selective universities. Athletes, Shulman and Bowen demonstrate, have a large and growing advantage in admission over everyone else. At the same time, they have markedly lower G.P.A.s and S.A.T. scores than their peers. Over the past twenty years, their class rankings have steadily dropped, and they tend to segregate themselves in an "athletic culture" different from the culture of the rest of the college. Shulman and Bowen think the preference given to athletes by the Ivy League is shameful.
Halfway through the book, however, Shulman and Bowen present "" finding. Male athletes, despite their lower S.A.T. scores and grades, and despite the fact that many of them are members of minorities and come from lower socioeconomic backgrounds than other students, turn out to earn a lot more than their peers. Apparently, athletes are far more likely to go into the high-paying financial-services sector, where they succeed because of their personality and psychological makeup. In what can only be described as a textbook example of burying the lead, Bowen and Shulman write:
One of these characteristics can be thought of as drive—a strong desire to succeed and unswerving determination to reach a goal, whether it be winning the next game or closing a sale. Similarly, athletes tend to be more energetic than the average person, which translates into an ability to work hard over long periods of time—to meet, for example, the workload demands placed on young people by an investment bank in the throes of analyzing a transaction. In addition, athletes are more likely than others to be highly competitive, gregarious and confident of their ability to work well in groups (on teams).
Shulman and Bowen would like to argue that the attitudes of selective colleges toward athletes are a perversion of the ideals of American Ă©lite education, but that's because they misrepresent the actual ideals of American Ă©lite education. The Ivy League is perfectly happy to accept, among others, the kind of student who makes a lot of money after graduation. As the old saying goes, the definition of a well-rounded Yale graduate is someone who can roll all the way from New Haven to Wall Street.
5.
I once had a conversation with someone who worked for an advertising agency that represented one of the big luxury automobile brands. He said that he was worried that his client's new lower-priced line was being bought disproportionately by black women. He insisted that he did not mean this in a racist way. It was just a fact, he said. Black women would destroy the brand's cachet. It was his job to protect his client from the attentions of the socially undesirable.
This is, in no small part, what Ivy League admissions directors do. They are in the luxury-brand-management business, and "The Chosen," in the end, is a testament to just how well the brand managers in Cambridge, New Haven, and Princeton have done their job in the past seventy-five years. In the nineteen twenties, when Harvard tried to figure out how many Jews they had on campus, the admissions office scoured student records and assigned each suspected Jew the designation j1 (for someone who was "conclusively Jewish"), j2 (where the "preponderance of evidence" pointed to Jewishness), or j3 (where Jewishness was a "possibility"). In the branding world, this is called customer segmentation. In the Second World War, as Yale faced plummeting enrollment and revenues, it continued to turn down qualified Jewish applicants. As Karabel writes, "In the language of sociology, Yale judged its symbolic capital to be even more precious than its economic capital." No good brand manager would sacrifice reputation for short-term gain. The admissions directors at Harvard have always, similarly, been diligent about rewarding the children of graduates, or, as they are quaintly called, "legacies." In the 1985-92 period, for instance, Harvard admitted children of alumni at a rate more than twice that of non-athlete, non-legacy applicants, despite the fact that, on virtually every one of the school's magical ratings scales, legacies significantly lagged behind their peers. Karabel calls the practice "unmeritocratic at best and profoundly corrupt at worst," but rewarding customer loyalty is what luxury brands do. Harvard wants good graduates, and part of their definition of a good graduate is someone who is a generous and loyal alumnus. And if you want generous and loyal alumni you have to reward them. Aren't the tremendous resources provided to Harvard by its alumni part of the reason so many people want to go to Harvard in the first place? The endless battle over admissions in the United States proceeds on the assumption that some great moral principle is at stake in the matter of whom schools like Harvard choose to let in—that those who are denied admission by the whims of the admissions office have somehow been harmed. If you are sick and a hospital shuts its doors to you, you are harmed. But a selective school is not a hospital, and those it turns away are not sick. Élite schools, like any luxury brand, are an aesthetic experience—an exquisitely constructed fantasy of what it means to belong to an Ă©lite —and they have always been mindful of what must be done to maintain that experience.
In the nineteen-eighties, when Harvard was accused of enforcing a secret quota on Asian admissions, its defense was that once you adjusted for the preferences given to the children of alumni and for the preferences given to athletes, Asians really weren't being discriminated against. But you could sense Harvard's exasperation that the issue was being raised at all. If Harvard had too many Asians, it wouldn't be Harvard, just as Harvard wouldn't be Harvard with too many Jews or pansies or parlor pinks or shy types or short people with big ears.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
Million-Dollar Murray
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
February 13, 2006
Dept. of Social Services
Why problems like homelessness may be easier to solve than to manage.
1.
Murray Barr was a bear of a man, an ex-marine, six feet tall and heavyset, and when he fell down—which he did nearly every day—it could take two or three grown men to pick him up. He had straight black hair and olive skin. On the street, they called him Smokey. He was missing most of his teeth. He had a wonderful smile. People loved Murray.
His chosen drink was vodka. Beer he called "horse piss." On the streets of downtown Reno, where he lived, he could buy a two-hundred-and-fifty-millilitre bottle of cheap vodka for a dollar-fifty. If he was flush, he could go for the seven-hundred-and-fifty-millilitre bottle, and if he was broke he could always do what many of the other homeless people of Reno did, which is to walk through the casinos and finish off the half-empty glasses of liquor left at the gaming tables.
"If he was on a runner, we could pick him up several times a day," Patrick O'Bryan, who is a bicycle cop in downtown Reno, said. "And he's gone on some amazing runners. He would get picked up, get detoxed, then get back out a couple of hours later and start up again. A lot of the guys on the streets who've been drinking, they get so angry. They are so incredibly abrasive, so violent, so abusive. Murray was such a character and had such a great sense of humor that we somehow got past that. Even when he was abusive, we'd say, 'Murray, you know you love us,' and he'd say, 'I know—and go back to swearing at us."
"I've been a police officer for fifteen years," O'Bryan's partner, Steve Johns, said. "I picked up Murray my whole career. Literally."
Johns and O'Bryan pleaded with Murray to quit drinking. A few years ago, he was assigned to a treatment program in which he was under the equivalent of house arrest, and he thrived. He got a job and worked hard. But then the program ended. "Once he graduated out, he had no one to report to, and he needed that," O'Bryan said. "I don't know whether it was his military background. I suspect that it was. He was a good cook. One time, he accumulated savings of over six thousand dollars. Showed up for work religiously. Did everything he was supposed to do. They said, 'Congratulations,' and put him back on the street. He spent that six thousand in a week or so."
Often, he was too intoxicated for the drunk tank at the jail, and he'd get sent to the emergency room at either Saint Mary's or Washoe Medical Center. Marla Johns, who was a social worker in the emergency room at Saint Mary's, saw him several times a week. "The ambulance would bring him in. We would sober him up, so he would be sober enough to go to jail. And we would call the police to pick him up. In fact, that's how I met my husband." Marla Johns is married to Steve Johns.
"He was like the one constant in an environment that was ever changing," she went on. "In he would come. He would grin that half-toothless grin. He called me 'my angel.' I would walk in the room, and he would smile and say, 'Oh, my angel, I'm so happy to see you.' We would joke back and forth, and I would beg him to quit drinking and he would laugh it off. And when time went by and he didn't come in I would get worried and call the coroner's office. When he was sober, we would find out, oh, he's working someplace, and my husband and I would go and have dinner where he was working. When my husband and I were dating, and we were going to get married, he said, 'Can I come to the wedding?' And I almost felt like he should. My joke was 'If you are sober you can come, because I can't afford your bar bill.' When we started a family, he would lay a hand on my pregnant belly and bless the child. He really was this kind of light."
In the fall of 2003, the Reno Police Department started an initiative designed to limit panhandling in the downtown core. There were articles in the newspapers, and the police department came under harsh criticism on local talk radio. The crackdown on panhandling amounted to harassment, the critics said. The homeless weren't an imposition on the city; they were just trying to get by. "One morning, I'm listening to one of the talk shows, and they're just trashing the police department and going on about how unfair it is," O'Bryan said. "And I thought, Wow, I've never seen any of these critics in one of the alleyways in the middle of the winter looking for bodies." O'Bryan was angry. In downtown Reno, food for the homeless was plentiful: there was a Gospel kitchen and Catholic Services, and even the local McDonald's fed the hungry. The panhandling was for liquor, and the liquor was anything but harmless. He and Johns spent at least half their time dealing with people like Murray; they were as much caseworkers as police officers. And they knew they weren't the only ones involved. When someone passed out on the street, there was a "One down" call to the paramedics. There were four people in an ambulance, and the patient sometimes stayed at the hospital for days, because living on the streets in a state of almost constant intoxication was a reliable way of getting sick. None of that, surely, could be cheap.
O'Bryan and Johns called someone they knew at an ambulance service and then contacted the local hospitals. "We came up with three names that were some of our chronic inebriates in the downtown area, that got arrested the most often," O'Bryan said. "We tracked those three individuals through just one of our two hospitals. One of the guys had been in jail previously, so he'd only been on the streets for six months. In those six months, he had accumulated a bill of a hundred thousand dollars—and that's at the smaller of the two hospitals near downtown Reno. It's pretty reasonable to assume that the other hospital had an even larger bill. Another individual came from Portland and had been in Reno for three months. In those three months, he had accumulated a bill for sixty-five thousand dollars. The third individual actually had some periods of being sober, and had accumulated a bill of fifty thousand."
The first of those people was Murray Barr, and Johns and O'Bryan realized that if you totted up all his hospital bills for the ten years that he had been on the streets—as well as substance-abuse-treatment costs, doctors' fees, and other expenses—Murray Barr probably ran up a medical bill as large as anyone in the state of Nevada.
"It cost us one million dollars not to do something about Murray," O'Bryan said.
2.
Fifteen years ago, after the Rodney King beating, the Los Angeles Police Department was in crisis. It was accused of racial insensitivity and ill discipline and violence, and the assumption was that those problems had spread broadly throughout the rank and file. In the language of statisticians, it was thought that L.A.P.D.'s troubles had a "normal" distribution—that if you graphed them the result would look like a bell curve, with a small number of officers at one end of the curve, a small number at the other end, and the bulk of the problem situated in the middle. The bell-curve assumption has become so much a part of our mental architecture that we tend to use it to organize experience automatically.
But when the L.A.P.D. was investigated by a special commission headed by Warren Christopher, a very different picture emerged. Between 1986 and 1990, allegations of excessive force or improper tactics were made against eighteen hundred of the eighty-five hundred officers in the L.A.P.D. The broad middle had scarcely been accused of anything. Furthermore, more than fourteen hundred officers had only one or two allegations made against them—and bear in mind that these were not proven charges, that they happened in a four-year period, and that allegations of excessive force are an inevitable feature of urban police work. (The N.Y.P.D. receives about three thousand such complaints a year.) A hundred and eighty-three officers, however, had four or more complaints against them, forty-four officers had six or more complaints, sixteen had eight or more, and one had sixteen complaints. If you were to graph the troubles of the L.A.P.D., it wouldn't look like a bell curve. It would look more like a hockey stick. It would follow what statisticians call a "power law" distribution—where all the activity is not in the middle but at one extreme.
The Christopher Commission's report repeatedly comes back to what it describes as the extreme concentration of problematic officers. One officer had been the subject of thirteen allegations of excessive use of force, five other complaints, twenty-eight "use of force reports" (that is, documented, internal accounts of inappropriate behavior), and one shooting. Another had six excessive-force complaints, nineteen other complaints, ten use-of-force reports, and three shootings. A third had twenty-seven use-of-force reports, and a fourth had thirty-five. Another had a file full of complaints for doing things like "striking an arrestee on the back of the neck with the butt of a shotgun for no apparent reason while the arrestee was kneeling and handcuffed," beating up a thirteen-year-old juvenile, and throwing an arrestee from his chair and kicking him in the back and side of the head while he was handcuffed and lying on his stomach.
The report gives the strong impression that if you fired those forty-four cops the L.A.P.D. would suddenly become a pretty well-functioning police department. But the report also suggests that the problem is tougher than it seems, because those forty-four bad cops were so bad that the institutional mechanisms in place to get rid of bad apples clearly weren't working. If you made the mistake of assuming that the department's troubles fell into a normal distribution, you'd propose solutions that would raise the performance of the middle—like better training or better hiring—when the middle didn't need help. For those hard-core few who did need help, meanwhile, the medicine that helped the middle wouldn't be nearly strong enough.
In the nineteen-eighties, when homelessness first surfaced as a national issue, the assumption was that the problem fit a normal distribution: that the vast majority of the homeless were in the same state of semi-permanent distress. It was an assumption that bred despair: if there were so many homeless, with so many problems, what could be done to help them? Then, fifteen years ago, a young Boston College graduate student named Dennis Culhane lived in a shelter in Philadelphia for seven weeks as part of the research for his dissertation. A few months later he went back, and was surprised to discover that he couldn't find any of the people he had recently spent so much time with. "It made me realize that most of these people were getting on with their own lives," he said.
Culhane then put together a database—the first of its kind—to track who was coming in and out of the shelter system. What he discovered profoundly changed the way homelessness is understood. Homelessness doesn't have a normal distribution, it turned out. It has a power-law distribution. "We found that eighty per cent of the homeless were in and out really quickly," he said. "In Philadelphia, the most common length of time that someone is homeless is one day. And the second most common length is two days. And they never come back. Anyone who ever has to stay in a shelter involuntarily knows that all you think about is how to make sure you never come back."
The next ten per cent were what Culhane calls episodic users. They would come for three weeks at a time, and return periodically, particularly in the winter. They were quite young, and they were often heavy drug users. It was the last ten per cent—the group at the farthest edge of the curve—that interested Culhane the most. They were the chronically homeless, who lived in the shelters, sometimes for years at a time. They were older. Many were mentally ill or physically disabled, and when we think about homelessness as a social problem—the people sleeping on the sidewalk, aggressively panhandling, lying drunk in doorways, huddled on subway grates and under bridges—it's this group that we have in mind. In the early nineteen-nineties, Culhane's database suggested that New York City had a quarter of a million people who were homeless at some point in the previous half decade —which was a surprisingly high number. But only about twenty-five hundred were chronically homeless.
It turns out, furthermore, that this group costs the health-care and social-services systems far more than anyone had ever anticipated. Culhane estimates that in New York at least sixty-two million dollars was being spent annually to shelter just those twenty-five hundred hard-core homeless. "It costs twenty-four thousand dollars a year for one of these shelter beds," Culhane said. "We're talking about a cot eighteen inches away from the next cot." Boston Health Care for the Homeless Program, a leading service group for the homeless in Boston, recently tracked the medical expenses of a hundred and nineteen chronically homeless people. In the course of five years, thirty-three people died and seven more were sent to nursing homes, and the group still accounted for 18,834 emergency-room visits—at a minimum cost of a thousand dollars a visit. The University of California, San Diego Medical Center followed fifteen chronically homeless inebriates and found that over eighteen months those fifteen people were treated at the hospital's emergency room four hundred and seventeen times, and ran up bills that averaged a hundred thousand dollars each. One person—San Diego's counterpart to Murray Barr—came to the emergency room eighty-seven times.
"If it's a medical admission, it's likely to be the guys with the really complex pneumonia," James Dunford, the city of San Diego's emergency medical director and the author of the observational study, said. "They are drunk and they aspirate and get vomit in their lungs and develop a lung abscess, and they get hypothermia on top of that, because they're out in the rain. They end up in the intensive-care unit with these very complicated medical infections. These are the guys who typically get hit by cars and buses and trucks. They often have a neurosurgical catastrophe as well. So they are very prone to just falling down and cracking their head and getting a subdural hematoma, which, if not drained, could kill them, and it's the guy who falls down and hits his head who ends up costing you at least fifty thousand dollars. Meanwhile, they are going through alcoholic withdrawal and have devastating liver disease that only adds to their inability to fight infections. There is no end to the issues. We do this huge drill. We run up big lab fees, and the nurses want to quit, because they see the same guys come in over and over, and all we're doing is making them capable of walking down the block."
The homelessness problem is like the L.A.P.D.'s bad-cop problem. It's a matter of a few hard cases, and that's good news, because when a problem is that concentrated you can wrap your arms around it and think about solving it. The bad news is that those few hard cases are hard. They are falling-down drunks with liver disease and complex infections and mental illness. They need time and attention and lots of money. But enormous sums of money are already being spent on the chronically homeless, and Culhane saw that the kind of money it would take to solve the homeless problem could well be less than the kind of money it took to ignore it. Murray Barr used more health-care dollars, after all, than almost anyone in the state of Nevada. It would probably have been cheaper to give him a full-time nurse and his own apartment.
The leading exponent for the power-law theory of homelessness is Philip Mangano, who, since he was appointed by President Bush in 2002, has been the executive director of the U.S. Interagency Council on Homelessness, a group that oversees the programs of twenty federal agencies. Mangano is a slender man, with a mane of white hair and a magnetic presence, who got his start as an advocate for the homeless in Massachusetts. In the past two years, he has crisscrossed the United States, educating local mayors and city councils about the real shape of the homelessness curve. Simply running soup kitchens and shelters, he argues, allows the chronically homeless to remain chronically homeless. You build a shelter and a soup kitchen if you think that homelessness is a problem with a broad and unmanageable middle. But if it's a problem at the fringe it can be solved. So far, Mangano has convinced more than two hundred cities to radically reëvaluate their policy for dealing with the homeless.
"I was in St. Louis recently," Mangano said, back in June, when he dropped by New York on his way to Boise, Idaho. "I spoke with people doing services there. They had a very difficult group of people they couldn't reach no matter what they offered. So I said, Take some of your money and rent some apartments and go out to those people, and literally go out there with the key and say to them, 'This is the key to an apartment. If you come with me right now I am going to give it to you, and you are going to have that apartment.' And so they did. And one by one those people were coming in. Our intent is to take homeless policy from the old idea of funding programs that serve homeless people endlessly and invest in results that actually end homelessness."
Mangano is a history buff, a man who sometimes falls asleep listening to old Malcolm X speeches, and who peppers his remarks with references to the civil-rights movement and the Berlin Wall and, most of all, the fight against slavery. "I am an abolitionist," he says. "My office in Boston was opposite the monument to the 54th Regiment on the Boston Common, up the street from the Park Street Church, where William Lloyd Garrison called for immediate abolition, and around the corner from where Frederick Douglass gave that famous speech at the Tremont Temple. It is very much ingrained in me that you do not manage a social wrong. You should be ending it."
3.
The old Y.M.C.A. in downtown Denver is on Sixteenth Street, just east of the central business district. The main building is a handsome six-story stone structure that was erected in 1906, and next door is an annex that was added in the nineteen-fifties. On the ground floor there is a gym and exercise rooms. On the upper floors there are several hundred apartments—brightly painted one-bedrooms, efficiencies, and S.R.O.-style rooms with microwaves and refrigerators and central airconditioning—and for the past several years those apartments have been owned and managed by the Colorado Coalition for the Homeless.
Even by big-city standards, Denver has a serious homelessness problem. The winters are relatively mild, and the summers aren't nearly as hot as those of neighboring New Mexico or Utah, which has made the city a magnet for the indigent. By the city's estimates, it has roughly a thousand chronically homeless people, of whom three hundred spend their time downtown, along the central Sixteenth Street shopping corridor or in nearby Civic Center Park. Many of the merchants downtown worry that the presence of the homeless is scaring away customers. A few blocks north, near the hospital, a modest, low-slung detox center handles twenty-eight thousand admissions a year, many of them homeless people who have passed out on the streets, either from liquor or—as is increasingly the case—from mouthwash. "Dr. ——Dr. Tich, they call it—is the brand of mouthwash they use," says Roxane White, the manager of the city's social services. "You can imagine what that does to your gut."
Eighteen months ago, the city signed up with Mangano. With a mixture of federal and local funds, the C.C.H. inaugurated a new program that has so far enrolled a hundred and six people. It is aimed at the Murray Barrs of Denver, the people costing the system the most. C.C.H. went after the people who had been on the streets the longest, who had a criminal record, who had a problem with substance abuse or mental illness. "We have one individual in her early sixties, but looking at her you'd think she's eighty," Rachel Post, the director of substance treatment at the C.C.H., said. (Post changed some details about her clients in order to protect their identity.) "She's a chronic alcoholic. A typical day for her is she gets up and tries to find whatever 's going to drink that day. She falls down a lot. There's another person who came in during the first week. He was on methadone maintenance. He'd had psychiatric treatment. He was incarcerated for eleven years, and lived on the streets for three years after that, and, if that's not enough, he had a hole in his heart."
The recruitment strategy was as simple as the one that Mangano had laid out in St. Louis: Would you like a free apartment? The enrollees got either an efficiency at the Y.M.C.A. or an apartment rented for them in a building somewhere else in the city, provided they agreed to work within the rules of the program. In the basement of the Y, where the racquetball courts used to be, the coalition built a command center, staffed with ten caseworkers. Five days a week, between eight-thirty and ten in the morning, the caseworkers meet and painstakingly review the status of everyone in the program. On the wall around the conference table are several large white boards, with lists of doctor's appointments and court dates and medication schedules. "We need a staffing ratio of one to ten to make it work," Post said. "You go out there and you find people and assess how 're doing in their residence. Sometimes we're in contact with someone every day. Ideally, we want to be in contact every couple of days. We've got about fifteen people we're really worried about now."
The cost of services comes to about ten thousand dollars per homeless client per year. An efficiency apartment in Denver averages $376 a month, or just over forty-five hundred a year, which means that you can house and care for a chronically homeless person for at most fifteen thousand dollars, or about a third of what he or she would cost on the street. The idea is that once the people in the program get stabilized they will find jobs, and start to pick up more and more of their own rent, which would bring someone's annual cost to the program closer to six thousand dollars. As of today, seventy-five supportive housing slots have already been added, and the city's homeless plan calls for eight hundred more over the next ten years.
The reality, of course, is hardly that neat and tidy. The idea that the very sickest and most troubled of the homeless can be stabilized and eventually employed is only a hope. Some of them plainly won't be able to get there: these are, after all, hard cases. "We've got one man, he's in his twenties," Post said. "Already, he has cirrhosis of the liver. One time he blew a blood alcohol of .49, which is enough to kill most people. The first place we had he brought over all his friends, and they partied and trashed the place and broke a window. Then we gave him another apartment, and he did the same thing."
Post said that the man had been sober for several months. But he could relapse at some point and perhaps trash another apartment, and they'd have to figure out what to do with him next. Post had just been on a conference call with some people in New York City who run a similar program, and they talked about whether giving clients so many chances simply encourages them to behave irresponsibly. For some people, it probably does. But what was the alternative? If this young man was put back on the streets, he would cost the system even more money. The current philosophy of welfare holds that government assistance should be temporary and conditional, to avoid creating dependency. But someone who blows .49 on a Breathalyzer and has cirrhosis of the liver at the age of twenty-seven doesn't respond to incentives and sanctions in the usual way. "The most complicated people to work with are those who have been homeless for so long that going back to the streets just isn't scary to them," Post said. "The summer comes along and they say, 'I don't need to follow your rules.' " Power-law homelessness policy has to do the opposite of normal-distribution social policy. It should create dependency: you want people who have been outside the system to come inside and rebuild their lives under the supervision of those ten caseworkers in the basement of the Y.M.C.A.
That is what is so perplexing about power-law homeless policy. From an economic perspective the approach makes perfect sense. But from a moral perspective it doesn't seem fair. Thousands of people in the Denver area no doubt live day to day, work two or three jobs, and are eminently deserving of a helping hand—and no one offers them the key to a new apartment. Yet that's just what the guy screaming obscenities and swigging Dr. Tich gets. When the welfare mom's time on public assistance runs out, we cut her off. Yet when the homeless man trashes his apartment we give him another. Social benefits are supposed to have some kind of moral justification. We give them to widows and disabled veterans and poor mothers with small children. Giving the homeless guy passed out on the sidewalk an apartment has a different rationale. It's simply about efficiency.
We also believe that the distribution of social benefits should not be arbitrary. We don't give only to some poor mothers, or to a random handful of disabled veterans. We give to everyone who meets a formal criterion, and the moral credibility of government assistance derives, in part, from this universality. But the Denver homelessness program doesn't help every chronically homeless person in Denver. There is a waiting list of six hundred for the supportive-housing program; it will be years before all those people get apartments, and some may never get one. There isn't enough money to go around, and to try to help everyone a little bit—to observe the principle of universality—isn't as cost-effective as helping a few people a lot. Being fair, in this case, means providing shelters and soup kitchens, and shelters and soup kitchens don't solve the problem of homelessness. Our usual moral intuitions are little use, then, when it comes to a few hard cases. Power-law problems leave us with an unpleasant choice. We can be true to our principles or we can fix the problem. We cannot do both.
4.
A few miles northwest of the old Y.M.C.A. in downtown Denver, on the Speer Boulevard off-ramp from I-25, there is a big electronic sign by the side of the road, connected to a device that remotely measures the emissions of the vehicles driving past. When a car with properly functioning pollution-control equipment passes, the sign flashes "Good." When a car passes that is well over the acceptable limits, the sign flashes "Poor." If you stand at the Speer Boulevard exit and watch the sign for any length of time, you'll find that virtually every car scores "Good." An Audi A4 —"Good." A Buick Century—"Good." A Toyota Corolla—"Good." A Ford Taurus—"Good." A Saab 9-5—"Good," and on and on, until after twenty minutes or so, some beat-up old Ford Escort or tricked-out Porsche drives by and the sign flashes "Poor." The picture of the smog problem you get from watching the Speer Boulevard sign and the picture of the homelessness problem you get from listening in on the morning staff meetings at the Y.M.C.A. are pretty much the same. Auto emissions follow a power-law distribution, and the air-pollution example offers another look at why we struggle so much with problems centered on a few hard cases.
Most cars, especially new ones, are extraordinarily clean. A 2004 Subaru in good working order has an exhaust stream that's just .06 per cent carbon monoxide, which is negligible. But on almost any highway, for whatever reason—age, ill repair, deliberate tampering by the owner—a small number of cars can have carbon-monoxide levels in excess of ten per cent, which is almost two hundred times higher. In Denver, five per cent of the vehicles on the road produce fifty-five per cent of the automobile pollution.
"Let's say a car is fifteen years old," Donald Stedman says. Stedman is a chemist and automobile-emissions specialist at the University of Denver. His laboratory put up the sign on Speer Avenue. "Obviously, the older a car is the more likely it is to become broken. It's the same as human beings. And by broken we mean any number of mechanical malfunctions—the computer's not working anymore, fuel injection is stuck open, the catalyst 's not unusual that these failure modes result in high emissions. We have at least one car in our database which was emitting seventy grams of hydrocarbon per mile, which means that you could almost drive a Honda Civic on the exhaust fumes from that car. It's not just old cars. It's new cars with high mileage, like taxis. One of the most successful and least publicized control measures was done by a district attorney in L.A. back in the nineties. He went to LAX and discovered that all of the Bell Cabs were gross emitters. One of those cabs emitted more than its own weight of pollution every year."
In Stedman's view, the current system of smog checks makes little sense. A million motorists in Denver have to go to an emissions center every year—take time from work, wait in line, pay fifteen or twenty-five dollars—for a test that more than ninety per cent of them don't need. "Not everybody gets tested for breast cancer," Stedman says. "Not everybody takes an AIDS test." On-site smog checks, furthermore, do a pretty bad job of finding and fixing the few outliers. Car enthusiasts—with high-powered, high-polluting sports cars—have been known to drop a clean engine into their car on the day they get it tested. Others register their car in a faraway town without emissions testing or arrive at the test site "hot"—having just come off hard driving on the freeway—which is a good way to make a dirty engine appear to be clean. Still others randomly pass the test when they shouldn't, because dirty engines are highly variable and sometimes burn cleanly for short durations. There is little evidence, Stedman says, that the city's regime of inspections makes any difference in air quality.
He proposes mobile testing instead. Twenty years ago, he invented a device the size of a suitcase that uses infrared light to instantly measure and then analyze the emissions of cars as they drive by on the highway. The Speer Avenue sign is attached to one of Stedman's devices. He says that cities should put half a dozen or so of his devices in vans, park them on freeway off-ramps around the city, and have a police car poised to pull over anyone who fails the test. A half-dozen vans could test thirty thousand cars a day. For the same twenty-five million dollars that Denver's motorists now spend on on-site testing, Stedman estimates, the city could identify and fix twenty-five thousand truly dirty vehicles every year, and within a few years cut automobile emissions in the Denver metropolitan area by somewhere between thirty-five and forty per cent. The city could stop managing its smog problem and start ending it.
Why don't we all adopt the Stedman method? There's no moral impediment here. We're used to the police pulling people over for having a blown headlight or a broken side mirror, and it wouldn't be difficult to have them add pollution-control devices to their list. Yet it does run counter to an instinctive social preference for thinking of pollution as a problem to which we all contribute equally. We have developed institutions that move reassuringly quickly and forcefully on collective problems. Congress passes a law. The Environmental Protection Agency promulgates a regulation. The auto industry makes its cars a little cleaner, and—presto—the air gets better. But Stedman doesn't much care about what happens in Washington and Detroit. The challenge of controlling air pollution isn't so much about the laws as it is about compliance with them. It's a policing problem, rather than a policy problem, and there is something ultimately unsatisfying about his proposed solution. He wants to end air pollution in Denver with a half-dozen vans outfitted with a contraption about the size of a suitcase. Can such a big problem have such a small-bore solution?
That's what made the findings of the Christopher Commission so unsatisfying. We put together blue-ribbon panels when we're faced with problems that seem too large for the normal mechanisms of bureaucratic repair. We want sweeping reforms. But what was the commission's most memorable observation? It was the story of an officer with a known history of doing things like beating up handcuffed suspects who nonetheless received a performance review from his superior stating that he "usually conducts himself in a manner that inspires respect for the law and instills public confidence." This is what you say about an officer when you haven't actually read his file, and the implication of the Christopher Commission's report was that the L.A.P.D. might help solve its problem simply by getting its police captains to read the files of their officers. The L.A.P.D.'s problem was a matter not of policy but of compliance. The department needed to adhere to the rules it already had in place, and that's not what a public hungry for institutional transformation wants to hear. Solving problems that have power-law distributions doesn't just violate our moral intuitions; it violates our political intuitions as well. It's hard not to conclude, in the end, that the reason we treated the homeless as one hopeless undifferentiated group for so long is not simply that we didn't know better. It's that we didn't want to know better. It was easier the old way.
Power-law solutions have little appeal to the right, because they involve special treatment for people who do not deserve special treatment; and they have little appeal to the left, because their emphasis on efficiency over fairness suggests the cold number-crunching of Chicago-school cost-benefit analysis. Even the promise of millions of dollars in savings or cleaner air or better police departments cannot entirely compensate for such discomfort. In Denver, John Hickenlooper, the city's enormously popular mayor, has worked on the homelessness issue tirelessly during the past couple of years. He spent more time on the subject in his annual State of the City address this past summer than on any other topic. He gave the speech, with deliberate symbolism, in the city's downtown Civic Center Park, where homeless people gather every day with their shopping carts and garbage bags. He has gone on local talk radio on many occasions to discuss what the city is doing about the issue. He has commissioned studies to show what a drain on the city's resources the homeless population has become. But, he says, "there are still people who stop me going into the supermarket and say, 'I can't believe you're going to help those homeless people, those bums.'"
5.
Early one morning a year ago, Marla Johns got a call from her husband, Steve. He was at work. "He called and woke me up," Johns remembers. "He was choked up and crying on the phone. And I thought that something had happened with another police officer. I said, 'Oh, my gosh, what happened?' He said, 'Murray died last night.' " He died of intestinal bleeding. At the police department that morning, some of the officers gave Murray a moment of silence.
"There are not many days that go by that I don't have a thought of him," she went on. "Christmas comes— and I used to buy him a Christmas present. Make sure he had warm gloves and a blanket and a coat. There was this mutual respect. There was a time when another intoxicated patient jumped off the gurney and was coming at me, and Murray jumped off his gurney and shook his fist and said, 'Don't you touch my angel.' You know, when he was monitored by the system he did fabulously. He would be on house arrest and he would get a job and he would save money and go to work every day, and he wouldn't drink. He would do all the things he was supposed to do. There are some people who can be very successful members of society if someone monitors them. Murray needed someone to be in charge of him."
But, of course, Reno didn't have a place where Murray could be given the structure he needed. Someone must have decided that it cost too much.
"I told my husband that I would claim his body if no one else did," she said. "I would not have him in an unmarked grave."
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
Here's Why
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
April 10, 2006
The Critics: Books
A sociologist offers an anatomy of explanations.
1.
Little Timothy is playing with his older brother Geoffrey, when he comes running to his mother.
"Mommy, Mommy," he starts in. "I was playing with my truck, and then Geoffrey came and he said it was his turn to play with the truck even though it's my truck and then he pushed me."
"Timothy!" his mother says, silencing him. "Don't be a tattletale."
Timothy has heard that phrase—"Don't be a tattletale"—countless times, and it always stops him short. He has offered his mother an eyewitness account of a crime. His mother, furthermore, in no way disputes the truth of his story. Yet what does she do? She rejects it in favor of a simplistic social formula: Don't be a tattletale. It makes no sense. Timothy's mother would never use such a formula to trump a story if she were talking to his father. On the contrary, his mother and father tattle to each other about Geoffrey all the time. And, if Timothy were to tattle on Geoffrey to his best friend, Bruce, Bruce wouldn't reject the story in favor of a formula, either. Narratives are the basis of Timothy's friendship with Bruce. They explain not just effects but causes. They matter—except in this instance, of a story told by Timothy to Mommy about Geoffrey, in which Mommy is suddenly indifferent to stories altogether. What is this don't-be-a-tattletale business about?
In "Why?" (Princeton; $24.95), the Columbia University scholar Charles Tilly sets out to make sense of our reasons for giving reasons. In the tradition of the legendary sociologist Erving Goffman, Tilly seeks to decode the structure of everyday social interaction, and the result is a book that forces readers to reëxamine everything from the way they talk to their children to the way they argue about politics.
In Tilly's view, we rely on four general categories of reasons. The first is what he calls conventions—conventionally accepted explanations. Tilly would call "Don't be a " a convention. The second is stories, and what distinguishes a story ("I was playing with my truck, and then Geoffrey came in . . .") is a very specific account of cause and effect. Tilly cites the sociologist Francesca Polletta's interviews with people who were active in the civil-rights sit-ins of the nineteen-sixties. Polletta repeatedly heard stories that stressed the spontaneity of the protests, leaving out the role of civil-rights organizations, teachers, and churches. That's what stories do. As Tilly writes, they circumscribe time and space, limit the number of actors and actions, situate all causes "in the consciousness of the actors," and elevate the personal over the institutional.
Then there are codes, which are high-level conventions, formulas that invoke sometimes recondite procedural rules and categories. If a loan officer turns you down for a mortgage, the reason he gives has to do with your inability to conform to a prescribed standard of creditworthiness. Finally, there are technical accounts: stories informed by specialized knowledge and authority. An academic history of civil-rights sit-ins wouldn't leave out the role of institutions, and it probably wouldn't focus on a few actors and actions; it would aim at giving patient and expert attention to every sort of nuance and detail.
Tilly argues that we make two common errors when it comes to understanding reasons. The first is to assume that some kinds of reasons are always better than others—that there is a hierarchy of reasons, with conventions (the least sophisticated) at the bottom and technical accounts at the top. That's wrong, Tilly says: each type of reason has its own role.
Tilly's second point flows from the first, and it's that the reasons people give aren't a function of their character—that is, there aren't people who always favor technical accounts and people who always favor stories. Rather, reasons arise out of situations and roles. Imagine, he says, the following possible responses to one person's knocking some books off the desk of another:
1. Sorry, buddy. I'm just plain awkward.
2. I'm sorry. I didn't see your book.
3. Nuts! I did it again.
4. Why did you put that book there?
5. I told you to stack up your books neatly.
The lesson is not that the kind of person who uses reason No. 1 or No. 2 is polite and the kind of person who uses reason No. 4 or No. 5 is a jerk. The point is that any of us might use any of those five reasons depending on our relation to the person whose books we knocked over. Reason-giving, Tilly says, reflects, establishes, repairs, and negotiates relationships. The husband who uses a story to explain his unhappiness to his wife—"Ever since I got my new job, I feel like I've just been so busy that I haven't had time for us"—is attempting to salvage the relationship. But when he wants out of the marriage, he'll say, "It's not you—it's me." He switches to a convention. As his wife realizes, it's not the content of what he has said that matters. It's his shift from the kind of reason-giving that signals commitment to the kind that signals disengagement. Marriages thrive on stories. They die on conventions.
Consider the orgy of reason-giving that followed Vice-President Dick Cheney's quail-hunting accident involving his friend Harry Whittington. Allies of the Vice-President insisted that the media were making way too much of it. "Accidents happen," they said, relying on a convention. Cheney, in a subsequent interview, looked penitently into the camera and said, "The image of him falling is something I'll never be able to get out of my mind. I fired, and there's Harry falling. And it was, I'd have to say, one of the worst days of my life." Cheney told a story. Some of Cheney's critics, meanwhile, focussed on whether he conformed to legal and ethical standards. Did he have a valid license? Was he too slow to notify the White House? They were interested in codes. Then came the response of hunting experts. They retold the narrative of Cheney's accident, using their specialized knowledge of hunting procedure. The Cheney party had three guns, and on a quail shoot, some of them said, you should never have more than two. Why did Whittington retrieve the downed bird? A dog should have done that. Had Cheney's shotgun been aimed more than thirty degrees from the ground, as it should have been? And what were they doing in the bush at five-thirty in the afternoon, when the light isn't nearly good enough for safe hunting? The experts gave a technical account.
Here are four kinds of reasons, all relational in nature. If you like Cheney and are eager to relieve him of responsibility, you want the disengagement offered by a convention. For a beleaguered P.R. agent, the first line of defense in any burgeoning scandal is, inevitably, There is no story here. When, in Cheney's case, this failed, the Vice-President had to convey his concern and regret while not admitting that he had done anything procedurally wrong. Only a story can accomplish that. Anything else—to shrug and say that accidents happen, for instance—would have been perceived as unpardonably callous. Cheney's critics, for their part, wanted the finality and precision of a code: he acted improperly. And hunting experts wanted to display their authority and educate the public about how to hunt safely, so they retold the story of Cheney's accident with the benefit of their specialized knowledge.
Effective reason-giving, then, involves matching the kind of reason we give to the particular role that we happen to be playing at the time a reason is necessary. The fact that Timothy's mother accepts tattling from his father but rejects it from Timothy is not evidence of capriciousness; it just means that a husband's relationship to his wife gives him access to a reasongiving category that a son's role does not. The lesson "Don't be a tattletale"—which may well be one of the hardest childhood lessons to learn—is that in the adult world it is sometimes more important to be appropriate than it is to be truthful.
2.
Two years ago, a young man named Anthony mugged a woman named Anne on a London street. Anthony was caught and convicted, and a few days before he was sentenced he sat down with Anne for a face-to-face meeting, as an exercise in what is known as "restorative justice." The meeting was videotaped by a criminal-justice research group, and to watch the video is to get an even deeper sense of the usefulness of Tilly's thinking.
"We're going to talk about what's happened," the policeman moderating the meeting begins. "Who's been affected, and how they've been affected, and see what we can do to make things better."
Anthony starts. He has a shaved head, a tattoo on his neck, and multiple piercings in his eyebrows and ears. Beside him is his partner, Christy, holding their baby boy. "What happened is I had a bad week. Been out of work for a couple of weeks. Had my kneecap broken. . . . I only had my dad in this country, who I don't get on with. We had no gas in our flat. Me and Christy were arguing all that morning. The baby had been screaming. We were hungry." His story comes out painfully and haltingly. "It was a bit too much. All my friends I was asking to loan me a couple of pounds. They just couldn't afford to give it to me. . . . I don't know what got into me. I just reached over and took your bag. And I'm really sorry for it. And if there is anything I can do to make up for it, I'm willing to do it. I know you probably don't want me anywhere near you."
Anne has been listening closely, her husband, Terry, next to her. Now she tells her side of the story. She heard a sound like male laughter. She turned, and felt her purse being pulled away. She saw a man pulling up his hood. She ran after him, feeling like a "complete idiot." In the struggle over her bag, her arm was injured. She is a journalist and has since had difficulty typing. "The mugging was very small," she says. "But the effect is not going away as fast as I expected. . . . It makes life one notch less bearable."
It was Christy's turn. She got the call at home. She didn't know exactly what had happened. She took the baby and walked to the police station, angry and frightened. "We got ourselves in a situation where we were relying on the state, and we just can't live off the money," Christy says. "And that's not your problem." She starts to cry. "He's not a drug addict," she continues, looking at her husband. Anthony takes the baby from her and holds him. "If we go to court on Monday, and he does get three years for what he's done, or six years, that's his problem. He done it. And he's got to pay for what he's done. I wake up and hear him cry"—she looks at the baby—"and it kills me. I'm in a situation where I can't do anything to make this better. . . . I just want you to know. The first thing he said to me when he walked in was 'I apologized.' And I said, 'That makes what difference?' "
Watching the conference is a strange experience, because it is utterly foreign to the criminal process of which it is ostensibly a part. There is none of the oppressive legalese of the courtroom. Nothing is "alleged"; there are no "perpetrators." The formal back-and-forth between questioner and answerer, the emotionally protective structure of courtroom procedure, is absent. Anne and Terry sit on comfortable chairs facing Christy and Anthony. They have a conversation, not a confrontation. They are telling stories, in Tilly's sense of that word: repairing their relationship by crafting a cause-and-effect account of what happened on the street.
3.
Why is such storytelling, in the wake of a crime, so important? Because, Tilly would argue, some social situations don't lend themselves to the easy reconciliation of reason and role. In Jonathan Franzen's novel "The Corrections," for example, one of the characters, Gary, is in the midst of a frosty conversation with his wife, Caroline. Gary had the sense, Franzen writes, "that Caroline was on the verge of accusing him of being 'depressed,' and he was afraid that if the idea that he was depressed gained currency, he would forfeit his right to his opinions. . . . Every word he spoke would become a symptom of disease; he would never again win an argument." Gary was afraid, in other words, that a technical account of his behavior—the explanation that he was clinically depressed—would trump his efforts to use the stories and conventions that permitted him to be human. But what was his wife to do? She wanted him to change.
When we say that two parties in a conflict are "talking past each other," this is what we mean: that both sides have a legitimate attachment to mutually exclusive reasons. Proponents of abortion often rely on a convention (choice) and a technical account (concerning the viability of a fetus in the first trimester). Opponents of abortion turn the fate of each individual fetus into a story: a life created and then abruptly terminated. Is it any surprise that the issue has proved to be so intractable? If you believe that stories are the most appropriate form of reason-giving, then those who use conventions and technical accounts will seem morally indifferent—regardless of whether you agree with them. And, if you believe that a problem is best adjudicated through conventions or technical accounts, it is hard not to look upon storytellers as sensationalistic and intellectually unserious. By Tilly's logic, abortion proponents who want to engage their critics will have to become better storytellers—and that, according to the relational principles of such reason-giving, may require them to acknowledge an emotional connection between a mother and a fetus. (Ironically, many of the same members of the religious right who have so emphatically demonstrated the emotional superiority of stories when it comes to abortion insist, when it comes to Genesis, on a reading of the Bible as a technical account. Thus do creationists, in the service of reasongiving exigency, force the Holy Scripture to do double duty as a high-school biology textbook.)
Tilly argues that these conflicts are endemic to the legal system. Laws are established in opposition to stories. In a criminal trial, we take a complicated narrative of cause and effect and match it to a simple, impersonal code: first-degree murder, or second-degree murder, or manslaughter. The impersonality of codes is what makes the law fair. But it is also what can make the legal system so painful for victims, who find no room for their voices and their anger and their experiences. Codes punish, but they cannot heal.
So what do you do? You put Anne and her husband in a room with Anthony and Christy and their baby boy and you let them talk. In a series of such experiments, conducted in Britain and Australia by the criminologists Lawrence Sherman and Heather Strang, restorative-justice programs have shown encouraging results in reducing recidivism rates among offenders and psychological trauma among victims. If you view the tape of the Anthony-Anne exchange, it's not hard to see why. Sherman said that when the Lord Chief Justice of England and Wales watched it at home one night he wept.
"If there is anything I can do, please say it," Anthony says.
"I think most of what you can do is between the two of you, actually," Anne says to Anthony and Christy. "I think if you can put your lives back together again, then that's what needs to be done."
The moderator tells them all to take a break and help themselves to "Metropolitan Police tea and coffee and chocolate biscuits."
Anne asks Christy how old the baby is, and where they are living. It turns out that their apartment has been condemned.Terry stands up and offers the baby a chocolate biscuit, and the adults experience the kind of moment that adults have in the company of babies, where nothing matters except the child in front of them.
"He's a good baby," Christy says. A convention. One kind of reason is never really enough.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
Game Theory
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 29, 2006
Books
When it comes to athletic prowess, don't believe your eyes.
1.
The first player picked in the 1996 National Basketball Association draft was a slender, six-foot guard from Georgetown University named Allen Iverson. Iverson was thrilling. He was lightning quick, and could stop and start on a dime. He would charge toward the basket, twist and turn and writhe through the arms and legs of much taller and heavier men, and somehow find a way to score. In his first season with the Philadelphia 76ers, Iverson was voted the N.B.A.'s Rookie of the Year. In every year since 2000, he has been named to the N.B.A.'s All-Star team. In the 2000-01 season, he finished first in the league in scoring and steals, led his team to the second-best record in the league, and was named, by the country's sportswriters and broadcasters, basketball's Most Valuable Player. He is currently in the midst of a four-year, seventy-seven-million-dollar contract. Almost everyone who knows basketball and who watches Iverson play thinks that he's one of the best players in the game.
But how do we know that we're watching a great player? That's an easier question to answer when it comes to, say, golf or tennis, where players compete against one another, under similar circumstances, week after week. Nobody would dispute that Roger Federer is the world's best tennis player. Baseball is a little more complicated, since it's a team sport. Still, because the game consists of a sequence of discrete, ritualized encounters between pitcher and hitter, it lends itself to statistical rankings and analysis. Most tasks that professionals perform, though, are surprisingly hard to evaluate. Suppose that we wanted to measure something in the real world, like the relative skill of New York City's heart surgeons. One obvious way would be to compare the mortality rates of the patients on whom they operate—except that substandard care isn't necessarily fatal, so a more accurate measure might be how quickly patients get better or how few complications they have after surgery. But recovery time is a function as well of how a patient is treated in the intensive-care unit, which reflects the capabilities not just of the doctor but of the nurses in the I.C.U. So now we have to adjust for nurse quality in our assessment of surgeon quality. We'd also better adjust for how sick the patients were in the first place, and since well-regarded surgeons often treat the most difficult cases, the best surgeons might well have the poorest patient recovery rates. In order to measure something you thought was fairly straightforward, you really have to take into account a series of things that aren't so straightforward.
Basketball presents many of the same kinds of problems. The fact that Allen Iverson has been one of the league's most prolific scorers over the past decade, for instance, could mean that he is a brilliant player. It could mean that he's selfish and takes shots rather than passing the ball to his teammates. It could mean that he plays for a team that races up and down the court and plays so quickly that he has the opportunity to take many more shots than he would on a team that plays more deliberately. Or he might be the equivalent of an average surgeon with a first-rate I.C.U.: maybe his success reflects the fact that everyone else on his team excels at getting rebounds and forcing the other team to turn over the ball. Nor does the number of points that Iverson scores tell us anything about his tendency to do other things that contribute to winning and losing games; it doesn't tell us how often he makes a mistake and loses the ball to the other team, or commits a foul, or blocks a shot, or rebounds the ball. Figuring whether one basketball player is better than another is a challenge similar to figuring out whether one heart surgeon is better than another: you have to find a way to interpret someone's individual statistics in the context of the team that they're on and the task that they are performing.
In "The Wages of Wins" (Stanford; $29.95), the economists David J. Berri, Martin B. Schmidt, and Stacey L. Brook set out to solve the Iverson problem. Weighing the relative value of fouls, rebounds, shots taken, turnovers, and the like, they've created an algorithm that, they argue, comes closer than any previous statistical measure to capturing the true value of a basketball player. The algorithm yields what they call a Win Score, because it expresses a player's worth as the number of wins that his contributions bring to his team. According to their analysis, Iverson's finest season was in 2004-05, when he was worth ten wins, which made him the thirty-sixth-best player in the league. In the season in which he won the Most Valuable Player award, he was the ninety-first-best player in the league. In his worst season (2003-04), he was the two-hundred-and-twenty-seventh-best player in the league. On average, for his career, he has ranked a hundred and sixteenth. In some years, Iverson has not even been the best player on his own team. Looking at the findings that Berri, Schmidt, and Brook present is enough to make one wonder what exactly basketball experts—coaches, managers, sportswriters—know about basketball.
2.
Basketball experts clearly appreciate basketball. They understand the gestalt of the game, in the way that someone who has spent a lifetime thinking about and watching, say, modern dance develops an understanding of that art form. They're able to teach and coach and motivate; to make judgments and predictions about a player's character and resolve and stage of development. But the argument of "The Wages of Wins" is that this kind of expertise has real limitations when it comes to making precise evaluations of individual performance, whether you're interested in the consistency of football quarterbacks or in testing claims that N.B.A. stars "turn it on" during playoffs. The baseball legend Ty Cobb, the authors point out, had a lifetime batting average of .366, almost thirty points higher than the former San Diego Padres outfielder Tony Gwynn, who had a lifetime batting average of .338:
So Cobb hit safely 37 percent of the time while Gwynn hit safely on 34 percent of his at bats. If all you did was watch these players, could you say who was a better hitter? Can one really tell the difference between 37 percent and 34 percent just staring at the players play? To see the problem with the non-numbers approach to player evaluation, consider that out of every 100 at bats, Cobb got three more hits than Gwynn. That's it, three hits.
Michael Lewis made a similar argument in his 2003 best-seller, "Moneyball," about how the so-called sabermetricians have changed the evaluation of talent in baseball. Baseball is sufficiently transparent, though, that the size of the discrepancies between intuitive and statistically aided judgment tends to be relatively modest. If you mistakenly thought that Gwynn was better than Cobb, you were still backing a terrific hitter. But "The Wages of Wins" suggests that when you move into more complex situations, like basketball, the limitations of "seeing" become enormous. Jermaine O'Neal, a center for the Indiana Pacers, finished third in the Most Valuable Player voting in 2004. His Win Score that year put him forty-fourth in the league. In 2004-05, the forward Antoine Walker made as much money as the point guard Jason Kidd, even though Walker produced 0.6 wins for Atlanta and Boston and Kidd produced nearly twenty wins for New Jersey. The Win Score algorithm suggests that Ray Allen has had nearly as good a career as Kobe Bryant, whom many consider the top player in the game, and that the journeyman forward Jerome Williams was actually among the strongest players of his generation.
Most egregious is the story of a young guard for the Chicago Bulls named Ben Gordon. Last season, Gordon finished second in the Rookie of the Year voting and was named the league's top "sixth man"—that is, the best non-starter—because he averaged an impressive 15.1 points per game in limited playing time. But Gordon rebounds less than he should, turns over the ball frequently, and makes such a low percentage of his shots that, of the ''s top thirty-three scorers—that is, players who score at least one point for every two minutes on the floor—Gordon's Win Score ranked him dead last.
The problem for basketball experts is that, in a situation with many variables, it's difficult to know how much weight to assign to each variable. Buying a house is agonizing because we look at the size, the location, the back yard, the proximity to local schools, the price, and so on, and we're unsure which of those things matters most. Assessing heart-attack risk is a notoriously difficult task for similar reasons. A doctor can analyze a dozen different factors. But how much weight should be given to a patient's cholesterol level relative to his blood pressure? In the face of such complexity, people construct their own arbitrary algorithms—they assume that every factor is of equal importance, or randomly elevate one or two factors for the sake of simplifying matters—and we make mistakes because those arbitrary algorithms are, well, arbitrary.
Berri, Schmidt, and Brook argue that the arbitrary algorithms of basketball experts elevate the number of points a player scores above all other considerations. In one clever piece of research, they analyze the relationship between the statistics of rookies and the number of votes they receive in the All-Rookie Team balloting. If a rookie increases his scoring by ten per cent—regardless of how efficiently he scores those points—the number of votes he'll get will increase by twenty-three per cent. If he increases his rebounds by ten per cent, the number of votes he'll get will increase by six per cent. Every other factor, like turnovers, steals, assists, blocked shots, and personal fouls—factors that can have a significant influence on the outcome of a game—seemed to bear no statistical relationship to judgments of merit at all. It's not even the case that high scorers help their team by drawing more fans. As the authors point out, that's only true on the road. At home, attendance is primarily a function of games won. Basketball's decision-makers, it seems, are simply irrational.
It's hard not to wonder, after reading "The Wages of Wins," about the other instances in which we defer to the evaluations of experts. Boards of directors vote to pay C.E.O.s tens of millions of dollars, ostensibly because they believe—on the basis of what they have learned over the years by watching other C.E.O.s—that they are worth it. But so what? We see Allen Iverson, over and over again, charge toward the basket, twisting and turning and writhing through a thicket of arms and legs of much taller and heavier men—and all we learn is to appreciate twisting and turning and writhing. We become dance critics, blind to Iverson's dismal shooting percentage and his excessive turnovers, blind to the reality that the Philadelphia 76ers would be better off without him. "One can play basketball," the authors conclude. "One can watch basketball. One can both play and watch basketball for a thousand years. If you do not systematically track what the players do, and then uncover the statistical relationship between these actions and wins, you will never know why teams win and why they lose."
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
No Mercy
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
September 4, 2006
Comment
In 1925, a young American physicist was doing graduate work at Cambridge University, in England. He was depressed. He was fighting with his mother and had just broken up with his girlfriend. His strength was in theoretical physics, but he was being forced to sit in a laboratory making thin films of beryllium. In the fall of that year, he dosed an apple with noxious chemicals from the lab and put it on the desk of his tutor, Patrick Blackett. Blackett, luckily, didn't eat the apple. But school officials found out what happened, and arrived at a punishment: the student was to be put on probation and ordered to go to London for regular sessions with a psychiatrist.
Probation? These days, we routinely suspend or expel high-school students for doing infinitely less harmful things, like fighting or drinking or taking drugs—that is, for doing the kinds of things that teen-agers do. This past summer, Rhett Bomar, the starting quarterback for the University of Oklahoma Sooners, was cut from the team when he was found to have been "overpaid" (receiving wages for more hours than he worked, with the apparent complicity of his boss) at his job at a car dealership. Even in Oklahoma, people seemed to think that kicking someone off a football team for having cut a few corners on his job made perfect sense. This is the age of zero tolerance. Rules are rules. Students have to be held accountable for their actions. Institutions must signal their expectations firmly and unambiguously: every school principal and every college president, these days, reads from exactly the same script. What, then, of a student who gives his teacher a poisoned apple? Surely he ought to be expelled from school and sent before a judge.
Suppose you cared about the student, though, and had some idea of his situation and his potential. Would you feel the same way? You might. Trying to poison your tutor is no small infraction. Then again, you might decide, as the dons at Cambridge clearly did, that what had happened called for a measure of leniency. They knew that the student had never done anything like this before, and that he wasn't well. And they knew that to file charges would almost certainly ruin his career. Cambridge wasn't sure that the benefits of enforcing the law, in this case, were greater than the benefits of allowing the offender an unimpeded future.
Schools, historically, have been home to this kind of discretionary justice. You let the principal or the teacher decide what to do about cheating because you know that every case of cheating is different—and, more to the point, that every cheater is different. Jimmy is incorrigible, and needs the shock of expulsion. But Bobby just needs a talking to, because he's a decent kid, and Mary and Jane cheated because the teacher foolishly stepped out of the classroom in the middle of the test, and the temptation was simply too much. A Tennessee study found that after zero-tolerance programs were adopted by the state's public schools the frequency of targeted offenses soared: the firm and unambiguous punishments weren't deterring bad behavior at all. Is that really a surprise? If you're a teen-ager, the announcement that an act will be sternly punished doesn't always sink in, and it isn't always obvious when you're doing the thing you aren't supposed to be doing. Why? Because you're a teen-ager.
Somewhere along the way—perhaps in response to Columbine—we forgot the value of discretion in disciplining the young. "Ultimately, they have to make right decisions," the Oklahoma football coach, Bob Stoops, said of his players, after jettisoning his quarterback. "When they do not, the consequences are serious." Open and shut: he sounded as if he were talking about a senior executive of Enron, rather than a college sophomore whose primary obligation at Oklahoma was to throw a football in the direction of young men in helmets. You might think that if the University of Oklahoma was so touchy about its quarterback being "overpaid" it ought to have kept closer track of his work habits with an on-campus job. But making a fetish of personal accountability conveniently removes the need for institutional accountability. (We court-martial the grunts who abuse prisoners, not the commanding officers who let the abuse happen.) To acknowledge that the causes of our actions are complex and muddy seems permissive, and permissiveness is the hallmark of an ideology now firmly in disgrace. That conservative patron saint Whittaker Chambers once defined liberalism as Christ without the Crucifixion. But punishment without the possibility of redemption is worse: it is the Crucifixion without Christ.
As for the student whose career Cambridge saved? He left at the end of the academic year and went to study at the University of Göttingen, where he made important contributions to quantum theory. Later, after a brilliant academic career, he was entrusted with leading one of the most critical and morally charged projects in the history of science. His name was Robert Oppenheimer.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
The Formula
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
October 10, 2006
Annals of Entertainment
What if you built a machine to predict hit movies?
1.
One sunny afternoon not long ago, Dick Copaken sat in a booth at Daniel, one of those hushed, exclusive restaurants on Manhattan's Upper East Side where the waiters glide spectrally from table to table. He was wearing a starched button-down shirt and a blue blazer. Every strand of his thinning hair was in place, and he spoke calmly and slowly, his large pink Charlie Brown head bobbing along evenly as he did. Copaken spent many years as a partner at the white-shoe Washington, D.C., firm Covington & Burling, and he has a lawyer's gravitas. One of his best friends calls him, admiringly, "relentless." He likes to tell stories. Yet he is not, strictly, a storyteller, because storytellers are people who know when to leave things out, and Copaken never leaves anything out: each detail is adduced, considered, and laid on the table—and then adjusted and readjusted so that the corners of the new fact are flush with the corners of the fact that preceded it. This is especially true when Copaken is talking about things that he really cares about, such as questions of international law or his grandchildren or, most of all, the movies.
Dick Copaken loves the movies. His friend Richard Light, a statistician at Harvard, remembers summer vacations on Cape Cod with the Copakens, when Copaken would take his children and the Light children to the movies every day. "Fourteen nights out of fourteen," Light said. "Dick would say at seven o'clock, 'Hey, who's up for the movies?' And, all by himself, he would take the six kids to the movies. The kids had the time of their lives. And Dick would come back and give, with a completely straight face, a rigorous analysis of how each movie was put together, and the direction and the special effects and the animation." This is a man who has seen two or three movies a week for the past fifty years, who has filed hundreds of plots and characters and scenes away in his mind, and at Daniel he was talking about a movie that touched him as much as any he'd ever seen.
"Nobody's heard of it," he said, and he clearly regarded this fact as a minor tragedy. "It's called 'Dear Frankie.' I watched it on a Virgin Atlantic flight because it was the only movie they had that I hadn't already seen. I had very low expectations. But I was blown away." He began, in his lawyer-like manner, to lay out the plot. It takes place in Scotland. A woman has fled an abusive relationship with her infant son and is living in a port town. The boy, now nine, is deaf, and misses the father he has never known. His mother has told him that his father is a sailor on a ship that rarely comes to shore, and has suggested that he write his father letters. These she intercepts, and replies to, writing as if she were the father. One day, the boy finds out that what he thinks is his father's ship is coming to shore. The mother has to find a man to stand in for the father. She does. The two fall in love. Unexpectedly, the real father reëmerges. He's dying, and demands to see his son. The mother panics. Then the little boy reveals his secret: he knew about his mother's ruse all along.
"I was in tears over this movie," Copaken said. "You know, sometimes when you see a movie in the air you're in such an out-of-body mood that things get exaggerated. So when I got home I sat down and saw it another time. I was bawling again, even though I knew what was coming." Copaken shook his head, and then looked away. His cheeks were flushed. His voice was suddenly thick. There he was, a buttoned-down corporate lawyer, in a hushed restaurant where there is practically a sign on the wall forbidding displays of human emotion—and he was crying, a third time. "That absolutely hits me," he said, his face still turned away. "He knew all along what the mother was doing." He stopped to collect himself. "I can't even retell the damn story without getting emotional."
He tried to explain why he was crying. There was the little boy, first of all. He was just about the same age as Copaken's grandson Jacob. So maybe that was part of it. Perhaps, as well, he was reacting to the idea of an absent parent. His own parents, Albert and Silvia, ran a modest community-law practice in Kansas City, and would shut down their office whenever Copaken or his brother had any kind of school activity or performance. In the Copaken world, it was an iron law that parents had to be present. He told a story about representing the Marshall Islands in negotiations with the U.S. government during the Cold War. A missile-testing range on the island was considered to be strategically critical. The case was enormously complex—involving something like fifty federal agencies and five countries—and, just as the negotiations were scheduled to begin, Copaken learned of a conflict: his eldest daughter was performing the lead role in a sixth-grade production of "The Wiz." "I made an instant decision," Copaken said. He told the President of the Marshall Islands that his daughter had to come first. Half an hour passed. "I get a frantic call from the State Department, very high levels: 'Dick, I got a call from the President of the Marshall Islands. What's going on?' I told him. He said, 'Dick, are you putting in jeopardy the national security of the United States for a sixth-grade production?' " In the end, the negotiations were suspended while Copaken flew home from Hawaii. "The point is," Copaken said, "that absence at crucial moments has been a worry to me, and maybe this movie just grabbed at that issue."
He stopped, seemingly dissatisfied. Was that really why he'd cried? Hollywood is awash in stories of bad fathers and abandoned children, and Copaken doesn't cry in fancy restaurants every time he thinks of one of them. When he tried to remember the last time he cried at the movies, he was stumped. So he must have been responding to something else, too—some detail, some unconscious emotional trigger in the combination of the mother and the boy and the Scottish seaside town and the ship and the hired surrogate and the dying father. To say that he cried at "Dear Frankie" because of that lonely fatherless boy was as inadequate as saying that people cried at the death of Princess Diana because she was a beautiful princess. Surely it mattered as well that she was killed in the company of her lover, a man distrusted by the Royal Family. ''t this "Romeo and Juliet"? And surely it mattered that she died in a tunnel, and that the tunnel was in Paris, and that she was chased by motorbikes, and that she was blond and her lover was dark—because each one of those additional narrative details has complicated emotional associations, and it is the subtle combination of all these associations that makes us laugh or choke up when we remember a certain movie, every single time, even when we're sitting in a fancy restaurant.
Of course, the optimal combination of all those elements is a mystery. That's why it's so hard to make a really memorable movie, and why we reward so richly the few people who can. But suppose you really, really loved the movies, and suppose you were a relentless type, and suppose you used all of the skills you'd learned during the course of your career at the highest rungs of the law to put together an international team of story experts. Do you think you could figure it out?
2.
The most famous dictum about Hollywood belongs to the screenwriter William Goldman. "Nobody knows anything," Goldman wrote in "Adventures in the Screen Trade" a couple of decades ago. "Not one person in the entire motion picture field knows for a certainty what's going to work. Every time out it's a guess." One of the highest-grossing movies in history, ""Raiders of the Lost Ark," was offered to every studio in Hollywood, Goldman writes, and every one of them turned it down except Paramount: "Why did Paramount say yes? Because nobody knows anything. And why did all the other studios say no? Because nobody knows anything. And why did Universal, the mightiest studio of all, pass on Star Wars? . . . Because nobody, nobody—not now, not ever—knows the least goddamn thing about what is or isn't going to work at the box office."
What Goldman was saying was a version of something that has long been argued about art: that there is no way of getting beyond one's own impressions to arrive at some larger, objective truth. There are no rules to art, only the infinite variety of subjective experience. "Beauty is no quality in things themselves," the eighteenth-century Scottish philosopher David Hume wrote. "It exists merely in the mind which contemplates them; and each mind perceives a different beauty." Hume might as well have said that nobody knows anything.
But Hume had a Scottish counterpart, Lord Kames, and Lord Kames was equally convinced that traits like beauty, sublimity, and grandeur were indeed reducible to a rational system of rules and precepts. He devised principles of congruity, propriety, and perspicuity: an elevated subject, for instance, must be expressed in elevated language; sound and signification should be in concordance; a woman was most attractive when in distress; depicted misfortunes must never occur by chance. He genuinely thought that the superiority of Virgil's hexameters to Horace's could be demonstrated with Euclidean precision, and for every Hume, it seems, there has always been a Kames—someone arguing that if nobody knows anything it is only because nobody's looking hard enough.
In a small New York loft, just below Union Square, for example, there is a tech startup called Platinum Blue that consults for companies in the music business. Record executives have tended to be Humean: though they can tell you how they feel when they listen to a song, they don't believe anyone can know with confidence whether a song is going to be a hit, and, historically, fewer than twenty per cent of the songs picked as hits by music executives have fulfilled those expectations. Platinum Blue thinks it can do better. It has a proprietary computer program that uses "spectral deconvolution software" to measure the mathematical relationships among all of a song's structural components: melody, harmony, beat, tempo, rhythm, octave, pitch, chord progression, cadence, sonic brilliance, frequency, and so on. On the basis of that analysis, the firm believes it can predict whether a song is likely to become a hit with eighty-per-cent accuracy. Platinum Blue is staunchly Kamesian, and, if you have a field dominated by those who say there are no rules, it is almost inevitable that someone will come along and say that there are. The head of Platinum Blue is a man named Mike McCready, and the service he is providing for the music business is an exact model of what Dick Copaken would like to do for the movie business.
McCready is in his thirties, baldish and laconic, with rectangular hipster glasses. His offices are in a large, open room, with a row of windows looking east, across the rooftops of downtown Manhattan. In the middle of the room is a conference table, and one morning recently McCready sat down and opened his laptop to demonstrate the Platinum Blue technology. On his screen was a cluster of thousands of white dots, resembling a cloud. This was a "map" of the songs his group had run through its software: each dot represented a single song, and each song was positioned in the cloud according to its particular mathematical signature. "You could have one piano sonata by Beethoven at this end and another one here," McCready said, pointing at the opposite end, "as long as they have completely different chord progressions and completely different melodic structures."
McCready then hit a button on his computer, which had the effect of eliminating all the songs that had not made the Billboard Top 30 in the past five years. The screen went from an undifferentiated cloud to sixty discrete clusters. This is what the universe of hit songs from the past five years looks like structurally; hits come out of a small, predictable, and highly conserved set of mathematical patterns. "We take a new CD far in advance of its release date," McCready said. "We analyze all twelve tracks. Then we overlay them on top of the already existing hit clusters, and what we can tell a record company is which of those songs conform to the mathematical pattern of past hits. Now, that doesn't mean that they will be hits. But what we are saying is that, almost certainly, songs that fall outside these clusters will not be —regardless of how much they sound and feel like hit songs, and regardless of how positive your call-out research or focus-group research is." Four years ago, when McCready was working with a similar version of the program at a firm in Barcelona, he ran thirty just-released albums, chosen at random, through his system. One stood out. The computer said that nine of the fourteen songs on the album had clear hit potential—which was unheard of. Nobody in his group knew much about the artist or had even listened to the record before, but the numbers said the album was going to be big, and McCready and his crew were of the belief that numbers do not lie. "Right around that time, a local newspaper came by and asked us what we were doing," McCready said. "We explained the hit-prediction thing, and that we were really turned on to a record by this artist called Norah Jones." The record was "Come Away with Me." It went on to sell twenty million copies and win eight Grammy awards.
3.
The strength of McCready's analysis is its precision. This past spring, for instance, he analyzed "Crazy," by Gnarls Barkley. The computer calculated, first of all, the song's Hit Grade—that is, how close it was to the center of any of those sixty hit clusters. Its Hit Grade was 755, on a scale where anything above 700 is exceptional. The computer also found that "Crazy" belonged to the same hit cluster as Dido's "Thank You," James Blunt's "You're Beautiful," and Ashanti's "Baby," as well as older hits like "Let Me Be There," by Olivia Newton-John, and "One Sweet Day," by Mariah Carey, so that listeners who liked any of those songs would probably like "Crazy," too. Finally, the computer gave "Crazy" a Periodicity Grade—which refers to the fact that, at any given time, only twelve to fifteen hit clusters are "active," because from month to month the particular mathematical patterns that excite music listeners will shift around. "Crazy" 's periodicity score was 658—which suggested a very good fit with current tastes. The data said, in other words, that "Crazy" was almost certainly going to be huge—and, sure enough, it was.
If "Crazy" hadn't scored so high, though, the Platinum Blue people would have given the song's producers broad suggestions for fixing it. McCready said, "We can tell a producer, 'These are the elements that seem to be pushing your song into the hit cluster. These are the variables that are pulling your song away from the hit cluster. The problem seems to be in your bass line.' And the producer will make a bunch of mixes, where they do something different with the bass lines—increase the decibel level, or muddy it up. Then they come back to us. And we say, 'Whatever you were doing with mix No. 3, do a little bit more of that and you'll be back inside the hit cluster.'"
McCready stressed that his system didn't take the art out of hit-making. Someone still had to figure out what to do with mix No. 3, and it was entirely possible that whatever needed to be done to put the song in the hit cluster wouldn't work, because it would make the song sound wrong—and in order to be a hit a song had to sound right. Still, for the first time you wouldn't be guessing about what needed to be done. You would know. And what you needed to know in order to fix the song was much simpler than anyone would have thought. McCready didn't care about who the artist was, or the cleverness of the lyrics. He didn't even have a way of feeding lyrics into his computer. He cared only about a song's underlying mathematical structure. "If you go back to the popular melodies written by Beethoven and Mozart three hundred years ago," he went on, "they conform to the same mathematical patterns that we are looking at today. What sounded like a beautiful melody to them sounds like a beautiful melody to us. What has changed is simply that we have come up with new styles and new instruments. Our brains are wired in a way—we assume—that keeps us coming back, again and again, to the same answers, the same pleasure centers." He had sales data and Top 30 lists and deconvolution software, and it seemed to him that if you put them together you had an objective way of measuring something like beauty. "We think we've figured out how the brain works regarding musical taste," McCready said.
It requires a very particular kind of person, of course, to see the world as a code waiting to be broken. Hume once called Kames "the most arrogant man in the world," and to take this side of the argument you have to be. Kames was also a brilliant lawyer, and no doubt that matters as well, because to be a good lawyer is to be invested with a reverence for rules. (Hume defied his family's efforts to make him a lawyer.) And to think like Kames you probably have to be an outsider. Kames was born Henry Home, to a farming family, and grew up in the sparsely populated cropping-and-fishing county of Berwickshire; he became Lord Kames late in life, after he was elevated to the bench. (Hume was born and reared in Edinburgh.) His early published work was about law and its history, but he soon wandered into morality, religion, anthropology, soil chemistry, plant nutrition, and the physical sciences, and once asked his friend Benjamin Franklin to explain the movement of smoke in chimneys. Those who believe in the power of broad patterns and rules, rather than the authority of individuals or institutions, are not intimidated by the boundaries and hierarchies of knowledge. They don't defer to the superior expertise of insiders; they set up shop in a small loft somewhere downtown and take on the whole music industry at once. The difference between Hume and Kames is, finally, a difference in kind, not degree. You're either a Kamesian or you're not. And if you were to create an archetypal Kamesian—to combine lawyerliness, outsiderness, and supreme self-confidence in one dapper, Charlie Brown-headed combination? You'd end up with Dick Copaken.
"I remember when I was a sophomore in high school and I went into the bathroom once to wash my hands," Copaken said. "I noticed the bubbles on the sink, and it fascinated me the way these bubbles would form and move around and float and reform, and I sat there totally transfixed. My father called me, and I didn't hear him. Finally, he comes in. 'Son. What the . . . are you all right?' I said, 'Bubbles, Dad, look what they do.' He said, 'Son, if you're going to waste your time, waste it on something that may have some future consequence.' Well, I kind of rose to the challenge. That summer, I bicycled a couple of miles to a library in Kansas City and I spent every day reading every book and article I could find on bubbles."
Bubbles looked completely random, but young Copaken wasn't convinced. He built a bubble-making device involving an aerator from a fish tank, and at school he pleaded with the math department to teach him the quadratic equations he needed to show why the bubbles formed the way they did. Then he devised an experiment, and ended up with a bronze medal at the International Science Fair. His interest in bubbles was genuine, but the truth is that almost anything could have caught Copaken's eye: pop songs, movies, the movement of chimney smoke. What drew him was not so much solving this particular problem as the general principle that problems were solvable—that he, little Dick Copaken from Kansas City, could climb on his bicycle and ride to the library and figure out something that his father thought wasn't worth figuring out.
Copaken has written a memoir of his experience defending the tiny Puerto Rican islands of Culebra and Vieques against the U.S. Navy, which had been using their beaches for target practice. It is a riveting story. Copaken takes on the vast Navy bureaucracy, armed only with arcane provisions of environmental law. He investigates the nesting grounds of the endangered hawksbill turtle, and the mating habits of a tiny yet extremely loud tree frog known as the coqui, and at one point he transports four frozen whale heads from the Bahamas to Harvard Medical School. Copaken wins. The Navy loses.
The memoir reads like a David-and-Goliath story. It isn't. David changed the rules on Goliath. He brought a slingshot to a sword fight. People like Copaken, though, don't change the rules; they believe in rules. Copaken would have agreed to sword-on-sword combat. But then he would have asked the referee for a stay, deposed Goliath and his team at great length, and papered him with brief after brief until he conceded that his weapon did not qualify as a sword under §48(B)(6)(e) of the Samaria Convention of 321 B.C. (The Philistines would have settled.) And whereas David knew that he couldn't win a conventional fight with Goliath, the conviction that sustained Copaken's long battle with the Navy was, to the contrary, that so long as the battle remained conventional—so long as it followed the familiar pathways of the law and of due process—he really could win. Dick Copaken didn't think he was an underdog at all. If you believe in rules, Goliath is just another Philistine, and the Navy is just another plaintiff. As for the ineffable mystery of the Hollywood blockbuster? Well, Mr. Goldman, you may not know anything. But I do.
4.
Dick Copaken has a friend named Nick Meaney. They met on a case years ago. Meaney has thick dark hair. He is younger and much taller than Copaken, and seems to regard his friend with affectionate amusement. Meaney's background is in risk management, and for years he'd been wanting to bring the principles of that world to the movie business. In 2003, Meaney and Copaken were driving through the English countryside to Durham when Meaney told Copaken about a friend of his from college. The friend and his business partner were students of popular narrative: the sort who write essays for obscure journals serving the small band of people who think deeply about, say, the evolution of the pilot episode in transnational TV crime dramas. And, for some time, they had been developing a system for evaluating the commercial potential of stories. The two men, Meaney told Copaken, had broken down the elements of screenplay narrative into multiple categories, and then drawn on their encyclopedic knowledge of television and film to assign scripts a score in each of those categories—creating a giant screenplay report card. The system was extraordinarily elaborate. It was under constant refinement. It was also top secret. Henceforth, Copaken and Meaney would refer to the two men publicly only as "Mr. Pink" and "Mr. Brown," an homage to "Reservoir Dogs."
"The guy had a big wall, and he started putting up little Post-its covering everything you can think of," Copaken said. It was unclear whether he was talking about Mr. Pink or Mr. Brown or possibly some Obi-Wan Kenobi figure from whom Mr. Pink and Mr. Brown first learned their trade. "You know, the star wears a blue shirt. The star doesn't zip up his pants. Whatever. So he put all these factors up and began moving them around as the scripts were either successful or unsuccessful, and he began grouping them and eventually this evolved to a kind of ad-hoc analytical system. He had no theory as to what would work, he just wanted to know what did work."
Copaken and Meaney also shared a fascination with a powerful kind of computerized learning system called an artificial neural network. Neural networks are used for data mining—to look for patterns in very large amounts of data. In recent years, they have become a critical tool in many industries, and what Copaken and Meaney realized, when they thought about Mr. Pink and Mr. Brown, was that it might now be possible to bring neural networks to Hollywood. They could treat screenplays as mathematical propositions, using Mr. Pink and Mr. Brown's categories and scores as the motion-picture equivalents of melody, harmony, beat, tempo, rhythm, octave, pitch, chord progression, cadence, sonic brilliance, and frequency.
Copaken and Meaney brought in a former colleague of Meaney's named Sean Verity, and the three of them signed up Mr. Pink and Mr. Brown. They called their company Epagogix—a reference to Aristotle's discussion of epagogic, or inductive, learning—and they started with a "training set" of screenplays that Mr. Pink and Mr. Brown had graded. Copaken and Meaney won't disclose how many scripts were in the training set. But let's say it was two hundred. Those scores—along with the U.S. box-office receipts for each of the films made from those screenplays—were fed into a neural network built by a computer scientist of Meaney's acquaintance. "I can't tell you his name," Meaney said, "but he's English to his bootstraps." Mr. Bootstraps then went to work, trying to use Mr. Pink and Mr. Brown's scoring data to predict the box-office receipts of every movie in the training set. He started with the first film and had the neural network make a guess: maybe it said that the hero's moral crisis in act one, which rated a 7 on the 10-point moral-crisis scale, was worth $7 million, and having a gorgeous red-headed eighteen-year-old female lead whose characterization came in at 6.5 was worth $3 million and a 9-point bonding moment between the male lead and a four-year-old boy in act three was worth $2 million, and so on, putting a dollar figure on every grade on Mr. Pink and Mr. Brown's report card until the system came up with a prediction. Then it compared its guess with how that movie actually did. Was it close? Of course not. The neural network then went back and tried again. If it had guessed $20 million and the movie actually made $110 million, it would reweight the movie's Pink/Brown scores and run the numbers a second time. And then it would take the formula that worked best on Movie One and apply it to Movie Two, and tweak that until it had a formula that worked on Movies One and Two, and take that formula to Movie Three, and then to four and five, and on through all two hundred movies, whereupon it would go back through all the movies again, through hundreds of thousands of iterations, until it had worked out a formula that did the best possible job of predicting the financial success of every one of the movies in its database.
That formula, the theory goes, can then be applied to new scripts. If you were developing a $75-million buddy picture for Bruce Willis and Colin Farrell, Epagogix says, it can tell you, based on past experience, what that script's particular combination of narrative elements can be expected to make at the box office. If the formula says it's a $50-million script, you pull the plug. "We shoot turkeys," Meaney said. He had seen Mr. Bootstraps and the neural network in action: "It can sometimes go on for hours. If you look at the computer, you see lots of flashing numbers in a gigantic grid. It's like 'The Matrix.' There are a lot of computations. The guy is there, the whole time, looking at it. It eventually stops flashing, and it tells us what it thinks the American box-office will be. A number comes out."
The way the neural network thinks is not that different from the way a Hollywood executive thinks: if you pitch a movie to a studio, the executive uses an ad-hoc algorithm—perfected through years of trial and error—to put a value on all the components in the story. Neural networks, though, can handle problems that have a great many variables, and they never play favorites—which means (at least in theory) that as long as you can give the neural network the same range of information that a human decision-maker has, it ought to come out ahead. That's what the University of Arizona computer scientist Hsinchun Chen demonstrated ten years ago, when he built a neural network to predict winners at the dog track. Chen used the ten variables that greyhound experts told him they used in making their bets—like fastest time and winning percentage and results for the past seven races—and trained his system with the results of two hundred races. Then he went to the greyhound track in Tucson and challenged three dog-racing handicappers to a contest. Everyone picked winners in a hundred races, at a modest two dollars a bet. The experts lost $71.40, $61.20, and $70.20, respectively. Chen won $124.80. It wasn't close, and one of the main reasons was the special interest the neural network showed in something called "race grade": greyhounds are moved up and down through a number of divisions, according to their ability, and dogs have a big edge when they've just been bumped down a level and a big handicap when they've just been bumped up. "The experts know race grade exists, but they don't weight it sufficiently," Chen said. "They are all looking at win percentage, place percentage, or thinking about the dogs' times."
Copaken and Meaney figured that Hollywood's experts also had biases and skipped over things that really mattered. If a neural network won at the track, why not Hollywood? "One of the most powerful aspects of what we do is the ruthless objectivity of our system," Copaken said. "It doesn't care about maintaining relationships with stars or agents or getting invited to someone's party. It doesn't care about climbing the corporate ladder. It has one master and one master only: how do you get to bigger box-office? Nobody else in Hollywood is like that."
In the summer of 2003, Copaken approached Josh Berger, a senior executive at Warner Bros. in Europe. Meaney was opposed to the idea: in his mind, it was too early. "I just screamed at Dick," he said. But Copaken was adamant. He had Mr. Bootstraps, Mr. Pink, and Mr. Brown run sixteen television pilots through the neural network, and try to predict the size of each show's eventual audience. "I told Josh, 'Stick this in a drawer, and I'll come back at the end of the season and we can check to see how we did,' " Copaken said. In January of 2004, Copaken tabulated the results. In six cases, Epagogix guessed the number of American homes that would tune in to a show to within .06 per cent. In thirteen of the sixteen cases, its predictions were within two per cent. Berger was floored. "It was incredible," he recalls. "It was like someone saying to you, 'We're going to show you how to count cards in Vegas.' It had that sort of quality."
Copaken then approached another Hollywood studio. He was given nine unreleased movies to analyze. Mr. Pink, Mr. Brown, and Mr. Bootstraps worked only from the script—without reference to the stars or the director or the marketing budget or the producer. On three of the films—two of which were low-budget—the Epagogix estimates were way off. On the remaining six—including two of the studio's biggest-budget productions—they correctly identified whether the film would make or lose money. On one film, the studio thought it had a picture that would make a good deal more than $100 million. Epagogix said $49 million. The movie made less than $40 million. On another, a big-budget picture, the team's estimate came within $1.2 million of the final gross. On a number of films, they were surprisingly close. "They were basically within a few million," a senior executive at the studio said. "It was shocking. It was kind of weird." Had the studio used Epagogix on those nine scripts before filming started, it could have saved tens of millions of dollars. "I was impressed by a couple of things," another executive at the same studio said. "I was impressed by the things they thought mattered to a movie. They weren't the things that we typically give credit to. They cared about the venue, and whether it was a love story, and very specific things about the plot that they were convinced determined the outcome more than anything else. It felt very objective. And they could care less about whether the lead was Tom Cruise or Tom Jones."
The Epagogix team knocked on other doors that weren't quite so welcoming. This was the problem with being a Kamesian. Your belief in a rule-bound universe was what gave you, an outsider, a claim to real expertise. But you were still an outsider. You were still Dick Copaken, the blue-blazered corporate lawyer who majored in bubbles as a little boy in Kansas City, and a couple of guys from the risk-management business, and three men called Pink, Brown, and Bootstraps—and none of you had ever made a movie in your life. And what were you saying? That stars didn't matter, that the director didn't matter, and that all that mattered was story—and, by the way, that you understood story the way the people on the inside, people who had spent a lifetime in the motion-picture business, didn't. "They called, and they said they had a way of predicting box-office success or failure, which is everyone's fantasy," one former studio chief recalled. "I said to them, 'I hope you're right.' " The executive seemed to think of the Epagogix team as a small band of Martians who had somehow slipped their U.F.O. past security. "In reality, there are so many circumstances that can affect a movie's success," the executive went on. "Maybe the actor or actress has an external problem. Or this great actor, for whatever reason, just fails. You have to fire a director. Or September 11th or some other thing happens. There are many people who have come forward saying they have a way of predicting box-office success, but so far nobody has been able to do it. I think we know something. We just don't know enough. I still believe in something called that magical thing—talent, the unexpected. The movie god has to shine on you." You were either a Kamesian or you weren't, and this person wasn't: "My first reaction to those guys? Bullshit."
5.
A few months ago, Dick Copaken agreed to lift the cloud of unknowing surrounding Epagogix, at least in part. He laid down three conditions: the meeting was to be in London, Mr. Pink and Mr. Brown would continue to be known only as Mr. Pink and Mr. Brown, and no mention was to be made of the team's current projects. After much discussion, an agreement was reached. Epagogix would analyze the 2005 movie "The Interpreter," which was directed by Sydney Pollack and starred Sean Penn and Nicole Kidman. "The Interpreter" had a complicated history, having gone through countless revisions, and there was a feeling that it could have done much better at the box office. If ever there was an ideal case study for the alleged wizardry of Epagogix, this was it.
The first draft of the movie was written by Charles Randolph, a philosophy professor turned screenwriter. It opened in the fictional African country of Matobo. Two men in a Land Rover pull up to a soccer stadium. A group of children lead them to a room inside the building. On the ground is a row of corpses.
Cut to the United Nations, where we meet Silvia Broome, a young woman who works as an interpreter. She goes to the U.N. Security Service and relates a terrifying story. The previous night, while working late in the interpreter's booth, she overheard two people plotting the assassination of Matobo's murderous dictator, Edmund Zuwanie, who is coming to New York to address the General Assembly. She says that the plotters saw her, and that her life may be in danger. The officer assigned to her case, Tobin Keller, is skeptical, particularly when he learns that she, too, is from Matobo, and that her parents were killed in the country's civil war. But after Broome suffers a series of threatening incidents Keller starts to believe her. His job is to protect Zuwanie, but he now feels moved to act as Broome's bodyguard as well. A quiet, slightly ambiguous romantic attraction begins to develop between them. Zuwanie's visit draws closer. Broome's job is to be his interpreter. On the day of the speech, Broome ends up in the greenroom with Zuwanie. Keller suddenly realizes the truth: that she has made up the whole story as a way of bringing Zuwanie to justice. He rushes to the greenroom. Broome, it seems, has poisoned Zuwanie and is withholding the antidote unless he goes onstage and confesses to the murder of his countrymen. He does. Broome escapes. A doctor takes a look at the poison. It's harmless. The doctor turns to the dictator, who has just been tricked into writing his own prison sentence: "You were never in danger, Mr. Zuwanie."
Randolph says that the film he was thinking of while he was writing "The Interpreter" was Francis Ford Coppola's classic "The Conversation." He wanted to make a spare, stark movie about an isolated figure. "She's a terrorist," Randolph said of Silvia Broome. "She comes to this country to do a very specific task, and when that task is done she's gone again. I wanted to write about this idea of a noble terrorist, who tried to achieve her ends with a character assassination, not a real assassination." Randolph realized that most moviegoers—and most Hollywood executives—prefer characters who have psychological motivations. But he wasn't trying to make "Die Hard." "Look, I'm the son of a preacher," he said. "I believe that ideology motivates people."
In 2004, Sydney Pollack signed on to direct the project. He loved the idea of an interpreter at the United Nations and the conceit of an overheard conversation. But he wanted to make a commercial movie, and parts of the script didn't feel right to him. He didn't like the twist at the end, for instance. "I felt like I had been tricked, because in fact there was no threat," Pollack said. "As much as I liked the original script, I felt like an audience would somehow, at the end, feel cheated." Pollack also felt that audiences would want much more from Silvia Broome's relationship with Tobin Keller. "I've never been able to do a movie without a love story in it," he said. "For me, the heart of it is always the man and the woman and who they are and what they are going through." Pollack brought Randolph back for rewrites. He then hired Scott Frank and Steven Zaillian, two of the most highly sought-after screenwriters in Hollywood—and after several months the story was turned inside out. Now Broome didn't tell the story of overhearing that conversation. It actually happened. She wasn't a terrorist anymore. She was a victim. She ''t an isolated figure. She was given a social life. She wasn't manipulating Keller. Their relationship was more prominent. A series of new characters—political allies and opponents of Zuwanie's—were added, as was a scene in Brooklyn where a bus explodes, almost killing Broome. "I remember when I came on 'Minority Report,' and started over," said Frank, who wrote many of the new scenes for "The Interpreter." "There weren't many characters. When I finished, there were two mysteries and a hundred characters. I have diarrhea of the plot. This movie cried out for that. There are never enough suspects and red herrings."
The lingering problem, though, was the ending. If Broome wasn't after Zuwanie, who was? "We struggled," Pollack said. "It was a long process, to the point where we almost gave up." In the end, Zuwanie was made the engineer of the plot: he fakes the attempt on his life in order to justify his attacks on his enemies back home. Zuwanie hires a man to shoot him, and then another of Zuwanie's men shoots the assassin before he can do the job—and in the chaos Broome ends up with a gun in her hand, training it on Zuwanie. "The end was the hardest part," Frank said. "All these balls were in the air. But I couldn't find a satisfying way to resolve it. We had to put a gun in the hand of a pacifist. I couldn't quite sew it up in the right way. Sydney kept saying, 'You're so close.' But I kept saying, 'Yeah, but I don't believe what I'm writing.' I wonder if I did a disservice to 'The Interpreter.' I don't know that I made it better. I may have just made it different."
This, then, was the question for Epagogix: If Pollack's goal was to make "The Interpreter" a more commercial movie, how well did he succeed? And could he have done better?
6.
The debriefing took place in central London, behind the glass walls of the private dining room of a Mayfair restaurant. The waiters came in waves, murmuring their announcements of the latest arrival from the kitchen. The table was round. Copaken, dapper as always in his navy blazer, sat next to Sean Verity, followed by Meaney, Mr. Brown, and Mr. Pink. Mr. Brown was very tall, and seemed to have a northern English accent. Mr. Pink was slender and graying, and had an air of authority about him. His academic training was in biochemistry. He said he thought that, in the highly emotional business of Hollywood, having a scientific background was quite useful. There was no sign of Mr. Bootstraps.
Mr. Pink began by explaining the origins of their system. "There were certain historical events that allowed us to go back and test how appealing one film was against another," he said. "The very simple one is that in the English market, in the sixties on Sunday night, religious programming aired on the major networks. Nobody watched it. And, as soon as that finished, movies came on. There were no lead-ins, and only two competing channels. Plus, across the country you had a situation where the commercial sector was playing a whole variety of movies against the standard, the BBC. It might be a John Wayne movie in Yorkshire, and a musical in Somerset, and the BBC would be the same movie everywhere. So you had a control. It was very pure and very simple. That was a unique opportunity to try and make some guesstimates as to why movies were doing what they were doing."
Brown nodded. "We built a body of evidence until we had something systematic," he said.
Pink estimated that they had analyzed thousands of movies. "The thing is that not everything comes to you as a script. For a long period, we worked for a broadcaster who used to send us a couple of paragraphs. We made our predictions based on that much. Having the script is actually too much information sometimes. You're trying to replicate what the audience is doing. They're trying to make a choice between three movies, and all they have at that point is whatever they've seen in TV Guide or on any trailer they've seen. We have to take a piece here and a piece here. Take a couple of reference points. When I look at a story, there are certain things I'm looking for—certain themes, and characters you immediately focus on." He thought for a moment. "That's not to deny that it matters whether the lead character wears a hat," he added, in a way that suggested he and Mr. Brown had actually thought long and hard about leads and hats.
"There's always a pattern," he went on. "There are certain stories that come back, time and time again, and that always work. You know, whenever we go into a market—and we work in fifty markets—the initial thing people say is 'What do you know about our market?' The assumption is that, say, Japan is different from us—that there has to be something else going on there. But, basically, they're just like us. It's the consistency of these reappearing things that I find amazing."
"Biblical stories are a classic case," Mr. Brown put in. "There is something about what they're telling and the message that's coming out that seems to be so universal. With Mel Gibson's 'The Passion,' people always say, 'Who could have predicted that?' And the answer is, we could have."
They had looked at "The Interpreter" scripts a few weeks earlier. The process typically takes them a day. They read, they graded, and then they compared notes, because Mr. Pink was the sort who went for "Yojimbo" and Mr. Brown's favorite movie was "Alien" (the first one), so they didn't always agree. Mr. Brown couldn't remember a single script he'd read where he thought there wasn't room for improvement, and Mr. Pink, when asked the same question, could come up with just one: "Lethal Weapon." "A friend of mine gave me the shooting script before it came out, and I remember reading it and thinking, It's all there. It was all on the page." Once Mr. Pink and Mr. Brown had scored "The Interpreter," they gave their analyses to Mr. Bootstraps, who did fifteen runs through the neural network: the original Randolph script, the shooting script, and certain variants of the plot that Epagogix devised. Mr. Bootstraps then passed his results to Copaken, who wrote them up. The Epagogix reports are always written by Copaken, and they are models of lawyerly thoroughness. This one ran to thirty-eight pages. He had finished the final draft the night before, very late. He looked fresh as a daisy.
Mr. Pink started with the original script. "My pure reaction? I found it very difficult to read. I got confused. I had to reread bits. We do this a lot. If a project takes more than an hour to read, then there's something going on that I'm not terribly keen on."
"It didn't feel to me like a mass-appeal movie," Mr. Brown added. "It seemed more niche."
When Mr. Bootstraps ran Randolph's original draft through the neural network, the computer called it a $33-million movie—an "intelligent" thriller, in the same commercial range as "The Constant Gardener" or "Out of Sight." According to the formula, the final shooting script was a $69-million picture (an estimate that came within $4 million of the actual box-office). Mr. Brown wasn't surprised. The shooting script, he said, "felt more like an American movie, where the first one seemed European in style."
Everyone agreed, though, that Pollack could have done much better. There was, first of all, the matter of the United Nations. "They had a unique opportunity to get inside the building," Mr. Pink said. "But I came away thinking that it could have been set in any boxy office tower in Manhattan. An opportunity was missed. That's when we get irritated—when there are opportunities that could very easily be turned into something that would actually have had an impact."
"Locale is an extra character," Mr. Brown said. "But in this case it's a very bland character that didn't really help."
In the Epagogix secret formula, it seemed, locale matters a great deal. "You know, there's a big difference between city and countryside," Mr. Pink said. "It can have a huge effect on a movie's ability to draw in viewers. And writers just do not take advantage of it. We have a certain set of values that we attach to certain places."
Mr. Pink and Mr. Brown ticked off the movies and television shows that they thought understood the importance of locale: "Crimson Tide," "Lawrence of Arabia," "Lost," "Survivor," "Castaway," "Deliverance." Mr. Pink said, "The desert island is something that we have always recognized as a pungent backdrop, but it's not used that often. In the same way, prisons can be a powerful environment, because they are so well defined." The U.N. could have been like that, but it wasn't. Then there was the problem of starting, as both scripts did, in Africa—and not just Africa but a fictional country in Africa. The whole team found that crazy. "Audiences are pretty parochial, by and large," Mr. Pink said. "If you start off by telling them, 'We're going to begin this movie in Africa,' you're going to lose them. They've bought their tickets. But when they come out they're going to say, 'It was all right. But it was Africa.' " The whole thing seemed to leave Mr. Pink quite distressed. He looked at Mr. Brown beseechingly.
Mr. Brown changed the subject. "It's amazing how often quite little things, quite small aspects, can spoil everything," he said. "I remember seeing the trailer for 'V for Vendetta' and deciding against it right there, for one very simple reason: there was a ridiculous mask on the main character. If you can't see the face of the character, you can't tell what that person is thinking. You can't tell who they are. With 'Spider-Man' and 'Superman,' though, you do see the face, so you respond to them."
The team once gave a studio a script analysis in which almost everything they suggested was, in Hollywood terms, small. They wanted the lead to jump off the page a little more. They wanted the lead to have a young sidekick—a relatively minor character—to connect with a younger demographic, and they wanted the city where the film was set to be much more of a presence. The neural network put the potential value of better characterization at an extra $2.46 million in U.S. box-office revenue; the value of locale adjustment at $4.92 million; the value of a sidekick at $12.3 million—and the value of all three together (given the resulting synergies) at $24.6 million. That's another $25 million for a few weeks of rewrites and maybe a day or two of extra filming. Mr. Bootstraps, incidentally, ran the numbers and concluded that the script would make $47 million if the suggested changes were not made. The changes were not made. The movie made $50 million.
Mr. Pink and Mr. Brown went on to discuss the second "Interpreter" screenplay, the shooting script. They thought the ending was implausible. Charles Randolph had originally suggested that the Tobin Keller character be black, not white, in order to create the frisson of bringing together a white African and a black American. Mr. Pink and Mr. Brown independently came to the same conclusion. Apparently, the neural network ran the numbers on movies that paired black and white leads—"Lethal Weapon," "The Crying Game," "Independence Day," "Men in Black," "Die Another Day," "The Pelican Brief"—and found that the black-white combination could increase box-office revenue. The computer did the same kind of analysis on Scott Frank's "diarrhea of the plot," and found that there were too many villains. And if Silvia Broome was going to be in danger, Mr. Bootstraps made clear, she really had to be in danger.
"Our feeling—and Dick, you may have to jump in here—is that the notion of a woman in peril is a very powerful narrative element," Mr. Pink said. He glanced apprehensively at Copaken, evidently concerned that what he was about to say might fall in the sensitive category of the proprietary. "How powerful?" He chose his words carefully. "Well above average. And the problem is that we lack a sense of how much danger she is in, so an opportunity is missed. There were times when you were thinking, Is this something she has created herself? Is someone actually after her? You are confused. There is an element of doubt, and that ambiguity makes it possible to doubt the danger of the situation." Of course, all that ambiguity was there because in the Randolph script she was making it all up, and we were supposed to doubt the danger of the situation. But Mr. Pink and Mr. Brown believed that, once you decided you weren't going to make a European-style niche movie, you had to abandon ambiguity altogether.
"You've got to make the peril real," Mr. Pink said.
The Epagogix revise of "The Interpreter" starts with an upbeat Silvia Broome walking into the United Nations, flirting with the security guard. The two men plotting the assassination later see her and chase her through the labyrinthine cor-ridors of what could only be the U.N. building. The ambiguous threats to Broome's life are now explicit. At one point in the Epagogix version, a villain pushes Broome's Vespa off one of Manhattan's iconic East River bridges. She hangs on to her motorbike for dear life, as it swings precariously over the edge of the parapet. Tobin Keller, in a police helicopter, swoops into view: "As she clings to Tobin's muscular body while the two of them are hoisted up into the hovering helicopter, we sense that she is feeling more than relief." In the Epagogix ending, Broome stabs one of Zuwanie's security men with a knife. Zuwanie storms off the stage, holds a press conference, and is shot dead by a friend of Broome's brother. Broome cradles the dying man in her arms. He " dies peacefully," with " a smile on his blood-spattered face." Then she gets appointed Matobo's U.N. ambassador. She turns to Keller. "'This time,' she notes with a wry smile . . . 'you will have to protect me.' " Bootstraps's verdict was that this version would result in a U.S. box-office of $111 million.
"It's funny," Mr. Pink said. "This past weekend, 'The Bodyguard' was on TV. Remember that piece of"—he winced—"entertainment? Which is about a bodyguard and a woman. The final scene is that they are right back together. It is very clearly and deliberately sown. That is the commercial way, if you want more bodies in the seats."
"You have to either consummate it or allow for the possibility of that," Copaken agreed.
They were thinking now of what would happen if they abandoned all fealty to the original, and simply pushed the movie's premise as far as they could possibly go.
Mr. Pink went on, "If Dick had said, 'You can take this project wherever you want,' we probably would have ended up with something a lot closer to 'The Bodyguard'—where you have a much more romantic film, a much more powerful focus to the two characters—without all the political stuff going on in the background. You go for the emotions on a very basic level. What would be the upper limit on that? You know, the upper limit of anything these days is probably still 'Titanic.' I'm not saying we could do six hundred million dollars. But it could be two hundred million."
7.
It was clear that the whole conversation was beginning to make Mr. Pink uncomfortable. He didn't like "The Bodyguard." Even the title made him wince. He was the sort who liked "Yojimbo," after all. The question went around the room: What would you do with "The Interpreter"? Sean Verity wanted to juice up the action-adventure elements and push it to the $150- to $160-million range. Meaney wanted to do without expensive stars: he didn't think they were worth the money. Copaken wanted more violence, and he also favored making Keller black. But he didn't want to go all the way to "The Bodyguard," either. This was a man who loved "Dear Frankie" as much as any film he'd seen in recent memory, and "Dear Frankie" had a domestic box-office gross of $1.3 million. If you followed the rules of Epagogix, there wouldn't be any movies like "Dear Frankie." The neural network had one master, the market, and answered one question: how do you get to bigger box-office? But once a movie had made you vulnerable—once you couldn't even retell the damn story without getting emotional—you couldn't be content with just one master anymore.
That was the thing about the formula: it didn't make the task of filmmaking easier. It made it harder. So long as nobody knows anything, you've got license to do whatever you want. You can start a movie in Africa. You can have male and female leads not go off together—all in the name of making something new. Once you came to think that you knew something, though, you had to decide just how much money you were willing to risk for your vision. Did the Epagogix team know what the answer to that question was? Of course not. That question required imagination, and they weren't in the imagination business. They were technicians with tools: computer programs and analytical systems and proprietary software that calculated mathematical relationships among a laundry list of structural variables. At Platinum Blue, Mike McCready could tell you that the bass line was pushing your song out of the center of hit cluster 31. But he couldn't tell you exactly how to fix the bass line, and he couldn't guarantee that the redone version would still sound like a hit, and you didn't see him releasing his own album of computer-validated pop music. A Kamesian had only to read Lord Kames to appreciate the distinction. The most arrogant man in the world was a terrible writer: clunky, dense, prolix. He knew the rules of art. But that didn't make him an artist.
Mr. Brown spoke last. "I don't think it needs to be a big-budget picture," he said. "I think we do what we can with the original script to make it a strong story, with an ending that is memorable, and then do a slow release. A low-budget picture. One that builds through word of mouth—something like that." He was confident that he had the means to turn a $69-million script into a $111-million movie, and then again into a $150- to $200-million blockbuster. But it had been a long afternoon, and part of him had a stubborn attachment to "The Interpreter" in something like its original form. Mr. Bootstraps might have disagreed. But Mr. Bootstraps was nowhere to be seen.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
Dangerous Minds
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
November 12, 2007
Dept. of Criminology
Criminal profiling made easy.
1.
On November 16, 1940, workers at the Consolidated Edison building on West Sixty-fourth Street in Manhattan found a homemade pipe bomb on a windowsill. Attached was a note: "Con Edison crooks, this is for you." In September of 1941, a second bomb was found, on Nineteenth Street, just a few blocks from Con Edison's headquarters, near Union Square. It had been left in the street, wrapped in a sock. A few months later, the New York police received a letter promising to "bring the Con Edison to justice—they will pay for their dastardly deeds." Sixteen other letters followed, between 1941 and 1946, all written in block letters, many repeating the phrase "dastardly deeds" and all signed with the initials "F.P." In March of 1950, a third bomb—larger and more powerful than the others—was found on the lower level of Grand Central Terminal. The next was left in a phone booth at the New York Public Library. It exploded, as did one placed in a phone booth in Grand Central. In 1954, the Mad Bomber—as he came to be known—struck four times, once in Radio City Music Hall, sending shrapnel throughout the audience. In 1955, he struck six times. The city was in an uproar. The police were getting nowhere. Late in 1956, in desperation, Inspector Howard Finney, of the New York City Police Department's crime laboratory, and two plainclothesmen paid a visit to a psychiatrist by the name of James Brussel.
Brussel was a Freudian. He lived on Twelfth Street, in the West Village, and smoked a pipe. In Mexico, early in his career, he had done counter-espionage work for the F.B.I. He wrote many books, including "Instant Shrink: How to Become an Expert Psychiatrist in Ten Easy Lessons." Finney put a stack of documents on Brussel's desk: photographs of unexploded bombs, pictures of devastation, photostats of F.P.'s neatly lettered missives. "I didn't miss the look in the two plainclothesmen's eyes," Brussel writes in his memoir, "Casebook of a Crime Psychiatrist." "I'd seen that look before, most often in the Army, on the faces of hard, old-line, field-grade officers who were sure this newfangled psychiatry business was all nonsense."
He began to leaf through the case materials. For sixteen years, F.P. had been fixated on the notion that Con Ed had done him some terrible injustice. Clearly, he was clinically paranoid. But paranoia takes some time to develop. F.P. had been bombing since 1940, which suggested that he was now middle-aged. Brussel looked closely at the precise lettering of F.P.'s notes to the police. This was an orderly man. He would be cautious. His work record would be exemplary. Further, the language suggested some degree of education. But there was a stilted quality to the word choice and the phrasing. Con Edison was often referred to as "the Con Edison." And who still used the expression "dastardly deeds"? F.P. seemed to be foreign-born. Brussel looked closer at the letters, and noticed that all the letters were perfect block capitals, except the "W"s. They were misshapen, like two "U"s. To Brussel's eye, those "W"s looked like a pair of breasts. He flipped to the crime-scene descriptions. When F.P. planted his bombs in movie theatres, he would slit the underside of the seat with a knife and stuff his explosives into the upholstery. Didn't that seem like a symbolic act of penetrating a woman, or castrating a man—or perhaps both? F.P. had probably never progressed beyond the Oedipal stage. He was unmarried, a loner. Living with a mother figure. Brussel made another leap. F.P. was a Slav. Just as the use of a garrote would have suggested someone of Mediterranean extraction, the bomb-knife combination struck him as Eastern European. Some of the letters had been posted from Westchester County, but F.P. wouldn't have mailed the letters from his home town. Still, a number of cities in southeastern Connecticut had a large Slavic population. And didn't you have to pass through Westchester to get to the city from Connecticut?
Brussel waited a moment, and then, in a scene that has become legendary among criminal profilers, he made a prediction:
"One more thing." I closed my eyes because I didn't want to see their reaction. I saw the Bomber: impeccably neat, absolutely proper. A man who would avoid the newer styles of clothing until long custom had made them conservative. I saw him clearly—much more clearly than the facts really warranted. I knew I was letting my imagination get the better of me, but I couldn't help it.
"One more " I said, my eyes closed tight. "When you catch him—and I have no doubt you will—he'll be wearing a double-breasted suit."
"Jesus!" one of the detectives whispered.
"And it will be buttoned," I said. I opened my eyes. Finney and his men were looking at each other.
"A double-breasted suit," said the Inspector.
"Yes."
"Buttoned."
"Yes."
He nodded. Without another word, they left.
A month later, George Metesky was arrested by police in connection with the New York City bombings. His name had been changed from Milauskas. He lived in Waterbury, Connecticut, with his two older sisters. He was unmarried. He was unfailingly neat. He attended Mass regularly. He had been employed by Con Edison from 1929 to 1931, and claimed to have been injured on the job. When he opened the door to the police officers, he said, "I know why you fellows are here. You think I'm the Mad Bomber." It was midnight, and he was in his pajamas. The police asked that he get dressed. When he returned, his hair was combed into a pompadour and his shoes were newly shined. He was also wearing a double-breasted suit—buttoned.
2.
In a new book, "Inside the Mind of BTK," the eminent F.B.I. criminal profiler John Douglas tells the story of a serial killer who stalked the streets of Wichita, Kansas, in the nineteen-seventies and eighties. Douglas was the model for Agent Jack Crawford in "The Silence of the Lambs." He was the protĂ©gĂ© of the pioneering F.B.I. profiler Howard Teten, who helped establish the bureau's Behavioral Science Unit, at Quantico, in 1972, and who was a protĂ©gĂ© of Brussel—which, in the close-knit fraternity of profilers, is like being analyzed by the analyst who was analyzed by Freud. To Douglas, Brussel was the father of criminal profiling, and, in both style and logic, "Inside the Mind of BTK" pays homage to "Casebook of a Crime Psychiatrist" at every turn.
"BTK" stood for "Bind, Torture, Kill"—the three words that the killer used to identify himself in his taunting notes to the Wichita police. He had struck first in January, 1974, when he killed thirty-eight-year-old Joseph Otero in his home, along with his wife, Julie, their son, Joey, and their eleven-year-old daughter, who was found hanging from a water pipe in the basement with semen on her leg. The following April, he stabbed a twenty-four-year-old woman. In March, 1977, he bound and strangled another young woman, and over the next few years he committed at least four more murders. The city of Wichita was in an uproar. The police were getting nowhere. In 1984, in desperation, two police detectives from Wichita paid a visit to Quantico.
The meeting, Douglas writes, was held in a first-floor conference room of the F.B.I.'s forensic-science building. He was then nearly a decade into his career at the Behavioral Science Unit. His first two best-sellers, "Mindhunter: Inside the FBI's Elite Serial Crime Unit," and "Obsession: The FBI's Legendary Profiler Probes the Psyches of Killers, Rapists, and Stalkers and Their Victims and Tells How to Fight Back," were still in the future. Working a hundred and fifty cases a year, he was on the road constantly, but BTK was never far from his thoughts. "Some nights I'd lie awake asking myself, 'Who the hell is this BTK?' " he writes. "What makes a guy like this do what he does? What makes him tick?"
Roy Hazelwood sat next to Douglas. A lean chain-smoker, Hazelwood specialized in sex crimes, and went on to write the best-sellers "Dark Dreams" and "The Evil That Men Do." Beside Hazelwood was an ex-Air Force pilot named Ron Walker. Walker, Douglas writes, was "whip smart" and an "exceptionally quick study." The three bureau men and the two detectives sat around a massive oak table. "The objective of our session was to keep moving forward until we ran out of juice," Douglas writes. They would rely on the typology developed by their colleague Robert Ressler, himself the author of the true-crime best-sellers "Whoever Fights Monsters" and "I Have Lived in the Monster." The goal was to paint a picture of the killer—of what sort of man BTK was, and what he did, and where he worked, and what he was like—and with that scene "Inside the Mind of BTK" begins.
We are now so familiar with crime stories told through the eyes of the profiler that it is easy to lose sight of how audacious the genre is. The traditional detective story begins with the body and centers on the detective's search for the culprit. Leads are pursued. A net is cast, widening to encompass a bewilderingly diverse pool of suspects: the butler, the spurned lover, the embittered nephew, the shadowy European. That's a Whodunit. In the profiling genre, the net is narrowed. The crime scene doesn't initiate our search for the killer. It defines the killer for us. The profiler sifts through the case materials, looks off into the distance, and knows. "Generally, a psychiatrist can study a man and make a few reasonable predictions about what the man may do in the future—how he will react to such-and-such a stimulus, how he will behave in such-and-such a situation," Brussel writes. "What I have done is reverse the terms of the prophecy. By studying a 's deeds, I have deduced what kind of man he might be." Look for a middle-aged Slav in a double-breasted suit. Profiling stories aren't Whodunits; they're Hedunits.
In the Hedunit, the profiler does not catch the criminal. That's for local law enforcement. He takes the meeting. Often, he doesn't write down his predictions. It's up to the visiting police officers to take notes. He does not feel the need to involve himself in the subsequent investigation, or even, it turns out, to justify his predictions. Once, Douglas tells us, he drove down to the local police station and offered his services in the case of an elderly woman who had been savagely beaten and sexually assaulted. The detectives working the crime were regular cops, and Douglas was a bureau guy, so you can imagine him perched on the edge of a desk, the others pulling up chairs around him.
" 'Okay,' I said to the detectives. . . . 'Here's what I think,' " Douglas begins. "It's a sixteen- or seventeen-year-old high school kid. . . . He'll be disheveled-looking, he'll have scruffy hair, generally poorly groomed." He went on: a loner, kind of weird, no girlfriend, lots of bottled-up anger. He comes to the old lady's house. He knows she's alone. Maybe he's done odd jobs for her in the past. Douglas continues:
I pause in my narrative and tell them there's someone who meets this description out there. If they can find him, they've got their offender.
One detective looks at another. One of them starts to smile. "Are you a psychic, Douglas?"
"No," I say, "but my job would be a lot easier if I were."
"Because we had a psychic, Beverly Newton, in here a couple of weeks ago, and she said just about the same things."
You might think that Douglas would bridle at that comparison. He is, after all, an agent of the Federal Bureau of Investigation, who studied with Teten, who studied with Brussel. He is an ace profiler, part of a team that restored the F.B.I.'s reputation for crime-fighting, inspired countless movies, television shows, and best-selling thrillers, and brought the modern tools of psychology to bear on the savagery of the criminal mind—and some cop is calling him a psychic. But Douglas doesn't object. Instead, he begins to muse on the ineffable origins of his insights, at which point the question arises of what exactly this mysterious art called profiling is, and whether it can be trusted. Douglas writes,
What I try to do with a case is to take in all the evidence I have to work with . . . and then put myself mentally and emotionally in the head of the offender. I try to think as he does. Exactly how this happens, I'm not sure, any more than the novelists such as Tom Harris who've consulted me over the years can say exactly how their characters come to life. If there's a psychic component to this, I won't run from it.
3.
In the late nineteen-seventies, John Douglas and his F.B.I. colleague Robert Ressler set out to interview the most notorious serial killers in the country. They started in California, since, as Douglas says, "California has always had more than its share of weird and spectacular crimes." On weekends and days off, over the next months, they stopped by one federal prison after another, until they had interviewed thirty-six murderers.
Douglas and Ressler wanted to know whether there was a pattern that connected a killer's life and personality with the nature of his crimes. They were looking for what psychologists would call a homology, an agreement between character and action, and, after comparing what they learned from the killers with what they already knew about the characteristics of their murders, they became convinced that they'd found one.
Serial killers, they concluded, fall into one of two categories. Some crime scenes show evidence of logic and planning. The victim has been hunted and selected, in order to fulfill a specific fantasy. The recruitment of the victim might involve a ruse or a con. The perpetrator maintains control throughout the offense. He takes his time with the victim, carefully enacting his fantasies. He is adaptable and mobile. He almost never leaves a weapon behind. He meticulously conceals the body. Douglas and Ressler, in their respective books, call that kind of crime "organized."
In a "disorganized" crime, the victim isn't chosen logically. She's seemingly picked at random and "blitz-attacked," not stalked and coerced. The killer might grab a steak knife from the kitchen and leave the knife behind. The crime is so sloppily executed that the victim often has a chance to fight back. The crime might take place in a high-risk environment. "Moreover, the disorganized killer has no idea of, or interest in, the personalities of his victims," Ressler writes in "Whoever Fights Monsters." "He does not want to know who they are, and many times takes steps to obliterate their personalities by quickly knocking them unconscious or covering their faces or otherwise disfiguring them."
Each of these styles, the argument goes, corresponds to a personality type. The organized killer is intelligent and articulate. He feels superior to those around him. The disorganized killer is unattractive and has a poor self-image. He often has some kind of disability. He's too strange and withdrawn to be married or have a girlfriend. If he doesn't live alone, he lives with his parents. He has pornography stashed in his closet. If he drives at all, his car is a wreck.
"The crime scene is presumed to reflect the murderer's behavior and personality in much the same way as furnishings reveal the homeowner's character," we're told in a crime manual that Douglas and Ressler helped write. The more they learned, the more precise the associations became. If the victim was white, the killer would be white. If the victim was old, the killer would be sexually immature.
"In our research, we discovered that . . . frequently serial offenders had failed in their efforts to join police departments and had taken jobs in related fields, such as security guard or night watchman," Douglas writes. Given that organized rapists were preoccupied with control, it made sense that they would be fascinated by the social institution that symbolizes control. Out of that insight came another prediction: "One of the things we began saying in some of our profiles was that the UNSUB"—the unknown subject—"would drive a policelike vehicle, say a Ford Crown Victoria or Chevrolet Caprice."
4.
On the surface, the F.B.I.'s system seems extraordinarily useful. Consider a case study widely used in the profiling literature. The body of a twenty-six-year-old special-education teacher was found on the roof of her Bronx apartment building. She was apparently abducted just after she left her house for work, at six-thirty in the morning. She had been beaten beyond recognition, and tied up with her stockings and belt. The killer had mutilated her sexual organs, chopped off her nipples, covered her body with bites, written obscenities across her abdomen, masturbated, and then defecated next to the body.
Let's pretend that we're an F.B.I. profiler. First question: race. The victim is white, so let's call the offender white. Let's say he's in his mid-twenties to early thirties, which is when the thirty-six men in the F.B.I.'s sample started killing. Is the crime organized or disorganized? Disorganized, clearly. It's on a rooftop, in the Bronx, in broad daylight—high risk. So what is the killer doing in the building at six-thirty in the morning? He could be some kind of serviceman, or he could live in the neighborhood. Either way, he appears to be familiar with the building. He's disorganized, though, so he's not stable. If he is employed, it's blue-collar work, at best. He probably has a prior offense, having to do with violence or sex. His relationships with women will be either nonexistent or deeply troubled. And the mutilation and the defecation are so strange that he's probably mentally ill or has some kind of substance-abuse problem. How does that sound? As it turns out, it's spot-on. The killer was Carmine Calabro, age thirty, a single, unemployed, deeply troubled actor who, when he was not in a mental institution, lived with his widowed father on the fourth floor of the building where the murder took place.
But how useful is that profile, really? The police already had Calabro on their list of suspects: if you're looking for the person who killed and mutilated someone on the roof, you don't really need a profiler to tell you to check out the dishevelled, mentally ill guy living with his father on the fourth floor.
That's why the F.B.I.'s profilers have always tried to supplement the basic outlines of the organized/disorganized system with telling details—something that lets the police zero in on a suspect. In the early eighties, Douglas gave a presentation to a roomful of police officers and F.B.I. agents in Marin County about the Trailside Killer, who was murdering female hikers in the hills north of San Francisco. In Douglas's view, the killer was a classic "disorganized" offender—a blitz attacker, white, early to mid-thirties, blue collar, probably with "a history of bed-wetting, fire-starting, and cruelty to animals." Then he went back to how asocial the killer seemed. Why did all the killings take place in heavily wooded areas, miles from the road? Douglas reasoned that the killer required such seclusion because he had some condition that he was deeply self-conscious about. Was it something physical, like a missing limb? But then how could he hike miles into the woods and physically overpower his victims? Finally, it came to him: " 'Another thing,' I added after a pregnant pause, 'the killer will have a speech impediment.' "
And so he did. Now, that's a useful detail. Or is it? Douglas then tells us that he pegged the 's age as early thirties, and he turned out to be fifty. Detectives use profiles to narrow down the range of suspects. It doesn't do any good to get a specific detail right if you get general details wrong.
In the case of Derrick Todd Lee, the Baton Rouge serial killer, the F.B.I. profile described the offender as a white male blue-collar worker, between twenty-five and thirty-five years old, who "wants to be seen as someone who is attractive and appealing to women." The profile went on, "However, his level of sophistication in interacting with women, especially women who are above him in the social strata, is low. Any contact he has had with women he has found attractive would be described by these women as 'awkward.' " The F.B.I. was right about the killer being a blue-collar male between twenty-five and thirty-five. But Lee turned out to be charming and outgoing, the sort to put on a cowboy hat and snakeskin boots and head for the bars. He was an extrovert with a number of girlfriends and a reputation as a ladies' man. And he wasn't white. He was black.
A profile isn't a test, where you pass if you get most of the answers right. It's a portrait, and all the details have to cohere in some way if the image is to be helpful. In the mid-nineties, the British Home Office analyzed a hundred and eighty-four crimes, to see how many times profiles led to the arrest of a criminal. The profile worked in five of those cases. That's just 2.7 per cent, which makes sense if you consider the position of the detective on the receiving end of a profiler's list of conjectures. Do you believe the stuttering part? Or do you believe the thirty-year-old part? Or do you throw up your hands in frustration?
5.
There is a deeper problem with F.B.I. profiling. Douglas and Ressler didn't interview a representative sample of serial killers to come up with their typology. They talked to whoever happened to be in the neighborhood. Nor did they interview their subjects according to a standardized protocol. They just sat down and chatted, which isn't a particularly firm foundation for a psychological system. So you might wonder whether serial killers can really be categorized by their level of organization.
Not long ago, a group of psychologists at the University of Liverpool decided to test the F.B.I.'s assumptions. First, they made a list of crime-scene characteristics generally considered to show organization: perhaps the victim was alive during the sex acts, or the body was posed in a certain way, or the murder weapon was missing, or the body was concealed, or torture and restraints were involved. Then they made a list of characteristics showing disorganization: perhaps the victim was beaten, the body was left in an isolated spot, the victim's belongings were scattered, or the murder weapon was improvised.
If the F.B.I. was right, they reasoned, the crime-scene details on each of those two lists should "co-occur"—that is, if you see one or more organized traits in a crime, there should be a reasonably high probability of seeing other organized traits. When they looked at a sample of a hundred serial crimes, however, they couldn't find any support for the F.B.I.'s distinction. Crimes don't fall into one camp or the other. It turns out that they're almost always a mixture of a few key organized traits and a random array of disorganized traits. Laurence Alison, one of the leaders of the Liverpool group and the author of "The Forensic Psychologist's Casebook," told me, "The whole business is a lot more complicated than the F.B.I. imagines."
Alison and another of his colleagues also looked at homology. If Douglas was right, then a certain kind of crime should correspond to a certain kind of criminal. So the Liverpool group selected a hundred stranger rapes in the United Kingdom, classifying them according to twenty-eight variables, such as whether a disguise was worn, whether compliments were given, whether there was binding, gagging, or blindfolding, whether there was apologizing or the theft of personal property, and so on. They then looked at whether the patterns in the crimes corresponded to attributes of the criminals—like age, type of employment, ethnicity, level of education, marital status, number of prior convictions, type of prior convictions, and drug use. Were rapists who bind, gag, and blindfold more like one another than they were like rapists who, say, compliment and apologize? The answer is no—not even slightly.
"The fact is that different offenders can exhibit the same behaviors for completely different reasons," Brent Turvey, a forensic scientist who has been highly critical of the F.B.I.'s approach, says. "You've got a rapist who attacks a woman in the park and pulls her shirt up over her face. Why? What does that mean? There are ten different things it could mean. It could mean he ''t want to see her. It could mean he doesn't want her to see him. It could mean he wants to see her breasts, he wants to imagine someone else, he wants to incapacitate her arms—all of those are possibilities. You can't just look at one behavior in isolation."
A few years ago, Alison went back to the case of the teacher who was murdered on the roof of her building in the Bronx. He wanted to know why, if the F.B.I.'s approach to criminal profiling was based on such simplistic psychology, it continues to have such a sterling reputation. The answer, he suspected, lay in the way the profiles were written, and, sure enough, when he broke down the rooftop-killer analysis, sentence by sentence, he found that it was so full of unverifiable and contradictory and ambiguous language that it could support virtually any interpretation.
Astrologers and psychics have known these tricks for years. The magician Ian Rowland, in his classic "The Full Facts Book of Cold Reading," itemizes them one by one, in what could easily serve as a manual for the beginner profiler. First is the Rainbow Ruse—the "statement which credits the client with both a personality trait and its opposite." ("I would say that on the whole you can be rather a quiet, self effacing type, but when the circumstances are right, you can be quite the life and soul of the party if the mood strikes you.") The Jacques Statement, named for the character in "As You Like It" who gives the Seven Ages of Man speech, tailors the prediction to the age of the subject. To someone in his late thirties or early forties, for example, the psychic says, "If you are honest about it, you often get to wondering what happened to all those dreams you had when you were younger." There is the Barnum Statement, the assertion so general that anyone would agree, and the Fuzzy Fact, the seemingly factual statement couched in a way that "leaves plenty of scope to be developed into something more specific." ("I can see a connection with Europe, possibly Britain, or it could be the warmer, Mediterranean part?") And that's only the start: there is the Greener Grass technique, the Diverted Question, the Russian Doll, Sugar Lumps, not to mention Forking and the Good Chance Guess—all of which, when put together in skillful combination, can convince even the most skeptical observer that he or she is in the presence of real insight.
"Moving on to career matters, you don't work with children, do you?" Rowland will ask his subjects, in an example of what he dubs the "Vanishing Negative."
No, I don't.
"No, I thought not. That's not really your role."
Of course, if the subject answers differently, there's another way to play the question: "Moving on to career matters, you don't work with children, do you?"
I do, actually, part time.
"Yes, I thought so."
After Alison had analyzed the rooftop-killer profile, he decided to play a version of the cold-reading game. He gave the details of the crime, the profile prepared by the F.B.I., and a description of the offender to a group of senior police officers and forensic professionals in England. How did they find the profile? Highly accurate. Then Alison gave the same packet of case materials to another group of police officers, but this time he invented an imaginary offender, one who was altogether different from Calabro. The new killer was thirty-seven years old. He was an alcoholic. He had recently been laid off from his job with the water board, and had met the victim before on one of his rounds. What's more, Alison claimed, he had a history of violent relationships with women, and prior convictions for assault and burglary. How accurate did a group of experienced police officers find the F.B.I.'s profile when it was matched with the phony offender? Every bit as accurate as when it was matched to the real offender.
James Brussel didn't really see the Mad Bomber in that pile of pictures and photostats, then. That was an illusion. As the literary scholar Donald Foster pointed out in his 2000 book "Author Unknown," Brussel cleaned up his predictions for his memoirs. He actually told the police to look for the bomber in White Plains, sending the N.Y.P.D.'s bomb unit on a wild goose chase in Westchester County, sifting through local records. Brussel also told the police to look for a man with a facial scar, which Metesky didn't have. He told them to look for a man with a night job, and Metesky had been largely unemployed since leaving Con Edison in 1931. He told them to look for someone between forty and fifty, and Metesky was over fifty. He told them to look for someone who was an "expert in civil or military ordnance" and the closest Metesky came to that was a brief stint in a machine shop. And Brussel, despite what he wrote in his memoir, never said that the Bomber would be a Slav. He actually told the police to look for a man "born and educated in Germany," a prediction so far off the mark that the Mad Bomber himself was moved to object. At the height of the police investigation, when the New York Journal American offered to print any communications from the Mad Bomber, Metesky wrote in huffily to say that "the nearest to my being 'Teutonic' is that my father boarded a liner in Hamburg for passage to this country—about sixty-five years ago."
The true hero of the case wasn't Brussel; it was a woman named Alice Kelly, who had been assigned to go through Con Edison's personnel files. In January, 1957, she ran across an employee complaint from the early nineteen-thirties: a generator wiper at the Hell Gate plant had been knocked down by a backdraft of hot gases. The worker said that he was injured. The company said that he wasn't. And in the flood of angry letters from the ex-employee Kelly spotted a threat—to "take justice in my own hands"—that had appeared in one of the Mad Bomber's letters. The name on the file was George Metesky.
Brussel did not really understand the mind of the Mad Bomber. He seems to have understood only that, if you make a great number of predictions, the ones that were wrong will soon be forgotten, and the ones that turn out to be true will make you famous. The Hedunit is not a triumph of forensic analysis. It's a party trick.
6.
"Here's where I'm at with this guy," Douglas said, kicking off the profiling session with which "Inside the Mind of BTK" begins. It was 1984. The killer was still at large. Douglas, Hazelwood, and Walker and the two detectives from Wichita were all seated around the oak table. Douglas took off his suit jacket and draped it over his chair. "Back when he started in 1974, he was in his mid-to-late twenties," Douglas began. "It's now ten years later, so that would put him in his mid-to-late thirties."
It was Walker's turn: BTK had never engaged in any sexual penetration. That suggested to him someone with an "inadequate, immature sexual history." He would have a "lone-wolf type of personality. But he's not alone because he's shunned by others—it's because he chooses to be alone. . . . He can function in social settings, but only on the surface. He may have women friends he can talk to, but he'd feel very inadequate with a peer-group female." Hazelwood was next. BTK would be "heavily into masturbation." He went on, "Women who have had sex with this guy would describe him as aloof, uninvolved, the type who is more interested in her servicing him than the other way around."
Douglas followed his lead. "The women he's been with are either many years younger, very naĂŻve, or much older and depend on him as their meal ticket," he ventured. What's more, the profilers determined, BTK would drive a "decent" automobile, but it would be "nondescript."
At this point, the insights began piling on. Douglas said he'd been thinking that BTK was married. But now maybe he was thinking he was divorced. He speculated that BTK was lower middle class, probably living in a rental. Walker felt BTK was in a "lower-paying white collar job, as opposed to blue collar." Hazelwood saw him as "middle class" and "articulate." The consensus was that his I.Q. was somewhere between 105 and 145. Douglas wondered whether he was connected with the military. Hazelwood called him a "now" person, who needed "instant gratification."
Walker said that those who knew him "might say they remember him, but didn't really know much about him." Douglas then had a flash—"It was a sense, almost a knowing"—and said, "I wouldn't be surprised if, in the job he's in today, that he's wearing some sort of uniform. . . . This guy isn't mental. But he is crazy like a fox."
They had been at it for almost six hours. The best minds in the F.B.I. had given the Wichita detectives a blueprint for their investigation. Look for an American male with a possible connection to the military. His I.Q. will be above 105. He will like to masturbate, and will be aloof and selfish in bed. He will drive a decent car. He will be a "now" person. He won't be comfortable with women. But he may have women friends. He will be a lone wolf. But he will be able to function in social settings. He won't be unmemorable. But he will be unknowable. He will be either never married, divorced, or married, and if he was or is married his wife will be younger or older. He may or may not live in a rental, and might be lower class, upper lower class, lower middle class or middle class. And he will be crazy like a fox, as opposed to being mental. If you're keeping score, that's a Jacques Statement, two Barnum Statements, four Rainbow Ruses, a Good Chance Guess, two predictions that aren't really predictions because they could never be verified—and nothing even close to the salient fact that BTK was a pillar of his community, the president of his church and the married father of two.
"This thing is solvable," Douglas told the detectives, as he stood up and put on his jacket. "Feel free to pick up the phone and call us if we can be of any further assistance." You can imagine him taking the time for an encouraging smile and a slap on the back. "You're gonna nail this guy."
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
None of the Above
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
December 17, 2007
Books
What I.Q. doesn't tell you about race.
If what I.Q. tests measure is immutable and innate, what explains the Flynn effect—the steady rise in scores across generations?
1.
One Saturday in November of 1984, James Flynn, a social scientist at the University of Otago, in New Zealand, received a large package in the mail. It was from a colleague in Utrecht, and it contained the results of I.Q. tests given to two generations of Dutch eighteen-year-olds. When Flynn looked through the data, he found something puzzling. The Dutch eighteen-year-olds from the nineteen-eighties scored better than those who took the same tests in the nineteen-fifties—and not just slightly better, much better.
Curious, Flynn sent out some letters. He collected intelligence-test results from Europe, from North America, from Asia, and from the developing world, until he had data for almost thirty countries. In every case, the story was pretty much the same. I.Q.s around the world appeared to be rising by 0.3 points per year, or three points per decade, for as far back as the tests had been administered. For some reason, human beings seemed to be getting smarter.
Flynn has been writing about the implications of his findings—now known as the Flynn effect—for almost twenty-five years. His books consist of a series of plainly stated statistical observations, in support of deceptively modest conclusions, and the evidence in support of his original observation is now so overwhelming that the Flynn effect has moved from theory to fact. What remains uncertain is how to make sense of the Flynn effect. If an American born in the nineteen-thirties has an I.Q. of 100, the Flynn effect says that his children will have I.Q.s of 108, and his grandchildren I.Q.s of close to 120—more than a standard deviation higher. If we work in the opposite direction, the typical teen-ager of today, with an I.Q. of 100, would have had grandparents with average I.Q.s of 82—seemingly below the threshold necessary to graduate from high school. And, if we go back even farther, the Flynn effect puts the average I.Q.s of the schoolchildren of 1900 at around 70, which is to suggest, bizarrely, that a century ago the United States was populated largely by people who today would be considered mentally retarded.
2.
For almost as long as there have been I.Q. tests, there have been I.Q. fundamentalists. H. H. Goddard, in the early years of the past century, established the idea that intelligence could be measured along a single, linear scale. One of his particular contributions was to coin the word "moron." "The people who are doing the drudgery are, as a rule, in their proper places," he wrote. Goddard was followed by Lewis Terman, in the nineteen-twenties, who rounded up the California children with the highest I.Q.s, and confidently predicted that they would sit at the top of every profession. In 1969, the psychometrician Arthur Jensen argued that programs like Head Start, which tried to boost the academic performance of minority children, were doomed to failure, because I.Q. was so heavily genetic; and in 1994, Richard Hernsterin and Charles Murray published their bestselling hereditarian primer "The Bell Curve," which argued that blacks were innately inferiour in intelligence to whites. To the I.Q. fundamentalist, two things are beyond dispute: first, that I.Q. tests measure some hard and identifiable trait that predicts the quality of our thinking; and, second, that this trait is stable—that is, it is determined by our genes and largely impervious to environmental influences.
This is what James Watson, the co-discoverer of DNA, meant when he told an English newspaper recently that he was "inherently gloomy" about the prospects for Africa. From the perspective of an I.Q. fundamentalist, the fact that Africans score lower than Europeans on I.Q. tests suggests an ineradicable cognitive disability. In the controversy that followed, Watson was defended by the journalist William Saletan, in a three-part series for the online magazine Slate. Drawing heavily on the work of J. Philippe Rushton—a psychologist who specializes in comparing the circumference of what he calls the Negroid brain with the length of the Negroid penis—Saletan took the fundamentalist position to its logical conclusion. To erase the difference between blacks and whites, Saletan wrote, would probably require vigorous interbreeding between the races, or some kind of corrective genetic engineering aimed at upgrading African stock. "Economic and cultural theories have failed to explain most of the pattern," Saletan declared, claiming to have been "soaking [his] head in each 's computations and arguments." One argument that Saletan never soaked his head in, however, was Flynn's, because what Flynn discovered in his mailbox upsets the certainties upon which I.Q. fundamentalism rests. If whatever the thing is that I.Q. tests measure can jump so much in a generation, it can't be all that immutable and it doesn't look all that innate.
The very fact that average I.Q.s shift over time ought to create a "crisis of confidence," Flynn writes in "What Is Intelligence?" (Cambridge; $22), his latest attempt to puzzle through the implications of his discovery. "How could such huge gains be intelligence gains? Either the children of today were far brighter than their parents or, at least in some circumstances, I.Q. tests were not good measures of intelligence."
3.
The best way to understand why I.Q.s rise, Flynn argues, is to look at one of the most widely used I.Q. tests, the so-called WISC (for Wechsler Intelligence Scale for Children). The WISC is composed of ten subtests, each of which measures a different aspect of I.Q. Flynn points out that scores in some of the categories—those measuring general knowledge, say, or vocabulary or the ability to do basic arithmetic—have risen only modestly over time. The big gains on the WISC are largely in the category known as "similarities," where you get questions such as "In what way are 'dogs' and 'rabbits' alike?" Today, we tend to give what, for the purposes of I.Q. tests, is the right answer: dogs and rabbits are both mammals. A nineteenth-century American would have said that "you use dogs to hunt rabbits."
"If the everyday world is your cognitive home, it is not natural to detach abstractions and logic and the hypothetical from their concrete referents," Flynn writes. Our great-grandparents may have been perfectly intelligent. But they would have done poorly on I.Q. tests because they did not participate in the twentieth century's great cognitive revolution, in which we learned to sort experience according to a new set of abstract categories. In Flynn's phrase, we have now had to put on "scientific spectacles," which enable us to make sense of the WISC questions about similarities. To say that Dutch I.Q. scores rose substantially between 1952 and 1982 was another way of saying that the Netherlands in 1982 was, in at least certain respects, much more cognitively demanding than the Netherlands in 1952. An I.Q., in other words, measures not so much how smart we are as how modern we are.
This is a critical distinction. When the children of Southern Italian immigrants were given I.Q. tests in the early part of the past century, for example, they recorded median scores in the high seventies and low eighties, a full standard deviation below their American and Western European counterparts. Southern Italians did as poorly on I.Q. tests as Hispanics and blacks did. As you can imagine, there was much concerned talk at the time about the genetic inferiority of Italian stock, of the inadvisability of letting so many second-class immigrants into the United States, and of the squalor that seemed endemic to Italian urban neighborhoods. Sound familiar? These days, when talk turns to the supposed genetic differences in the intelligence of certain races, Southern Italians have disappeared from the discussion. "Did their genes begin to mutate somewhere in the 1930s?" the psychologists Seymour Sarason and John Doris ask, in their account of the Italian experience. "Or is it possible that somewhere in the 1920s, if not earlier, the sociocultural history of Italo-Americans took a turn from the blacks and the Spanish Americans which permitted their assimilation into the general undifferentiated mass of Americans?"
The psychologist Michael Cole and some colleagues once gave members of the Kpelle tribe, in Liberia, a version of the WISC similarities test: they took a basket of food, tools, containers, and clothing and asked the tribesmen to sort them into appropriate categories. To the frustration of the researchers, the Kpelle chose functional pairings. They put a potato and a knife together because a knife is used to cut a potato. "A wise man could only do such-and-such," they explained. Finally, the researchers asked, "How would a fool do it?" The tribesmen immediately re-sorted the items into the "right" categories. It can be argued that taxonomical categories are a developmental improvement—that is, that the Kpelle would be more likely to advance, technologically and scientifically, if they started to see the world that way. But to label them less intelligent than Westerners, on the basis of their performance on that test, is merely to state that they have different cognitive preferences and habits. And if I.Q. varies with habits of mind, which can be adopted or discarded in a generation, what, exactly, is all the fuss about?
When I was growing up, my family would sometimes play Twenty Questions on long car trips. My father was one of those people who insist that the standard categories of animal, vegetable, and mineral be supplemented with a fourth category: "abstract." Abstract could mean something like "whatever it was that was going through my mind when we drove past the water tower fifty miles back." That abstract category sounds absurdly difficult, but it wasn't: it merely required that we ask a slightly different set of questions and grasp a slightly different set of conventions, and, after two or three rounds of practice, guessing the contents of someone's mind fifty miles ago becomes as easy as guessing Winston Churchill. (There is one exception. That was the trip on which my old roommate Tom Connell chose, as an abstraction, "the Unknown Soldier"—which allowed him legitimately and gleefully to answer "I have no idea" to almost every question. There were four of us playing. We gave up after an hour.) Flynn would say that my father was teaching his three sons how to put on scientific spectacles, and that extra practice probably bumped up all of our I.Q.s a few notches. But let's be clear about what this means. There's a world of difference between an I.Q. advantage that's genetic and one that depends on extended car time with Graham Gladwell.
4.
Flynn is a cautious and careful writer. Unlike many others in the I.Q. debates, he resists grand philosophizing. He comes back again and again to the fact that I.Q. scores are generated by paper-and-pencil tests—and making sense of those scores, he tells us, is a messy and complicated business that requires something closer to the skills of an accountant than to those of a philosopher.
For instance, Flynn shows what happens when we recognize that I.Q. is not a freestanding number but a value attached to a specific time and a specific test. When an I.Q. test is created, he reminds us, it is calibrated or "normed" so that the test-takers in the fiftieth percentile—those exactly at the median—are assigned a score of 100. But since I.Q.s are always rising, the only way to keep that hundred-point benchmark is periodically to make the tests more difficult—to "renorm" them. The original WISC was normed in the late nineteen-forties. It was then renormed in the early nineteen-seventies, as the WISC-R; renormed a third time in the late eighties, as the WISC III; and renormed again a few years ago, as the WISC IV—with each version just a little harder than its predecessor. The notion that anyone "has" an I.Q. of a certain number, then, is meaningless unless you know which WISC he took, and when he took it, since there's a substantial difference between getting a 130 on the WISC IV and getting a 130 on the much easier WISC.
This is not a trivial issue. I.Q. tests are used to diagnose people as mentally retarded, with a score of 70 generally taken to be the cutoff. You can imagine how the Flynn effect plays havoc with that system. In the nineteen-seventies and eighties, most states used the WISC-R to make their mental-retardation diagnoses. But since kids—even kids with disabilities—score a little higher every year, the number of children whose scores fell below 70 declined steadily through the end of the eighties. Then, in 1991, the WISC III was introduced, and suddenly the percentage of kids labelled retarded went up. The psychologists Tomoe Kanaya, Matthew Scullin, and Stephen Ceci estimated that, if every state had switched to the WISC III right away, the number of Americans labelled mentally retarded should have doubled.
That is an extraordinary number. The diagnosis of mental disability is one of the most stigmatizing of all educational and occupational classifications—and yet, apparently, the chances of being burdened with that label are in no small degree a function of the point, in the life cycle of the WISC, at which a child happens to sit for his evaluation. "As far as I can determine, no clinical or school psychologists using the WISC over the relevant 25 years noticed that its criterion of mental retardation became more lenient over time," Flynn wrote, in a 2000 paper. "Yet no one drew the obvious moral about psychologists in the field: They simply were not making any systematic assessment of the I.Q. criterion for mental retardation."
Flynn brings a similar precision to the question of whether Asians have a genetic advantage in I.Q., a possibility that has led to great excitement among I.Q. fundamentalists in recent years. Data showing that the Japanese had higher I.Q.s than people of European descent, for example, prompted the British psychometrician and eugenicist Richard Lynn to concoct an elaborate evolutionary explanation involving the Himalayas, really cold weather, premodern hunting practices, brain size, and specialized vowel sounds. The fact that the I.Q.s of Chinese-Americans also seemed to be elevated has led I.Q. fundamentalists to posit the existence of an international I.Q. pyramid, with Asians at the top, European whites next, and Hispanics and blacks at the bottom.
Here was a question tailor-made for James Flynn's accounting skills. He looked first at Lynn's data, and realized that the comparison was skewed. Lynn was comparing American I.Q. estimates based on a representative sample of schoolchildren with Japanese estimates based on an upper-income, heavily urban sample. Recalculated, the Japanese average came in not at 106.6 but at 99.2. Then Flynn turned his attention to the Chinese-American estimates. They turned out to be based on a 1975 study in San Francisco's Chinatown using something called the Lorge-Thorndike Intelligence Test. But the Lorge-Thorndike test was normed in the nineteen-fifties. For children in the nineteen-seventies, it would have been a piece of cake. When the Chinese-American scores were reassessed using up-to-date intelligence metrics, Flynn found, they came in at 97 verbal and 100 nonverbal. Chinese-Americans had slightly lower I.Q.s than white Americans.
The Asian-American success story had suddenly been turned on its head. The numbers now suggested, Flynn said, that they had succeeded not because of their higher I.Q.s. but despite their lower I.Q.s. Asians were overachievers. In a nifty piece of statistical analysis, Flynn then worked out just how great that overachievement was. Among whites, virtually everyone who joins the ranks of the managerial, professional, and technical occupations has an I.Q. of 97 or above. Among Chinese-Americans, that threshold is 90. A Chinese-American with an I.Q. of 90, it would appear, does as much with it as a white American with an I.Q. of 97.
There should be no great mystery about Asian achievement. It has to do with hard work and dedication to higher education, and belonging to a culture that stresses professional success. But Flynn makes one more observation. The children of that first successful wave of Asian-Americans really did have I.Q.s that were higher than everyone else's—coming in somewhere around 103. Having worked their way into the upper reaches of the occupational scale, and taken note of how much the professions value abstract thinking, Asian-American parents have evidently made sure that their own children wore scientific spectacles. "Chinese Americans are an ethnic group for whom high achievement preceded high I.Q. rather than the reverse," Flynn concludes, reminding us that in our discussions of the relationship between I.Q. and success we often confuse causes and effects. "It is not easy to view the history of their achievements without emotion," he writes. That is exactly right. To ascribe Asian success to some abstract number is to trivialize it.
5.
Two weeks ago, Flynn came to Manhattan to debate Charles Murray at a forum sponsored by the Manhattan Institute. Their subject was the black-white I.Q. gap in America. During the twenty-five years after the Second World War, that gap closed considerably. The I.Q.s of white Americans rose, as part of the general worldwide Flynn effect, but the I.Q.s of black Americans rose faster. Then, for about a period of twenty-five years, that trend stalled—and the question was why.
Murray showed a series of PowerPoint slides, each representing different statistical formulations of the I.Q. gap. He appeared to be pessimistic that the racial difference would narrow in the future. "By the nineteen-seventies, you had gotten most of the juice out of the environment that you were going to get," he said. That gap, he seemed to think, reflected some inherent difference between the races. "Starting in the nineteen-seventies, to put it very crudely, you had a higher proportion of black kids being born to really dumb mothers," he said. When the debate's moderator, Jane Waldfogel, informed him that the most recent data showed that the race gap had begun to close again, Murray seemed unimpressed, as if the possibility that blacks could ever make further progress was inconceivable.
Flynn took a different approach. The black-white gap, he pointed out, differs dramatically by age. He noted that the tests we have for measuring the cognitive functioning of infants, though admittedly crude, show the races to be almost the same. By age four, the average black I.Q. is 95.4—only four and a half points behind the average white I.Q. Then the real gap emerges: from age four through twenty-four, blacks lose six-tenths of a point a year, until their scores settle at 83.4.
That steady decline, Flynn said, did not resemble the usual pattern of genetic influence. Instead, it was exactly what you would expect, given the disparate cognitive environments that whites and blacks encounter as they grow older. Black children are more likely to be raised in single-parent homes than are white children—and single-parent homes are less cognitively complex than two-parent homes. The average I.Q. of first-grade students in schools that blacks attend is 95, which means that "kids who want to be above average don't have to aim as high." There were possibly adverse differences between black teen-age culture and white teen-age culture, and an enormous number of young black men are in jail—which is hardly the kind of environment in which someone would learn to put on scientific spectacles.
Flynn then talked about what we've learned from studies of adoption and mixed-race children—and that evidence didn't fit a genetic model, either. If I.Q. is innate, it shouldn't make a difference whether it's a mixed-race child's mother or father who is black. But it does: children with a white mother and a black father have an eight-point I.Q. advantage over those with a black mother and a white father. And it shouldn't make much of a difference where a mixed-race child is born. But, again, it does: the children fathered by black American G.I.s in postwar Germany and brought up by their German mothers have the same I.Q.s as the children of white American G.I.s and German mothers. The difference, in that case, was not the fact of the children's blackness, as a fundamentalist would say. It was the fact of their Germanness—of their being brought up in a different culture, under different circumstances. "The mind is much more like a muscle than we've ever realized," Flynn said. "It needs to get cognitive exercise. It's not some piece of clay on which you put an indelible mark." The lesson to be drawn from black and white differences was the same as the lesson from the Netherlands years ago: I.Q. measures not just the quality of a person's mind but the quality of the world that person lives in.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
In the Air
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 12, 2008
Annals of Innovation
Who says big ideas are rare?
1.
Nathan Myhrvold met Jack Horner on the set of the "Jurassic Park" sequel in 1996. Horner is an eminent paleontologist, and was a consultant on the movie. Myhrvold was there because he really likes dinosaurs. Between takes, the two men got to talking, and Horner asked Myhrvold if he was interested in funding dinosaur expeditions.
Myhrvold is of Nordic extraction, and he looks every bit the bearded, fair-haired Viking—not so much the tall, ferocious kind who raped and pillaged as the impish, roly-poly kind who stayed home by the fjords trying to turn lead into gold. He is gregarious, enthusiastic, and nerdy on an epic scale. He graduated from high school at fourteen. He started Microsoft's research division, leaving, in 1999, with hundreds of millions. He is obsessed with aperiodic tile patterns. (Imagine a floor tiled in a pattern that never repeats.) When Myhrvold built his own house, on the shores of Lake Washington, outside Seattle—a vast, silvery hypermodernist structure described by his wife as the place in the sci-fi movie where the aliens live—he embedded some sixty aperiodic patterns in the walls, floors, and ceilings. His front garden is planted entirely with vegetation from the Mesozoic era. ("If the 'Jurassic Park' thing happens," he says, "this is where the dinosaurs will come to eat.") One of the scholarly achievements he is proudest of is a paper he co-wrote proving that it was theoretically possible for sauropods—his favorite kind of dinosaur—to have snapped their tails back and forth faster than the speed of sound. How could he say no to the great Jack Horner?
"What you do on a dinosaur expedition is you hike and look at the ground," Myhrvold explains. "You find bones sticking out of the dirt and, once you see something, you dig." In Montana, which is prime dinosaur country, people had been hiking around and looking for bones for at least a hundred years. But Horner wanted to keep trying. So he and Myhrvold put together a number of teams, totalling as many as fifty people. They crossed the Fort Peck reservoir in boats, and began to explore the Montana badlands in earnest. They went out for weeks at a time, several times a year. They flew equipment in on helicopters. They mapped the full dinosaur ecology—bringing in specialists from other disciplines. And they found dinosaur bones by the truckload.
Once, a team member came across a bone sticking out from the bottom of a recently eroded cliff. It took Horner's field crew three summers to dig it out, and when they broke the bone open a black, gooey substance trickled out—a discovery that led Myhrvold and his friend Lowell Wood on a twenty-minute digression at dinner one night about how, given enough goo and a sufficient number of chicken embryos, they could "make another one."
There was also Myhrvold's own find: a line of vertebrae, as big as apples, just lying on the ground in front of him. "It was seven years ago. It was a bunch of bones from a fairly rare dinosaur called a thescelosaurus. I said, 'Oh, my God!' I was walking with Jack and my son. Then Jack said, 'Look, there's a bone in the side of the hill.' And we look at it, and it's a piece of a jawbone with a tooth the size of a banana. It was a T. rex skull. There was nothing else it could possibly be."
People weren't finding dinosaur bones, and they assumed that it was because they were rare. But—and almost everything that Myhrvold has been up to during the past half decade follows from this fact—it was our fault. We didn't look hard enough.
Myhrvold gave the skeleton to the Smithsonian. It's called the N. rex. "Our expeditions have found more T. rex than anyone else in the world," Myhrvold said. "From 1909 to 1999, the world found eighteen T. rex specimens. From 1999 until now, we've found nine more." Myhrvold has the kind of laugh that scatters pigeons. "We have dominant T. rex market share."
2.
In 1874, Alexander Graham Bell spent the summer with his parents in Brantford, Ontario. He was twenty-seven years old, and employed as a speech therapist in Boston. But his real interest was solving the puzzle of what he then called the "harmonic telegraph." In Boston, he had tinkered obsessively with tuning forks and electromagnetic coils, often staying up all night when he was in the grip of an idea. When he went to Brantford, he brought with him an actual human ear, taken from a cadaver and preserved, to which he attached a pen, so that he could record the vibration of the ear's bones when he spoke into it.
One day, Bell went for a walk on a bluff overlooking the Grand River, near his parents' house. In a recent biography of Bell, "Reluctant Genius," Charlotte Gray writes:
A large tree had blown down here, creating a natural and completely private belvedere, which [he] had dubbed his "dreaming place." Slouched on a wicker chair, his hands in his pockets, he stared unseeing at the swiftly flowing river below him. Far from the bustle of Boston and the pressure of competition from other eager inventors, he mulled over everything he had discovered about sound.
In that moment, Bell knew the answer to the puzzle of the harmonic telegraph. Electric currents could convey sound along a wire if they undulated in accordance with the sound waves. Back in Boston, he hired a research assistant, Thomas Watson. He turned his attic into a laboratory, and redoubled his efforts. Then, on March 10, 1876, he set up one end of his crude prototype in his bedroom, and had Watson take the other end to the room next door. Bell, always prone to clumsiness, spilled acid on his clothes. "Mr. Watson, come here," he cried out. Watson came —but only because he had heard Bell on the receiver, plain as day. The telephone was born.
In 1999, when Nathan Myhrvold left Microsoft and struck out on his own, he set himself an unusual goal. He wanted to see whether the kind of insight that leads to invention could be engineered. He formed a company called Intellectual Ventures. He raised hundreds of millions of dollars. He hired the smartest people he knew. It was not a venture-capital firm. Venture capitalists fund insights—that is, they let the magical process that generates new ideas take its course, and then they jump in. Myhrvold wanted to make insights—to come up with ideas, patent them, and then license them to interested companies. He thought that if he brought lots of very clever people together he could reconstruct that moment by the Grand River.
One rainy day last November, Myhrvold held an "invention session," as he calls such meetings, on the technology of self-assembly. What if it was possible to break a complex piece of machinery into a thousand pieces and then, at some predetermined moment, have the machine put itself back together again? That had to be useful. But for what?
The meeting, like many of Myhrvold's sessions, was held in a conference room in the Intellectual Ventures laboratory, a big warehouse in an industrial park across Lake Washington from Seattle: plasma TV screens on the walls, a long table furnished with bottles of Diet Pepsi and big bowls of cashews.
Chairing the meeting was Casey Tegreene, an electrical engineer with a law degree, who is the chief patent counsel for I.V. He stood at one end of the table. Myhrvold was at the opposite end. Next to him was Edward Jung, whom Myhrvold met at Microsoft. Jung is lean and sleek, with closely cropped fine black hair. Once, he spent twenty-two days walking across Texas with nothing but a bedroll, a flashlight, and a rifle, from Big Bend, in the west, to Houston, where he was going to deliver a paper at a biology conference. On the other side of the table from Jung was Lowell Wood, an imposing man with graying red hair and an enormous head. Three or four pens were crammed into his shirt pocket. The screen saver on his laptop was a picture of Stonehenge.
"You know how musicians will say, 'My teacher was So-and-So, and his teacher was So-and-So,' right back to Beethoven?" Myhrvold says. "So Lowell was the great protégé of Edward Teller. He was at Lawrence Livermore. He was the technical director of Star Wars." Myhrvold and Wood have known each other since Myhrvold was a teen-ager and Wood interviewed him for a graduate fellowship called the Hertz. "If you want to know what Nathan was like at that age," Wood said, "look at that ball of fire now and scale that up by eight or ten decibels." Wood bent the rules for Myhrvold; the Hertz was supposed to be for research in real-world problems. Myhrvold's field at that point, quantum cosmology, involved the application of quantum mechanics to the period just after the big bang, which means, as Myhrvold likes to say, that he had no interest in the universe a microsecond after its creation.
The chairman of the chemistry department at Stanford, Richard Zare, had flown in for the day, as had Eric Leuthardt, a young neurosurgeon from Washington University, in St. Louis, who is a regular at I.V. sessions. At the back was a sombre, bearded man named Rod Hyde, who had been Wood's protégé at Lawrence Livermore.
Tegreene began. "There really aren't any rules," he told everyone. "We may start out talking about refined plastics and end up talking about shoes, and that's O.K."
He started in on the "prep." In the previous weeks, he and his staff had reviewed the relevant scientific literature and recent patent filings in order to come up with a short briefing on what was and wasn't known about self-assembly. A short BBC documentary was shown, on the early work of the scientist Lionel Penrose. Richard Zare passed around a set of what looked like ceramic dice. Leuthardt drew elaborate diagrams of the spine on the blackboard. Self-assembly was very useful in eye-of-the-needle problems—in cases where you had to get something very large through a very small hole—and Leuthardt wondered if it might be helpful in minimally invasive surgery.
The conversation went in fits and starts. "I'm asking a simple question and getting a long-winded answer," Jung said at one point, quietly. Wood played the role of devil's advocate. During a break, Myhrvold announced that he had just bought a CAT scanner, on an Internet auction site.
"I put in a minimum bid of twenty-nine hundred dollars," he said. There was much murmuring and nodding around the room. Myhrvold's friends, like Myhrvold, seemed to be of the opinion that there is no downside to having a CAT scanner, especially if you can get it for twenty-nine hundred dollars.
Before long, self-assembly was put aside and the talk swung to how to improve X-rays, and then to the puzzling phenomenon of soldiers in Iraq who survive a bomb blast only to die a few days later of a stroke. Wood thought it was a shock wave, penetrating the soldiers' helmets and surging through their brains, tearing blood vessels away from tissue. "Lowell is the living example of something better than the Internet," Jung said after the meeting was over. "On the Internet, you can search for whatever you want, but you have to know the right terms. With Lowell, you just give him a concept, and this stuff pops out."
Leuthardt, the neurosurgeon, thought that Wood's argument was unconvincing. The two went back and forth, arguing about how you could make a helmet that would better protect soldiers.
"We should be careful how much mental energy we spend on this," Leuthardt said, after a few minutes.
Wood started talking about the particular properties of bullets with tungsten cores.
"Shouldn't someone tell the Pentagon?" a voice said, only half jokingly, from the back of the room.
3.
How useful is it to have a group of really smart people brainstorm for a day? When Myhrvold started out, his expectations were modest. Although he wanted insights like Alexander Graham Bell's, Bell was clearly one in a million, a genius who went on to have ideas in an extraordinary number of areas—sound recording, flight, lasers, tetrahedral construction, and hydrofoil boats, to name a few. The telephone was his obsession. He approached it from a unique perspective, that of a speech therapist. He had put in years of preparation before that moment by the Grand River, and it was impossible to know what unconscious associations triggered his great insight. Invention has its own algorithm: genius, obsession, serendipity, and epiphany in some unknowable combination. How can you put that in a bottle?
But then, in August of 2003, I.V. held its first invention session, and it was a revelation. "Afterward, Nathan kept saying, 'There are so many inventions,' " Wood recalled. "He thought if we came up with a half-dozen good ideas it would be great, and we came up with somewhere between fifty and a hundred. I said to him, 'But you had eight people in that room who are seasoned inventors. Weren't you expecting a multiplier effect?' And he said, 'Yeah, but it was more than multiplicity.' Not even Nathan had any idea of what it was going to be like."
The original expectation was that I.V. would file a hundred patents a year. Currently, it's filing five hundred a year. It has a backlog of three thousand ideas. Wood said that he once attended a two-day invention session presided over by Jung, and after the first day the group went out to dinner. "So Edward took his people out, plus me," Wood said. "And the eight of us sat down at a table and the attorney said, 'Do you mind if I record the evening?' And we all said no, of course not. We sat there. It was a long dinner. I thought we were lightly chewing the rag. But the next day the attorney comes up with eight single-spaced pages flagging thirty-six different inventions from dinner. Dinner."
And the kinds of ideas the group came up with weren't trivial. Intellectual Ventures just had a patent issued on automatic, battery-powered glasses, with a tiny video camera that reads the image off the retina and adjusts the fluid-filled lenses accordingly, up to ten times a second. It just licensed off a cluster of its patents, for eighty million dollars. It has invented new kinds of techniques for making microchips and improving jet engines; it has proposed a way to custom-tailor the mesh "sleeve" that neurosurgeons can use to repair aneurysms.
Bill Gates, whose company, Microsoft, is one of the major investors in Intellectual "Ventures, says, I can give you fifty examples of ideas they've had where, if you take just one of them, you'd have a startup company right there." Gates has participated in a number of invention sessions, and, with other members of the Gates Foundation, meets every few months with Myhrvold to brainstorm about things like malaria or H.I.V. "Nathan sent over a hundred scientific papers beforehand," Gates said of the last such meeting. "The amount of reading was huge. But it was fantastic. There's this idea they have where you can track moving things by counting wing beats. So you could build a mosquito fence and clear an entire area. They had some ideas about super-thermoses, so you wouldn't need refrigerators for certain things. They also came up with this idea to stop hurricanes. Basically, the waves in the ocean have energy, and you use that to lower the temperature differential. I'm not saying it necessarily is going to work. But it's just an example of something where you go, Wow."
One of the sessions that Gates participated in was on the possibility of resuscitating nuclear energy. "Teller had this idea way back when that you could make a very safe, passive nuclear reactor," Myhrvold explained. "No moving parts. Proliferation-resistant. Dead simple. Every serious nuclear accident involves operator error, so you want to eliminate the operator altogether. Lowell and Rod and others wrote a paper on it once. So we did several sessions on it."
The plant, as they conceived it, would produce something like one to three gigawatts of power, which is enough to serve a medium-sized city. The reactor core would be no more than several metres wide and about ten metres long. It would be enclosed in a sealed, armored box. The box would work for thirty years, without need for refuelling. Wood's idea was that the box would run on thorium, which is a very common, mildly radioactive metal. (The world has roughly a hundred-thousand-year supply, he figures.) Myhrvold's idea was that it should run on spent fuel from existing power plants. "Waste has negative cost," Myhrvold said. "This is how we make this idea politically and regulatorily attractive. Lowell and I had a monthlong no-holds-barred nuclear-physics battle. He didn't believe waste would work. It turns out it does." Myhrvold grinned. "He concedes it now."
It was a long-shot idea, easily fifteen years from reality, if it became a reality at all. It was just a tantalizing idea at this point, but who wasn't interested in seeing where it would lead? "We have thirty guys working on it," he went on. "I have more people doing cutting-edge nuclear work than General Electric. We're looking for someone to partner with us, because this is a huge undertaking. We took out an ad in Nuclear News, which is the big trade journal. It looks like something from The Onion: 'Intellectual Ventures interested in nuclear-core designer and fission specialist.' And, no, the F.B.I. hasn't come knocking." He lowered his voice to a stage whisper. "Lowell is known to them."
It was the dinosaur-bone story all over again. You sent a proper search team into territory where people had been looking for a hundred years, and, lo and behold, there's a T. rex tooth the size of a banana. Ideas weren't precious. They were everywhere, which suggested that maybe the extraordinary process that we thought was necessary for invention—genius, obsession, serendipity, epiphany—wasn't necessary at all.
4.
In June of 1876, a few months after he shouted out, "Mr. Watson, come here," Alexander Graham Bell took his device to the World's Fair in Philadelphia. There, before an audience that included the emperor of Brazil, he gave his most famous public performance. The emperor accompanied Bell's assistant, Willie Hubbard, to an upper gallery, where the receiver had been placed, leaving Bell with his transmitter. Below them, and out of sight, Bell began to talk. "A storm of emotions crossed the Brazilian emperor's face—uncertainty, amazement, elation," Charlotte Gray writes. "Lifting his head from the receiver . . . he gave Willie a huge grin and said, 'This thing speaks!' " Gray continues:
Soon a steady stream of portly, middle-aged men were clambering into the gallery, stripping off their jackets, and bending their ears to the receiver. "For an hour or more," Willie remembered, "all took turns in talking and listening, testing the line in every possible way, evidently looking for some trickery, or thinking that the sound was carried through the air. . . . It seemed to be nearly all too wonderful for belief."
Bell was not the only one to give a presentation on the telephone at the Philadelphia Exhibition, however. Someone else spoke first. His name was Elisha Gray. Gray never had an epiphany overlooking the Grand River. Few have claimed that Gray was a genius. He does not seem to have been obsessive, or to have routinely stayed up all night while in the grip of an idea—although we don't really know, because, unlike Bell, he has never been the subject of a full-length biography. Gray was simply a very adept inventor. He was the author of a number of discoveries relating to the telegraph industry, including a self-adjusting relay that solved the problem of circuits sticking open or shut, and a telegraph printer—a precursor of what was later called the Teletype machine. He worked closely with Western Union. He had a very capable partner named Enos Barton, with whom he formed a company that later became the Western Electric Company and its offshoot Graybar (of Graybar Building fame). And Gray was working on the telephone at the same time that Bell was. In fact, the two filed notice with the Patent Office in Washington, D.C., on the same day—February 14, 1876. Bell went on to make telephones with the company that later became A. T. & T. Gray went on to make telephones in partnership with Western Union and Thomas Edison, and—until Gray's team was forced to settle a lawsuit with Bell's company—the general consensus was that Gray and Edison's telephone was better than Bell's telephone.
In order to get one of the greatest inventions of the modern age, in other words, we thought we needed the solitary genius. But if Alexander Graham Bell had fallen into the Grand River and drowned that day back in Brantford, the world would still have had the telephone, the only difference being that the telephone company would have been nicknamed Ma Gray, not Ma Bell.
5.
This phenomenon of simultaneous discovery—what science historians call "multiples"—turns out to be extremely common. One of the first comprehensive lists of multiples was put together by William Ogburn and Dorothy Thomas, in 1922, and they found a hundred and forty-eight major scientific discoveries that fit the multiple pattern. Newton and Leibniz both discovered calculus. Charles Darwin and Alfred Russel Wallace both discovered evolution. Three mathematicians "invented" decimal fractions. Oxygen was discovered by Joseph Priestley, in Wiltshire, in 1774, and by Carl Wilhelm Scheele, in Uppsala, a year earlier. Color photography was invented at the same time by Charles Cros and by Louis Ducos du Hauron, in France. Logarithms were invented by John Napier and Henry Briggs in Britain, and by Joost BĂĽrgi in Switzerland.
"There were four independent discoveries of sunspots, all in 1611; namely, by Galileo in Italy, Scheiner in Germany, Fabricius in Holland and Harriott in England," Ogburn and Thomas note, and they continue:
The law of the conservation of energy, so significant in science and philosophy, was formulated four times independently in 1847, by Joule, Thomson, Colding and Helmholz. They had been anticipated by Robert Mayer in 1842. There seem to have been at least six different inventors of the thermometer and no less than nine claimants of the invention of the telescope. Typewriting machines were invented simultaneously in England and in America by several individuals in these countries. The steamboat is claimed as the "exclusive" discovery of Fulton, Jouffroy, Rumsey, Stevens and Symmington.
For Ogburn and Thomas, the sheer number of multiples could mean only one thing: scientific discoveries must, in some sense, be inevitable. They must be in the air, products of the intellectual climate of a specific time and place. It should not surprise us, then, that calculus was invented by two people at the same moment in history. Pascal and Descartes had already laid the foundations. The Englishman John Wallis had pushed the state of knowledge still further. Newton's teacher was Isaac Barrow, who had studied in Italy, and knew the critical work of Torricelli and Cavalieri. Leibniz knew Pascal's and Descartes's work from his time in Paris. He was close to a German named Henry Oldenburg, who, now living in London, had taken it upon himself to catalogue the latest findings of the English mathematicians. Leibniz and Newton may never have actually sat down together and shared their work in detail. But they occupied a common intellectual milieu. "All the basic work was done—someone just needed to take the next step and put it together," Jason Bardi writes in "The Calculus Wars," a history of the idea's development. "If Newton and Leibniz had not discovered it, someone else would have." Calculus was in the air.
Of course, that is not the way Newton saw it. He had done his calculus work in the mid-sixteen-sixties, but never published it. And after Leibniz came out with his calculus, in the sixteen-eighties, people in Newton's circle accused Leibniz of stealing his work, setting off one of the great scientific scandals of the seventeenth century. That is the inevitable human response. We're reluctant to believe that great discoveries are in the air. We want to believe that great discoveries are in our heads—and to each party in the multiple the presence of the other party is invariably cause for suspicion.
Thus the biographer Robert Bruce, in "Bell: Alexander Graham Bell and the Conquest of Solitude," casts a skeptical eye on Elisha Gray. Was it entirely coincidence, he asks, that the two filed on exactly the same day? "If Gray had prevailed in the end," he goes on,
Bell and his partners, along with fanciers of the underdog, would have suspected chicanery. After all, Gray did not put his concept on paper nor even mention it to anyone until he had spent nearly a month in Washington making frequent visits to the Patent Office, and until Bell's notarized specifications had for several days been the admiration of at least some of "the people in the Patent Office." . . . It is easier to believe that a conception already forming in Gray's mind was precipitated by rumors of what Bell was about to patent, than to believe that chance alone brought Gray to inspiration and action at that precise moment.
In "The Telephone Gambit," Seth Shulman makes the opposite case. Just before Bell had his famous conversation with Watson, Shulman points out, he visited the Patent Office in Washington. And the transmitter design that Bell immediately sketched in his notebook upon his return to Boston was identical to the sketch of the transmitter that Gray had submitted to the Patent Office. This could not be coincidence, Shulman concludes, and thereupon constructs an ingenious (and, it should be said, highly entertaining) revisionist account of Bell's invention, complete with allegations of corruption and romantic turmoil. Bell's telephone, he writes, is "one of the most consequential thefts in history."
But surely Gray and Bell occupied their scientific moment in the same way that Leibniz and Newton did. They arrived at electric speech by more or less the same pathway. They were trying to find a way to send more than one message at a time along a telegraph wire—which was then one of the central technological problems of the day. They had read the same essential sources—particularly the work of Philipp Reis, the German physicist who had come startlingly close to building a working telephone back in the early eighteen-sixties. The arguments of Bruce and Shulman suppose that great ideas are precious. It is too much for them to imagine that a discovery as remarkable as the telephone could arise in two places at once. But five people came up with the steamboat, and nine people came up with the telescope, and, if Gray had fallen into the Grand River along with Bell, some Joe Smith somewhere would likely have come up with the telephone instead and Ma Smith would have run the show. Good ideas are out there for anyone with the wit and the will to find them, which is how a group of people can sit down to dinner, put their minds to it, and end up with eight single-spaced pages of ideas.
6.
Last March, Myhrvold decided to do an invention session with Eric Leuthardt and several other physicians in St. Louis. Rod Hyde came, along with a scientist from M.I.T. named Ed Boyden. Wood was there as well.
"Lowell came in looking like the Cheshire Cat," Myhrvold recalled. "He said, 'I have a question for everyone. You have a tumor, and the tumor becomes metastatic, and it sheds metastatic cancer cells. How long do those circulate in the bloodstream before they land?' And we all said, 'We don't know. Ten times?' 'No,' he said. 'As many as a million times.' Isn't that amazing? If you had no time, you'd be screwed. But it turns out that these cells are in your blood for as long as a year before they land somewhere. What that says is that you've got a chance to intercept them."
How did Wood come to this conclusion? He had run across a stray fact in a recent issue of The New England Journal of Medicine. "It was an article that talked about, at one point, the number of cancer cells per millilitre of blood," he said. "And I looked at that figure and said, 'Something's wrong here. That can't possibly be true.' The number was incredibly high. Too high. It has to be one cell in a hundred litres, not what they were saying—one cell in a millilitre. Yet they spoke of it so confidently. I clicked through to the references. It was a commonplace. There really were that many cancer cells."
Wood did some arithmetic. He knew that human beings have only about five litres of blood. He knew that the heart pumps close to a hundred millilitres of blood per beat, which means that all of our blood circulates through our bloodstream in a matter of minutes. The New England Journal article was about metastatic breast cancer, and it seemed to Wood that when women die of metastatic breast cancer they don't die with thousands of tumors. The vast majority of circulating cancer cells don't do anything.
"It turns out that some small per cent of tumor cells are actually the deadly " "; he went on. " Tumor stem cells are what really initiate metastases. And isn't it astonishing that they have to turn over at least ten thousand times before they can find a happy home? You naĂŻvely think it's once or twice or three times. Maybe five times at most. It isn't. In other words, metastatic cancer—the brand of cancer that kills us—is an amazingly hard thing to initiate. Which strongly suggests that if you tip things just a little bit you essentially turn off the process."
That was the idea that Wood presented to the room in St. Louis. From there, the discussion raced ahead. Myhrvold and his inventors had already done a lot of thinking about using tiny optical filters capable of identifying and zapping microscopic particles. They also knew that finding cancer cells in blood is not hard. They're often the wrong size or the wrong shape. So what if you slid a tiny filter into a blood vessel of a cancer patient? "You don't have to intercept very much of the blood for it to work," Wood went on. "Maybe one ten-thousandth of it. The filter could be put in a little tiny vein in the back of the hand, because that's all you need. Or maybe I intercept all of the blood, but then it doesn't have to be a particularly efficient filter."
Wood was a physicist, not a doctor, but that wasn't necessarily a liability, at this stage. ""People in biology and medicine don't do arithmetic," he said. He wasn't being critical of biologists and physicians: this was, after all, a man who read medical journals for fun. He meant that the traditions of medicine encouraged qualitative observation and interpretation. But what physicists do—out of sheer force of habit and training—is measure things and compare measurements, and do the math to put measurements in context. At that moment, while reading The New England Journal, Wood had the advantages of someone looking at a familiar fact with a fresh perspective.
That was also why Myhrvold had wanted to take his crew to St. Louis to meet with the surgeons. He likes to say that the only time a physicist and a brain surgeon meet is when the physicist is about to be cut open—and to his mind that made no sense. Surgeons had all kinds of problems that they didn't realize had solutions, and physicists had all kinds of solutions to things that they didn't realize were problems. At one point, Myhrvold asked the surgeons what, in a perfect world, would make their lives easier, and they said that they wanted an X-ray that went only skin deep. They wanted to know, before they made their first incision, what was just below the surface. When the Intellectual Ventures crew heard that, their response was amazement. "That's your dream? A subcutaneous X-ray? We can do that."
Insight could be orchestrated: that was the lesson. If someone who knew how to make a filter had a conversation with someone who knew a lot about cancer and with someone who read the medical literature like a physicist, then maybe you could come up with a cancer treatment. It helped as well that Casey Tegreene had a law degree, Lowell Wood had spent his career dreaming up weapons for the government, Nathan Myhrvold was a ball of fire, Edward Jung had walked across Texas. They had different backgrounds and temperaments and perspectives, and if you gave them something to think about that they did not ordinarily think about—like hurricanes, or jet engines, or metastatic cancer—you were guaranteed a fresh set of eyes.
There were drawbacks to this approach, of course. The outsider, not knowing what the insider knew, would make a lot of mistakes and chase down a lot of rabbit holes. Myhrvold admits that many of the ideas that come out of the invention sessions come to naught. After a session, the Ph.D.s on the I.V. staff examine each proposal closely and decide which ones are worth pursuing. They talk to outside experts; they reread the literature. Myhrvold isn't even willing to guess what his company's most promising inventions are. "That's a fool's game," he says. If ideas are cheap, there is no point in making predictions, or worrying about failures, or obsessing, like Newton and Leibniz, or Bell and Gray, over who was first. After I.V. came up with its cancer-filter idea, it discovered that there was a company, based in Rochester, that was already developing a cancer filter. Filters were a multiple. But so what? If I.V.'s design wasn't the best, Myhrvold had two thousand nine hundred and ninety-nine other ideas to pursue.
In his living room, Myhrvold has a life-size T. rex skeleton, surrounded by all manner of other dinosaur artifacts. One of those is a cast of a nest of oviraptor eggs, each the size of an eggplant. You'd think a bird that big would have one egg, or maybe two. That's the general rule: the larger the animal, the lower the fecundity. But it didn't. For Myhrvold, it was one of the many ways in which dinosaurs could teach us about ourselves. "You know how many eggs were in that nest?" Myhrvold asked. "Thirty-two."
7.
In the nineteen-sixties, the sociologist Robert K. Merton wrote a famous essay on scientific discovery in which he raised the question of what the existence of multiples tells us about genius. No one is a partner to more multiples, he pointed out, than a genius, and he came to the conclusion that our romantic notion of the genius must be wrong. A scientific genius is not a person who does what no one else can do; he or she is someone who does what it takes many others to do. The genius is not a unique source of insight; he is merely an efficient source of insight. "Consider the case of Kelvin, by way of illustration," Merton writes, summarizing work he had done with his Columbia colleague Elinor Barber:
After examining some 400 of his 661 scientific communications and addresses . . . Dr. Elinor Barber and I find him testifying to at least 32 multiple discoveries in which he eventually found that his independent discoveries had also been made by others. These 32 multiples involved an aggregate of 30 other scientists, some, like Stokes, Green, Helmholtz, Cavendish, Clausius, Poincaré, Rayleigh, themselves men of undeniable genius, others, like Hankel, Pfaff, Homer Lane, Varley and Lamé, being men of talent, no doubt, but still not of the highest order. . . . For the hypothesis that each of these discoveries was destined to find expression, even if the genius of Kelvin had not obtained, there is the best of traditional proof: each was in fact made by others. Yet Kelvin's stature as a genius remains undiminished. For it required a considerable number of others to duplicate these 32 discoveries which Kelvin himself made.
This is, surely, what an invention session is: it is Hankel, Pfaff, Homer Lane, Varley, and LamĂ© in a room together, and if you have them on your staff you can get a big chunk of 's discoveries, without ever needing to have Kelvin—which is fortunate, because, although there are plenty of Homer Lanes, Varleys, and Pfaffs in the world, there are very few Kelvins.
Merton's observation about scientific geniuses is clearly not true of artistic geniuses, however. You can't pool the talents of a dozen Salieris and get Mozart's Requiem. You can't put together a committee of really talented art students and get Matisse's "La Danse." A work of artistic genius is singular, and all the arguments over calculus, the accusations back and forth between the Bell and the Gray camps, and our persistent inability to come to terms with the existence of multiples are the result of our misplaced desire to impose the paradigm of artistic invention on a world where it doesn't belong. Shakespeare owned Hamlet because he created him, as none other before or since could. Alexander Graham Bell owned the telephone only because his patent application landed on the examiner's desk a few hours before Gray's. The first kind of creation was sui generis; the second could be re-created in a warehouse outside Seattle.
This is a confusing distinction, because we use the same words to describe both kinds of inventors, and the brilliant scientist is every bit as dazzling in person as the brilliant playwright. The unavoidable first response to Myhrvold and his crew is to think of them as a kind of dream team, but, of course, the fact that they invent as prodigiously and effortlessly as they do is evidence that they are not a dream team at all. You could put together an Intellectual Ventures in Los Angeles, if you wanted to, and Chicago, and New York and Baltimore, and anywhere you could find enough imagination, a fresh set of eyes, and a room full of Varleys and Pfaffs.
The statistician Stephen Stigler once wrote an elegant essay about the futility of the practice of eponymy in science—that is, the practice of naming a scientific discovery after its inventor. That's another idea inappropriately borrowed from the cultural realm. As Stigler pointed out, "It can be found that Laplace employed Fourier Transforms in print before Fourier published on the topic, that Lagrange presented Laplace Transforms before Laplace began his scientific career, that Poisson published the Cauchy distribution in 1824, twenty-nine years before Cauchy touched on it in an incidental manner, and that BienaymĂ© stated and proved the Chebychev Inequality a decade before and in greater generality than Chebychev's first work on the topic." For that matter, the Pythagorean theorem was known before Pythagoras; Gaussian distributions were not discovered by Gauss. The examples were so legion that Stigler declared the existence of Stigler's Law: "No scientific discovery is named after its original discoverer." There are just too many people with an equal shot at those ideas floating out there in the ether. We think we're pinning medals on heroes. In fact, we're pinning tails on donkeys.
Stigler's Law was true, Stigler gleefully pointed out, even of Stigler's Law itself. The idea that credit does not align with discovery, he reveals at the very end of his essay, was in fact first put forth by Merton. "We may expect," Stigler concluded, "that in years to come, Robert K. Merton, and his colleagues and students, will provide us with answers to these and other questions regarding eponymy, completing what, but for the Law, would be called the Merton Theory of the reward system of science."
8.
In April, Lowell Wood was on the East Coast for a meeting of the Hertz Foundation fellows in Woods Hole. Afterward, he came to New York to make a pilgrimage to the American Museum of Natural History. He had just half a day, so he began right away in the Dinosaur Halls. He spent what he later described as a "ridiculously prolonged" period of time at the first station in the Ornithischian Hall—the ankylosaurus shrine. He knew it by heart. His next stop was the dimetrodon, the progenitor of Mammalia. This was a family tradition. When Wood first took his daughter to the museum, she dubbed the fossil "Great Grand-Uncle Dimetrodon," and they always paid their respects to it. Next, he visited a glyptodont; this creature was the only truly armored mammal, a fact of great significance to a former weaponeer.
He then wandered into the Vertebrate Origins gallery and, for the hundredth time, wondered about the strange openings that Archosauria had in front of their eyes and behind their nostrils. They had to be for breathing, didn't they? He tried to come up with an alternate hypothesis, and couldn't—but then he couldn't come up with a way to confirm his own hunch, either. It was a puzzle. Perhaps someday he would figure it out. Perhaps someone else would. Or perhaps someone would find another skeleton that shed light on the mystery. Nathan Myhrvold and Jack Horner had branched out from Montana, and at the end of the summer were going to Mongolia, to hunt in the Gobi desert. There were a lot more bones where these came from.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
Late Bloomers
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
October 20, 2008
Annals of Culture
Why do we equate genius with precocity?
1.
Ben Fountain was an associate in the real-estate practice at the Dallas offices of Akin, Gump, Strauss, Hauer & Feld, just a few years out of law school, when he decided he wanted to write fiction. The only thing Fountain had ever published was a law-review article. His literary training consisted of a handful of creative-writing classes in college. He had tried to write when he came home at night from work, but usually he was too tired to do much. He decided to quit his job.
"I was tremendously apprehensive," Fountain recalls. "I felt like I'd stepped off a cliff and I didn't know if the parachute was going to open. Nobody wants to waste their life, and I was doing well at the practice of law. I could have had a good career. And my parents were very proud of me—my dad was so proud of me. . . . It was crazy."
He began his new life on a February morning—a Monday. He sat down at his kitchen table at 7:30 A.M. He made a plan. Every day, he would write until lunchtime. Then he would lie down on the floor for twenty minutes to rest his mind. Then he would return to work for a few more hours. He was a lawyer. He had discipline. "I figured out very early on that if I didn't get my writing done I felt terrible. So I always got my writing done. I treated it like a job. I did not procrastinate." His first story was about a stockbroker who uses inside information and crosses a moral line. It was sixty pages long and took him three months to write. When he finished that story, he went back to work and wrote another—and then another.
In his first year, Fountain sold two stories. He gained confidence. He wrote a novel. He decided it wasn't very good, and he ended up putting it in a drawer. Then came what he describes as his dark period, when he adjusted his expectations and started again. He got a short story published in Harper's. A New York literary agent saw it and signed him up. He put together a collection of short stories titled "Brief Encounters with Che Guevara," and Ecco, a HarperCollins imprint, published it. The reviews were sensational. The Times Book Review called it "heartbreaking." It won the Hemingway Foundation/PEN award. It was named a No. 1 Book Sense Pick. It made major regional best-seller lists, was named one of the best books of the year by the San Francisco Chronicle, the Chicago Tribune, and Kirkus Reviews, and drew comparisons to Graham Greene, Evelyn Waugh, Robert Stone, and John le Carré.
Ben Fountain's rise sounds like a familiar story: the young man from the provinces suddenly takes the literary world by storm. But Ben Fountain's success was far from sudden. He quit his job at Akin, Gump in 1988. For every story he published in those early years, he had at least thirty rejections. The novel that he put away in a drawer took him four years. The dark period lasted for the entire second half of the nineteen-nineties. His breakthrough with "Brief " came in 2006, eighteen years after he first sat down to write at his kitchen table. The "young" writer from the provinces took the literary world by storm at the age of forty-eight.
2.
Genius, in the popular conception, is inextricably tied up with precocity—doing something truly creative, we're inclined to think, requires the freshness and exuberance and energy of youth. Orson Welles made his masterpiece, "Citizen Kane," at twenty-five. Herman Melville wrote a book a year through his late twenties, culminating, at age thirty-two, with "Moby-Dick." Mozart wrote his breakthrough Piano Concerto No. 9 in E-Flat-Major at the age of twenty-one. In some creative forms, like lyric poetry, the importance of precocity has hardened into an iron law. How old was T. S. Eliot when he wrote "The Love Song of J. Alfred Prufrock" ("I grow old . . . I grow old")? Twenty-three. "Poets peak young," the creativity researcher James Kaufman maintains. Mihály CsĂkszentmihályi, the author of "Flow," agrees: "The most creative lyric verse is believed to be that written by the young." According to the Harvard psychologist Howard Gardner, a leading authority on creativity, "Lyric poetry is a domain where talent is discovered early, burns brightly, and then peters out at an early age."
A few years ago, an economist at the University of Chicago named David Galenson decided to find out whether this assumption about creativity was true. He looked through forty-seven major poetry anthologies published since 1980 and counted the poems that appear most frequently. Some people, of course, would quarrel with the notion that literary merit can be quantified. But Galenson simply wanted to poll a broad cross-section of literary scholars about which poems they felt were the most important in the American canon. The top eleven are, in order, T. S. Eliot's "Prufrock," Robert Lowell's "Skunk Hour," Robert Frost's "Stopping by Woods on a Snowy Evening," William Carlos Williams's "Red Wheelbarrow," Elizabeth Bishop's "The Fish," Ezra Pound's "The River Merchant's Wife," Sylvia Plath's "Daddy," Pound's "In a Station of the Metro," Frost's "Mending Wall," Wallace Stevens's "The Snow Man," and Williams's "The Dance." Those eleven were composed at the ages of twenty-three, forty-one, forty-eight, forty, twenty-nine, thirty, thirty, twenty-eight, thirty-eight, forty-two, and fifty-nine, respectively. There is no evidence, Galenson concluded, for the notion that lyric poetry is a young person's game. Some poets do their best work at the beginning of their careers. Others do their best work decades later. Forty-two per cent of Frost's anthologized poems were written after the age of fifty. For Williams, it's forty-four per cent. For Stevens, it's forty-nine per cent.
The same was true of film, Galenson points out in his study "Old Masters and Young Geniuses: The Two Life Cycles of Artistic Creativity." Yes, there was Orson Welles, peaking as a director at twenty-five. But then there was Alfred Hitchcock, who made "Dial M for Murder," "Rear Window," "To Catch a Thief," "The Trouble with Harry," "Vertigo," "North by Northwest," and "Psycho"—one of the greatest runs by a director in history—between his fifty-fourth and sixty-first birthdays. Mark Twain published "Adventures of Huckleberry Finn" at forty-nine. Daniel Defoe wrote "Robinson Crusoe" at fifty-eight.
The examples that Galenson could not get out of his head, however, were Picasso and CĂ©zanne. He was an art lover, and he knew their stories well. Picasso was the incandescent prodigy. His career as a serious artist began with a masterpiece, "Evocation: The Burial of Casagemas," produced at age twenty. In short order, he painted many of the greatest works of his career—including "Les Demoiselles d'Avignon," at the age of twenty-six. Picasso fit our usual ideas about genius perfectly.
CĂ©zanne didn't. If you go to the CĂ©zanne room at the MusĂ©e d'Orsay, in Paris—the finest collection of CĂ©zannes in the world—the array of masterpieces you'll find along the back wall were all painted at the end of his career. Galenson did a simple economic analysis, tabulating the prices paid at auction for paintings by Picasso and CĂ©zanne with the ages at which they created those works. A painting done by Picasso in his mid-twenties was worth, he found, an average of four times as much as a painting done in his sixties. For CĂ©zanne, the opposite was true. The paintings he created in his mid-sixties were valued fifteen times as highly as the paintings he created as a young man. The freshness, exuberance, and energy of youth did little for CĂ©zanne. He was a late bloomer—and for some reason in our accounting of genius and creativity we have forgotten to make sense of the CĂ©zannes of the world.
3.
The first day that Ben Fountain sat down to write at his kitchen table went well. He knew how the story about the stockbroker was supposed to start. But the second day, he says, he "completely freaked out." He didn't know how to describe things. He felt as if he were back in first grade. He didn't have a fully formed vision, waiting to be emptied onto the page. "I had to create a mental image of a building, a room, a façade, haircut, clothes—just really basic things," he says. "I realized I didn't have the facility to put those into words. I started going out and buying visual dictionaries, architectural dictionaries, and going to school on those."
He began to collect articles about things he was interested in, and before long he realized that he had developed a fascination with Haiti. "The Haiti file just kept getting bigger and bigger," Fountain says. "And I thought, O.K., here's my novel. For a month or two I said I really don't need to go there, I can imagine everything. But after a couple of months I thought, Yeah, you've got to go there, and so I went, in April or May of '91."
He spoke little French, let alone Haitian Creole. He had never been abroad. Nor did he know anyone in Haiti. "I got to the hotel, walked up the stairs, and there was this guy standing at the top of the stairs," Fountain recalls. "He said, 'My name is Pierre. You need a guide.' I said, 'You're sure as hell right, I do.' He was a very genuine person, and he realized pretty quickly I didn't want to go see the girls, I didn't want drugs, I didn't want any of that other stuff," Fountain went on. "And then it was, boom, 'I can take you there. I can take you to this person.' "
Fountain was riveted by Haiti. "It's like a laboratory, almost," he says. "Everything that's gone on in the last five hundred years—colonialism, race, power, politics, ecological disasters—it's all there in very concentrated form. And also I just felt, viscerally, pretty comfortable there." He made more trips to Haiti, sometimes for a week, sometimes for two weeks. He made friends. He invited them to visit him in Dallas. ("You haven't lived until you've had Haitians stay in your house," Fountain says.) "I mean, I was involved. I couldn't just walk away. There's this very nonrational, nonlinear part of the whole process. I had a pretty specific time era that I was writing about, and certain things that I needed to know. But there were other things I didn't really need to know. I met a fellow who was with Save the Children, and he was on the Central Plateau, which takes about twelve hours to get to on a bus, and I had no reason to go there. But I went up there. Suffered on that bus, and ate dust. It was a hard trip, but it was a glorious trip. It had nothing to do with the book, but it wasn't wasted knowledge."
In "Brief Encounters with Che Guevara," four of the stories are about Haiti, and they are the strongest in the collection. They feel like Haiti; they feel as if they've been written from the inside looking out, not the outside looking in. "After the novel was done, I don't know, I just felt like there was more for me, and I could keep going, keep going deeper there," Fountain recalls. "Always there's something—always something—here for me. How many times have I been? At least thirty times."
Prodigies like Picasso, Galenson argues, rarely engage in that kind of open-ended exploration. They tend to be "conceptual," Galenson says, in the sense that they start with a clear idea of where they want to go, and then they execute it. "I can hardly understand the importance given to the word 'research,' " Picasso once said in an interview with the artist Marius de Zayas. "In my opinion, to search means nothing in painting. To find is the thing." He continued, "The several manners I have used in my art must not be considered as an evolution or as steps toward an unknown ideal of painting. . . . I have never made trials or experiments."
But late bloomers, Galenson says, tend to work the other way around. Their approach is experimental. "Their goals are imprecise, so their procedure is tentative and incremental," Galenson writes in "Old Masters and Young Geniuses," and he goes on:
The imprecision of their goals means that these artists rarely feel they have succeeded, and their careers are consequently often dominated by the pursuit of a single objective. These artists repeat themselves, painting the same subject many times, and gradually changing its treatment in an experimental process of trial and error. Each work leads to the next, and none is generally privileged over others, so experimental painters rarely make specific preparatory sketches or plans for a painting. They consider the production of a painting as a process of searching, in which they aim to discover the image in the course of making it; they typically believe that learning is a more important goal than making finished paintings. Experimental artists build their skills gradually over the course of their careers, improving their work slowly over long periods. These artists are perfectionists and are typically plagued by frustration at their inability to achieve their goal.
Where Picasso wanted to find, not search, CĂ©zanne said the opposite: "I seek in painting."
An experimental innovator would go back to Haiti thirty times. That's how that kind of mind figures out what it wants to do. When CĂ©zanne was painting a portrait of the critic Gustave Geffroy, he made him endure eighty sittings, over three months, before announcing the project a failure. (The result is one of that string of masterpieces in the MusĂ©e ''Orsay.) When CĂ©zanne painted his dealer, Ambrose Vollard, he made Vollard arrive at eight in the morning and sit on a rickety platform until eleven-thirty, without a break, on a hundred and fifty occasions—before abandoning the portrait. He would paint a scene, then repaint it, then paint it again. He was notorious for slashing his canvases to pieces in fits of frustration.
Mark Twain was the same way. Galenson quotes the literary critic Franklin Rogers on Twain's trial-and-error method: "His routine procedure seems to have been to start a novel with some structural plan which ordinarily soon proved defective, whereupon he would cast about for a new plot which would overcome the difficulty, rewrite what he had already written, and then push on until some new defect forced him to repeat the process once again." Twain fiddled and despaired and revised and gave up on "Huckleberry Finn" so many times that the book took him nearly a decade to complete. The CĂ©zannes of the world bloom late not as a result of some defect in character, or distraction, or lack of ambition, but because the kind of creativity that proceeds through trial and error necessarily takes a long time to come to fruition.
One of the best stories in "Brief Encounters" is called "Near-Extinct Birds of the Central Cordillera." It's about an ornithologist taken hostage by the FARC guerrillas of Colombia. Like so much of Fountain's work, it reads with an easy grace. But there was nothing easy or graceful about its creation. "I struggled with that story," Fountain says. "I always try to do too much. I mean, I probably wrote five hundred pages of it in various incarnations." Fountain is at work right now on a novel. It was supposed to come out this year. It's late.
4.
Galenson's idea that creativity can be divided into these types—conceptual and experimental—has a number of important implications. For example, we sometimes think of late bloomers as late starters. They don't realize they're good at something until they're fifty, so of course they achieve late in life. But that's not quite right. CĂ©zanne was painting almost as early as Picasso was. We also sometimes think of them as artists who are discovered late; the world is just slow to appreciate their gifts. In both cases, the assumption is that the prodigy and the late bloomer are fundamentally the same, and that late blooming is simply genius under conditions of market failure. What Galenson's argument suggests is something else—that late bloomers bloom late because they simply aren't much good until late in their careers.
"All these qualities of his inner vision were continually hampered and obstructed by CĂ©zanne's incapacity to give sufficient verisimilitude to the personae of his drama," the great English art critic Roger Fry wrote of the early CĂ©zanne. "With all his rare endowments, he happened to lack the comparatively common gift of illustration, the gift that any draughtsman for the illustrated papers learns in a school of commercial art; whereas, to realize such visions as CĂ©zanne's required this gift in high degree." In other words, the young CĂ©zanne couldn't draw. Of "The Banquet," which CĂ©zanne painted at thirty-one, Fry writes, "It is no use to deny that CĂ©zanne has made a very poor job of it." Fry goes on, "More happily endowed and more integral personalities have been able to express themselves harmoniously from the very first. But such rich, complex, and conflicting natures as CĂ©zanne's require a long period of fermentation." CĂ©zanne was trying something so elusive that he couldn't master it until he'd spent decades practicing.
This is the vexing lesson of Fountain's long attempt to get noticed by the literary world. On the road to great achievement, the late bloomer will resemble a failure: while the late bloomer is revising and despairing and changing course and slashing canvases to ribbons after months or years, what he or she produces will look like the kind of thing produced by the artist who will never bloom at all. Prodigies are easy. They advertise their genius from the get-go. Late bloomers are hard. They require forbearance and blind faith. (Let's just be thankful that CĂ©zanne didn't have a guidance counsellor in high school who looked at his primitive sketches and told him to try accounting.) Whenever we find a late bloomer, we can't but wonder how many others like him or her we have thwarted because we prematurely judged their talents. But we also have to acccept that there's nothing we can do about it. How can we ever know which of the failures will end up blooming?
Not long after meeting Ben Fountain, I went to see the novelist Jonathan Safran Foer, the author of the 2002 best-seller "Everything Is Illuminated." Fountain is a graying man, slight and modest, who looks, in the words of a friend of his, like a "golf pro from Augusta, Georgia." Foer is in his early thirties and looks barely old enough to drink. Fountain has a softness to him, as if years of struggle have worn away whatever sharp edges he once had. Foer gives the impression that if you touched him while he was in full conversational flight you would get an electric shock.
"I came to writing really by the back door," Foer said. "My wife is a writer, and she grew up keeping journals—you know, parents said, 'Lights out, time for bed,' and she had a little flashlight under the covers, reading books. I don't think I read a book until much later than other people. I just wasn't interested in it."
Foer went to Princeton and took a creative-writing class in his freshman year with Joyce Carol Oates. It was, he explains, "sort of on a whim, maybe out of a sense that I should have a diverse course load." He'd never written a story before. "I didn't really think anything of it, to be honest, but halfway through the semester I arrived to class early one day, and she said, 'Oh, I'm glad I have this chance to talk to you. I'm a fan of your writing.' And it was a real revelation for me."
Oates told him that he had the most important of writerly qualities, which was energy. He had been writing fifteen pages a week for that class, an entire story for each seminar. "Why does a dam with a crack in it leak so much?" he said, with a laugh. "There was just something in me, there was like a pressure."
As a sophomore, he took another creative-writing class. During the following summer, he went to Europe. He wanted to find the village in Ukraine where his grandfather had come from. After the trip, he went to Prague. There he read Kafka, as any literary undergraduate would, and sat down at his computer.
"I was just writing," he said. "I didn't know that I was writing until it was happening. I didn't go with the intention of writing a book. I wrote three hundred pages in ten weeks. I really wrote. I'd never done it like that."
It was a novel about a boy named Jonathan Safran Foer who visits the village in Ukraine where his grandfather had come from. Those three hundred pages were the first draft of "Everything Is Illuminated"—the exquisite and extraordinary novel that established Foer as one of the most distinctive literary voices of his generation. He was nineteen years old.
Foer began to talk about the other way of writing books, where you painstakingly honed your craft, over years and years. "I couldn't do that," he said. He seemed puzzled by it. It was clear that he had no understanding of how being an experimental innovator would work. "I mean, imagine if the craft you're trying to learn is to be an original. How could you learn the craft of being an original?"
He began to describe his visit to Ukraine. "I went to the shtetl where my family came from. It's called Trachimbrod, the name I use in the book. It's a real place. But you know what's funny? It's the single piece of research that made its way into the book." He wrote the first sentence, and he was proud of it, and then he went back and forth in his mind about where to go next. "I spent the first week just having this debate with myself about what to do with this first sentence. And once I made the decision, I felt liberated to just create—and it was very explosive after that."
If you read "Everything Is Illuminated," you end up with the same feeling you get when you read "Brief Encounters with Che Guevara"—the sense of transport you experience when a work of literature draws you into its own world. Both are works of art. It's just that, as artists, Fountain and Foer could not be less alike. Fountain went to Haiti thirty times. Foer went to Trachimbrod just once. "I mean, it was nothing," Foer said. "I had absolutely no experience there at all. It was just a springboard for my book. It was like an empty swimming pool that had to be filled up." Total time spent getting inspiration for his novel: three days.
5.
Ben Fountain did not make the decision to quit the law and become a writer all by himself. He is married and has a family. He met his wife, Sharon, when they were both in law school at Duke. When he was doing real-estate work at Akin, Gump, she was on the partner track in the tax practice at Thompson & Knight. The two actually worked in the same building in downtown Dallas. They got married in 1985, and had a son in April of 1987. Sharie, as Fountain calls her, took four months of maternity leave before returning to work. She made partner by the end of that year.
"We had our son in a day care downtown," she recalls. "We would drive in together, one of us would take him to day care, the other one would go to work. One of us would pick him up, and then, somewhere around eight o'clock at night, we would have him bathed, in bed, and then we hadn't even eaten yet, and we'd be looking at each other, going, 'This is just the beginning.' " She made a face. "That went on for maybe a month or two, and Ben's like, 'I don't know how people do this.' We both agreed that continuing at that pace was probably going to make us all miserable. Ben said to me, 'Do you want to stay home?' Well, I was pretty happy in my job, and he wasn't, so as far as I was concerned it didn't make any sense for me to stay home. And I didn't have anything besides practicing law that I really wanted to do, and he did. So I said, 'Look, can we do this in a way that we can still have some day care and so you can write?' And so we did that."
Ben could start writing at seven-thirty in the morning because Sharie took their son to day care. He stopped working in the afternoon because that was when he had to pick him up, and then he did the shopping and the household chores. In 1989, they had a second child, a daughter. Fountain was a full-fledged North Dallas stay-at-home dad.
"When Ben first did this, we talked about the fact that it might not work, and we talked about, generally, 'When will we know that it really isn't working?' and I'd say, 'Well, give it ten years,' " Sharie recalled. To her, ten years didn't seem unreasonable. "It takes a while to decide whether you like something or not," she says. And when ten years became twelve and then fourteen and then sixteen, and the kids were off in high school, she stood by him, because, even during that long stretch when Ben had nothing published at all, she was confident that he was getting better. She was fine with the trips to Haiti, too. "I can't imagine writing a novel about a place you haven't at least tried to visit," she says. She even went with him once, and on the way into town from the airport there were people burning tires in the middle of the road.
"I was making pretty decent money, and we didn't need two incomes," Sharie went on. She has a calm, unflappable quality about her. "I mean, it would have been nice, but we could live on one."
Sharie was Ben's wife. But she was also—to borrow a term from long ago—his patron. That word has a condescending edge to it today, because we think it far more appropriate for artists (and everyone else for that matter) to be supported by the marketplace. But the marketplace works only for people like Jonathan Safran Foer, whose art emerges, fully realized, at the beginning of their career, or Picasso, whose talent was so blindingly obvious that an art dealer offered him a hundred-and-fifty-franc-a-month stipend the minute he got to Paris, at age twenty. If you are the type of creative mind that starts without a plan, and has to experiment and learn by doing, you need someone to see you through the long and difficult time it takes for your art to reach its true level.
This is what is so instructive about any biography of Cézanne. Accounts of his life start out being about Cézanne, and then quickly turn into the story of Cézanne's circle. First and foremost is always his best friend from childhood, the writer Émile Zola, who convinces the awkward misfit from the provinces to come to Paris, and who serves as his guardian and protector and coach through the long, lean years.
Here is Zola, already in Paris, in a letter to the young CĂ©zanne back in Provence. Note the tone, more paternal than fraternal:
You ask me an odd question. Of course one can work here, as anywhere else, if one has the will. Paris offers, further, an advantage you can't find elsewhere: the museums in which you can study the old masters from 11 to 4. This is how you must divide your time. From 6 to 11 you go to a studio to paint from a live model; you have lunch, then from 12 to 4 you copy, in the Louvre or the Luxembourg, whatever masterpiece you like. That will make up nine hours of work. I think that ought to be enough.
Zola goes on, detailing exactly how CĂ©zanne could manage financially on a monthly stipend of a hundred and twenty-five francs:
I'll reckon out for you what you should spend. A room at 20 francs a month; lunch at 18 sous and dinner at 22, which makes two francs a day, or 60 francs a month. . . . Then you have the studio to pay for: the Atelier Suisse, one of the least expensive, charges, I think, 10 francs. Add 10 francs for canvas, brushes, colors; that makes 100. So you'll have 25 francs left for laundry, light, the thousand little needs that turn up.
Camille Pissarro was the next critical figure in CĂ©zanne's life. It was Pissarro who took CĂ©zanne under his wing and taught him how to be a painter. For years, there would be periods in which they went off into the country and worked side by side.
Then there was Ambrose Vollard, the sponsor of CĂ©zanne's first one-man show, at the age of fifty-six. At the urging of Pissarro, Renoir, Degas, and Monet, Vollard hunted down CĂ©zanne in Aix. He spotted a still-life in a tree, where it had been flung by CĂ©zanne in disgust. He poked around the town, putting the word out that he was in the market for CĂ©zanne's canvases. In "Lost Earth: A Life of CĂ©zanne," the biographer Philip Callow writes about what happened next:
Before long someone appeared at his hotel with an object wrapped in a cloth. He sold the picture for 150 francs, which inspired him to trot back to his house with the dealer to inspect several more magnificent CĂ©zannes. Vollard paid a thousand francs for the job lot, then on the way out was nearly hit on the head by a canvas that had been overlooked, dropped out the window by the man's wife. All the pictures had been gathering dust, half buried in a pile of junk in the attic.
All this came before Vollard agreed to sit a hundred and fifty times, from eight in the morning to eleven-thirty, without a break, for a picture that CĂ©zanne disgustedly abandoned. Once, Vollard recounted in his memoir, he fell asleep, and toppled off the makeshift platform. CĂ©zanne berated him, incensed: "Does an apple move?" This is called friendship.
Finally, there was CĂ©zanne's father, the banker Louis-Auguste. From the time CĂ©zanne first left Aix, at the age of twenty-two, Louis-Auguste paid his bills, even when CĂ©zanne gave every indication of being nothing more than a failed dilettante. But for Zola, CĂ©zanne would have remained an unhappy banker's son in Provence; but for Pissarro, he would never have learned how to paint; but for Vollard (at the urging of Pissarro, Renoir, Degas, and Monet), his canvases would have rotted away in some attic; and, but for his father, CĂ©zanne's long apprenticeship would have been a financial impossibility. That is an extraordinary list of patrons. The first three—Zola, Pissarro, and Vollard—would have been famous even if CĂ©zanne never existed, and the fourth was an unusually gifted entrepreneur who left CĂ©zanne four hundred thousand francs when he died. CĂ©zanne didn't just have help. He had a dream team in his corner.
This is the final lesson of the late bloomer: his or her success is highly contingent on the efforts of others. In biographies of CĂ©zanne, Louis-Auguste invariably comes across as a kind of grumpy philistine, who didn't appreciate his son's genius. But Louis-Auguste didn't have to support CĂ©zanne all those years. He would have been within his rights to make his son get a real job, just as Sharie might well have said no to her husband's repeated trips to the chaos of Haiti. She could have argued that she had some right to the life style of her profession and status—that she deserved to drive a BMW, which is what power couples in North Dallas drive, instead of a Honda Accord, which is what she settled for.
But she believed in her husband's art, or perhaps, more simply, she believed in her husband, the same way Zola and Pissarro and Vollard and—in his own, querulous way—Louis-Auguste must have believed in CĂ©zanne. Late bloomers' stories are invariably love stories, and this may be why we have such difficulty with them. We'd like to think that mundane matters like loyalty, steadfastness, and the willingness to keep writing checks to support what looks like failure have nothing to do with something as rarefied as genius. But sometimes genius is anything but rarefied; sometimes it's just the thing that emerges after twenty years of working at your kitchen table.
"Sharie never once brought up money, not once—never," Fountain said. She was sitting next to him, and he looked at her in a way that made it plain that he understood how much of the credit for "Brief Encounters" belonged to his wife. His eyes welled up with tears. "I never felt any pressure from her," he said. "Not even covert, not even implied."
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
The Uses of Adversity
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
November 10, 2008
Annals of Business
Can underprivileged outsiders have an advantage?
1.
Sidney Weinberg was born in 1891, one of eleven children of Pincus Weinberg, a struggling Polish-born liquor wholesaler and bootlegger in Brooklyn. Sidney was short, a "Kewpie doll," as the New Yorker writer E. J. Kahn, Jr., described him, "in constant danger of being swallowed whole by executive-size chairs." He pronounced his name "Wine-boig." He left school at fifteen. He had scars on his back from knife fights in his preteen days, when he sold evening newspapers at the Hamilton Avenue terminus of the Manhattan-Brooklyn ferry.
At sixteen, he made a visit to Wall Street, keeping an eye out for a "nice-looking, tall building," as he later recalled. He picked 43 Exchange Place, where he started at the top floor and worked his way down, asking at every office, "Want a boy?" By the end of the day, he had reached the third-floor offices of a small brokerage house. There were no openings. He returned to the brokerage house the next morning. He lied that he was told to come back, and bluffed himself into a job assisting the janitor, for three dollars a week. The small brokerage house was Goldman Sachs.
From that point, Charles Ellis tells us in a new book, "The Partnership: The Making of Goldman Sachs," Weinberg's rise was inexorable. Early on, he was asked to carry a flagpole on the trolley uptown to the Sachs family's town house. The door was opened by Paul Sachs, the grandson of the firm's founder, and Sachs took a shine to him. Weinberg was soon promoted to the mailroom, which he promptly reorganized. Sachs sent him to Browne's Business College, in Brooklyn, to learn penmanship. By 1925, the firm had bought him a seat on the New York Stock Exchange. By 1927, he had made partner. By 1930, he was a senior partner, and for the next thirty-nine years—until his death, in 1969—Weinberg was Goldman Sachs, turning it from a floundering, mid-tier partnership into the premier investment bank in the world.
2.
The rags-to-riches story—that staple of American biography—has over the years been given two very different interpretations. The nineteenth-century version stressed the value of compensating for disadvantage. If you wanted to end up on top, the thinking went, it was better to start at the bottom, because it was there that you learned the discipline and motivation essential for success. "New York merchants preferred to hire country boys, on the theory that they worked harder, and were more resolute, obedient, and cheerful than native New Yorkers," Irvin G. Wyllie wrote in his 1954 study "The Self-Made Man in America." Andrew Carnegie, whose personal history was the defining self-made-man narrative of the nineteenth century, insisted that there was an advantage to being "cradled, nursed and reared in the stimulating school of poverty." According to Carnegie, "It is not from the sons of the millionaire or the noble that the world receives its teachers, its martyrs, its inventors, its statesmen, its poets, or even its men of affairs. It is from the cottage of the poor that all these spring."
Today, that interpretation has been reversed. Success is seen as a matter of capitalizing on socioeconomic advantage, not compensating for disadvantage. The mechanisms of social mobility—scholarships, affirmative action, housing vouchers, Head Start—all involve attempts to convert the poor from chronic outsiders to insiders, to rescue them from what is assumed to be a hopeless state. Nowadays, we don't learn from poverty, we escape from poverty, and a book like Ellis's history of Goldman Sachs is an almost perfect case study of how we have come to believe social mobility operates. Six hundred pages of Ellis's book are devoted to the modern-day Goldman, the firm that symbolized the golden era of Wall Street. From the boom years of the nineteen-eighties through the great banking bubble of the past decade, Goldman brought impeccably credentialled members of the cognitive and socioeconomic Ă©lite to Wall Street, where they conjured up fantastically complex deals and made enormous fortunes. The opening seventy-two pages of the book, however, the chapters covering the Sidney Weinberg years, seem as though they belong to a different era. The man who created what we know as Goldman Sachs was a poor, uneducated member of a despised minority—and his story is so remarkable that perhaps only Andrew Carnegie could make sense of it.
3.
Weinberg was not a financial wizard. His gifts were social. In his heyday, Weinberg served as a director on thirty-one corporate boards. He averaged two hundred and fifty committee or board meetings a year, and when he was not in meetings he would often take a steam at the Hotel Biltmore's Turkish baths with the likes of Robert Woodruff, of Coca-Cola, and Bernard Gimbel, of Gimbels. During the Depression, Weinberg served on Franklin Roosevelt's Business Advisory and Planning Council, and F.D.R. dubbed him the Politician, for his skill at mediating among contentious parties. He spent the war years as the vice-chairman of the War Production Board, where he was known as the Body Snatcher, because of the way he persuaded promising young business executives to join the war effort. (Weinberg seems to have been the first to realize that signing up promising young executives for public service during the war was the surest way to sign them up as clients after the war.)
When Ford Motor Company decided to go public, in the mid-nineteen-fifties, in what remains one of the world's biggest initial public offerings, both major parties in the hugely complicated transaction—the Ford family and the Ford Foundation—wanted Weinberg to represent them. He was Mr. Wall Street. "In his role as the power behind the throne," E. J. Kahn wrote in a New Yorker Profile of Weinberg, fifty years ago, "he probably comes as close as Bernard Baruch to embodying the popular conception of Bernard Baruch." Kahn went on:
There is hardly a prominent corporation executive of whom he cannot—and, indeed, does not—say, "He's an intimate close personal friend of mine." . . . Industrialists who want information about other industrialists automatically turn to Weinberg, much as merchants consult credit-rating agencies. His end of many telephone conversations consists of fragments like "Who? . . . Of course I know him. Intimately. . . . Used to be Under-Secretary of the Treasury. . . . O.K., I'll have him call you."
This gregariousness is what we expect of the head of an investment bank. Wall Street—particularly the clubby Wall Street of the early and middle part of the twentieth century—was a relationship business: you got to do the stock offering for Continental Can because you knew the head of Continental Can. We further assume that businesses based on social ties reward cultural insiders. That's one of the reasons we no longer think of poverty as being useful in the nineteenth-century sense; no matter how hard you work, or how disciplined you are, it is difficult to overcome the socially marginalizing effects of an impoverished background. In order to do the stock offering for Continental Can, you need to know the head of Continental Can, and in order to know the head of Continental Can it really helps to have been his classmate at Yale.
But Weinberg wasn't Yale. He was P.S. 13. Nor did he try to pretend that he was an insider. He did the opposite. "You'll have to make that plainer," he would say. "I'm just a dumb, uneducated kid from Brooklyn." He bought a modest house in Scarsdale in the nineteen-twenties, and lived there the rest of his life. He took the subway. He may have worked closely with the White House, but this was the Roosevelt White House, in the nineteen-thirties, at a time when none of the Old Guard on Wall Street were New Dealers. Weinberg would talk about his public school as if it were Princeton, and as a joke he would buy up Phi Beta Kappa keys from pawnshops and hand them out to visitors like party favors. His savvy was such that Roosevelt wanted to make him Ambassador to the Soviet Union, and his grasp of the intricacies of Wall Street was so shrewd that his phone never stopped ringing. But as often as he could he reminded his peers that he was from the other side of the tracks.
At one board meeting, Ellis writes, "a long presentation was being made that was overloaded with dull, detailed statistics. Number after number was read off. When the droning presenter finally paused for breath, Weinberg jumped up, waving his papers in mock triumph, to call out 'Bingo!' " The immigrant's best strategy, in the famous adage, is to think Yiddish and dress British. Weinberg thought British and dressed Yiddish.
Why did that strategy work? This is the great mystery of Weinberg's career, and it's hard to escape the conclusion that Carnegie was on to something: there are times when being an outsider is precisely what makes you a good insider. It's not difficult to imagine, for example, that the head of Continental Can liked the fact that Weinberg was from nothing, in the same way that New York City employers preferred country boys to city boys. That C.E.O. dwelled in a world with lots of people who went to Yale and then to Wall Street; he knew that some of them were good at what they did and some of them were just well connected, and separating the able from the incompetent wasn't always easy. Weinberg made it out of Brooklyn; how could he not be good?
Weinberg's outsiderness also allowed him to play the classic "middleman minority" role. One of the reasons that the Parsi in India, the East Asians in Africa, the Chinese in Southeast Asia, and the Lebanese in the Caribbean, among others, have been so successful, sociologists argue, is that they are decoupled from the communities in which they operate. If you are a Malaysian in Malaysia, or a Kenyan in Kenya, or an African-American in Watts, and you want to run a grocery store, you start with a handicap: you have friends and relatives who want jobs, or discounts. You can't deny credit or collect a debt from your neighbor, because he's your neighbor, and your social and business lives are tied up together. As the anthropologist Brian Foster writes of commerce in Thailand:
A trader who was subject to the traditional social obligations and constraints would find it very difficult to run a viable business. If, for example, he were fully part of the village society and subject to the constraints of the society, he would be expected to be generous in the traditional way to those in need. It would be difficult for him to refuse credit, and it would not be possible to collect debts. . . . The inherent conflict of interest in a face-to-face market transaction would make proper etiquette impossible or would at least strain it severely, which is an important factor in Thai social relations.
The minority has none of those constraints. He's free to keep social and financial considerations separate. He can call a bad debt a bad debt, or a bad customer a bad customer, without worrying about the social implications of his honesty.
Weinberg was decoupled from the business establishment in the same way, and that seems to have been a big part of what drew executives to him. The chairman of General Foods avowed, "Sidney is the only man I know who could ever say to me in the middle of a board meeting, as he did once, 'I don't think you're very bright,' and somehow give me the feeling that I'd been paid a compliment." That Weinberg could make a rebuke seem like a compliment is testament to his charm. That he felt free to deliver the rebuke in the first place is testament to his sociological position. You can't tell the chairman of General Foods that he's an idiot if you were his classmate at Yale. But you can if you're Pincus Weinberg's son from Brooklyn. Truthtelling is easier from a position of cultural distance.
Here is Ellis on Weinberg, again:
Shortly after he was elected a director of General Electric, he was called upon by Philip D. Reed, GE's chairman of the board, to address a group of company officials at a banquet at the Waldorf Astoria. In presenting Weinberg, Reed said . . . that he hoped Mr. Weinberg felt, as he felt, that GE was the greatest outfit in the greatest industry in the greatest country in the world. Weinberg got to his feet. "I'll string along with your chairman about this being the greatest country," he began. "And I guess I'll even buy that part about the electrical industry. But as to GE's being the greatest business in the field, why, I'm damned if I'll commit myself until I've had a look-see." Then he sat down to vigorous applause.
At G.E., Weinberg's irreverence was cherished. During the Second World War, a top Vichy official, Admiral Jean-François Darlan, visited the White House. Darlan was classic French military, imperious and entitled, and was thought to have Nazi sympathies. Protocol dictated that the Allies treat Darlan with civility, and everyone did—save for Weinberg. The outsider felt perfectly free to say what everyone else wanted to but could not, and in so doing surely endeared himself to the whole room. "When it was time to leave," Ellis writes, "Weinberg reached into his pocket as he came to the front door, pulled out a quarter, and handed it to the resplendently uniformed admiral, saying, 'Here, boy, get me a cab.'"
The idea that outsiders can profit by virtue of their outsiderness runs contrary to our understanding of minorities. "Think Yiddish, dress British" presumes that the outsider is better off cloaking his differences. But there are clearly also times and places where minorities benefit by asserting and even exaggerating their otherness. The Berkeley historian Yuri Slezkine argues, in "The Jewish Century" (2004), that Yiddish did not evolve typically: if you study its form and structure, you discover its deliberate and fundamental artificiality—it is the language of people who are interested, in Slezkine's words, in "the maintenance of difference, the conscious preservation of the self and thus of strangeness."
Similarly, in field work in a Malaysian village, the anthropologist L. A. Peter Gosling observed a Chinese shopkeeper who
appeared to be considerably acculturated to Malay culture, and was scrupulously sensitive to Malays in every way, including the normal wearing of a sarong, quiet and polite Malay speech, and a humble and affable manner. However, at harvest time when he would go to the field to collect crops on which he had advanced credit, he would put on his Chinese costume of shorts and undershirt, and speak in a much more abrupt fashion, acting, as one Malay farmer put it, "just like a Chinese." This behavior was to insure that he would not be treated like a fellow Malay who might be expected to be more generous on price or credit terms.
Is this what Weinberg was up to with his constant references to P.S. 13? Ellis's book repeats stories about Weinberg from Lisa Endlich's 1999 history, "Goldman Sachs: The Culture of Success," which in turn repeats stories about Weinberg from Kahn's Profile, which in turn—one imagines—repeats stories honed by Weinberg and his friends over the years. And what is clear when you read those stories is how obviously they are stories: anecdotes clearly constructed for strategic effect. According to Ellis:
A friend told of Weinberg's being the guest of honor at J. P. Morgan's luncheon table, where the following exchange occurred: "Mr. Weinberg, I presume you served in the last war?"
"Yes, sir, I was in the war—in the navy."
"What were you in the navy?"
"Cook, Second Class."
Morgan was delighted.
Of course, J. P. Morgan wasn't actually delighted. He died in 1913, before the First World War started. So he wasn't the mogul at the table. But you can understand why Weinberg would want to pretend that he was. And although Weinberg did a stint as a cook (on account of poor eyesight), he quickly got himself transferred to the Office of Naval Intelligence, and then spent most of the war heading up the inspection of all vessels using the port of Norfolk. But you can understand why that little bit of additional history doesn't fit, either.
Here's another one:
The heir to a large retailing fortune once spent a night in Scarsdale with the Weinbergs and retired early. After Weinberg and his wife, whose only servant was a cook, had emptied the ashtrays and picked up the glasses, they noticed that their guest had put his suit and shoes outside his bedroom door. Amused, Weinberg took the suit and shoes down to the kitchen, cleaned the shoes, brushed the suit, and put them back. The following day, as the guest was leaving, he handed Weinberg a five dollar bill and asked him to pass it along to the butler who had taken such excellent care of things. Weinberg thanked him gravely and pocketed the money.
Let's see: we're supposed to believe that the retailing heir has dinner at the modest Weinberg residence in Scarsdale and never once sees a butler, and doesn't see a butler in the morning, either, and yet somehow remains convinced that there's a butler around. Did he imagine the butler was hiding in a closet? No matter. This is another of those stories which Weinberg needed to tell, and his audience needed to hear.
4.
It's one thing to argue that being an outsider can be strategically useful. But Andrew Carnegie went farther. He believed that poverty provided a better preparation for success than wealth did; that, at root, compensating for disadvantage was more useful, developmentally, than capitalizing on advantage.
This idea is both familiar and perplexing. Consider the curious fact that many successful entrepreneurs suffer from serious learning disabilities. Paul Orfalea, the founder of the Kinko's chain, was a D student who failed two grades, was expelled from four schools, and graduated at the bottom of his high-school class. "In third grade, the only word I could read was 'the,' " he says. "I used to keep track of where the group was reading by following from one 'the' to the next." Richard Branson, the British billionaire who started the Virgin empire, dropped out of school at fifteen after struggling with reading and writing. "I was always bottom of the class," he has said. John Chambers, who built the Silicon Valley firm Cisco into a hundred-billion-dollar corporation, has trouble reading e-mail. One of the pioneers of the cellular-phone industry, Craig McCaw, is dyslexic, as is Charles Schwab, the founder of the discount brokerage house that bears his name. When the business-school professor Julie Logan surveyed a group of American small-business owners recently, she found that thirty-five per cent of them self-identified as dyslexic.
That is a remarkable statistic. Dyslexia affects the very skills that lie at the center of an individual's ability to manage the modern world. Yet Schwab and Orfalea and Chambers and Branson seem to have made up for their disabilities, in the same way that the poor, in Carnegie's view, can make up for their poverty. Because of their difficulties with reading and writing, they were forced to develop superior oral-communication and problem-solving skills. Because they had to rely on others to help them navigate the written word, they became adept at delegating authority. In one study, conducted in Britain, eighty per cent of dyslexic entrepreneurs were found to have held the position of captain in a high-school sport, versus twenty-seven per cent of non-dyslexic entrepreneurs. They compensated for their academic shortcomings by developing superior social skills, and, when they reached the workplace, those compensatory skills gave them an enormous head start. "I didn't have a lot of self-confidence as a kid," Orfalea said once, in an interview. "And that is for the good. If you have a healthy dose of rejection in your life, you are going to have to figure out how to do it your way."
There's no question that we are less than comfortable with the claims that people like Schwab and Orfalea make on behalf of their disabilities. As impressive as their success has been, none of us would go so far as to wish dyslexia on our own children. If a disproportionately high number of entrepreneurs are dyslexic, so are a disproportionately high number of prisoners. Systems in which people compensate for disadvantage seem to us unacceptably Darwinian. The stronger get stronger, and the weaker get even weaker. The man who boasts of walking seven miles to school, barefoot, every morning, happily drives his own grandchildren ten blocks in an S.U.V. We have become convinced that the surest path to success for our children involves providing them with a carefully optimized educational experience: the "best" schools, the most highly educated teachers, the smallest classrooms, the shiniest facilities, the greatest variety of colors in the art-room paint box. But one need only look at countries where schoolchildren outperform their American counterparts—despite larger classes, shabbier schools, and smaller budgets—to wonder if our wholesale embrace of the advantages of advantages isn't as simplistic as Carnegie's wholesale embrace of the advantages of disadvantages.
In E. J. Kahn's Profile, he tells the story of a C.E.O. retreat that Weinberg attended, organized by Averell Harriman. It was at Sun Valley, Harriman's ski resort, where, Kahn writes, it emerged that Weinberg had never skied before:
Several corporation presidents pooled their cash resources to bet him twenty-five dollars that he could not ski down the steepest and longest slope in the area. Weinberg was approaching fifty but game. "I got hold of an instructor named Franz Something or Fritz Something and had a thirty minute lesson," he says. "Then I rode up to the top of the mountain. It took me half a day to come down, and I finished with only one ski, and for two weeks I was black and blue all over, but I won the bet."
Here you have the Waspy Ă©lite of corporate America, off in their mountain idyll, subjecting the little Jew from Brooklyn to a bit of boarding-school hazing. (In a reminder of the anti-Semitism permeating Weinberg's world, Ellis tells us that, in the Depression, Manufacturers Trust, a predominantly Jewish company, had to agree to install a Gentile as C.E.O. as a condition of being rescued by a coalition of banks.) It is also possible, though, to read that story as highlighting the determination of the Brooklyn kid who'll be damned if he's going to let himself lose a bet to those smirking C.E.O.s. One imagines that Weinberg told that tale the first way to his wife, and the second way to his buddies in the Biltmore steam room. And when he tried to get out of bed the next morning it probably occurred to him that sometimes being humiliated provides a pretty good opportunity to show a lodge full of potential clients that you would ski down a mountain for them.
Twenty years later, Weinberg had his greatest score, handling the initial public offering for Ford Motor Company, which was founded, of course, by that odious anti-Semite Henry Ford. Did taking the business prick Weinberg's conscience? Maybe so. But he probably realized that the unstated premise behind the idea that the Jews control all the banks is that Jews are really good bankers. The first was a stereotype that oppressed; the second was a stereotype that, if you were smart about it, you could use to win a few clients. If you're trying to build an empire, you work with what you have.
5.
In 1918, Henry Goldman, one of the senior partners of Goldman Sachs, quit the firm in a dispute over Liberty Bonds. Goldman was a Germanophile, who objected to aiding the Allied war effort. (This is the same Henry Goldman who later bought the twelve-year-old Yehudi Menuhin a Stradivarius and Albert Einstein a yacht.) The Sachs brothers—Walter and Arthur—were desperate for a replacement, and they settled, finally, on a young man named Waddill Catchings, a close friend of Arthur Sachs from Harvard. He had worked at Sullivan & Cromwell, Wall Street's great patrician law firm. He had industrial experience, having reorganized several companies, and "on top of all that," Ellis tells us, "Catchings was one of the most talented, charming, handsome, well-educated, and upwardly mobile people in Wall Street."
Catchings's bold idea was to create a huge investment trust, called the Goldman Sachs Trading Corporation. It was a precursor to today's hedge funds; it borrowed heavily to buy controlling stakes in groups of corporations. The fund was originally intended to be twenty-five million dollars, but then Catchings, swept up in the boom market of the nineteen-twenties, doubled it to fifty million, doubled it again to a hundred million, then merged the Goldman fund with another fund and added two subsidiary trusts, until G.S.T.C. controlled half a billion dollars in assets.
"Walter and Arthur Sachs were travelling in Europe during the summer of 1929," Ellis writes. "In Italy they learned of the deals Catchings was doing on his own, and Walter Sachs got worried. On his return to New York, he went straight to Catchings' apartment in the Plaza Hotel to urge greater caution. But Catchings, still caught up in the bull-market euphoria, was unmoved. "The trouble with you, Walter, is that you've no imagination," he said.
Then came the stock-market crash. G.S.T.C. stock, which had traded as high as three hundred and twenty-six dollars a share, fell to $1.75. Goldman's capital was wiped out. The firm was besieged with lawsuits, the last of which was not settled until 1968. Eddie Cantor—one of the most popular comedians of the day and a disgruntled G.S.T.C. investor—turned the respected Goldman name into a punch line: "They told me to buy the stock for my old age . . . and it worked perfectly. Within six months I felt like a very old man!" Catchings was ousted. "Very few men can stand success," Walter Sachs concluded. "He was not one of them." Privilege did not prepare Catchings for crisis. The Sachs brothers then replaced Catchings with a man who was not from privilege at all, and perhaps now we can appreciate the wisdom of that decision. Wall Street needs a few less Waddill Catchingses and a few more Sidney Weinbergs.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
Most Likely to Succeed
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
December 15, 2008
Annals of Education
How do we hire when we can't tell who's right for the job?
1.
On the day of the big football game between the University of Missouri Tigers and the Cowboys of Oklahoma State, a football scout named Dan Shonka sat in his hotel, in Columbia, Missouri, with a portable DVD player. Shonka has worked for three National Football League teams. Before that, he was a football coach, and before that he played linebacker—although, he says, "that was three knee operations and a hundred pounds ago." Every year, he evaluates somewhere between eight hundred and twelve hundred players around the country, helping professional teams decide whom to choose in the college draft, which means that over the last thirty years he has probably seen as many football games as anyone else in America. In his DVD player was his homework for the evening's big game—an edited video of the Tigers' previous contest, against the University of Nebraska Cornhuskers.
Shonka methodically made his way through the video, stopping and re-winding whenever he saw something that caught his eye. He liked Jeremy Maclin and Chase Coffman, two of the Mizzou receivers. He loved William Moore, the team's bruising strong safety. But, most of all, he was interested in the Tigers' quarterback and star, a stocky, strong-armed senior named Chase Daniel.
"I like to see that the quarterback can hit a receiver in stride, so he doesn't have to slow for the ball," Shonka began. He had a stack of evaluation forms next to him and, as he watched the game, he was charting and grading every throw that Daniel made. "Then judgment. Hey, if it's not there, throw it away and play another day. Will he stand in there and take a hit, with a guy breathing down his face? Will he be able to step right in there, throw, and still take that hit? Does the guy throw better when he's in the pocket, or does he throw equally well when he's on the move? You want a great competitor. Durability. Can they hold up, their strength, toughness? Can they make big plays? Can they lead a team down the field and score late in the game? Can they see the field? When your team's way ahead, that's fine. But when you're getting your ass kicked I want to see what you're going to do."
He pointed to his screen. Daniel had thrown a dart, and, just as he did, a defensive player had hit him squarely. "See how he popped up?" Shonka said. "He stood right there and threw the ball in the face of that rush. This kid has got a lot of courage." Daniel was six feet tall and weighed two hundred and twenty-five pounds: thick through the chest and trunk. He carried himself with a self-assurance that bordered on cockiness. He threw quickly and in rhythm. He nimbly evaded defenders. He made short throws with touch and longer throws with accuracy. By the game's end, he had completed an astonishing seventy-eight per cent of his passes, and handed Nebraska its worst home defeat in fifty-three years. "He can zip it," Shonka said. "He can really gun, when he has to." Shonka had seen all the promising college quarterbacks, charted and graded their throws, and to his mind Daniel was special: "He might be one of the best college quarterbacks in the country."
But then Shonka began to talk about when he was on the staff of the Philadelphia Eagles, in 1999. Five quarterbacks were taken in the first round of the college draft that year, and each looked as promising as Chase Daniel did now. But only one of them, Donovan McNabb, ended up fulfilling that promise. Of the rest, one descended into mediocrity after a decent start. Two were complete busts, and the last was so awful that after failing out of the N.F.L. he ended up failing out of the Canadian Football League as well.
The year before, the same thing happened with Ryan Leaf, who was the Chase Daniel of 1998. The San Diego Chargers made him the second player taken over all in the draft, and gave him an eleven-million-dollar signing bonus. Leaf turned out to be terrible. In 2002, it was Joey Harrington's turn. Harrington was a golden boy out of the University of Oregon, and the third player taken in the draft. Shonka still can't get over what happened to him.
"I tell you, I saw Joey live," he said. "This guy threw lasers, he could throw under tight spots, he had the arm strength, he had the size, he had the intelligence." Shonka got as misty as a two-hundred-and-eighty-pound ex-linebacker in a black tracksuit can get. "He's a concert pianist, you know? I really—I mean, I really—liked Joey." And yet Harrington's career consisted of a failed stint with the Detroit Lions and a slide into obscurity. Shonka looked back at the screen, where the young man he felt might be the best quarterback in the country was marching his team up and down the field. "How will that ability translate to the National Football League?" He shook his head slowly. "Shoot."
This is the quarterback problem. There are certain jobs where almost nothing you can learn about candidates before they start predicts how they'll do once they're hired. So how do we know whom to choose in cases like that? In recent years, a number of fields have begun to wrestle with this problem, but none with such profound social consequences as the profession of teaching.
2.
One of the most important tools in contemporary educational research is "value added" analysis. It uses standardized test scores to look at how much the academic performance of students in a given teacher's classroom changes between the beginning and the end of the school year. Suppose that Mrs. Brown and Mr. Smith both teach a classroom of third graders who score at the fiftieth percentile on math and reading tests on the first day of school, in September. When the students are retested, in June, Mrs. Brown's class scores at the seventieth percentile, while Mr. Smith's students have fallen to the fortieth percentile. That change in the students' rankings, value-added theory says, is a meaningful indicator of how much more effective Mrs. Brown is as a teacher than Mr. Smith.
It's only a crude measure, of course. A teacher is not solely responsible for how much is learned in a classroom, and not everything of value that a teacher imparts to his or her students can be captured on a standardized test. Nonetheless, if you follow Brown and Smith for three or four years, their effect on their students' test scores starts to become predictable: with enough data, it is possible to identify who the very good teachers are and who the very poor teachers are. What's more—and this is the finding that has galvanized the educational world—the difference between good teachers and poor teachers turns out to be vast.
Eric Hanushek, an economist at Stanford, estimates that the students of a very bad teacher will learn, on average, half a year's worth of material in one school year. The students in the class of a very good teacher will learn a year and a half's worth of material. That difference amounts to a year's worth of learning in a single year. Teacher effects dwarf school effects: your child is actually better off in a "bad" school with an excellent teacher than in an excellent school with a bad teacher. Teacher effects are also much stronger than class-size effects. You'd have to cut the average class almost in half to get the same boost that you'd get if you switched from an average teacher to a teacher in the eighty-fifth percentile. And remember that a good teacher costs as much as an average one, whereas halving class size would require that you build twice as many classrooms and hire twice as many teachers.
Hanushek recently did a back-of-the-envelope calculation about what even a rudimentary focus on teacher quality could mean for the United States. If you rank the countries of the world in terms of the academic performance of their schoolchildren, the U.S. is just below average, half a standard deviation below a clump of relatively high-performing countries like Canada and Belgium. According to Hanushek, the U.S. could close that gap simply by replacing the bottom six per cent to ten per cent of public-school teachers with teachers of average quality. After years of worrying about issues like school funding levels, class size, and curriculum design, many reformers have come to the conclusion that nothing matters more than finding people with the potential to be great teachers. But there's a hitch: no one knows what a person with the potential to be a great teacher looks like. The school system has a quarterback problem.
3.
Kickoff time for Missouri's game against Oklahoma State was seven o'clock. It was a perfect evening for football: cloudless skies and a light fall breeze. For hours, fans had been tailgating in the parking lots around the stadium. Cars lined the roads leading to the university, many with fuzzy yellow-and-black Tiger tails hanging from their trunks. It was one of Mizzou's biggest games in years. The Tigers were undefeated, and had a chance to become the No. 1 college football team in the country. Shonka made his way through the milling crowds and took a seat in the press box. Below him, the players on the field looked like pieces on a chessboard.
The Tigers held the ball first. Chase Daniel stood a good seven yards behind his offensive line. He had five receivers, two to his left and three to his right, spaced from one side of the field to the other. His linemen were widely spaced as well. In play after play, Daniel caught the snap from his center, planted his feet, and threw the ball in quick seven- and eight-yard diagonal passes to one of his five receivers.
The style of offense that the Tigers run is called the "spread," and most of the top quarterbacks in college football—the players who will be drafted into the pros—are spread quarterbacks. By spacing out the offensive linemen and wide receivers, the system makes it easy for the quarterback to figure out the intentions of the opposing defense before the ball is snapped: he can look up and down the line, "read" the defense, and decide where to throw the ball before anyone has moved a muscle. Daniel had been playing in the spread since high school; he was its master. "Look how quickly he gets the ball out," Shonka said. "You can hardly go a thousand and one, a thousand and two, and it's out of his hand. He knows right where he's going. When everyone is spread out like that, the defense can't disguise its coverage. Chase knows right away what they are going to do. The system simplifies the quarterback's decisions."
But for Shonka this didn't help matters. It had always been hard to predict how a college quarterback would fare in the pros. The professional game was, simply, faster and more complicated. With the advent of the spread, though, the correspondence between the two levels of play had broken down almost entirely. N.F.L. teams don't run the spread. They can't. The defenders in the pros are so much faster than their college counterparts that they would shoot through those big gaps in the offensive line and flatten the quarterback. In the N.F.L., the offensive line is bunched closely together. Daniel wouldn't have five receivers. Most of the time, he'd have just three or four. He wouldn't have the luxury of standing seven yards behind the center, planting his feet, and knowing instantly where to throw. He'd have to crouch right behind the center, take the snap directly, and run backward before planting his feet to throw. The onrushing defenders wouldn't be seven yards away. They would be all around him, from the start. The defense would no longer have to show its hand, because the field would not be so spread out. It could now disguise its intentions. Daniel wouldn't be able to read the defense before the snap was taken. He'd have to read it in the seconds after the play began.
"In the spread, you see a lot of guys wide open," Shonka said. "But when a guy like Chase goes to the N.F.L. he's never going to see his receivers that open—only in some rare case, like someone slips or there's a bust in the coverage. When that ball's leaving your hands in the pros, if you don't use your eyes to move the defender a little bit, they'll break on the ball and intercept it. The athletic ability that they're playing against in the league is unbelievable."
As Shonka talked, Daniel was moving his team down the field. But he was almost always throwing those quick, diagonal passes. In the N.F.L., he would have to do much more than that—he would have to throw long, vertical passes over the top of the defense. Could he make that kind of throw? Shonka didn't know. There was also the matter of his height. Six feet was fine in a spread system, where the big gaps in the offensive line gave Daniel plenty of opportunity to throw the ball and see downfield. But in the N.F.L. there wouldn't be gaps, and the linemen rushing at him would be six-five, not six-one.
"I wonder," Shonka went on. "Can he see? Can he be productive in a new kind of offense? How will he handle that? I'd like to see him set up quickly from center. I'd like to see his ability to read coverages that are not in the spread. I'd like to see him in the pocket. I'd like to see him move his feet. I'd like to see him do a deep dig, or deep comeback. You know, like a throw twenty to twenty-five yards down the field."
It was clear that Shonka didn't feel the same hesitancy in evaluating the other Mizzou stars—the safety Moore, the receivers Maclin and Coffman. The game that they would play in the pros would also be different from the game they were playing in college, but the difference was merely one of degree. They had succeeded at Missouri because they were strong and fast and skilled, and these traits translate in kind to professional football.
A college quarterback joining the N.F.L., by contrast, has to learn to play an entirely new game. Shonka began to talk about Tim Couch, the quarterback taken first in that legendary draft of 1999. Couch set every record imaginable in his years at the University of Kentucky. "They used to put five garbage cans on the field," Shonka recalled, shaking his head, "and Couch would stand there and throw and just drop the ball into every one." But Couch was a flop in the pros. It wasn't that professional quarterbacks didn't need to be accurate. It was that the kind of accuracy required to do the job well could be measured only in a real N.F.L. game.
Similarly, all quarterbacks drafted into the pros are required to take an I.Q. test—the Wonderlic Personnel Test. The theory behind the test is that the pro game is so much more cognitively demanding than the college game that high intelligence should be a good predictor of success. But when the economists David Berri and Rob Simmons analyzed the scores—which are routinely leaked to the press—they found that Wonderlic scores are all but useless as predictors. Of the five quarterbacks taken in round one of the 1999 draft, Donovan McNabb, the only one of the five with a shot at the Hall of Fame, had the lowest Wonderlic score. And who else had I.Q. scores in the same range as McNabb? Dan Marino and Terry Bradshaw, two of the greatest quarterbacks ever to play the game.
We're used to dealing with prediction problems by going back and looking for better predictors. We now realize that being a good doctor requires the ability to communicate, listen, and empathize—and so there is increasing pressure on medical schools to pay attention to interpersonal skills as well as to test scores. We can have better physicians if we're just smarter about how we choose medical-school students. But no one is saying that Dan Shonka is somehow missing some key ingredient in his analysis; that if he were only more perceptive he could predict Chase Daniel's career trajectory. The problem with picking quarterbacks is that Chase Daniel's performance can't be predicted. The job he's being groomed for is so particular and specialized that there is no way to know who will succeed at it and who won't. In fact, Berri and Simmons found no connection between where a quarterback was taken in the draft—that is, how highly he was rated on the basis of his college performance—and how well he played in the pros.
The entire time that Chase Daniel was on the field against Oklahoma State, his backup, Chase Patton, stood on the sidelines, watching. Patton didn't play a single down. In his four years at Missouri, up to that point, he had thrown a total of twenty-six passes. And yet there were people in Shonka's world who thought that Patton would end up as a better professional quarterback than Daniel. The week of the Oklahoma State game, the national sports magazine ESPN even put the two players on its cover, with the title "CHASE DANIEL MIGHT WIN THE HEISMAN"—referring to the trophy given to college football's best player. "HIS BACKUP COULD WIN THE SUPER BOWL." Why did everyone like Patton so much? It wasn't clear. Maybe he looked good in practice. Maybe it was because this season in the N.F.L. a quarterback who had also never started in a single college game is playing superbly for the New England Patriots. It sounds absurd to put an athlete on the cover of a magazine for no particular reason. But perhaps that's just the quarterback problem taken to an extreme. If college performance doesn't tell us anything, why shouldn't we value someone who hasn't had the chance to play as highly as someone who plays as well as anyone in the land?
4.
Picture a young preschool teacher, sitting on a classroom floor surrounded by seven children. She is holding an alphabet book, and working through the letters with the children, one by one: " 'A' is for apple. . . . 'C' is for cow." The session was taped, and the videotape is being watched by a group of experts, who are charting and grading each of the teacher's moves.
After thirty seconds, the leader of the group—Bob Pianta, the dean of the University of Virginia's Curry School of Education—stops the tape. He points to two little girls on the right side of the circle. They are unusually active, leaning into the circle and reaching out to touch the book.
"What I'm struck by is how lively the affect is in this room," Pianta said. "One of the things the teacher is doing is creating a holding space for that. And what distinguishes her from other teachers is that she flexibly allows the kids to move and point to the book. She's not rigidly forcing the kids to sit back."
Pianta's team has developed a system for evaluating various competencies relating to student-teacher interaction. Among them is "regard for student perspective"; that is, a teacher's knack for allowing students some flexibility in how they become engaged in the classroom. Pianta stopped and rewound the tape twice, until what the teacher had managed to achieve became plain: the children were active, but somehow the class hadn't become a free-for-all.
"A lesser teacher would have responded to the kids' leaning over as misbehavior," Pianta went on. " 'We can't do this right now. You need to be sitting still.' She would have turned this off."
Bridget Hamre, one of Pianta's colleagues, chimed in: "These are three- and four-year-olds. At this age, when kids show their engagement it's not like the way we show our engagement, where we look alert. They're leaning forward and wriggling. That's their way of doing it. And a good teacher doesn't interpret that as bad behavior. You can see how hard it is to teach new teachers this idea, because the minute you teach them to have regard for the student's perspective, they think you have to give up control of the classroom."
The lesson continued. Pianta pointed out how the teacher managed to personalize the material. " 'C' is for cow" turned into a short discussion of which of the kids had ever visited a farm. "Almost every time a child says something, she responds to it, which is what we describe as teacher sensitivity," Hamre said.
The teacher then asked the children if anyone's name "began with that letter. Calvin," a boy named Calvin says. The teacher nods, and says, "Calvin starts with 'C.' " A little girl in the middle says, "Me!" The teacher turns to her. "Your name's Venisha. Letter 'V.' Venisha."
It was a key moment. Of all the teacher elements analyzed by —the Virginia group, feedbacka direct, personal response by a teacher to a specific statement by a student—seems to be most closely linked to academic success. Not only did the teacher catch the "Me!" amid the wiggling and tumult; she addressed it directly.
"Mind you, that's not great feedback," Hamre said. "High-quality feedback is where there is a back-and-forth exchange to get a deeper understanding." The perfect way to handle that moment would have been for the teacher to pause and pull out Venisha's name card, point to the letter "V," show her how different it is from "C," and make the class sound out both letters. But the teacher didn't do that—either because it didn't occur to her or because she was distracted by the wiggling of the girls to her right.
"On the other hand, she could have completely ignored the girl, which happens a lot," Hamre went on. "The other thing that happens a lot is the teacher will just say, 'You're wrong.' Yes-no feedback is probably the predominant kind of feedback, which provides almost no information for the kid in terms of learning."
Pianta showed another tape, of a nearly identical situation: a circle of pre-schoolers around a teacher. The lesson was about how we can tell when someone is happy or sad. The teacher began by acting out a short conversation between two hand puppets, Henrietta and Twiggle: Twiggle is sad until Henrietta shares some watermelon with him.
"The idea that the teacher is trying to get across is that you can tell by looking at somebody's face how they're feeling, whether they're feeling sad or happy," Hamre said. "What kids of this age tend to say is you can tell how they're feeling because of something that happened to them. They lost their puppy and that's why they're sad. They don't really get this idea. So she's been challenged, and she's struggling."
The teacher begins, "Remember when we did something and we drew our face?" She touches her face, pointing out her eyes and mouth. "When somebody is happy, their face tells us that they're happy. And their eyes tell us." The children look on blankly. The teacher plunges on: "Watch, watch." She smiles broadly. "This is happy! How can you tell that I'm happy? Look at my face. Tell me what changes about my face when I'm happy. No, no, look at my face. . . . No. . . ."
A little girl next to her says, "Eyes," providing the teacher with an opportunity to use one of her students to draw the lesson out. But the teacher doesn't hear her. Again, she asks, "What's changed about my face?" She smiles and she frowns, as if she can reach the children by sheer force of repetition. Pianta stopped the tape. One problem, he pointed out, was that Henrietta made Twiggle happy by sharing watermelon with him, which doesn't illustrate what the lesson is about.
"You know, a better way to handle this would be to anchor something around the kids," Pianta said. "She should ask, 'What makes you feel happy?' The kids could answer. Then she could say, 'Show me your face when you have that feeling? O.K., what does So-and-So's face look like? Now tell me what makes you sad. Show me your face when you're sad. Oh, look, her face changed!' You've basically made the point. And then you could have the kids practice, or something. But this is going to go nowhere."
"What's changed about my face?" the teacher repeated, for what seemed like the hundredth time. One boy leaned forward into the circle, trying to engage himself in the lesson, in the way that little children do. His eyes were on the teacher. "Sit up!" she snapped at him.
As Pianta played one tape after another, the patterns started to become clear. Here was a teacher who read out sentences, in a spelling test, and every sentence came from her own life—"I went to a wedding last week"—which meant she was missing an opportunity to say something that engaged her students. Another teacher walked over to a computer to do a PowerPoint presentation, only to realize that she hadn't turned it on. As she waited for it to boot up, the classroom slid into chaos.
Then there was the superstar—a young high-school math teacher, in jeans and a green polo shirt. "So let's see," he began, standing up at the blackboard. "Special right triangles. We're going to do practice with this, just throwing out ideas." He drew two triangles. "Label the length of the side, if you can. If you can't, we'll all do it." He was talking and moving quickly, which Pianta said might be interpreted as a bad thing, because this was trigonometry. It wasn't easy material. But his energy seemed to infect the class. And all the time he offered the promise of help. If you can't, we'll all do it. In a corner of the room was a student named Ben, who'd evidently missed a few classes. "See what you can remember, Ben," the teacher said. Ben was lost. The teacher quickly went to his side: "I'm going to give you a way to get to it." He made a quick suggestion: "How about that?" Ben went back to work. The teacher slipped over to the student next to Ben, and glanced at her work. "That's all right!" He went to a third student, then a fourth. Two and a half minutes into the lesson—the length of time it took that subpar teacher to turn on the computer—he had already laid out the problem, checked in with nearly every student in the class, and was back at the blackboard, to take the lesson a step further.
"In a group like this, the standard m.o. would be: he's at the board, broadcasting to the kids, and has no idea who knows what he's doing and who doesn't know," Pianta said. "But he's giving individualized feedback. He's off the charts on feedback." Pianta and his team watched in awe.
5.
Educational-reform efforts typically start with a push for higher standards for teachers—that is, for the academic and cognitive requirements for entering the profession to be as stiff as possible. But after you've watched Pianta's tapes, and seen how complex the elements of effective teaching are, this emphasis on book smarts suddenly seems peculiar. The preschool teacher with the alphabet book was sensitive to her students' needs and knew how to let the two girls on the right wiggle and squirm without disrupting the rest of the students; the trigonometry teacher knew how to complete a circuit of his classroom in two and a half minutes and make everyone feel as if he or she were getting his personal attention. But these aren't cognitive skills.
A group of researchers—Thomas J. Kane, an economist at Harvard's school of education; Douglas Staiger, an economist at Dartmouth; and Robert Gordon, a policy analyst at the Center for American Progress—have investigated whether it helps to have a teacher who has earned a teaching certification or a master's degree. Both are expensive, time-consuming credentials that almost every district expects teachers to acquire; neither makes a difference in the classroom. Test scores, graduate degrees, and certifications—as much as they appear related to teaching prowess—turn out to be about as useful in predicting success as having a quarterback throw footballs into a bunch of garbage cans.
Another educational researcher, Jacob Kounin, once did an analysis of "desist" events, in which a teacher has to stop some kind of misbehavior. In one instance, "Mary leans toward the table to her right and whispers to Jane. Both she and Jane giggle. The teacher says, 'Mary and Jane, stop that!' " That's a desist event. But how a teacher desists—her tone of voice, her attitudes, her choice of words—appears to make no difference at all in maintaining an orderly classroom. How can that be? Kounin went back over the videotape and noticed that forty-five seconds before Mary whispered to Jane, Lucy and John had started whispering. Then Robert had noticed and joined in, making Jane giggle, whereupon Jane said something to John. Then Mary whispered to Jane. It was a contagious chain of misbehavior, and what really was significant was not how a teacher stopped the deviancy at the end of the chain but whether she was able to stop the chain before it started. Kounin called that ability "withitness," which he defined as "a teacher's communicating to the children by her actual behavior (rather than by verbally announcing: 'I know what's going on') that she knows what the children are doing, or has the proverbial 'eyes in the back of her head.' " It stands to reason that to be a great teacher you have to have withitness. But how do you know whether someone has withitness until she stands up in front of a classroom of twenty-five wiggly Janes, Lucys, Johns, and Roberts and tries to impose order?
6.
Perhaps no profession has taken the implications of the quarterback problem more seriously than the financial-advice field, and the experience of financial advisers is a useful guide to what could happen in teaching as well. There are no formal qualifications for entering the field except a college degree. Financial-services firms don't look for only the best students, or require graduate degrees or specify a list of prerequisites. No one knows beforehand what makes a high-performing financial adviser different from a low-performing one, so the field throws the door wide open.
"A question I ask is, 'Give me a typical day,' " Ed Deutschlander, the co-president of North Star Resource Group, in Minneapolis, says. "If that person says, 'I get up at five-thirty, hit the gym, go to the library, go to class, go to my job, do homework until eleven,' that person has a chance." Deutschlander, in other words, begins by looking for the same general traits that every corporate recruiter looks for.
Deutschlander says that last year his firm interviewed about a thousand people, and found forty-nine it liked, a ratio of twenty interviewees to one candidate. Those candidates were put through a four-month "training camp," in which they tried to act like real financial advisers. "They should be able to obtain in that four-month period a minimum of ten official clients," Deutschlander said. "If someone can obtain ten clients, and is able to maintain a minimum of ten meetings a week, that means that person has gathered over a hundred introductions in that four-month period. Then we know that person is at least fast enough to play this game."
Of the forty-nine people invited to the training camp, twenty-three made the cut and were hired as apprentice advisers. Then the real sorting began. "Even with the top performers, it really takes three to four years to see whether someone can make it," Deutschlander says. "You're just scratching the surface at the beginning. Four years from now, I expect to hang on to at least thirty to forty per cent of that twenty-three."
People like Deutschlander are referred to as gatekeepers, a title that suggests that those at the door of a profession are expected to discriminate—to select who gets through the gate and who doesn't. But Deutschlander sees his role as keeping the gate as wide open as possible: to find ten new financial advisers, he's willing to interview a thousand people. The equivalent of that approach, in the N.F.L., would be for a team to give up trying to figure out who the "best" college quarterback is, and, instead, try out three or four "good" candidates.
In teaching, the implications are even more profound. They suggest that we shouldn't be raising standards. We should be lowering them, because there is no point in raising standards if standards don't track with what we care about. Teaching should be open to anyone with a pulse and a college degree—and teachers should be judged after they have started their jobs, not before. That means that the profession needs to start the equivalent of Ed Deutschlander's training camp. It needs an apprenticeship system that allows candidates to be rigorously evaluated. Kane and Staiger have calculated that, given the enormous differences between the top and the bottom of the profession, you'd probably have to try out four candidates to find one good teacher. That means tenure can't be routinely awarded, the way it is now. Currently, the salary structure of the teaching profession is highly rigid, and that would also have to change in a world where we want to rate teachers on their actual performance. An apprentice should get apprentice wages. But if we find eighty-fifth-percentile teachers who can teach a year and a half's material in one year, we're going to have to pay them a lot—both because we want them to stay and because the only way to get people to try out for what will suddenly be a high-risk profession is to offer those who survive the winnowing a healthy reward.
Is this solution to teaching's quarterback problem politically possible? Taxpayers might well balk at the costs of trying out four teachers to find one good one. Teachers' unions have been resistant to even the slightest move away from the current tenure arrangement. But all the reformers want is for the teaching profession to copy what firms like North Star have been doing for years. Deutschlander interviews a thousand people to find ten advisers. He spends large amounts of money to figure out who has the particular mixture of abilities to do the job. "Between hard and soft costs," he says, "most firms sink between a hundred thousand dollars and two hundred and fifty thousand dollars on someone in their first three or four years," and in most cases, of course, that investment comes to naught. But, if you were willing to make that kind of investment and show that kind of patience, you wound up with a truly high-performing financial adviser. "We have a hundred and twenty-five full-time advisers," Deutschlander says. "Last year, we had seventy-one of them qualify for the Million Dollar Round Table"—the industry's association of its most successful practitioners. "We're seventy-one out of a hundred and twenty-five in that Ă©lite group." What does it say about a society that it devotes more care and patience to the selection of those who handle its money than of those who handle its children?
7.
Midway through the fourth quarter of the Oklahoma State–Missouri game, the Tigers were in trouble. For the first time all year, they were behind late in the game. They needed to score, or they'd lose any chance of a national championship. Daniel took the snap from his center, and planted his feet to pass. His receivers were covered. He began to run. The Oklahoma State defenders closed in on him. He was under pressure, something that rarely happened to him in the spread. Desperate, he heaved the ball downfield, right into the arms of a Cowboy defender.
Shonka jumped up. "That's not like him!" he cried out. "He doesn't throw stuff up like that."
Next to Shonka, a scout for the Kansas City Chiefs looked crestfallen. "Chase never throws something up for grabs!"
It was tempting to see Daniel's mistake as definitive. The spread had broken down. He was finally under pressure. This was what it would be like to be an N.F.L. quarterback, wasn't it? But there is nothing like being an N.F.L. quarterback except being an N.F.L. quarterback. A prediction, in a field where prediction is not possible, is no more than a prejudice. Maybe that interception means that Daniel won't be a good professional quarterback, or maybe he made a mistake that he'll learn from. "In a great big piece of pie," Shonka said, "that was just a little slice."
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
How David Beats Goliath
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
May 11, 2009
Annals of Innovation
When underdogs break the rules.
1.
When Vivek RanadivĂ© decided to coach his daughter Anjali's basketball team, he settled on two principles. The first was that he would never raise his voice. This was National Junior Basketball—the Little League of basketball. The team was made up mostly of twelve-year-olds, and twelve-year-olds, he knew from experience, did not respond well to shouting. He would conduct business on the basketball court, he decided, the same way he conducted business at his software firm. He would speak calmly and softly, and convince the girls of the wisdom of his approach with appeals to reason and common sense.
The second principle was more important. RanadivĂ© was puzzled by the way Americans played basketball. He is from Mumbai. He grew up with cricket and soccer. He would never forget the first time he saw a basketball game. He thought it was mindless. Team A would score and then immediately retreat to its own end of the court. Team B would inbound the ball and dribble it into Team A's end, where Team A was patiently waiting. Then the process would reverse itself. A basketball court was ninety-four feet long. But most of the time a team defended only about twenty-four feet of that, conceding the other seventy feet. Occasionally, teams would play a full-court press—that is, they would contest their opponent's attempt to advance the ball up the court. But they would do it for only a few minutes at a time. It was as if there were a kind of conspiracy in the basketball world about the way the game ought to be played, and RanadivĂ© thought that that conspiracy had the effect of widening the gap between good teams and weak teams. Good teams, after all, had players who were tall and could dribble and shoot well; they could crisply execute their carefully prepared plays in their opponent's end. Why, then, did weak teams play in a way that made it easy for good teams to do the very things that made them so good?
RanadivĂ© looked at his girls. Morgan and Julia were serious basketball players. But Nicky, Angela, Dani, Holly, Annika, and his own daughter, Anjali, had never played the game before. They weren't all that tall. They couldn't shoot. They weren't particularly adept at dribbling. They were not the sort who played pickup games at the playground every evening. Most of them were, as RanadivĂ© says, "little blond girls" from Menlo Park and Redwood City, the heart of Silicon Valley. These were the daughters of computer programmers and people with graduate degrees. They worked on science projects, and read books, and went on ski vacations with their parents, and dreamed about growing up to be marine biologists. RanadivĂ© knew that if they played the conventional way—if they let their opponents dribble the ball up the court without opposition—they would almost certainly lose to the girls for whom basketball was a passion. RanadivĂ© came to America as a seventeen-year-old, with fifty dollars in his pocket. He was not one to accept losing easily. His second principle, then, was that his team would play a real full-court press, every game, all the time. The team ended up at the national championships. "It was really random," Anjali RanadivĂ© said. "I mean, my father had never played basketball before."
2.
David's victory over Goliath, in the Biblical account, is held to be an anomaly. It was not. Davids win all the time. The political scientist Ivan ArreguĂn-Toft recently looked at every war fought in the past two hundred years between strong and weak combatants. The Goliaths, he found, won in 71.5 per cent of the cases. That is a remarkable fact. ArreguĂn-Toft was analyzing conflicts in which one side was at least ten times as powerful—in terms of armed might and population—as its opponent, and even in those lopsided contests the underdog won almost a third of the time.
In the Biblical story of David and Goliath, David initially put on a coat of mail and a brass helmet and girded himself with a sword: he prepared to wage a conventional battle of swords against Goliath. But then he stopped. "I cannot walk in these, for I am unused to it," he said (in Robert Alter's translation), and picked up those five smooth stones. What happened, ArreguĂn-Toft wondered, when the underdogs likewise acknowledged their weakness and chose an unconventional strategy? He went back and re-analyzed his data. In those cases, David's winning percentage went from 28.5 to 63.6. When underdogs choose not to play by Goliath's rules, they win, ArreguĂn-Toft concluded, "even when everything we think we know about power says they shouldn't."
Consider the way T. E. Lawrence (or, as he is better known, Lawrence of Arabia) led the revolt against the Ottoman Army occupying Arabia near the end of the First World War. The British were helping the Arabs in their uprising, and the initial focus was Medina, the city at the end of a long railroad that the Turks had built, running south from Damascus and down through the Hejaz desert. The Turks had amassed a large force in Medina, and the British leadership wanted Lawrence to gather the Arabs and destroy the Turkish garrison there, before the Turks could threaten the entire region.
But when Lawrence looked at his ragtag band of Bedouin fighters he realized that a direct attack on Medina would never succeed. And why did taking the city matter, anyway? The Turks sat in Medina "on the defensive, immobile." There were so many of them, consuming so much food and fuel and water, that they could hardly make a major move across the desert. Instead of attacking the Turks at their point of strength, Lawrence reasoned, he ought to attack them where they were weak—along the vast, largely unguarded length of railway line that was their connection to Damascus. Instead of focussing his attention on Medina, he should wage war over the broadest territory possible.
The Bedouins under Lawrence's command were not, in conventional terms, skilled troops. They were nomads. Sir Reginald Wingate, one of the British commanders in the region, called them "an untrained rabble, most of whom have never fired a rifle." But they were tough and they were mobile. The typical Bedouin soldier carried no more than a rifle, a hundred rounds of ammunition, forty-five pounds of flour, and a pint of drinking water, which meant that he could travel as much as a hundred and ten miles a day across the desert, even in summer. "Our cards were speed and time, not hitting power," Lawrence wrote. "Our largest available resources were the tribesmen, men quite unused to formal warfare, whose assets were movement, endurance, individual intelligence, knowledge of the country, courage." The eighteenth-century general Maurice de Saxe famously said that the art of war was about legs, not arms, and Lawrence's troops were all legs. In one typical stretch, in the spring of 1917, his men dynamited sixty rails and cut a telegraph line at Buair on March 24th, sabotaged a train and twenty-five rails at Abu al-Naam on March 25th, dynamited fifteen rails and cut a telegraph line at Istabl Antar on March 27th, raided a Turkish garrison and derailed a train on March 29th, returned to Buair and sabotaged the railway line again on March 31st, dynamited eleven rails at Hediah on April 3rd, raided the train line in the area of Wadi Dhaiji on April 4th and 5th, and attacked twice on April 6th.
Lawrence's masterstroke was an assault on the port town of Aqaba. The Turks expected an attack from British ships patrolling the waters of the Gulf of Aqaba to the west. Lawrence decided to attack from the east instead, coming at the city from the unprotected desert, and to do that he led his men on an audacious, six-hundred-mile loop—up from the Hejaz, north into the Syrian desert, and then back down toward Aqaba. This was in summer, through some of the most inhospitable land in the Middle East, and Lawrence tacked on a side trip to the outskirts of Damascus, in order to mislead the Turks about his intentions. "This year the valley seemed creeping with horned vipers and puff-adders, cobras and black snakes," Lawrence writes in "The Seven Pillars of Wisdom" of one stage in the journey:
We could not lightly draw water after dark, for there were snakes swimming in the pools or clustering in knots around their brinks. Twice puff-adders came twisting into the alert ring of our debating coffee-circle. Three of our men died of bites; four recovered after great fear and pain, and a swelling of the poisoned limb. Howeitat treatment was to bind up the part with snake-skin plaster and read chapters of the Koran to the sufferer until he died.
When they finally arrived at Aqaba, Lawrence's band of several hundred warriors killed or captured twelve hundred Turks, and lost only two men. The Turks simply did not think that their opponent would be mad enough to come at them from the desert. This was Lawrence's great insight. David can beat Goliath by substituting effort for ability—and substituting effort for ability turns out to be a winning formula for underdogs in all walks of life, including little blond-haired girls on the basketball court.
3.
Vivek Ranadivé is an elegant man, slender and fine-boned, with impeccable manners and a languorous walk. His father was a pilot who was jailed by Indira Gandhi, he says, because he wouldn't stop challenging the safety of India's planes. Ranadivé went to M.I.T., because he saw a documentary on the school and decided that it was perfect for him. This was in the nineteen-seventies, when going abroad for undergraduate study required the Indian government to authorize the release of foreign currency, and Ranadivé camped outside the office of the governor of the Reserve Bank of India until he got his way. The Ranadivés are relentless.
In 1985, RanadivĂ© founded a software company in Silicon Valley devoted to what in the computer world is known as "real time" processing. If a businessman waits until the end of the month to collect and count his receipts, he's "batch processing." There is a gap between the events in the company—sales—and his understanding of those events. Wall Street used to be the same way. The information on which a trader based his decisions was scattered across a number of databases. The trader would collect information from here and there, collate and analyze it, and then make a trade. What RanadivĂ©'s company, TIBCO, did was to consolidate those databases into one stream, so that the trader could collect all the data he wanted instantaneously. Batch processing was replaced by real-time processing. Today, TIBCO's software powers most of the trading floors on Wall Street.
RanadivĂ© views this move from batch to real time as a sort of holy mission. The shift, to his mind, is one of kind, not just of degree. "We've been working with some airlines," he said. "You know, when you get on a plane and your bag doesn't, they actually know right away that it's not there. But no one tells you, and a big part of that is that they don't have all their information in one place. There are passenger systems that know where the passenger is. There are aircraft and maintenance systems that track where the plane is and what kind of shape 's in. Then, there are baggage systems and ticketing systems—and they're all separate. So you land, you wait at the baggage terminal, and it doesn't show up." Everything bad that happens in that scenario, RanadivĂ© maintains, happens because of the lag between the event (the luggage doesn't make it onto the plane) and the response (the airline tells you that your luggage didn't make the plane). The lag is why you're angry. The lag is why you had to wait, fruitlessly, at baggage claim. The lag is why you vow never to fly that airline again. Put all the databases together, and there's no lag. "What we can do is send you a text message the moment we know your bag didn't make it," RanadivĂ© said, "telling you we'll ship it to your house."
A few years ago, RanadivĂ© wrote a paper arguing that even the Federal Reserve ought to make its decisions in real time—not once every month or two. "Everything in the world is now real time," he said. "So when a certain type of shoe isn't selling at your corner shop, it's not six months before the guy in China finds out. It's almost instantaneous, thanks to my software. The world runs in real time, but government runs in batch. Every few months, it adjusts. Its mission is to keep the temperature comfortable in the economy, and, if you were to do things the government's way in your house, then every few months you'd turn the heater either on or off, overheating or underheating your house." RanadivĂ© argued that we ought to put the economic data that the Fed uses into a big stream, and write a computer program that sifts through those data, the moment they are collected, and make immediate, incremental adjustments to interest rates and the money supply. "It can all be automated," he said. "Look, we've had only one soft landing since the Second World War. Basically, we've got it wrong every single time."
You can imagine what someone like Alan Greenspan or Ben Bernanke might say about that idea. Such people are powerfully invested in the notion of the Fed as a Solomonic body: that pause of five or eight weeks between economic adjustments seems central to the process of deliberation. To Ranadivé, though, "deliberation" just prettifies the difficulties created by lag. The Fed has to deliberate because it's several weeks behind, the same way the airline has to bow and scrape and apologize because it waited forty-five minutes to tell you something that it could have told you the instant you stepped off the plane.
Is it any wonder that RanadivĂ© looked at the way basketball was played and found it mindless? A professional basketball game was forty-eight minutes long, divided up into alternating possessions of roughly twenty seconds: back and forth, back and forth. But a good half of each twenty-second increment was typically taken up with preliminaries and formalities. The point guard dribbled the ball up the court. He stood above the top of the key, about twenty-four feet from the opposing team's basket. He called out a play that the team had choreographed a hundred times in practice. It was only then that the defending team sprang into action, actively contesting each pass and shot. Actual basketball took up only half of that twenty-second interval, so that a game's real length was not forty-eight minutes but something closer to twenty-four minutes—and that twenty-four minutes of activity took place within a narrowly circumscribed area. It was as formal and as convention-bound as an eighteenth-century quadrille. The supporters of that dance said that the defensive players had to run back to their own end, in order to compose themselves for the arrival of the other team. But the reason they had to compose themselves, surely, was that by retreating they allowed the offense to execute a play that it had practiced to perfection. Basketball was batch!
Insurgents, though, operate in real time. Lawrence hit the Turks, in that stretch in the spring of 1917, nearly every day, because he knew that the more he accelerated the pace of combat the more the war became a battle of endurance—and endurance battles favor the insurgent. "And it happened as the Philistine arose and was drawing near David that David hastened and ran out from the lines toward the Philistine," the Bible says. "And he reached his hand into the pouch and took from there a stone and slung it and struck the Philistine in his forehead." The second sentence—the slingshot part—is what made David famous. But the first sentence matters just as much. David broke the rhythm of the encounter. He speeded it up. "The sudden astonishment when David sprints forward must have frozen Goliath, making him a better target," the poet and critic Robert Pinsky writes in "The Life of David." Pinsky calls David a "point guard ready to flick the basketball here or there." David pressed. That's what Davids do when they want to beat Goliaths.
4.
Ranadivé's basketball team played in the National Junior Basketball seventh-and-eighth-grade division, representing Redwood City. The girls practiced at Paye's Place, a gym in nearby San Carlos. Because Ranadivé had never played basketball, he recruited a series of experts to help him. The first was Roger Craig, the former all-pro running back for the San Francisco 49ers, who is also TIBCO's director of business development. As a football player, Craig was legendary for the off-season hill workouts he put himself through. Most of his N.F.L. teammates are now hobbling around golf courses. He has run seven marathons. After Craig signed on, he recruited his daughter Rometra, who played Division I basketball at Duke and U.S.C. Rometra was the kind of person you assigned to guard your opponent's best player in order to shut her down. The girls loved Rometra. "She has always been like my big sister," Anjali Ranadivé said. "It was so awesome to have her along."
Redwood City's strategy was built around the two deadlines that all basketball teams must meet in order to advance the ball. The first is the inbounds pass. When one team scores, a player from the other team takes the ball out of bounds and has five seconds to pass it to a teammate on the court. If that deadline is missed, the ball goes to the other team. Usually, that's not an issue, because teams don't contest the inbounds pass. They run back to their own end. Redwood City did not. Each girl on the team closely shadowed her counterpart. When some teams play the press, the defender plays behind the offensive player she's guarding, to impede her once she catches the ball. The Redwood City girls, by contrast, played in front of their opponents, to prevent them from catching the inbounds pass in the first place. And they didn't guard the player throwing the ball in. Why bother? Ranadivé used that extra player as a floater, who could serve as a second defender against the other team's best player. "Think about football," Ranadivé said. "The quarterback can run with the ball. He has the whole field to throw to, and it's still damned difficult to complete a pass." Basketball was harder. A smaller court. A five-second deadline. A heavier, bigger ball. As often as not, the teams Redwood City was playing against simply couldn't make the inbounds pass within the five-second limit. Or the inbounding player, panicked by the thought that her five seconds were about to be up, would throw the ball away. Or her pass would be intercepted by one of the Redwood City players. Ranadivé's girls were maniacal.
The second deadline requires a team to advance the ball across mid-court, into its opponent's end, within ten seconds, and if Redwood City's opponents met the first deadline the girls would turn their attention to the second. They would descend on the girl who caught the inbounds pass and "trap" her. Anjali was the designated trapper. She'd sprint over and double-team the dribbler, stretching her long arms high and wide. Maybe she'd steal the ball. Maybe the other player would throw it away in a panic—or get bottled up and stalled, so that the ref would end up blowing the whistle. "When we first started out, no one knew how to play defense or anything," Anjali said. "So my dad said the whole game long, 'Your job is to guard someone and make sure they never get the ball on inbounds plays.' It's the best feeling in the world to steal the ball from someone. We would press and steal, and do that over and over again. It made people so nervous. There were teams that were a lot better than us, that had been playing a long time, and we would beat them."
The Redwood City players would jump ahead 4–0, 6–0, 8–0, 12–0. One time, they led 25–0. Because they typically got the ball underneath their opponent's basket, they rarely had to take low-percentage, long-range shots that required skill and practice. They shot layups. In one of the few games that Redwood City lost that year, only four of the team's players showed up. They pressed anyway. Why not? They lost by three points.
"What that defense did for us is that we could hide our weaknesses," Rometra Craig said. She helped out once Redwood City advanced to the regional championships. "We could hide the fact that we didn't have good outside shooters. We could hide the fact that we didn't have the tallest lineup, because as long as we played hard on defense we were getting steals and getting easy layups. I was honest with the girls. I told them, 'We're not the best basketball team out there.' But they understood their roles." A twelve-year-old girl would go to war for Rometra. "They were awesome," she said.
Lawrence attacked the Turks where they were weak—the railroad—and not where they were strong, Medina. Redwood City attacked the inbounds pass, the point in a game where a great team is as vulnerable as a weak one. Lawrence extended the battlefield over as large an area as possible. So did the girls of Redwood City. They defended all ninety-four feet. The full-court press is legs, not arms. It supplants ability with effort. It is basketball for those "quite unused to formal warfare, whose assets were movement, endurance, individual intelligence . . . courage."
"It's an exhausting strategy," Roger Craig said. He and Ranadivé were in a TIBCO conference room, reminiscing about their dream season. Ranadivé was at the whiteboard, diagramming the intricacies of the Redwood City press. Craig was sitting at the table.
"My girls had to be more fit than the others," Ranadivé said.
"He used to make them run," Craig said, nodding approvingly.
"We followed soccer strategy in practice," Ranadivé said. "I would make them run and run and run. I couldn't teach them skills in that short period of time, and so all we did was make sure they were fit and had some basic understanding of the game. That's why attitude plays such a big role in this, because you're going to get tired." He turned to Craig. "What was our cheer again?"
The two men thought for a moment, then shouted out happily, in unison, "One, two, three, ATTITUDE!"
That was it! The whole Redwood City philosophy was based on a willingness to try harder than anyone else.
"One time, some new girls joined the team," RanadivĂ© said, "and so in the first practice I had I was telling them, 'Look, this is what we're going to do,' and I showed them. I said, 'It's all about attitude.' And there was this one new girl on the team, and I was worried that she wouldn't get the whole attitude thing. Then we did the cheer and she said, 'No, no, it's not One, two three, ATTITUDE. It's One, two, three, attitude HAH ' "—at which point RanadivĂ© and Craig burst out laughing.
5.
In January of 1971, the Fordham University Rams played a basketball game against the University of Massachusetts Redmen. The game was in Amherst, at the legendary arena known as the Cage, where the Redmen hadn't lost since December of 1969. Their record was –1. The Redmen's star was none other than Julius Erving—Dr. J. The UMass team was very, very good. Fordham, by contrast, was a team of scrappy kids from the Bronx and Brooklyn. Their center had torn up his knee the first week of the season, which meant that their tallest player was six feet five. Their starting forward—and forwards are typically almost as tall as centers—was Charlie Yelverton, who was six feet two. But from the opening buzzer the Rams launched a full-court press, and never let up. "We jumped out to a thirteen-to-six lead, and it was a war the rest of the way," Digger Phelps, the Fordham coach at the time, recalls. "These were tough city kids. We played you ninety-four feet. We knew that sooner or later we were going to make you crack." Phelps sent in one indefatigable Irish or Italian kid from the Bronx after another to guard Erving, and, one by one, the indefatigable Irish and Italian kids fouled out. None of them were as good as Erving. It didn't matter. Fordham won, 87–79.
In the world of basketball, there is one story after another like this about legendary games where David used the full-court press to beat Goliath. Yet the puzzle of the press is that it has never become popular. People look at upsets like Fordham over UMass and call them flukes. Basketball sages point out that the press can be beaten by a well-coached team with adept ball handlers and astute passers—and that is true. RanadivĂ© readily admitted that all an opposing team had to do to beat Redwood City was press back: the girls were not good enough to handle their own medicine. Playing insurgent basketball did not guarantee victory. It was simply the best chance an underdog had of beating Goliath. If Fordham had played UMass the conventional way, it would have lost by thirty points. And yet somehow that lesson has escaped the basketball establishment.
What did Digger Phelps do, the season after his stunning upset of UMass? He never used the full-court press the same way again. The UMass coach, Jack Leaman, was humbled in his own gym by a bunch of street kids. Did he learn from his defeat and use the press himself the next time he had a team of underdogs? He did not.
The only person who seemed to have absorbed the lessons of that game was a skinny little guard on the UMass freshman team named Rick Pitino. He didn't play that day. He watched, and his eyes grew wide. Even now, thirty-eight years later, he can name, from memory, nearly every player on the Fordham team: Yelverton, Sullivan, Mainor, Charles, Zambetti. "They came in with the most unbelievable pressing team I'd ever seen," Pitino said. "Five guys between six feet five and six feet. It was unbelievable how they covered ground. I studied it. There is no way they should have beaten us. Nobody beat us at the Cage."
Pitino became the head coach at Boston University in 1978, when he was twenty-five years old, and used the press to take the school to its first N.C.A.A. tournament appearance in twenty-four years. At his next head-coaching stop, Providence College, Pitino took over a team that had gone 11–20 the year before. The players were short and almost entirely devoid of talent—a carbon copy of the Fordham Rams. They pressed, and ended up one game away from playing for the national championship. At the University of Kentucky, in the mid-nineteen-nineties, Pitino took his team to the Final Four three times—and won a national championship—with full-court pressure, and then rode the full-court press back to the Final Four in 2005, as the coach at the University of Louisville. This year, his Louisville team entered the N.C.A.A. tournament ranked No. 1 in the land. College coaches of Pitino's calibre typically have had numerous players who have gone on to be bona-fide all-stars at the professional level. In his many years of coaching, Pitino has had one, Antoine Walker. It doesn't matter. Every year, he racks up more and more victories.
"The greatest example of the press I've ever coached was my Kentucky team in '96, when we played L.S.U.," Pitino said. He was at the athletic building at the University of Louisville, in a small room filled with television screens, where he watches tapes of opponents' games. "Do we have that tape?" Pitino called out to an assistant. He pulled a chair up close to one of the monitors. The game began with Kentucky stealing the ball from L.S.U., deep in L.S.U.'s end. Immediately, the ball was passed to Antoine Walker, who cut to the basket for a layup. L.S.U. got the ball back. Kentucky stole it again. Another easy basket by Walker. "Walker had almost thirty points at halftime," Pitino said. "He dunked it almost every time. When we steal, he just runs to the basket." The Kentucky players were lightning quick and long-armed, and swarmed around the L.S.U. players, arms flailing. It was mayhem. Five minutes in, it was clear that L.S.U. was panicking.
Pitino trains his players to look for what he calls the "rush state" in their opponents—that moment when the player with the ball is shaken out of his tempo—and L.S.U. could not find a way to get out of the rush state. "See if you find one play that L.S.U. managed to run," Pitino said. You couldn't. The L.S.U. players struggled to get the ball inbounds, and, if they did that, they struggled to get the ball over mid-court, and on those occasions when they managed both those things they were too overwhelmed and exhausted to execute their offense the way they had been trained to. "We had eighty-six points at halftime," Pitino went on—eighty-six points being, of course, what college basketball teams typically score in an entire game. "And I think we'd forced twenty-three turnovers at halftime," twenty-three turnovers being what college basketball teams might force in two games. "I love watching this," Pitino said. He had a faraway look in his eyes. "Every day, you dream about getting a team like this again." So why are there no more than a handful of college teams who use the full-court press the way Pitino does?
ArreguĂn-Toft found the same puzzling pattern. When an underdog fought like David, he usually won. But most of the time underdogs didn't fight like David. Of the two hundred and two lopsided conflicts in ArreguĂn-Toft's database, the underdog chose to go toe to toe with Goliath the conventional way a hundred and fifty-two times—and lost a hundred and nineteen times. In 1809, the Peruvians fought the Spanish straight up and lost; in 1816, the Georgians fought the Russians straight up and lost; in 1817, the Pindaris fought the British straight up and lost; in the Kandyan rebellion of 1817, the Sri Lankans fought the British straight up and lost; in 1823, the Burmese chose to fight the British straight up and lost. The list of failures was endless. In the nineteen-forties, the Communist insurgency in Vietnam bedevilled the French until, in 1951, the Viet Minh strategist Vo Nguyen Giap switched to conventional warfare—and promptly suffered a series of defeats. George Washington did the same in the American Revolution, abandoning the guerrilla tactics that had served the colonists so well in the conflict's early stages. "As quickly as he could," William Polk writes in "Violent Politics," a history of unconventional warfare, Washington "devoted his energies to creating a British-type army, the Continental Line. As a result, he was defeated time after time and almost lost the war."
It makes no sense, unless you think back to that Kentucky-L.S.U. game and to Lawrence's long march across the desert to Aqaba. It is easier to dress soldiers in bright uniforms and have them march to the sound of a fife-and-drum corps than it is to have them ride six hundred miles through the desert on the back of a camel. It is easier to retreat and compose yourself after every score than swarm about, arms flailing. We tell ourselves that skill is the precious resource and effort is the commodity. It's the other way around. Effort can trump ability—legs, in Saxe's formulation, can overpower arms—because relentless effort is in fact something rarer than the ability to engage in some finely tuned act of motor coördination.
"I have so many coaches come in every year to learn the press," Pitino said. Louisville was the Mecca for all those Davids trying to learn how to beat Goliaths. "Then they e-mail me. They tell me they can't do it. They don't know if they have the bench. They don't know if the players can last." Pitino shook his head. "We practice every day for two hours " " he went on. "The players are moving almost ninety-eight per cent of the practice. We spend very little time talking. When we make our corrections"—that is, when Pitino and his coaches stop play to give instruction—"they are seven-second corrections, so that our heart rate never rests. We are always working." Seven seconds! The coaches who came to Louisville sat in the stands and watched that ceaseless activity and despaired. The prospect of playing by David's rules was too daunting. They would rather lose.
6.
In 1981, a computer scientist from Stanford University named Doug Lenat entered the Traveller Trillion Credit Squadron tournament, in San Mateo, California. It was a war game. The contestants had been given several volumes of rules, well beforehand, and had been asked to design their own fleet of warships with a mythical budget of a trillion dollars. The fleets then squared off against one another in the course of a weekend. "Imagine this enormous auditorium area with tables, and at each table people are paired off," Lenat said. "The winners go on and advance. The losers get eliminated, and the field gets smaller and smaller, and the audience gets larger and larger."
Lenat had developed an artificial-intelligence program that he called Eurisko, and he decided to feed his program the rules of the tournament. Lenat did not give Eurisko any advice or steer the program in any particular strategic direction. He was not a war-gamer. He simply let Eurisko figure things out for itself. For about a month, for ten hours every night on a hundred computers at Xerox PARC, in Palo Alto, Eurisko ground away at the problem, until it came out with an answer. Most teams fielded some version of a traditional naval fleet—an array of ships of various sizes, each well defended against enemy attack. Eurisko thought differently. "The program came up with a strategy of spending the trillion on an astronomical number of small ships like P.T. boats, with powerful weapons but absolutely no defense and no mobility," Lenat said. "They just sat there. Basically, if they were hit once they would sink. And what happened is that the enemy would take its shots, and every one of those shots would sink our ships. But it didn't matter, because we had so many." Lenat won the tournament in a runaway.
The next year, Lenat entered once more, only this time the rules had changed. Fleets could no longer just sit there. Now one of the criteria of success in battle was fleet "agility." Eurisko went back to work. "What Eurisko did was say that if any of our ships got damaged it would sink itself—and that would raise fleet agility back up again," Lenat said. Eurisko won again.
Eurisko was an underdog. The other gamers were people steeped in military strategy and history. They were the sort who could tell you how Wellington had outfoxed Napoleon at Waterloo, or what exactly happened at Antietam. They had been raised on Dungeons and Dragons. They were insiders. Eurisko, on the other hand, knew nothing but the rule book. It had no common sense. As Lenat points out, a human being understands the meaning of the sentences "Johnny robbed a bank. He is now serving twenty years in prison," but Eurisko could not, because as a computer it was perfectly literal; it could not fill in the missing step—"Johnny was caught, tried, and convicted." Eurisko was an outsider. But it was precisely that outsiderness that led to Eurisko's victory: not knowing the conventions of the game turned out to be an advantage.
"Eurisko was exposing the fact that any finite set of rules is going to be a very incomplete approximation of reality," Lenat explained. "What the other entrants were doing was filling in the holes in the rules with real-world, realistic answers. But Eurisko didn't have that kind of preconception, partly because it didn't know enough about the world." So it found solutions that were, as Lenat freely admits, "socially horrifying": send a thousand defenseless and immobile ships into battle; sink your own ships the moment they get damaged.
This is the second half of the insurgent's creed. Insurgents work harder than Goliath. But their other advantage is that they will do what is "socially horrifying"—they will challenge the conventions about how battles are supposed to be fought. All the things that distinguish the ideal basketball player are acts of skill and coördination. When the game becomes about effort over ability, it becomes unrecognizable—a shocking mixture of broken plays and flailing limbs and usually competent players panicking and throwing the ball out of bounds. You have to be outside the establishment—a foreigner new to the game or a skinny kid from New York at the end of the bench—to have the audacity to play it that way. George Washington couldn't do it. His dream, before the war, was to be a British Army officer, finely turned out in a red coat and brass buttons. He found the guerrillas who had served the American Revolution so well to be "an exceeding dirty and nasty people." He couldn't fight the establishment, because he was the establishment.
T. E. Lawrence, by contrast, was the farthest thing from a proper British Army officer. He did not graduate with honors from Sandhurst. He was an archeologist by trade, a dreamy poet. He wore sandals and full Bedouin dress when he went to see his military superiors. He spoke Arabic like a native, and handled a camel as if he had been riding one all his life. And David, let's not forget, was a shepherd. He came at Goliath with a slingshot and staff because those were the tools of his trade. He didn't know that duels with Philistines were supposed to proceed formally, with the crossing of swords. "When the lion or the bear would come and carry off a sheep from the herd, I would go out after him and strike him down and rescue it from his clutches," David explained to Saul. He brought a shepherd's rules to the battlefield.
The price that the outsider pays for being so heedless of custom is, of course, the disapproval of the insider. Why did the Ivy League schools of the nineteen-twenties limit the admission of Jewish immigrants? Because they were the establishment and the Jews were the insurgents, scrambling and pressing and playing by immigrant rules that must have seemed to the Wasp Ă©lite of the time to be socially horrifying. "Their accomplishment is well over a hundred per cent of their ability on account of their tremendous energy and ambition," the dean of Columbia College said of the insurgents from Brooklyn, the Bronx, and the Lower East Side. He wasn't being complimentary. Goliath does not simply dwarf David. He brings the full force of social convention against him; he has contempt for David.
"In the beginning, everyone laughed at our fleet," Lenat said. "It was really embarrassing. People felt sorry for us. But somewhere around the third round they stopped laughing, and some time around the fourth round they started complaining to the judges. When we won again, some people got very angry, and the tournament directors basically said that it was not really in the spirit of the tournament to have these weird computer-designed fleets winning. They said that if we entered again they would stop having the tournament. I decided the best thing to do was to graciously bow out."
It isn't surprising that the tournament directors found Eurisko's strategies beyond the pale. 's wrong to sink your own ships, they believed. And they were right. But let's remember who made that rule: Goliath. And let's remember why Goliath made that rule: when the world has to play on Goliath's terms, Goliath wins.
7.
The trouble for Redwood City started early in the regular season. The opposing coaches began to get angry. There was a sense that Redwood City wasn't playing fair—that it wasn't right to use the full-court press against twelve-year-old girls, who were just beginning to grasp the rudiments of the game. The point of basketball, the dissenting chorus said, was to learn basketball skills. Of course, you could as easily argue that in playing the press a twelve-year-old girl learned something much more valuable—that effort can trump ability and that conventions are made to be challenged. But the coaches on the other side of Redwood City's lopsided scores were disinclined to be so philosophical.
"There was one guy who wanted to have a fight with me in the parking lot," Ranadivé said. "He was this big guy. He obviously played football and basketball himself, and he saw that skinny, foreign guy beating him at his own game. He wanted to beat me up."
Roger Craig says that he was sometimes startled by what he saw. "The other coaches would be screaming at their girls, humiliating them, shouting at them. They would say to the refs—'That's a foul! That's a foul!' But we weren't fouling. We were just playing aggressive defense."
"My girls were all blond-haired white girls," Ranadivé said. "My daughter is the closest we have to a black girl, because she's half-Indian. One time, we were playing this all-black team from East San Jose. They had been playing for years. These were born-with-a-basketball girls. We were just crushing them. We were up something like twenty to zero. We wouldn't even let them inbound the ball, and the coach got so mad that he took a chair and threw it. He started screaming at his girls, and of course the more you scream at girls that age the more nervous they get." Ranadivé shook his head: never, ever raise your voice. "Finally, the ref physically threw him out of the building. I was afraid. I think he couldn't stand it because here were all these blond-haired girls who were clearly inferior players, and we were killing them."
At the nationals, the Redwood City girls won their first two games. In the third round, their opponents were from somewhere deep in Orange County. Redwood City had to play them on their own court, and the opponents supplied their own referee as well. The game was at eight o'clock in the morning. The Redwood City players left their hotel at six, to beat the traffic. It was downhill from there. The referee did not believe in "One, two, three, attitude HAH." He didn't think that playing to deny the inbounds pass was basketball. He began calling one foul after another.
"They were touch fouls," Craig said. Ticky-tacky stuff. The memory was painful.
"My girls didn't understand," Ranadivé said. "The ref called something like four times as many fouls on us as on the other team."
"People were booing," Craig said. "It was bad."
"A two-to-one ratio is understandable, but a ratio of four to one?" Ranadivé shook his head.
"One girl fouled out."
"We didn't get blown out. There was still a chance to win. But . . ."
RanadivĂ© called the press off. He had to. The Redwood City players retreated to their own end, and passively watched as their opponents advanced down the court. They did not run. They paused and deliberated between each possession. They played basketball the way basketball is supposed to be played, and they lost—but not before making Goliath wonder whether he was a giant, after all.
homethe new yorker archivetop
THE ARCHIVE
complete list
Articles from the New Yorker
Priced to Sell
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 6, 2009
Books
Is free the future?
1.
At a hearing on Capitol Hill in May, James Moroney, the publisher of the Dallas Morning News, told Congress about negotiations he'd just had with the online retailer Amazon. The idea was to license his newspaper's content to the Kindle, Amazon's new electronic reader. "They want seventy per cent of the subscription revenue," Moroney testified. ""I get thirty per cent, they get seventy per cent. On top of that, they have said we get the right to republish your intellectual property to any portable device." The idea was that if a Kindle subscription to the Dallas Morning News cost ten dollars a month, seven dollars of that belonged to Amazon, the provider of the gadget on which the news was read, and just three dollars belonged to the newspaper, the provider of an expensive and ever-changing variety of editorial content. The people at Amazon valued the newspaper's contribution so little, in fact, that they felt they ought then to be able to license it to anyone else they wanted. Another witness at the hearing, Arianna Huffington, of the Huffington Post, said that she thought the Kindle could provide a business model to save the beleaguered newspaper industry. Moroney disagreed. "I get thirty per cent and they get the right to license my content to any portable device—not just ones made by Amazon?" He was incredulous. "That, to me, is not a model."
Had James Moroney read Chris Anderson's new book, "Free: The Future of a Radical Price" (Hyperion; $26.99), Amazon's offer might not have seemed quite so surprising. Anderson is the editor of Wired and the author of the 2006 best-seller "The Long Tail," and "Free" is essentially an extended elaboration of Stewart Brand's famous declaration that "information wants to be free." The digital age, Anderson argues, is exerting an inexorable downward pressure on the prices of all things "made of ideas." Anderson does not consider this a passing trend. Rather, he seems to think of it as an iron law: "In the digital realm you can try to keep Free at bay with laws and locks, but eventually the force of economic gravity will win." To musicians who believe that their music is being pirated, Anderson is blunt. They should stop complaining, and capitalize on the added exposure that piracy provides by making money through touring, merchandise sales, and "yes, the sale of some of [their] music to people who still want CDs or prefer to buy their music online." To the Dallas Morning News, he would say the same thing. Newspapers need to accept that content is never again going to be worth what they want it to be worth, and reinvent their business. "Out of the bloodbath will come a new role for professional journalists," he predicts, and he goes on:
There may be more of them, not fewer, as the ability to participate in journalism extends beyond the credentialed halls of traditional media. But they may be paid far less, and for many it won't be a full time job at all. Journalism as a profession will share the stage with journalism as an avocation. Meanwhile, others may use their skills to teach and organize amateurs to do a better job covering their own communities, becoming more editor/coach than writer. If so, leveraging the Free—paying people to get other people to write for non-monetary rewards—may not be the enemy of professional journalists. Instead, it may be their salvation.
Anderson is very good at paragraphs like this—with its reassuring arc from "bloodbath" to "salvation." His advice is pithy, his tone uncompromising, and his subject matter perfectly timed for a moment when old-line content providers are desperate for answers. That said, it is not entirely clear what distinction is being marked between "paying people to get other people to write" and paying people to write. If you can afford to pay someone to get other people to write, why can't you pay people to write? It would be nice to know, as well, just how a business goes about reorganizing itself around getting people to work for "non-monetary rewards." Does he mean that the New York Times should be staffed by volunteers, like Meals on Wheels? Anderson's reference to people who "prefer to buy their music online" carries the faint suggestion that refraining from theft should be considered a mere preference. And then there is his insistence that the relentless downward pressure on prices represents an iron law of the digital economy. Why is it a law? Free is just another price, and prices are set by individual actors, in accordance with the aggregated particulars of marketplace power. "Information wants to be free," Anderson tells us, "in the same way that life wants to spread and water wants to run downhill." But information can't actually want anything, can it? Amazon wants the information in the Dallas paper to be free, because that way Amazon makes more money. Why are the self-interested motives of powerful companies being elevated to a philosophical principle? But we are getting ahead of ourselves.
2.
Anderson's argument begins with a technological trend. The cost of the building blocks of all electronic activity—storage, processing, and bandwidth—has fallen so far that it is now approaching zero. In 1961, Anderson says, a single transistor was ten dollars. In 1963, it was five dollars. By 1968, it was one dollar. Today, Intel will sell you two billion transistors for eleven hundred dollars—meaning that the cost of a single transistor is now about .000055 cents.
Anderson's second point is that when prices hit zero extraordinary things happen. Anderson describes an experiment conducted by the M.I.T. behavioral economist Dan Ariely, the author of "Predictably Irrational." Ariely offered a group of subjects a choice between two kinds of chocolate—Hershey's Kisses, for one cent, and Lindt truffles, for fifteen cents. Three-quarters of the subjects chose the truffles. Then he redid the experiment, reducing the price of both chocolates by one cent. The Kisses were now free. What happened? The order of preference was reversed. Sixty-nine per cent of the subjects chose the Kisses. The price difference between the two chocolates was exactly the same, but that magic word "free" has the power to create a consumer stampede. Amazon has had the same experience with its offer of free shipping for orders over twenty-five dollars. The idea is to induce you to buy a second book, if your first book comes in at less than the twenty-five-dollar threshold. And that's exactly what it does. In France, however, the offer was mistakenly set at the equivalent of twenty cents—and consumers didn't buy the second book. "From the consumer's perspective, there is a huge difference between cheap and free," Anderson writes. "Give a product away, and it can go viral. Charge a single cent for it and you're in an entirely different business. . . . The truth is that zero is one market and any other price is another."
Since the falling costs of digital technology let you make as much stuff as you want, Anderson argues, and the magic of the word "free" creates instant demand among consumers, then Free (Anderson honors it with a capital) represents an enormous business opportunity. Companies ought to be able to make huge amounts of money "around" the thing being given away—as Google gives away its search and e-mail and makes its money on advertising.
Anderson cautions that this philosophy of embracing the Free involves moving from a "scarcity" mind-set to an "abundance" mind-set. Giving something away means that a lot of it will be wasted. But because it costs almost nothing to make things, digitally, we can afford to be wasteful. The elaborate mechanisms we set up to monitor and judge the quality of content are, Anderson thinks, artifacts of an era of scarcity: we had to worry about how to allocate scarce resources like newsprint and shelf space and broadcast time. Not anymore. Look at YouTube, he says, the free video archive owned by Google. YouTube lets anyone post a video to its site free, and lets anyone watch a video on its site free, and it doesn't have to pass judgment on the quality of the videos it archives. "Nobody is deciding whether a video is good enough to justify the scarce channel space it takes, because there is no scarce channel space," he writes, and goes on:
Distribution is now close enough to free to round down. Today, it costs about $0.25 to stream one hour of video to one person. Next year, it will be $0.15. A year later it will be less than a dime. Which is why YouTube's founders decided to give it away. . . . The result is both messy and runs counter to every instinct of a television professional, but this is what abundance both requires and demands.
There are four strands of argument here: a technological claim (digital infrastructure is effectively Free), a psychological claim (consumers love Free), a procedural claim (Free means never having to make a judgment), and a commercial claim (the market created by the technological Free and the psychological Free can make you a lot of money). The only problem is that in the middle of laying out what he sees as the new business model of the digital age Anderson is forced to admit that one of his main case studies, YouTube, "has so far failed to make any money for Google."
Why is that? Because of the very principles of Free that Anderson so energetically celebrates. When you let people upload and download as many videos as they want, lots of them will take you up on the offer. That's the magic of Free psychology: an estimated seventy-five billion videos will be served up by YouTube this year. Although the magic of Free technology means that the cost of serving up each video is "close enough to free to round down," "close enough to free" multiplied by seventy-five billion is still a very large number. A recent report by Credit Suisse estimates that YouTube's bandwidth costs in 2009 will be three hundred and sixty million dollars. In the case of YouTube, the effects of technological Free and psychological Free work against each other.
So how does YouTube bring in revenue? Well, it tries to sell advertisements alongside its videos. The problem is that the videos attracted by psychological Free—pirated material, cat videos, and other forms of user-generated content—are not the sort of thing that advertisers want to be associated with. In order to sell advertising, YouTube has had to buy the rights to professionally produced content, such as television shows and movies. Credit Suisse put the cost of those licenses in 2009 at roughly two hundred and sixty million dollars. For Anderson, YouTube illustrates the principle that Free removes the necessity of aesthetic judgment. (As he puts it, YouTube proves that "crap is in the eye of the beholder.") But, in order to make money, YouTube has been obliged to pay for programs that aren't crap. To recap: YouTube is a great example of Free, except that Free technology ends up not being Free because of the way consumers respond to Free, fatally compromising YouTube's ability to make money around Free, and forcing it to retreat from the "abundance thinking" that lies at the heart of Free. Credit Suisse estimates that YouTube will lose close to half a billion dollars this year. If it were a bank, it would be eligible for TARP funds.
3.
Anderson begins the second part of his book by quoting Lewis Strauss, the former head of the Atomic Energy Commission, who famously predicted in the mid-nineteen-fifties that "our children will enjoy in their homes electrical energy too cheap to meter."
"What if Strauss had been right?" Anderson wonders, and then diligently sorts through the implications: as much fresh water as you could want, no reliance on fossil fuels, no global warming, abundant agricultural production. Anderson wants to take "too cheap to meter" seriously, because he believes that we are on the cusp of our own "too cheap to meter" revolution with computer processing, storage, and bandwidth. But here is the second and broader problem with Anderson's argument: he is asking the wrong question. It is pointless to wonder what would have happened if Strauss's prediction had come true while rushing past the reasons that it could not have come true.
Strauss's optimism was driven by the fuel cost of nuclear energy—which was so low compared with its fossil-fuel counterparts that he considered it (to borrow Anderson's phrase) close enough to free to round down. Generating and distributing electricity, however, requires a vast and expensive infrastructure of transmission lines and power plants—and it is this infrastructure that accounts for most of the cost of electricity. Fuel prices are only a small part of that. As Gordon Dean, Strauss's predecessor at the A.E.C., wrote, " " Even if coal were mined and distributed free to electric generating plants today, the reduction in your monthly electricity bill would amount to but twenty per cent, so great is the cost of the plant itself and the distribution system."
This is the kind of error that technological utopians make. They assume that their particular scientific revolution will wipe away all traces of its predecessors—that if you change the fuel you change the whole system. Strauss went on to forecast "an age of peace," jumping from atoms to human hearts. "As the world of chips and glass fibers and wireless waves goes, so goes the rest of the world," Kevin Kelly, another Wired visionary, proclaimed at the start of his 1998 digital manifesto, "New Rules for the New Economy," offering up the same non sequitur. And now comes Anderson. "The more products are made of ideas, rather than stuff, the faster they can get cheap," he writes, and we know what's coming next: "However, this is not limited to digital products." Just look at the pharmaceutical industry, he says. Genetic engineering means that drug development is poised to follow the same learning curve of the digital world, to "accelerate in performance while it drops in price."
But, like Strauss, he's forgotten about the plants and the power lines. The expensive part of making drugs has never been what happens in the laboratory. It's what happens after the laboratory, like the clinical testing, which can take years and cost hundreds of millions of dollars. In the pharmaceutical world, what's more, companies have chosen to use the potential of new technology to do something very different from their counterparts in Silicon Valley. They've been trying to find a way to serve smaller and smaller markets—to create medicines tailored to very specific subpopulations and strains of diseases—and smaller markets often mean higher prices. The biotechnology company Genzyme spent five hundred million dollars developing the drug Myozyme, which is intended for a condition, Pompe disease, that afflicts fewer than ten thousand people worldwide. That's the quintessential modern drug: a high-tech, targeted remedy that took a very long and costly path to market. Myozyme is priced at three hundred thousand dollars a year. Genzyme isn't a mining company: its real assets are intellectual property—information, not stuff. But, in this case, information does not want to be free. It wants to be really, really expensive.
And there's plenty of other information out there that has chosen to run in the opposite direction from Free. The Times gives away its content on its Web site. But the Wall Street Journal has found that more than a million subscribers are quite happy to pay for the privilege of reading online. Broadcast television—the original practitioner of Free—is struggling. But premium cable, with its stiff monthly charges for specialty content, is doing just fine. Apple may soon make more money selling iPhone downloads (ideas) than it does from the iPhone itself (stuff). The company could one day give away the iPhone to boost downloads; it could give away the downloads to boost iPhone sales; or it could continue to do what it does now, and charge for both. Who knows? The only iron law here is the one too obvious to write a book about, which is that the digital age has so transformed the ways in which things are made and sold that there are no iron laws.
backtop
THE ARCHIVE
complete list
Articles from the New Yorker
Cocksure
download pdf
homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog
July 27, 2009
Dept. of Finance
Banks, battles, and the psychology of overconfidence.
1.
In 1996, an investor named Henry de Kwiatkowski sued Bear Stearns for negligence and breach of fiduciary duty. De Kwiatkowski had made—and then lost—hundreds of millions of dollars by betting on the direction of the dollar, and he blamed his bankers for his reversals. The district court ruled in de Kwiatkowski's favor, ultimately awarding him $164.5 million in damages. But Bear Stearns appealed—successfully—and in William D. Cohan's engrossing account of the fall of Bear Stearns, "House of Cards," the firm's former chairman and C.E.O. Jimmy Cayne tells the story of what happened on the day of the hearing:
Their lead lawyer turned out to be about a 300-pound fag from Long Island . . . a really irritating guy who had cross-examined me and tried to kick the shit out of me in the lower court trial. Now when we walk into the courtroom for the appeal, they're arguing another case and we have to wait until they're finished. And I stopped this guy. I had to take a piss. I went into the bathroom to take a piss and came back and sat down. Then I see my blood enemy stand up and he's going to the bathroom. So I wait till he passes and then I follow him in and it's just he and I in the bathroom. And I said to him, "Today you're going to get your ass kicked, big." He ran out of the room. He thought I might have wanted to start it right there and then.
At the time Cayne said this, Bear Stearns had spectacularly collapsed. The eighty-five-year-old investment bank, with its shiny new billion-dollar headquarters and its storied history, was swallowed whole by J. P. Morgan Chase. Cayne himself had lost close to a billion dollars. His reputation—forty years in the making—was in ruins, especially when it came out that, during Bear's final, critical months, he'd spent an inordinate amount of time on the golf course.
Did Cayne think long and hard about how he wanted to make his case to Cohan? He must have. Cayne understood selling; he started out as a photocopier salesman, working the nine-hundred-mile stretch between Boise and Salt Lake City, and ended up among the highest-paid executives in banking. He was known as one of the savviest men on the Street, a master tactician, a brilliant gamesman. "Jimmy had it all," Bill Bamber, a former Bear senior managing director, writes in "Bear Trap: The Fall of Bear Stearns and the Panic of 2008" (a book co-written by Andrew Spencer). "The ability to read an opponent. The ability to objectively analyze his own strengths and weaknesses. . . . He knew how to exploit others' weaknesses—and their strengths, for that matter—as a way to further his own gain. He knew when to take his losses and live to fight another day."
Cohan asked Cayne about the last days of Bear Stearns, in the spring of 2008. Wall Street had become so spooked by rumors about the firm's financial status that investors withdrew their capital, and no one would lend Bear the money required for its day-to-day operations. The bank received some government money, via J. P. Morgan. But Timothy Geithner, then the head of the New York Federal Reserve Bank, didn't open the Fed's so-called "discount window" to investment banks until J. P. Morgan's acquisition of Bear was under way. What did Cayne think of Geithner? Picture the scene. The journalist in one chair, Cayne in another. Between them, a tape recorder. And the savviest man on Wall Street sets out to salvage his good name:
The audacity of that prick in front of the American people announcing he was deciding whether or not a firm of this stature and this whatever was good enough to get a loan. Like he was the determining factor, and it's like a flea on his back, floating down underneath the Golden Gate Bridge, getting a hard-on, saying, "Raise the bridge." This guy thinks he's got a big dick. He's got nothing, except maybe a boyfriend.
2.
Since the beginning of the financial crisis, there have been two principal explanations for why so many banks made such disastrous decisions. The first is structural. Regulators did not regulate. Institutions failed to function as they should. Rules and guidelines were either inadequate or ignored. The second explanation is that Wall Street was incompetent, that the traders and investors didn't know enough, that they made extravagant bets without understanding the consequences. But the first wave of postmortems on the crash suggests a third possibility: that the roots of Wall Street's crisis were not structural or cognitive so much as they were psychological.
In "Military Misfortunes," the historians Eliot Cohen and John Gooch offer, as a textbook example of this kind of failure, the British-led invasion of Gallipoli, in 1915. Gallipoli is a peninsula in southern Turkey, jutting out into the Aegean. The British hoped that by landing an army there they could make an end run around the stalemate on the Western Front, and give themselves a clear shot at the soft underbelly of Germany. It was a brilliant and daring strategy. "In my judgment, it would have produced a far greater effect upon the whole conduct of the war than anything [else]," the British Prime Minister H. H. Asquith later concluded. But the invasion ended in disaster, and Cohen and Gooch find the roots of that disaster in the curious complacency displayed by the British.
The invasion required a large-scale amphibious landing, something the British had little experience with. It then required combat against a foe dug into ravines and rocky outcroppings and hills and thickly vegetated landscapes that Cohen and Gooch call "one of the finest natural fortresses in the world." Yet the British never bothered to draw up a formal plan of operations. The British military leadership had originally estimated that the Allies would need a hundred and fifty thousand troops to take Gallipoli. Only seventy thousand were sent. The British troops should have had artillery—more than three hundred guns. They took a hundred and eighteen, and, for the most part, neglected to bring howitzers, trench mortars, or grenades. Command of the landing at Sulva —the most critical element of the attack—was given to Frederick Stopford, a retired officer whose experience was largely administrative. Stopford had two days during which he had a ten-to-one advantage over the Turks and could easily have seized the highlands overlooking the bay. Instead, his troops lingered on the beach, while Stopford lounged offshore, aboard a command ship. Winston Churchill later described the scene as "the placid, prudent, elderly English gentleman with his 20,000 men spread around the beaches, the front lines sitting on the tops of shallow trenches, smoking and cooking, with here and there an occasional rifle shot, others bathing by hundreds in the bright blue bay where, disturbed hardly by a single shell, floated the great ships of war." When word of Stopford's ineptitude reached the British commander, Sir Ian Hamilton, he rushed to Sulva Bay to intercede—although "rushed" may not be quite the right word here, since Hamilton had chosen to set up his command post on an island an hour away and it took him a good while to find a boat to take him to the scene.
Cohen and Gooch ascribe the disaster at Gallipoli to a failure to adapt—a failure to take into account how reality did not conform to their expectations. And behind that failure to adapt was a deeply psychological problem: the British simply couldn't wrap their heads around the fact that they might have to adapt. "Let me bring my lads face to face with Turks in the open field," Hamilton wrote in his diary before the attack. "We must beat them every time because British volunteer soldiers are superior individuals to Anatolians, Syrians or Arabs and are animated with a superior ideal and an equal joy in battle."
Hamilton was not a fool. Cohen and Gooch call him an experienced and "brilliant commander who was also a firstrate trainer of men and a good organizer." Nor was he entirely wrong in his assessments. The British probably were a superior fighting force. Certainly they were more numerous, especially when they held that ten-to-one advantage at Sulva Bay. Hamilton, it seems clear, was simply overconfident—and one of the things that happen to us when we become overconfident is that we start to blur the line between the kinds of things that we can control and the kinds of things that we can't. The psychologist Ellen Langer once had subjects engage in a betting game against either a self-assured, well-dressed opponent or a shy and badly dressed opponent (in Langer's delightful phrasing, the "dapper" or the "schnook" condition), and she found that her subjects bet far more aggressively when they played against the schnook. They looked at their awkward opponent and thought, I'm better than he is. Yet the game was pure chance: all the players did was draw cards at random from a deck, and see who had the high hand. This is called the "illusion of control": confidence spills over from areas where it may be warranted ("I'm savvier than that schnook") to areas where it isn't warranted at all ("and that means I'm going to draw higher cards").
At Gallipoli, the British acted as if their avowed superiority over the Turks gave them superiority over all aspects of the contest. They neglected to take into account the fact that the morning sun would be directly in the eyes of the troops as they stormed ashore. They didn't bring enough water. They didn't factor in the harsh terrain. "The attack was based on two assumptions," Cohen and Gooch write, "both of which turned out to be unwise: that the only really difficult part of the operation would be getting ashore, after which the Turks could easily be pushed off the peninsula; and that the main obstacles to a happy landing would be provided by the enemy."
Most people are inclined to use moral terms to describe overconfidence—terms like "arrogance" or "hubris." But psychologists tend to regard overconfidence as a state as much as a trait. The British at Gallipoli were victims of a situation that promoted overconfidence. Langer didn't say that it was only arrogant gamblers who upped their bets in the presence of the schnook. She argues that this is what competition does to all of us; because ability makes a difference in competitions of skill, we make the mistake of thinking that it must also make a difference in competitions of pure chance. Other studies have reached similar conclusions. As novices, we don't trust our judgment. Then we have some success, and begin to feel a little surer of ourselves. Finally, we get to the top of our game and succumb to the trap of thinking that there's nothing we can't master. As we get older and more experienced, we overestimate the accuracy of our judgments, especially when the task before us is difficult and when we're involved with something of great personal importance. The British were overconfident at Gallipoli not because Gallipoli didn't matter but, paradoxically, because it did; it was a high-stakes contest, of daunting complexity, and it is often in those circumstances that overconfidence takes root.
Several years ago, a team headed by the psychologist Mark Fenton-O'Creevy created a computer program that mimicked the ups and downs of an index like the Dow, and recruited, as subjects, members of a highly paid profession. As the line moved across the screen, Fenton-O'Creevy asked his subjects to press a series of buttons, which, they were told, might or might not affect the course of the line. At the end of the session, they were asked to rate their effectiveness in moving the line upward. The buttons had no effect at all on the line. But many of the players were convinced that their manipulation of the buttons made the index go up and up. The world these people inhabited was competitive and stressful and complex. They had been given every reason to be confident in their own judgments. If they sat down next to you, with a tape recorder, it wouldn't take much for them to believe that they had you in the palm of their hand. They were traders at an investment bank.
3.
The high-water mark for Bear Stearns was 2003. The dollar was falling. A wave of scandals had just swept through the financial industry. The stock market was in a swoon. But Bear Stearns was an exception. In the first quarter of that year, its earnings jumped fifty-five per cent. Its return on equity was the highest on Wall Street. The firm's mortgage business was booming. Since Bear Stearns's founding, in 1923, it had always been a kind of also-ran to its more blue-chip counterparts, like Goldman Sachs and Morgan Stanley. But that year Fortune named it the best financial company to work for. "We are hitting on all 99 cylinders,'' Jimmy Cayne told a reporter for the Times, in the spring of that year, "so you have to ask yourself, What can we do better? And I just can't decide what that might be.'' He went on, "Everyone says that when the markets turn around, we will suffer. But let me tell you, we are going to surprise some people this time around. Bear Stearns is a great place to be.''
With the benefit of hindsight, Cayne's words read like the purest hubris. But in 2003 they would have seemed banal. These are the kinds of things that bankers say. More precisely—and here is where psychological failure becomes more problematic still—these are the kinds of things that bankers are expected to say. Investment banks are able to borrow billions of dollars and make huge trades because, at the end of the day, their counterparties believe they are capable of making good on their promises. Wall Street is a confidence game, in the strictest sense of that phrase.
This is what social scientists mean when they say that human overconfidence can be an "adaptive trait. In conflicts involving mutual assessment, an exaggerated assessment of the probability of winning increases the probability of winning," Richard Wrangham, a biological anthropologist at Harvard, writes. "Selection therefore favors this form of overconfidence." Winners know how to bluff. And who bluffs the best? The person who, instead of pretending to be stronger than he is, actually believes himself to be stronger than he is. According to Wrangham, self-deception reduces the chances of "behavioral leakage"; that is, of "inadvertently revealing the truth through an inappropriate behavior." This much is in keeping with what some psychologists have been telling us for years—that it can be useful to be especially optimistic about how attractive our spouse is, or how marketable our new idea is. In the words of the social psychologist Roy Baumeister, humans have an "optimal margin of illusion."
If you were a Wall Street C.E.O., there were two potential lessons to be drawn from the collapse of Bear Stearns. The first was that Jimmy Cayne was overconfident. The second was that Jimmy Cayne wasn't overconfident enough. Bear Stearns did not collapse, after all, simply because it had made bad bets. Until very close to the end, the firm had a capital cushion of more than seventeen billion dollars. The problem was that when, in early 2008, Cayne and his colleagues stood up and said that Bear was a great place to be, the rest of Wall Street no longer believed them. Clients withdrew their money, and lenders withheld funding. As the run on Bear Stearns worsened, J. P. Morgan and the Fed threw the bank a lifeline—a multibillion-dollar line of credit. But confidence matters so much on Wall Street that the lifeline had the opposite of its intended effect. As Bamber writes:
This line-of-credit, the stop-gap measure that was supposed to solve the problem that hadn't really existed in the first place had done nothing but worsen it. When we started the week, we had no liquidity issues. But because people had said that we did have problems with our capital, it became true, even though it wasn't true when people started saying it. . . . So we were forced to find capital to offset the losses we'd sustained because somebody decided we didn't have capital when we really did. So when we finally got more capital to replace the capital we'd lost, people took that as a bad sign and pointed to the fact that we'd had no capital and had to get a loan to cover it, even when we did have the capital they said we didn't have.
Of course, one reason that over-confidence is so difficult to eradicate from expert fields like finance is that, at least some of the time, it's useful to be overconfident—or, more precisely, sometimes the only way to get out of the problems caused by overconfidence is to be even more overconfident.
From an individual perspective, it is hard to distinguish between the times when excessive optimism is good and the times when it isn't. All that we can say unequivocally is that overconfidence is, as Wrangham puts it, "globally maladaptive." When one opponent bluffs, he can score an easy victory. But when everyone bluffs, Wrangham writes, rivals end up "escalating conflicts that only one can win and suffering higher costs than they should if assessment were accurate." The British didn't just think the Turks would lose in Gallipoli; they thought that Belgium would prove to be an obstacle to Germany's advance, and that the Russians would crush the Germans in the east. The French, for their part, planned to be at the Rhine within six weeks of the start of the war, while the Germans predicted that by that point they would be on the outskirts of Paris. Every side in the First World War was bluffing, with the resolve and skill that only the deluded are capable of, and the results, of course, were catastrophic.
4.
Jimmy Cayne grew up in Chicago, the son of a patent lawyer. He wanted to be a bookie, but he realized that it wasn't quite respectable enough. He went to Purdue University to study mechanical engineering—and became hooked on bridge. His grades suffered, and he never graduated. He got married in 1956 and was divorced within four years. "At this time, he was one of the best bridge players in Chicago," his ex-brother-in-law told Cohan. "In fact, that's the reason for the divorce. There was no other woman or anything like that. The co-respondent in their divorce was bridge. He spent all of his time playing bridge—every night. He wasn't home." He was selling scrap metal in those days, and, Cohan says, he would fall asleep on the job, exhausted from playing cards. In 1964, he moved to New York to become a professional bridge player. It was bridge that led him to his second wife, and to a job interview with Alan (Ace) Greenberg, then a senior executive at Bear Stearns. When Cayne told Greenberg that he was a bridge player, Cayne tells Cohan, "you could see the electric light bulb." Cayne goes on:
[Greenberg] says, "How well do you play?" I said, "I play well." He said, "Like how well?" I said, "I play quite well." He says, "You don't understand." I said, "Yeah, I do. I understand. Mr. Greenberg, if you study bridge the rest of your life, if you play with the best partners and you achieve your potential, you will never play bridge like I play bridge."
Right then and there, Cayne says, Greenberg offered him a job.
Twenty years later, the scene was repeated with Warren Spector, who went on to become a co-president of the firm. Spector had been a bridge champion as a student, and Cayne somehow heard about it. "Suddenly, out of nowhere there's a bridge player at Bear Stearns on the bond desk," Cayne recalls. Spector tells Cohan, "He called me up and said, 'Are you a bridge player?' I said, 'I used to be.' So bridge was something that he, Ace, and I all shared and talked about." As reports circulated that two of Bear Stearns's hedge funds were going under—a failure that started the bank on its long, downward spiral into collapse—Spector and Cayne were attending the Spingold K.O. bridge tournament, in Nashville. The Wall Street Journal reported that, of the twenty-one workdays that month, Cayne was out of the office for nearly half of them.
It makes sense that there should be an affinity between bridge and the business of Wall Street. Bridge is a contest between teams, each of which competes over a —how many tricks they think they can win in a given hand. Winning requires knowledge of the cards, an accurate sense of probabilities, steely nerves, and the ability to assess an opponent's psychology. Bridge is Wall Street in miniature, and the reason the light bulb went on when Greenberg looked at Cayne, and Cayne looked at Spector, is surely that they assumed that bridge skills could be transferred to the trading floor—that being good at the game version of Wall Street was a reasonable proxy for being good at the real-life version of Wall Street.
It isn't, however. In bridge, there is such a thing as expertise unencumbered by bias. That's because, as the psychologist Gideon Keren points out, bridge involves "related items with continuous feedback." It has rules and boundaries and situations that repeat themselves and clear patterns that ——and when a player makes a mistake of overconfidence he or she learns of the consequences of that mistake almost immediately. In other words, it's a game. But running an investment bank is not, in this sense, a game: it is not a closed world with a limited set of possibilities. It is an open world where one day a calamity can happen that no one had dreamed could happen, and where you can make a mistake of overconfidence and not personally feel the consequences for years and years—if at all. Perhaps this is part of why we play games: there is something intoxicating about pure expertise, and the real mastery we can attain around a card table or behind the wheel of a racecar emboldens us when we move into the more complex realms. "I'm good at that. I must be good at this, too," we tell ourselves, forgetting that in wars and on Wall Street there is no such thing as absolute expertise, that every step taken toward mastery brings with it an increased risk of mastery's curse. Cayne must have come back from the Spingold bridge tournament fortified in his belief in his own infallibility. And the striking thing about his conversations with Cohan is that nothing that had happened since seemed to have shaken that belief.
"When I left," Cayne told Cohan, speaking of his final day at Bear Stearns, "I had three different meetings. The first was with the president's advisory group, which was about eighty people. There wasn't a dry eye. Standing ovation. I was crying." Until the very end, he evidently saw the world that he wanted to see. "The second meeting was with the retail sales force on the Web," he goes on. "Standing ovation. And the third was a partners' meeting that night for me to tell them that I was stepping down. Standing ovation, of the whole auditorium."
backtop
Subscribe to:
Posts (Atom)