Friday, January 14, 2011

gladwell articles

Connecting the Dots

download pdf

../index.html../index.html

 

homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog

March 10, 2003
THE CRITICS

The paradoxes of intelligence reform.

1.

In the fall of 1973, the Syrian Army began to gather a large number of tanks, artillery batteries, and infantry along its border with Israel. Simultaneously, to the south, the Egyptian Army cancelled all leaves, called up thousands of reservists, and launched a massive military exercise, building roads and preparing anti-aircraft and artillery positions along the Suez Canal. On October 4th, an Israeli aerial reconnaissance mission showed that the Egyptians had moved artillery into offensive positions. That evening, AMAN, the Israeli military intelligence agency, learned that portions of the Soviet fleet near Port Said and Alexandria had set sail, and that the Soviet government had begun airlifting the families of Soviet advisers out of Cairo and Damascus. Then, at four o'clock in the morning on October 6th, Israel's director of military intelligence received an urgent telephone call from one of the country's most trusted intelligence sources. Egypt and Syria, the source said, would attack later that day. Top Israeli officials immediately called a meeting. Was war imminent? The head of AMAN, Major General Eli Zeira, looked over the evidence and said he didn't think so. He was wrong. That afternoon, Syria attacked from the east, overwhelming the thin Israeli defenses in the Golan Heights, and Egypt attacked from the south, bombing Israeli positions and sending eight thousand infantry streaming across the Suez. Despite all the warnings of the previous weeks, Israeli officials were caught by surprise. Why couldn't they connect the dots?

If you start on the afternoon of October 6th and work backward, the trail of clues pointing to an attack seems obvious; you'd have to conclude that something was badly wrong with the Israeli intelligence service. On the other hand, if you start several years before the Yom Kippur War and work forward, re-creating what people in Israeli intelligence knew in the same order that they knew it, a very different picture emerges. In the fall of 1973, Egypt and Syria certainly looked as if they were preparing to go to war. But, in the Middle East of the time, countries always looked as if they were going to war. In the fall of 1971, for instance, both Egypt's President and its minister of war stated publicly that the hour of battle was approaching. The Egyptian Army was mobilized. Tanks and bridging equipment were sent to the canal. Offensive positions were readied. And nothing happened. In December of 1972, the Egyptians mobilized again. The Army furiously built fortifications along the canal. A reliable source told Israeli intelligence that an attack was imminent. Nothing happened. In the spring of 1973, the President of Egypt told Newsweek that everything in his country "is now being mobilized in earnest for the resumption of battle." Egyptian forces were moved closer to the canal. Extensive fortifications were built along the Suez. Blood donors were rounded up. Civil-defense personnel were mobilized. Blackouts were imposed throughout Egypt. A trusted source told Israeli intelligence that an attack was imminent. It didn't come. Between January and October of 1973, the Egyptian Army mobilized nineteen times without going to war. The Israeli government couldn't mobilize its Army every time its neighbors threatened war. Israel is a small country with a citizen Army. Mobilization was disruptive and expensive, and the Israeli government was acutely aware that if its Army was mobilized and Egypt and Syria weren't serious about war, the very act of mobilization might cause them to become serious about war.

Nor did the other signs seem remarkable. The fact that the Soviet families had been sent home could have signified nothing more than a falling-out between the Arab states and Moscow. Yes, a trusted source called at four in the morning, with definite word of a late afternoon attack, but his last two attack warnings had been wrong. What's more, the source said that the attack would come at sunset, and an attack so late in the day wouldn't leave enough time for opening air strikes. Israeli intelligence didn't see the pattern of Arab intentions, in other words, because, until Egypt and Syria actually attacked, on the afternoon of October 6, 1973, their intentions didn't form a pattern. They formed a Rorschach blot. What is clear in hindsight is rarely clear before the fact. It's an obvious point, but one that nonetheless bears repeating, particularly when we're in the midst of assigning blame for the surprise attack of September 11th.

2.

Of the many postmortems conducted after September 11th, the one that has received the most attention is "The Cell: Inside the 9/11 Plot, and Why the F.B.I. and C.I.A. Failed to Stop It" (Hyperion; $24.95), by John Miller, Michael Stone, and Chris Mitchell. The authors begin their tale with El Sayyid Nosair, the Egyptian who was arrested in November of 1990 for shooting Rabbi Meir Kahane, the founder of the Jewish Defense League, in the ballroom of the Marriott Hotel in midtown Manhattan. Nosair's apartment in New Jersey was searched, and investigators found sixteen boxes of files, including training manuals from the Army Special Warfare School; copies of teletypes that had been routed to the Joint Chiefs of Staff; bombmaking manuals; and maps, annotated in Arabic, of landmarks like the Statue of Liberty, Rockefeller Center, and the World Trade Center. According to "The Cell," Nosair was connected to gunrunners and to Islamic radicals in Brooklyn, who were in turn behind the World Trade Center bombing two and a half years later, which was masterminded by Ramzi Yousef, who then showed up in Manila in 1994, apparently plotting to kill the Pope, crash a plane into the Pentagon or the C.I.A., and bomb as many as twelve transcontinental airliners simultaneously. And who was Yousef associating with in the Philippines? Mohammed Khalifa, Wali Khan AminShah, and Ibrahim Munir, all of whom had fought alongside, pledged a loyalty oath to, or worked for a shadowy Saudi Arabian millionaire named Osama bin Laden.

Miller was a network-television correspondent throughout much of the past decade, and the best parts of "The Cell" recount his own experiences in covering the terrorist story. He is an extraordinary reporter. At the time of the first World Trade Center attack, in February of 1993, he clapped a flashing light on the dashboard of his car and followed the wave of emergency vehicles downtown. (At the bombing site, he was continuously trailed by a knot of reporters--I was one of them--who had concluded that the best way to learn what was going on was to try to overhear his conversations.) Miller became friends with the F.B.I. agents who headed the New York counterterrorist office--Neil Herman and John O'Neill, in particular--and he became as obsessed with Al Qaeda as they were. He was in Yemen, with the F.B.I., after Al Qaeda bombed the U.S.S. Cole. In 1998, at the Marriott in Islamabad, he and his cameraman met someone known to them only as Akhtar, who spirited them across the border into the hills of Afghanistan to interview Osama bin Laden. In "The Cell," the period from 1990 through September 11th becomes a seamless, devastating narrative: the evolution of Al Qaeda. "How did this happen to us?" the book asks in its opening pages. The answer, the authors argue, can be found by following the "thread" connecting Kahane's murder to September 11th. In the events of the past decade, they declare, there is a clear "recurring pattern."

The same argument is made by Senator Richard Shelby, vice-chairman of the Senate Select Committee on Intelligence, in his investigative report on September 11th, released this past December. The report is a lucid and powerful document, in which Shelby painstakingly points out all the missed or misinterpreted signals pointing to a major terrorist attack. The C.I.A. knew that two suspected Al Qaeda operatives, Khalid al-Mihdhar and Nawaf al-Hazmi, had entered the country, but the C.I.A. didn't tell the F.B.I. or the N.S.C. An F.B.I. agent in Phoenix sent a memo to headquarters that began with the sentence "The purpose of this communication is to advise the bureau and New York of the possibility of a coordinated effort by Osama Bin Laden to send students to the United States to attend civilian aviation universities and colleges." But the F.B.I. never acted on the information, and failed to connect it with reports that terrorists were interested in using airplanes as weapons. The F.B.I. took into custody the suspected terrorist Zacarias Moussaoui, on account of his suspicious behavior at flight school, but was unable to integrate his case into a larger picture of terrorist behavior. "The most fundamental problem . . . is our Intelligence Community's inability to 'connect the dots' available to it before September 11, 2001, about terrorists' interest in attacking symbolic American targets," the Shelby report states. The phrase "connect the dots" appears so often in the report that it becomes a kind of mantra. There was a pattern, as plain as day in retrospect, yet the vaunted American intelligence community simply could not see it.

None of these postmortems, however, answer the question raised by the Yom Kippur War: Was this pattern obvious before the attack? This question--whether we revise our judgment of events after the fact--is something that psychologists have paid a great deal of attention to. For example, on the eve of Richard Nixon's historic visit to China, the psychologist Baruch Fischhoff asked a group of people to estimate the probability of a series of possible outcomes of the trip. What were the chances that the trip would lead to permanent diplomatic relations between China and the United States? That Nixon would meet with the leader of China, Mao Tse-tung, at least once? That Nixon would call the trip a success? As it turned out, the trip was a diplomatic triumph, and Fischhoff then went back to the same people and asked them to recall what their estimates of the different outcomes of the visit had been. He found that the subjects now, overwhelmingly, "remembered" being more optimistic than they had actually been. If you originally thought that it was unlikely that Nixon would meet with Mao, afterward, when the press was full of accounts of Nixon's meeting with Mao, you'd "remember" that you had thought the chances of a meeting were pretty good. Fischhoff calls this phenomenon "creeping determinism"--the sense that grows on us, in retrospect, that what has happened was actually inevitable--and the chief effect of creeping determinism, he points out, is that it turns unexpected events into expected events. As he writes, "The occurrence of an event increases its reconstructed probability and makes it less surprising than it would have been had the original probability been remembered."

To read the Shelby report, or the seamless narrative from Nosair to bin Laden in "The Cell," is to be convinced that if the C.I.A. and the F.B.I. had simply been able to connect the dots what happened on September 11th should not have been a surprise at all. Is this a fair criticism or is it just a case of creeping determinism?

3.

On August 7, 1998, two Al Qaeda terrorists detonated a cargo truck filled with explosives outside the United States Embassy in Nairobi, killing two hundred and thirteen people and injuring more than four thousand. Miller, Stone, and Mitchell see the Kenyan Embassy bombing as a textbook example of intelligence failure. The C.I.A., they tell us, had identified an Al Qaeda cell in Kenya well before the attack, and its members were under surveillance. They had an eight-page letter, written by an Al Qaeda operative, speaking of the imminent arrival of "engineers"--the code word for bombmakers--in Nairobi. The United States Ambassador to Kenya, Prudence Bushnell, had begged Washington for more security. A prominent Kenyan lawyer and legislator says that the Kenyan intelligence service warned U.S. intelligence about the plot several months before August 7th, and in November of 1997 a man named Mustafa Mahmoud Said Ahmed, who worked for one of Osama bin Laden's companies, walked into the United States Embassy in Nairobi and told American intelligence of a plot to blow up the building. What did our officials do? They forced the leader of the Kenyan cell--a U.S. citizen--to return home, and then abruptly halted their surveillance of the group. They ignored the eight-page letter. They allegedly showed the Kenyan intelligence service's warning to the Mossad, which dismissed it, and after questioning Ahmed they decided that he wasn't credible. After the bombing, "The Cell" tells us, a senior State Department official phoned Bushnell and asked, "How could this have happened?"

"For the first time since the blast," Miller, Stone, and Mitchell write, "Bushnell's horror turned to anger. There was too much history. 'I wrote you a letter,' she said."

This is all very damning, but doesn't it fall into the creeping-determinism trap? It is not at all clear that it passes the creeping-determinism test. It's an edited version of the past. What we don't hear about is all the other people whom American intelligence had under surveillance, how many other warnings they received, and how many other tips came in that seemed promising at the time but led nowhere. The central challenge of intelligence gathering has always been the problem of "noise": the fact that useless information is vastly more plentiful than useful information. Shelby's report mentions that the F.B.I.'s counter terrorism division has sixty-eight thousand outstanding and unassigned leads dating back to 1995. And, of those, probably no more than a few hundred are useful. Analysts, in short, must be selective, and the decisions made in Kenya, by that standard, do not seem unreasonable. Surveillance on the cell was shut down, but, then, its leader had left the country. Bushnell warned Washington--but, as "The Cell" admits, there were bomb warnings in Africa all the time. Officials at the Mossad thought the Kenyan intelligence was dubious, and the Mossad ought to know. Ahmed may have worked for bin Laden but he failed a polygraph test, and it was also learned that he had previously given similar--groundless--warnings to other embassies in Africa. When a man comes into your office, fails a lie-detector test, and is found to have shopped the same unsubstantiated story all over town, can you be blamed for turning him out?

Miller, Stone, and Mitchell make the same mistake when they quote from a transcript of a conversation that was recorded by Italian intelligence in August of 2001 between two Al Qaeda operatives, Abdel Kader Es Sayed and a man known as al Hilal. This, they say, is yet another piece of intelligence that "seemed to forecast the September 11 attacks."

"I've been studying airplanes," al Hilal tells Es Sayed. "If God wills, I hope to be able to bring you a window or a piece of a plane the next time I see you."

"What, is there a jihad planned?" Es Sayed asks. "In the future, listen to the news and remember these words: 'Up above,'" al Hilal replies.

Es Sayed thinks that al Hilal is referring to an operation in his native Yemen, but al Hilal corrects him: "But the surprise attack will come from the other country, one of those attacks you will never forget."

A moment later al Hilal says about the plan, "It is something terrifying that goes from south to north, east to west. The person who devised this plan is a madman, but a genius. He will leave them frozen [in shock]."

This is a tantalizing exchange. It would now seem that it refers to September 11th. But in what sense was it a "forecast"? It gave neither time nor place nor method nor target. It suggested only that there were terrorists out there who liked to talk about doing something dramatic with an airplane--which did not, it must be remembered, reliably distinguish them from any other terrorists of the past thirty years.

In the real world, intelligence is invariably ambiguous. Information about enemy intentions tends to be short on detail. And information that's rich in detail tends to be short on intentions. In April of 1941, for instance, the Allies learned that Germany had moved a huge army up to the Russian front. The intelligence was beyond dispute: the troops could be seen and counted. But what did it mean? Churchill concluded that Hitler wanted to attack Russia. Stalin concluded that Hitler was serious about attacking, but only if the Soviet Union didn't meet the terms of the German ultimatum. The British foreign secretary, Anthony Eden, thought that Hitler was bluffing, in the hope of winning further Russian concessions. British intelligence thought--at least, in the beginning--that Hitler simply wanted to reinforce his eastern frontier against a possible Soviet attack. The only way for this piece of intelligence to have been definitive wold have been if the Allies had a second piece of intelligence--like the phone call between al Hilal and Es Sayed--that demonstrated Germany's true purpose. Similarly, the only way the al Hilal phone call would have been definitive is if we'd also had intelligence as detailed as the Allied knowledge of German troop movements. But rarely do intelligence services have the luxury of both kinds of information. Nor are their analysts mind readers. It is only with hindsight that human beings acquire that skill.

"The Cell" tells us that, in the final months before September 11th, Washington was frantic with worry:

A spike in phone traffic among suspected al Qaeda members in the early part of the summer [of 2001], as well as debriefings of [an al Qaeda operative in custody] who had begun cooperating with the government, convinced investigators that bin Laden was planning a significant operation--one intercepted al Qaeda message spoke of a "Hiroshima-type" event--and that he was planning it soon. Through the summer, the CIA repeatedly warned the White House that attacks were imminent.

The fact that these worries did not protect us is not evidence of the limitations of the intelligence community. It is evidence of the limitations of intelligence.

4.

In the early nineteen-seventies, a professor of psychology at Stanford University named David L. Rosenhan gathered together a painter, a graduate student, a pediatrician, a psychiatrist, a housewife, and three psychologists. He told them to check into different psychiatric hospitals under aliases, with the complaint that they had been hearing voices. They were instructed to say that the voices were unfamiliar, and that they heard words like "empty,""thud," and "hollow." Apart from that initial story, the pseudo patients were instructed to answer every question truthfully, to behave as they normally would, and to tell the hospital staff--at every opportunity--that the voices were gone and that they had experienced no further symptoms. The eight subjects were hospitalized, on average, for nineteen days. One was kept for almost two months. Rosenhan wanted to find out if the hospital staffs would ever see through the ruse. They never did.

Rosenhan's test is, in a way, a classic intelligence problem. Here was a signal (a sane person) buried in a mountain of conflicting and confusing noise (a mental hospital), and the intelligence analysts (the doctors) were asked to connect the dots--and they failed spectacularly. In the course of their hospital stay, the eight pseudo patients were given a total of twenty-one hundred pills. They underwent psychiatric interviews, and sober case summaries documenting their pathologies were written up. They were asked by Rosenhan to take notes documenting how they were treated, and this quickly became part of their supposed pathology. "Patient engaging in writing behavior," one nurse ominously wrote in her notes. Having been labelled as ill upon admission, they could not shake the diagnosis. "Nervous?" a friendly nurse asked one of the subjects as he paced the halls one day. "No," he corrected her, to no avail, "bored."

The solution to this problem seems obvious enough. Doctors and nurses need to be made alert to the possibility that sane people sometimes get admitted to mental hospitals. So Rosenhan went to a research-and-teaching hospital and informed the staff that at some point in the next three months he would once again send over one or more of his pseudo patients. This time, of the hundred and ninety-three patients admitted in the three-month period, forty-one were identified by at least one staff member as being almost certainly sane. Once again, however, they were wrong. Rosenhan hadn't sent anyone over. In attempting to solve one kind of intelligence problem (overdiagnosis), the hospital simply created another problem (underdiagnosis). This is the second, and perhaps more serious, consequence of creeping determinism: in our zeal to correct what we believe to be the problems of the past, we end up creating new problems for the future.

Pearl Harbor, for example, was widely considered to be an organizational failure. The United States had all the evidence it needed to predict the Japanese attack, but the signals were scattered throughout the various intelligence services. The Army and the Navy didn't talk to each other. They spent all their time arguing and competing. This was, in part, why the Central Intelligence Agency was created, in 1947--to insure that all intelligence would be collected and processed in one place. Twenty years after Pearl Harbor, the United States suffered another catastrophic intelligence failure, at the Bay of Pigs: the Kennedy Administration grossly underestimated the Cubans' capacity to fight and their support for Fidel Castro. This time, however, the diagnosis was completely different. As Irving L. Janis concluded in his famous study of "groupthink," the root cause of the Bay of Pigs fiasco was that the operation was conceived by a small, highly cohesive group whose close ties inhibited the beneficial effects of argument and competition. Centralization was now the problem. One of the most influential organizational sociologists of the postwar era, Harold Wilensky, went out of his way to praise the "constructive rivalry" fostered by Franklin D. Roosevelt, which, he says, is why the President had such formidable intelligence on how to attack the economic ills of the Great Depression. In his classic 1967 work "Organizational Intelligence," Wilensky pointed out that Roosevelt would

use one anonymous informant's information to challenge and check another's, putting both on their toes; he recruited strong personalities and structured their work so that clashes would be certain. . . . In foreign affairs, he gave Moley and Welles tasks that overlapped those of Secretary of State Hull; in conservation and power, he gave Ickes and Wallace identical missions; in welfare, confusing both functions and initials, he assigned PWA to Ickes, WPA to Hopkins; in politics, Farley found himself competing with other political advisors for control over patronage. The effect: the timely advertisement of arguments, with both the experts and the President pressured to consider the main choices as they came boiling up from below.

The intelligence community that we had prior to September 11th was the direct result of this philosophy. The F.B.I. and the C.I.A. were supposed to be rivals, just as Ickes and Wallace were rivals. But now we've changed our minds. The F.B.I. and the C.I.A., Senator Shelby tells us disapprovingly, argue and compete with one another. The September 11th story, his report concludes, "should be an object lesson in the perils of failing to share information promptly and efficiently between (and within) organizations." Shelby wants recentralization and more focus on coöperation. He wants a "central national level knowledge-compiling entity standing above and independent from the disputatious bureaucracies." He thinks the intelligence service should be run by a small, highly cohesive group, and so he suggests that the F.B.I. be removed from the counterterrorism business entirely. The F.B.I., according to Shelby, is governed by

deeply-entrenched individual mindsets that prize the production of evidence-supported narratives of defendant wrongdoing over the drawing of probabilistic inferences based on incomplete and fragmentary information in order to support decision-making. . . . Law enforcement organizations handle information, reach conclusions, and ultimately just think differently than intelligence organizations. Intelligence analysts would doubtless make poor policemen, and it has become very clear that policemen make poor intelligence analysts.

In his State of the Union Message, President George W. Bush did what Shelby wanted, and announced the formation of the Terrorist Threat Integration Center--a special unit combining the antiterrorist activities of the F.B.I. and the C.I.A. The cultural and organizational diversity of the intelligence business, once prized, is now despised.

The truth is, though, that it is just as easy, in the wake of September 11th, to make the case for the old system. Isn't it an advantage that the F.B.I. doesn't think like the C.I.A.? It was the F.B.I., after all, that produced two of the most prescient pieces of analysis--the request by the Minneapolis office for a warrant to secretly search Zacarias Moussaoui's belongings, and the now famous Phoenix memo. In both cases, what was valuable about the F.B.I.'s analysis was precisely the way in which it differed from the traditional "big picture," probabilistic inference-making of the analyst. The F.B.I. agents in the field focussed on a single case, dug deep, and came up with an "evidence-supported narrative of defendant wrongdoing" that spoke volumes about a possible Al Qaeda threat.

The same can be said for the alleged problem of rivalry. "The Cell" describes what happened after police in the Philippines searched the apartment that Ramzi Yousef shared with his co-conspirator, Abdul Hakim Murad. Agents from the F.B.I.'s counterterrorism unit immediately flew to Manila and "bumped up against the C.I.A." As the old adage about the Bureau and the Agency has it, the F.B.I. wanted to string Murad up and the C.I.A. wanted to string him along. The two groups eventually worked together, but only because they had to. It was a relationship "marred by rivalry and mistrust." But what's wrong with this kind of rivalry? As Miller, Stone, and Mitchell tell us, the real objection of Neil Herman--the F.B.I.'s former domestic counterterrorism chief--to "working with the C.I.A. had nothing to do with procedure. He just didn't think the Agency was going to be of any help in finding Ramzi Yousef. 'Back then, I don't think the C.I.A. could have found a person in a bathroom,'" Herman says. "'Hell, I don't think they could have found the bathroom.'" The assumption of the reformers is always that the rivalry between the F.B.I. and the C.I.A. is essentially marital, that it is the dysfunction of people who ought to work together but can't. But it could equally be seen as a version of the marketplace rivalry that leads to companies working harder and making better products.

There is no such thing as a perfect intelligence system, and every seeming improvement involves a tradeoff. A couple of months ago, for example, a suspect in custody in Canada, who was wanted in New York on forgery charges, gave police the names and photographs of five Arab immigrants, who he said had crossed the border into the United States. The F.B.I. put out an alert on December 29th, posting the names and photographs on its Web site, in the "war on terrorism" section. Even President Bush joined in, saying, "We need to know why they have been smuggled into the country, what they're doing in the country." As it turned out, the suspect in Canada had made the story up. Afterward, an F.B.I. official said that the agency circulated the photographs in order to "err on the side of caution." Our intelligence services today are highly sensitive. But this kind of sensitivity is not without its costs. As the political scientist Richard K. Betts wrote in his essay "Analysis, War, and Decision: Why Intelligence Failures Are Inevitable,""Making warning systems more sensitive reduces the risk of surprise, but increases the number of false alarms, which in turn reduces sensitivity." When we run out and buy duct tape to seal our windows against chemical attack, and nothing happens, and when the government's warning light is orange for weeks on end, and nothing happens, we soon begin to doubt every warning that comes our way. Why was the Pacific fleet at Pearl Harbor so unresponsive to signs of an impending Japanese attack? Because, in the week before December 7, 1941, they had checked out seven reports of Japanese submarines in the area--and all seven were false. Rosenhan's psychiatrists used to miss the sane; then they started to see sane people everywhere. That is a change, but it is not exactly progress.

5.

In the wake of the Yom Kippur War, the Israeli government appointed a special investigative commission, and one of the witnesses called was Major General Zeira, the head of AMAN. Why, they asked, had he insisted that war was not imminent? His answer was simple:

The Chief of Staff has to make decisions, and his decisions must be clear. The best support that the head of AMAN can give the Chief of Staff is to give a clear and unambiguous estimate, provided that it is done in an objective fashion. To be sure, the clearer and sharper the estimate, the clearer and sharper the mistake--but this is a professional hazard for the head of AMAN.

The historians Eliot A. Cohen and John Gooch, in their book "Military Misfortunes," argue that it was Zeira's certainty that had proved fatal: "The culpable failure of AMAN's leaders in September and October 1973 lay not in their belief that Egypt would not attack but in their supreme confidence, which dazzled decision-makers. . . . Rather than impress upon the prime minister, the chief of staff and the minister of defense the ambiguity of the situation, they insisted--until the last day--that there would be no war, period."

But, of course, Zeira gave an unambiguous answer to the question of war because that is what politicians and the public demanded of him. No one wants ambiguity. Today, the F.B.I. gives us color-coded warnings and speaks of "increased chatter" among terrorist operatives, and the information is infuriating to us because it is so vague. What does "increased chatter" mean? We want a prediction. We want to believe that the intentions of our enemies are a puzzle that intelligence services can piece together, so that a clear story emerges. But there rarely is a clear story--at least, not until afterward, when some enterprising journalist or investigative committee decides to write one.

The Talent Myth

download pdf

../index.html../index.html

 

homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog

July 22, 2002
DEPT. OF HUMAN RESOURCES

Are smart people overrated?

1.

Five years ago, several executives at McKinsey & Company, America's largest and most prestigious management-consulting firm, launched what they called the War for Talent. Thousands of questionnaires were sent to managers across the country. Eighteen companies were singled out for special attention, and the consultants spent up to three days at each firm, interviewing everyone from the C.E.O. down to the human-resources staff. McKinsey wanted to document how the top-performing companies in America differed from other firms in the way they handle matters like hiring and promotion. But, as the consultants sifted through the piles of reports and questionnaires and interview transcripts, they grew convinced that the difference between winners and losers was more profound than they had realized. "We looked at one another and suddenly the light bulb blinked on," the three consultants who headed the project--Ed Michaels, Helen Handfield-Jones, and Beth Axelrod--write in their new book, also called "The War for Talent." The very best companies, they concluded, had leaders who were obsessed with the talent issue. They recruited ceaselessly, finding and hiring as many top performers as possible. They singled out and segregated their stars, rewarding them disproportionately, and pushing them into ever more senior positions. "Bet on the natural athletes, the ones with the strongest intrinsic skills," the authors approvingly quote one senior General Electric executive as saying. "Don't be afraid to promote stars without specifically relevant experience, seemingly over their heads." Success in the modern economy, according to Michaels, Handfield-Jones, and Axelrod, requires "the talent mind-set": the "deep-seated belief that having better talent at all levels is how you outperform your competitors."

This "talent mind-set" is the new orthodoxy of American management. It is the intellectual justification for why such a high premium is placed on degrees from first-tier business schools, and why the compensation packages for top executives have become so lavish. In the modern corporation, the system is considered only as strong as its stars, and, in the past few years, this message has been preached by consultants and management gurus all over the world. None, however, have spread the word quite so ardently as McKinsey, and, of all its clients, one firm took the talent mind-set closest to heart. It was a company where McKinsey conducted twenty separate projects, where McKinsey's billings topped ten million dollars a year, where a McKinsey director regularly attended board meetings, and where the C.E.O. himself was a former McKinsey partner. The company, of course, was Enron.

The Enron scandal is now almost a year old. The reputations of Jeffrey Skilling and Kenneth Lay, the company's two top executives, have been destroyed. Arthur Andersen, Enron's auditor, has been driven out of business, and now investigators have turned their attention to Enron's investment bankers. The one Enron partner that has escaped largely unscathed is McKinsey, which is odd, given that it essentially created the blueprint for the Enron culture. Enron was the ultimate "talent" company. When Skilling started the corporate division known as Enron Capital and Trade, in 1990, he "decided to bring in a steady stream of the very best college and M.B.A. graduates he could find to stock the company with talent," Michaels, Handfield-Jones, and Axelrod tell us. During the nineties, Enron was bringing in two hundred and fifty newly minted M.B.A.s a year. "We had these things called Super Saturdays," one former Enron manager recalls. "I'd interview some of these guys who were fresh out of Harvard, and these kids could blow me out of the water. They knew things I'd never heard of." Once at Enron, the top performers were rewarded inordinately, and promoted without regard for seniority or experience. Enron was a star system. "The only thing that differentiates Enron from our competitors is our people, our talent," Lay, Enron's former chairman and C.E.O., told the McKinsey consultants when they came to the company's headquarters, in Houston. Or, as another senior Enron executive put it to Richard Foster, a McKinsey partner who celebrated Enron in his 2001 book, "Creative Destruction," "We hire very smart people and we pay them more than they think they are worth."

The management of Enron, in other words, did exactly what the consultants at McKinsey said that companies ought to do in order to succeed in the modern economy. It hired and rewarded the very best and the very brightest--and it is now in bankruptcy. The reasons for its collapse are complex, needless to say. But what if Enron failed not in spite of its talent mind-set but because of it? What if smart people are overrated?

2.

At the heart of the McKinsey vision is a process that the War for Talent advocates refer to as "differentiation and affirmation." Employers, they argue, need to sit down once or twice a year and hold a "candid, probing, no-holds-barred debate about each individual," sorting employees into A, B, and C groups. The A's must be challenged and disproportionately rewarded. The B's need to be encouraged and affirmed. The C's need to shape up or be shipped out. Enron followed this advice almost to the letter, setting up internal Performance Review Committees. The members got together twice a year, and graded each person in their section on ten separate criteria, using a scale of one to five. The process was called "rank and yank." Those graded at the top of their unit received bonuses two-thirds higher than those in the next thirty per cent; those who ranked at the bottom received no bonuses and no extra stock options--and in some cases were pushed out.

How should that ranking be done? Unfortunately, the McKinsey consultants spend very little time discussing the matter. One possibility is simply to hire and reward the smartest people. But the link between, say, I.Q. and job performance is distinctly underwhelming. On a scale where 0.1 or below means virtually no correlation and 0.7 or above implies a strong correlation (your height, for example, has a 0.7 correlation with your parents' height), the correlation between I.Q. and occupational success is between 0.2 and 0.3. "What I.Q. doesn't pick up is effectiveness at common-sense sorts of things, especially working with people," Richard Wagner, a psychologist at Florida State University, says. "In terms of how we evaluate schooling, everything is about working by yourself. If you work with someone else, it's called cheating. Once you get out in the real world, everything you do involves working with other people."

Wagner and Robert Sternberg, a psychologist at Yale University, have developed tests of this practical component, which they call "tacit knowledge." Tacit knowledge involves things like knowing how to manage yourself and others, and how to navigate complicated social situations. Here is a question from one of their tests:

You have just been promoted to head of an important department in your organization. The previous head has been transferred to an equivalent position in a less important department. Your understanding of the reason for the move is that the performance of the department as a whole has been mediocre. There have not been any glaring deficiencies, just a perception of the department as so-so rather than very good. Your charge is to shape up the department. Results are expected quickly. Rate the quality of the following strategies for succeeding at your new position.

a) Always delegate to the most junior person who can be trusted with the task.
b) Give your superiors frequent progress reports.
c) Announce a major reorganization of the department that includes getting rid of whomever you believe to be "dead wood."
d) Concentrate more on your people than on the tasks to be done.
e) Make people feel completely responsible for their work.

Wagner finds that how well people do on a test like this predicts how well they will do in the workplace: good managers pick (b) and (e); bad managers tend to pick (c). Yet there's no clear connection between such tacit knowledge and other forms of knowledge and experience. The process of assessing ability in the workplace is a lot messier than it appears.

An employer really wants to assess not potential but performance. Yet that's just as tricky. In "The War for Talent," the authors talk about how the Royal Air Force used the A, B, and C ranking system for its pilots during the Battle of Britain. But ranking fighter pilots--for whom there are a limited and relatively objective set of performance criteria (enemy kills, for example, and the ability to get their formations safely home)--is a lot easier than assessing how the manager of a new unit is doing at, say, marketing or business development. And whom do you ask to rate the manager's performance? Studies show that there is very little correlation between how someone's peers rate him and how his boss rates him. The only rigorous way to assess performance, according to human-resources specialists, is to use criteria that are as specific as possible. Managers are supposed to take detailed notes on their employees throughout the year, in order to remove subjective personal reactions from the process of assessment. You can grade someone's performance only if you know their performance. And, in the freewheeling culture of Enron, this was all but impossible. People deemed "talented" were constantly being pushed into new jobs and given new challenges. Annual turnover from promotions was close to twenty per cent. Lynda Clemmons, the so-called "weather babe" who started Enron's weather derivatives business, jumped, in seven quick years, from trader to associate to manager to director and, finally, to head of her own business unit. How do you evaluate someone's performance in a system where no one is in a job long enough to allow such evaluation?

The answer is that you end up doing performance evaluations that aren't based on performance. Among the many glowing books about Enron written before its fall was the best-seller "Leading the Revolution," by the management consultant Gary Hamel, which tells the story of Lou Pai, who launched Enron's power-trading business. Pai's group began with a disaster: it lost tens of millions of dollars trying to sell electricity to residential consumers in newly deregulated markets. The problem, Hamel explains, is that the markets weren't truly deregulated: "The states that were opening their markets to competition were still setting rules designed to give their traditional utilities big advantages." It doesn't seem to have occurred to anyone that Pai ought to have looked into those rules more carefully before risking millions of dollars. He was promptly given the chance to build the commercial electricity-outsourcing business, where he ran up several more years of heavy losses before cashing out of Enron last year with two hundred and seventy million dollars. Because Pai had "talent," he was given new opportunities, and when he failed at those new opportunities he was given still more opportunities . . . because he had "talent." "At Enron, failure--even of the type that ends up on the front page of the Wall Street Journal--doesn't necessarily sink a career," Hamel writes, as if that were a good thing. Presumably, companies that want to encourage risk-taking must be willing to tolerate mistakes. Yet if talent is defined as something separate from an employee's actual performance, what use is it, exactly?

3.

What the War for Talent amounts to is an argument for indulging A employees, for fawning over them. "You need to do everything you can to keep them engaged and satisfied--even delighted," Michaels, Handfield-Jones, and Axelrod write. "Find out what they would most like to be doing, and shape their career and responsibilities in that direction. Solve any issues that might be pushing them out the door, such as a boss that frustrates them or travel demands that burden them." No company was better at this than Enron. In one oft-told story, Louise Kitchin, a twenty-nine-year-old gas trader in Europe, became convinced that the company ought to develop an online-trading business. She told her boss, and she began working in her spare time on the project, until she had two hundred and fifty people throughout Enron helping her. After six months, Skilling was finally informed. "I was never asked for any capital," Skilling said later. "I was never asked for any people. They had already purchased the servers. They had already started ripping apart the building. They had started legal reviews in twenty-two countries by the time I heard about it." It was, Skilling went on approvingly, "exactly the kind of behavior that will continue to drive this company forward."

Kitchin's qualification for running EnronOnline, it should be pointed out, was not that she was good at it. It was that she wanted to do it, and Enron was a place where stars did whatever they wanted. "Fluid movement is absolutely necessary in our company. And the type of people we hire enforces that," Skilling told the team from McKinsey. "Not only does this system help the excitement level for each manager, it shapes Enron's business in the direction that its managers find most exciting." Here is Skilling again: "If lots of [employees] are flocking to a new business unit, that's a good sign that the opportunity is a good one. . . . If a business unit can't attract people very easily, that's a good sign that it's a business Enron shouldn't be in." You might expect a C.E.O. to say that if a business unit can't attract customers very easily that's a good sign it's a business the company shouldn't be in. A company's business is supposed to be shaped in the direction that its managers find most profitable. But at Enron the needs of the customers and the shareholders were secondary to the needs of its stars.

A dozen years ago, the psychologists Robert Hogan, Robert Raskin, and Dan Fazzini wrote a brilliant essay called "The Dark Side of Charisma." It argued that flawed managers fall into three types. One is the High Likability Floater, who rises effortlessly in an organization because he never takes any difficult decisions or makes any enemies. Another is the Homme de Ressentiment, who seethes below the surface and plots against his enemies. The most interesting of the three is the Narcissist, whose energy and self-confidence and charm lead him inexorably up the corporate ladder. Narcissists are terrible managers. They resist accepting suggestions, thinking it will make them appear weak, and they don't believe that others have anything useful to tell them. "Narcissists are biased to take more credit for success than is legitimate," Hogan and his co-authors write, and "biased to avoid acknowledging responsibility for their failures and shortcomings for the same reasons that they claim more success than is their due." Moreover:

Narcissists typically make judgments with greater confidence than other people . . . and, because their judgments are rendered with such conviction, other people tend to believe them and the narcissists become disproportionately more influential in group situations. Finally, because of their self-confidence and strong need for recognition, narcissists tend to "self-nominate"; consequently, when a leadership gap appears in a group or organization, the narcissists rush to fill it.

Tyco Corporation and WorldCom were the Greedy Corporations: they were purely interested in short-term financial gain. Enron was the Narcissistic Corporation--a company that took more credit for success than was legitimate, that did not acknowledge responsibility for its failures, that shrewdly sold the rest of us on its genius, and that substituted self-nomination for disciplined management. At one point in "Leading the Revolution," Hamel tracks down a senior Enron executive, and what he breathlessly recounts--the braggadocio, the self-satisfaction--could be an epitaph for the talent mind-set:

"You cannot control the atoms within a nuclear fusion reaction," said Ken Rice when he was head of Enron Capital and Trade Resources (ECT), America's largest marketer of natural gas and largest buyer and seller of electricity. Adorned in a black T-shirt, blue jeans, and cowboy boots, Rice drew a box on an office whiteboard that pictured his business unit as a nuclear reactor. Little circles in the box represented its "contract originators," the gunslingers charged with doing deals and creating new businesses. Attached to each circle was an arrow. In Rice's diagram the arrows were pointing in all different directions. "We allow people to go in whichever direction that they want to go."

The distinction between the Greedy Corporation and the Narcissistic Corporation matters, because the way we conceive our attainments helps determine how we behave. Carol Dweck, a psychologist at Columbia University, has found that people generally hold one of two fairly firm beliefs about their intelligence: they consider it either a fixed trait or something that is malleable and can be developed over time. Five years ago, Dweck did a study at the University of Hong Kong, where all classes are conducted in English. She and her colleagues approached a large group of social-sciences students, told them their English-proficiency scores, and asked them if they wanted to take a course to improve their language skills. One would expect all those who scored poorly to sign up for the remedial course. The University of Hong Kong is a demanding institution, and it is hard to do well in the social sciences without strong English skills. Curiously, however, only the ones who believed in malleable intelligence expressed interest in the class. The students who believed that their intelligence was a fixed trait were so concerned about appearing to be deficient that they preferred to stay home. "Students who hold a fixed view of their intelligence care so much about looking smart that they act dumb," Dweck writes, "for what could be dumber than giving up a chance to learn something that is essential for your own success?"

In a similar experiment, Dweck gave a class of preadolescent students a test filled with challenging problems. After they were finished, one group was praised for its effort and another group was praised for its intelligence. Those praised for their intelligence were reluctant to tackle difficult tasks, and their performance on subsequent tests soon began to suffer. Then Dweck asked the children to write a letter to students at another school, describing their experience in the study. She discovered something remarkable: forty per cent of those students who were praised for their intelligence lied about how they had scored on the test, adjusting their grade upward. They weren't naturally deceptive people, and they weren't any less intelligent or self-confident than anyone else. They simply did what people do when they are immersed in an environment that celebrates them solely for their innate "talent." They begin to define themselves by that description, and when times get tough and that self-image is threatened they have difficulty with the consequences. They will not take the remedial course. They will not stand up to investors and the public and admit that they were wrong. They'd sooner lie.

4.

The broader failing of McKinsey and its acolytes at Enron is their assumption that an organization's intelligence is simply a function of the intelligence of its employees. They believe in stars, because they don't believe in systems. In a way, that's understandable, because our lives are so obviously enriched by individual brilliance. Groups don't write great novels, and a committee didn't come up with the theory of relativity. But companies work by different rules. They don't just create; they execute and compete and coördinate the efforts of many different people, and the organizations that are most successful at that task are the ones where the system is the star.

There is a wonderful example of this in the story of the so-called Eastern Pearl Harbor, of the Second World War. During the first nine months of 1942, the United States Navy suffered a catastrophe. German U-boats, operating just off the Atlantic coast and in the Caribbean, were sinking our merchant ships almost at will. U-boat captains marvelled at their good fortune. "Before this sea of light, against this footlight glare of a carefree new world were passing the silhouettes of ships recognizable in every detail and sharp as the outlines in a sales catalogue," one U-boat commander wrote. "All we had to do was press the button."

What made this such a puzzle is that, on the other side of the Atlantic, the British had much less trouble defending their ships against U-boat attacks. The British, furthermore, eagerly passed on to the Americans everything they knew about sonar and depth-charge throwers and the construction of destroyers. And still the Germans managed to paralyze America's coastal zones.

You can imagine what the consultants at McKinsey would have concluded: they would have said that the Navy did not have a talent mind-set, that President Roosevelt needed to recruit and promote top performers into key positions in the Atlantic command. In fact, he had already done that. At the beginning of the war, he had pushed out the solid and unspectacular Admiral Harold R. Stark as Chief of Naval Operations and replaced him with the legendary Ernest Joseph King. "He was a supreme realist with the arrogance of genius," Ladislas Farago writes in "The Tenth Fleet," a history of the Navy's U-boat battles in the Second World War. "He had unbounded faith in himself, in his vast knowledge of naval matters and in the soundness of his ideas. Unlike Stark, who tolerated incompetence all around him, King had no patience with fools."

The Navy had plenty of talent at the top, in other words. What it didn't have was the right kind of organization. As Eliot A. Cohen, a scholar of military strategy at Johns Hopkins, writes in his brilliant book "Military Misfortunes in the Atlantic":

To wage the antisubmarine war well, analysts had to bring together fragments of information, direction-finding fixes, visual sightings, decrypts, and the "flaming datum" of a U-boat attack--for use by a commander to coordinate the efforts of warships, aircraft, and convoy commanders. Such synthesis had to occur in near "real time"--within hours, even minutes in some cases.

The British excelled at the task because they had a centralized operational system. The controllers moved the British ships around the Atlantic like chess pieces, in order to outsmart U-boat "wolf packs." By contrast, Admiral King believed strongly in a decentralized management structure: he held that managers should never tell their subordinates " 'how' as well as what to 'do.' " In today's jargon, we would say he was a believer in "loose-tight" management, of the kind celebrated by the McKinsey consultants Thomas J. Peters and Robert H. Waterman in their 1982 best-seller, "In Search of Excellence." But "loose-tight" doesn't help you find U-boats. Throughout most of 1942, the Navy kept trying to act smart by relying on technical know-how, and stubbornly refused to take operational lessons from the British. The Navy also lacked the organizational structure necessary to apply the technical knowledge it did have to the field. Only when the Navy set up the Tenth Fleet--a single unit to coördinate all anti-submarine warfare in the Atlantic--did the situation change. In the year and a half before the Tenth Fleet was formed, in May of 1943, the Navy sank thirty-six U-boats. In the six months afterward, it sank seventy-five. "The creation of the Tenth Fleet did not bring more talented individuals into the field of ASW"--anti-submarine warfare--"than had previous organizations," Cohen writes. "What Tenth Fleet did allow, by virtue of its organization and mandate, was for these individuals to become far more effective than previously." The talent myth assumes that people make organizations smart. More often than not, it's the other way around.

5.

There is ample evidence of this principle among America's most successful companies. Southwest Airlines hires very few M.B.A.s, pays its managers modestly, and gives raises according to seniority, not "rank and yank." Yet it is by far the most successful of all United States airlines, because it has created a vastly more efficient organization than its competitors have. At Southwest, the time it takes to get a plane that has just landed ready for takeoff--a key index of productivity--is, on average, twenty minutes, and requires a ground crew of four, and two people at the gate. (At United Airlines, by contrast, turnaround time is closer to thirty-five minutes, and requires a ground crew of twelve and three agents at the gate.)

In the case of the giant retailer Wal-Mart, one of the most critical periods in its history came in 1976, when Sam Walton "unretired," pushing out his handpicked successor, Ron Mayer. Mayer was just over forty. He was ambitious. He was charismatic. He was, in the words of one Walton biographer, "the boy-genius financial officer." But Walton was convinced that Mayer was, as people at McKinsey would say, "differentiating and affirming" in the corporate suite, in defiance of Wal-Mart's inclusive culture. Mayer left, and Wal-Mart survived. After all, Wal-Mart is an organization, not an all-star team. Walton brought in David Glass, late of the Army and Southern Missouri State University, as C.E.O.; the company is now ranked No. 1 on the Fortune 500 list.

Procter & Gamble doesn't have a star system, either. How could it? Would the top M.B.A. graduates of Harvard and Stanford move to Cincinnati to work on detergent when they could make three times as much reinventing the world in Houston? Procter & Gamble isn't glamorous. Its C.E.O. is a lifer--a former Navy officer who began his corporate career as an assistant brand manager for Joy dishwashing liquid--and, if Procter & Gamble's best played Enron's best at Trivial Pursuit, no doubt the team from Houston would win handily. But Procter & Gamble has dominated the consumer-products field for close to a century, because it has a carefully conceived managerial system, and a rigorous marketing methodology that has allowed it to win battles for brands like Crest and Tide decade after decade. In Procter & Gamble's Navy, Admiral Stark would have stayed. But a cross-divisional management committee would have set the Tenth Fleet in place before the war ever started.

6.

Among the most damning facts about Enron, in the end, was something its managers were proudest of. They had what, in McKinsey terminology, is called an "open market" for hiring. In the open-market system--McKinsey's assault on the very idea of a fixed organization--anyone could apply for any job that he or she wanted, and no manager was allowed to hold anyone back. Poaching was encouraged. When an Enron executive named Kevin Hannon started the company's global broadband unit, he launched what he called Project Quick Hire. A hundred top performers from around the company were invited to the Houston Hyatt to hear Hannon give his pitch. Recruiting booths were set up outside the meeting room. "Hannon had his fifty top performers for the broadband unit by the end of the week," Michaels, Handfield-Jones, and Axelrod write, "and his peers had fifty holes to fill." Nobody, not even the consultants who were paid to think about the Enron culture, seemed worried that those fifty holes might disrupt the functioning of the affected departments, that stability in a firm's existing businesses might be a good thing, that the self-fulfillment of Enron's star employees might possibly be in conflict with the best interests of the firm as a whole.

These are the sort of concerns that management consultants ought to raise. But Enron's management consultant was McKinsey, and McKinsey was as much a prisoner of the talent myth as its clients were. In 1998, Enron hired ten Wharton M.B.A.s; that same year, McKinsey hired forty. In 1999, Enron hired twelve from Wharton; McKinsey hired sixty-one. The consultants at McKinsey were preaching at Enron what they believed about themselves. "When we would hire them, it wouldn't just be for a week," one former Enron manager recalls, of the brilliant young men and women from McKinsey who wandered the hallways at the company's headquarters. "It would be for two to four months. They were always around." They were there looking for people who had the talent to think outside the box. It never occurred to them that, if everyone had to think outside the box, maybe it was the box that needed fixing.

The Televisionary

download pdf

../index.html../index.html

 

homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog

May 27, 2002
A CRITIC AT LARGE

Big business and the myth of the lone inventor

1.

Philo T. Farnsworth was born in 1906, and he looked the way an inventor of that era was supposed to look: slight and gaunt, with bright-blue exhausted eyes, and a mane of brown hair swept back from his forehead. He was nervous and tightly wound. He rarely slept. He veered between fits of exuberance and depression. At the age of three, he was making precise drawings of the internal mechanisms of locomotives. At six, he declared his intention to follow in the footsteps of Thomas Edison and Alexander Graham Bell. At fourteen, while tilling a potato field on his family's farm in Idaho, he saw the neat, parallel lines of furrows in front of him, and it occurred to him--in a single, blinding moment--that a picture could be sent electronically through the airwaves in the same way, broken down into easily transmitted lines and then reassembled into a complete picture at the other end. He went to see his high-school science teacher, and covered the blackboard with drawings and equations. At nineteen, after dropping out of college, he impressed two local investors with his brilliance and his conviction. He moved to California and set up shop in a tiny laboratory. He got married on an impulse. On his wedding night, he seized his bride by the shoulders and looked at her with those bright-blue eyes. "Pemmie," he said. "I have to tell you. There is another woman in my life--and her name is Television."

Philo T. Farnsworth was the inventor of television. Through the nineteen-thirties and forties, he engaged in a heroic battle to perfect and commercialize his discovery, fending off creditors and predators, and working himself to the point of emotional and physical exhaustion. His nemesis was David Sarnoff, the head of RCA, then one of the most powerful American electronics companies. Sarnoff lived in an enormous Upper East Side mansion and smoked fat cigars and travelled by chauffeured limousine. His top television researcher was Vladimir Zworykin, the scion of a wealthy Russian family, who wore elegant three-piece suits and round spectacles, had a Ph.D. in physics, and apprenticed with the legendary Boris Rosing at the St. Petersburg Institute of Technology. Zworykin was never more than half a step behind Farnsworth: he filed for a patent on his own version of electronic television two years after Farnsworth had his potato-field vision. At one point, Sarnoff sent Zworykin to Farnsworth's tiny laboratory, on Green Street in San Francisco, and he stayed for three days, asking suspiciously detailed questions. He had one of Farnsworth's engineers build the heart of Farnsworth's television system--the so-called image dissector--before his eyes, and then picked the tube up and turned it over in his hands and said, ominously, "This is a beautiful instrument. I wish I had invented it myself." Soon Sarnoff himself came out to Green Street, swept imperially through the laboratory, and declared, "There's nothing here we'll need." It was, of course, a lie. In the nineteen-thirties, television was not possible without Philo Farnsworth's work. But in the end it didn't much matter. Farnsworth's company was forced out of the TV business. Farnsworth had a nervous breakdown, and Sarnoff used his wealth and power to declare himself the father of television.

The life of Philo Farnsworth is the subject of two new books, "The Last Lone Inventor," by Evan I. Schwartz (HarperCollins; $24.95), and "The Boy Genius and the Mogul," by Daniel Stashower (Broadway; $24.95). It is a wonderful tale, riveting and bittersweet. But its lessons, on closer examination, are less straightforward than the clichés of the doomed inventor and the villainous mogul might suggest. Philo Farnsworth's travails make a rather strong case for big corporations, not against them.

2.

The idea of television arose from two fundamental discoveries. The first was photoconductivity. In 1872, Joseph May and Willoughby Smith discovered that the electrical resistance of certain metals varied according to their exposure to light. And, since everyone knew how to transmit electricity from one place to another, it made sense that images could be transmitted as well. The second discovery was what is called visual persistence. In 1880, the French engineer Maurice LeBlanc pointed out that, because the human eye retains an image for about a tenth of a second, if you wanted to transmit a picture you didn't have to send it all at once. You could scan it, one line at a time, and, as long as you put all those lines back together at the other end within that fraction of a second, the human eye would be fooled into thinking that it was seeing a complete picture.

The hard part was figuring out how to do the scanning. In 1883, the German engineer Paul Nipkow devised an elaborate and ultimately unworkable system using a spinning metal disk. The disk was punctured with a spiral of small holes, and, as it spun, one line of light after another was projected through the holes onto a photocell. In 1908, a British electrical engineer named A. A. Campbell Swinton suggested that it would make more sense to scan images electronically, using a cathode ray. Philo Farnsworth was the first to work out how to do that. His image dissector was a vacuum tube with a lens at one end, a photoelectric plate right in front of the lens to convert the image from light to electricity, and then an "anode finger" to scan the electrical image line by line. After setting up his laboratory, Farnsworth tinkered with his makeshift television camera day and night for months. Finally, on September 7, 1927, he was ready. His wife, Pem, was by his side. His tiny television screen was in front of him. His brother-in-law, Cliff Gardner, was manning the television camera in a room at the other end of the lab. Stashower writes:

Squaring his shoulders, Farnsworth took his place at the controls and flicked a series of switches. A small, bluish patch of light appeared at the end of the receiving tube. Farnsworth lifted his head and began calling out instructions to Gardner in the next room.

"Put in the slide, Cliff," Farnsworth said.

"Okay, it's in," Gardner answered. "Can you see it?"

A faint but unmistakable line appeared across the receiving end of the tube. As Farnsworth made some adjustments, the line became more distinct.

"Turn the slide a quarter turn, Cliff," Farnsworth called. Seconds later, the line on the receiving tube rotated ninety degrees. Farnsworth looked up from the tube. "That's it, folks," he announced with a tremor in his voice. "We've done it--there you have electronic television."

Both Stashower and Schwartz talk about how much meaning Farnsworth attached to this moment. He was a romantic, and in the romance of invention the creative process consists of two discrete, euphoric episodes, linked by long years of grit and hard work. First is the magic moment of conception: Farnsworth in the potato field. Second is the moment of execution: the day in the lab. If you had the first of those moments and not the second, you were a visionary. But if you had both you were in a wholly different category. Farnsworth must have known the story of King Gillette, the bottle-cap salesman, who woke up one morning in the summer of 1895 to find his razor dull. Gillette had a sudden vision: if all he wanted was a sharp edge, then why should he have to refashion the whole razor? Gillette later recalled:

As I stood there with the razor in my hand, my eyes resting on it as lightly as a bird settling down on its nest, the Gillette razor was born--more with the rapidity of a dream than by a process of reasoning. In a moment I saw it all: the way the blade could be held in a holder; the idea of sharpening the two opposite edges on the thin piece of steel; the clamping plates for the blade, with a handle half-way between the two edges of the blade...I stood there before the mirror in a trance of joy. My wife was visiting Ohio and I hurriedly wrote to her: "I've got it! Our fortune is made!"

If you had the vision and you made the vision work, then the invention was yours--that was what Farnsworth believed. It belonged to you, just as the safety razor belonged to King Gillette.

But this was Farnsworth's mistake, because television wasn't at all like the safety razor. It didn't belong to one person. May and Smith stumbled across photoconductivity, and inspired LeBlanc, who, in turn, inspired Swinton, and Swinton's idea inspired inventors around the world. Then there was Zworykin, of course, and his mentor Boris Rosing, and the team of Max Dieckmann and Rudolf Hell, in Germany, who tried to patent something in the mid-twenties that was virtually identical to the image dissector. In 1931, when Zworykin perfected his own version of the television camera, called the Iconoscope, RCA did a worldwide patent search and found very similar patent applications from a Hungarian named Kolomon Tihany, a Canadian named François Henrouteau, a Japanese inventor named Kenjiro Takayanagi, two Englishmen, and a Russian. Everyone was working on television and everyone was reading everyone else's patent applications, and, because television was such a complex technology, nearly everyone had something new to add. Farnsworth came up with the first camera. Zworykin had the best early picture tube. And when Zworykin finally came up with his own camera it was not as good as Farnsworth's camera in some respects, but it was better in others. In September of 1939, when RCA finally licensed the rights to Farnsworth's essential patents, it didn't replace the Iconoscope with Farnsworth's image dissector. It took the best parts of both.

It is instructive to compare the early history of television with the development, some seventy-five years earlier, of the sewing machine. As the historian Grace Rogers Cooper points out, a sewing machine is really six different mechanisms in one--a means of supporting the cloth, a needle and a combining device to form the stitch, a feeding mechanism to allow one stitch to follow another, a means of insuring the even delivery of thread, and a governing mechanism to insure that each of the previous five steps is performed in sequence. Cooper writes in her book "The Sewing Machine":

Weisenthal had added a point to the eye-end of the needle. Saint supported the fabric by placing it in a horizontal position with a needle entering vertically, Duncan successfully completed a chainstitch for embroidery purposes, Chapman used a needle with an eye at its point and did not pass it completely through the fabric, Krems stitched circular caps with an eye-pointed needle used with a hook to form a chainstitch, Thimmonier used the hooked needle to form a chainstitch on a fabric laid horizontally, and Hunt created a new stitch that was more readily adapted to sewing by machine than the hand stitches had been.

The man generally credited with combining and perfecting these elements is Elias Howe, a machinist from Boston. But even Howe's patents were quickly superseded by a new round of patents, each taking one of the principles of his design and either augmenting it or replacing it. The result was legal and commercial gridlock, broken only when, in 1856, Howe and three of the leading sewing-machine manufacturers (among them Isaac Merritt Singer, who gave the world the sewing-machine foot pedal) agreed to pool their patents and form a trust. It was then that the sewing-machine business took off. For the sewing machine to succeed, in other words, those who saw themselves as sewing-machine inventors had to swallow their pride and concede that the machine was larger than they were--that groups, not individuals, invent complex technologies. That was what Farnsworth could not do, and it explains the terrible turn that his life took.

3.

David Sarnoff's RCA had a very strict policy on patents. If you worked for RCA and you invented something patentable, it belonged to RCA. Your name was on the patent, and you got credit for your work. But you had to sign over your rights for one dollar. In "The Last Lone Inventor," Schwartz tells the story of an RCA engineer who thought the system was so absurd that he would paste his one-dollar checks to the wall of his office--until the accounting department, upset with the unresolved balance on its books, steamed them off and forced him to cash them. At the same time, Sarnoff was a patient and generous benefactor. When Zworykin and Sarnoff discussed television for the first time, in 1929, Zworykin promised the RCA chief that he would create a working system in two years, at a cost of a hundred thousand dollars. In fact, it took more than ten years and fifty million dollars, and through all those years--which just happened to coincide with the Depression--Sarnoff's support never wavered. Sarnoff "hired the best engineers out of the best universities," Schwartz writes. "He paid them competitive salaries, provided them with ample research budgets, and offered them a chance to join his crusade to change the world, working in the most dynamic industry the world had ever seen." What Sarnoff presented was a compromise. In exchange for control over the fruits of invention, he gave his engineers the freedom to invent.

Farnsworth didn't want to relinquish that control. Both RCA and General Electric offered him a chance to work on television in their laboratories. He turned them both down. He wanted to go it alone. This was the practical consequence of his conviction that television was his, and it was, in retrospect, a grievous error. It meant that Farnsworth was forced to work in a state of chronic insecurity. He never had enough money. He feuded constantly with his major investor, a man named Jesse McCargar, who didn't have the resources to play the television game. At the time of what should have been one of Farnsworth's greatest triumphs--the granting of his principal --McCargar showed up at the lab complaining about costs, and made Farnsworth fire his three star engineers. When, in 1928, the Green Street building burned down, a panicked Farnsworth didn't know whether or not his laboratory was insured. It was, as it happened, but a second laboratory, in Maine, wasn't, and when it burned down, years later, he lost everything. Twice, he testified before Congress. The first time, he rambled off on a tangent about transmission bandwidth which left people scratching their heads. The second time, he passed up a perfect opportunity to register his complaints about RCA, and launched, instead, into a sentimental account of his humble origins. He simply did not understand how to play politics, just as he did not understand how to raise money or run a business or organize his life. All he really knew how to do was invent, which was something that, as a solo operator, he too seldom had time for.

This is the reason that so many of us work for big companies, of course: in a big company, there is always someone to do what we do not want to do or do not do well--someone to answer the phone, and set up our computer, and arrange our health insurance, and clean our office at night, and make sure the building is insured. In a famous 1937 essay, "The Nature of the Firm," the economist Ronald Coase said that the reason we have corporations is to reduce the everyday transaction costs of doing business: a company puts an accountant on the staff so that if a staffer needs to check the books all he has to do is walk down the hall. It's an obvious point, but one that is consistently overlooked, particularly by those who periodically rail, in the name of efficiency, against corporate bloat and superfluous middle managers. Yes, the middle manager does not always contribute directly to the bottom line. But he does contribute to those who contribute to the bottom line, and only an absurdly truncated account of human productivity--one that assumes real work to be somehow possible when phones are ringing, computers are crashing, and health insurance is expiring--does not see that secondary contribution as valuable.

In April, 1931, Sarnoff showed up at the Green Street laboratory to review Farnsworth's work. This was, by any measure, an extraordinary event. Farnsworth was twenty-four, and working out of a ramshackle building. Sarnoff was one of the leading industrialists of his day. It was as if Bill Gates were to get in his private jet and visit a software startup in a garage across the country. But Farnsworth wasn't there. He was in New York, trapped there by a court order resulting from a frivolous lawsuit filed by a shady would-be investor. Stashower calls this one of the great missed opportunities of Farnsworth's career, because he almost certainly would have awed Sarnoff with his passion and brilliance, winning a lucrative licensing deal. Instead, an unimpressed Sarnoff made a token offer of a hundred thousand dollars for Farnsworth's patents, and Farnsworth dismissed the offer out of hand. This, too, is a reason that inventors ought to work for big corporations: big corporations have legal departments to protect their employees against being kept away from their laboratories by frivolous lawsuits. A genius is a terrible thing to waste.

4.

In 1939, at the World's Fair in New York City, David Sarnoff set up a nine-thousand-square-foot pavilion to showcase the new technology of television. The pavilion, shaped like a giant radio tube, was covered with RCA logos, and stood next to the Perisphere Theatre, the centerpiece of the fairgrounds. On opening day, thirty thousand people gathered to hear from President Roosevelt and Albert Einstein. The gala was televised by RCA, beamed across the New York City area from the top of the Empire State Building. As it happened, Farnsworth was in New York City that day, and he caught the opening ceremonies on a television in a department-store window. He saw Sarnoff introducing both Roosevelt and Einstein, and effectively claiming this wondrous new technology as his own. "Farnsworth's entire existence seemed to be annulled in this moment," Schwartz writes:

The dreams of a farm boy, the eureka moment in a potato field, the confession to a teacher, the confidence in him shown by businessmen and bankers and investors, the breakthroughs in the laboratory, all the years of work, the decisions of the official patent examiners, those hard-fought victories, all of those demonstrations that had come and gone, the entire vision of the future. All of it was being negated by Sarnoff's performance at the World's Fair. Would the public ever know the truth?... The agony of it set off sharp pains in his stomach.

Finally, later that summer, RCA settled with Farnsworth. It agreed to pay him a million dollars for the rights to his main patents, plus royalties on every television set sold. But it was too late. Something had died in him. "It's come to the point of choosing whether I want to be a drunk or go crazy," he told his wife. One doctor prescribed chloral hydrate, which destroyed his appetite and left him dangerously thin. Another doctor prescribed cigarettes, to soothe his nerves. A third prescribed uppers. He became addicted to the painkiller Pantipon. He committed himself to a sanitarium in Massachusetts, where he was given a course of shock therapy. After the war, his brother died in a plane crash. His patents expired, drying up his chief source of income. His company, unable to compete with RCA, was forced out of the television business. He convinced himself that he could unlock the secrets of nuclear fusion, and launched another private research project, mortgaging his home, selling his stock, and cashing in his life insurance to fund the project. But nothing came of it. He died in 1971--addicted to alcohol, deeply depressed, and all but forgotten. He was sixty-four.

In "Tube," a history of television, David E. Fisher and Marshall Jon Fisher point out that Farnsworth was not the only television pioneer to die in misery. So did two others--John Logie Baird and Charles Francis Jenkins--who had tried and failed to produce mechanical television. This should not come as a surprise. The creative enterprise is a hazardous journey, and those who venture on it alone do so at their peril. Baird and Jenkins and Farnsworth risked their psychological and financial well-being on the romantic notion of the solitary inventor, and when that idea failed them what resources did they have left? Zworykin had his share of setbacks as well. He took on Farnsworth in court, and lost. He promised television in two years for a hundred thousand dollars and he came in eight years and fifty million dollars over budget. But he ended his life a prosperous and contented man, lauded and laurelled with awards and honorary degrees. He had the cocoon of RCA to protect him: a desk and a paycheck and a pension and a secretary and a boss with the means to rewrite history in his favor. This is perhaps a more important reason that we have companies--or, for that matter, that we have universities and tenure. Institutions are not just the best environment for success; they are also the safest environment for failure--and, much of the time, failure is what lies in store for innovators and visionaries. Philo Farnsworth should have gone to work for RCA. He would still have been the father of television, and he might have died a happy man.

The Naked Face

download pdf

../index.html../index.html

 

homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog

August 5, 2002
ANNALS OF PSYCHOLOGY

Can you read people's thoughts
just by looking at them?

1.

Some years ago, John Yarbrough was working patrol for the Los Angeles County Sheriff's Department. It was about two in the morning. He and his partner were in the Willowbrook section of South Central Los Angeles, and they pulled over a sports car. "Dark, nighttime, average stop," Yarbrough recalls. "Patrol for me was like going hunting. At that time of night in the area I was working, there was a lot of criminal activity, and hardly anyone had a driver's license. Almost everyone had something intoxicating in the car. We stopped drunk drivers all the time. You're hunting for guns or lots of dope, or suspects wanted for major things. You look at someone and you get an instinctive reaction. And the longer you've been working the stronger that instinctive reaction is."

Yarbrough was driving, and in a two-man patrol car the procedure is for the driver to make the approach and the officer on the passenger side to provide backup. He opened the door and stepped out onto the street, walking toward the vehicle with his weapon drawn. Suddenly, a man jumped out of the passenger side and pointed a gun directly at him. The two of them froze, separated by no more than a few yards. "There was a tree behind him, to his right," Yarbrough recalls. "He was about seventeen. He had the gun in his right hand. He was on the curb side. I was on the other side, facing him. It was just a matter of who was going to shoot first. I remember it clear as day. But for some reason I didn't shoot him." Yarbrough is an ex-marine with close-cropped graying hair and a small mustache, and he speaks in measured tones. "Is he a danger? Sure. He's standing there with a gun, and what person in his right mind does that facing a uniformed armed policeman? If you looked at it logically, I should have shot him. But logic had nothing to do with it. Something just didn't feel right. It was a gut reaction not to shoot-- a hunch that at that exact moment he was not an imminent threat to me." So Yarbrough stopped, and, sure enough, so did the kid. He pointed a gun at an armed policeman on a dark street in South Central L.A., and then backed down.

Yarbrough retired last year from the sheriff's department after almost thirty years, sixteen of which were in homicide. He now lives in western Arizona, in a small, immaculate house overlooking the Colorado River, with pictures of John Wayne, Charles Bronson, Clint Eastwood, and Dale Earnhardt on the wall. He has a policeman's watchfulness: while he listens to you, his eyes alight on your face, and then they follow your hands, if you move them, and the areas to your immediate left and right-- and then back again, in a steady cycle. He grew up in an affluent household in the San Fernando Valley, the son of two doctors, and he is intensely analytical: he is the sort to take a problem and break it down, working it over slowly and patiently in his mind, and the incident in Willowbrook is one of those problems. Policemen shoot people who point guns directly at them at two in the morning. But something he saw held him back, something that ninety-nine people out of a hundred wouldn't have seen.

Many years later, Yarbrough met with a team of psychologists who were conducting training sessions for law enforcement. They sat beside him in a darkened room and showed him a series of videotapes of people who were either lying or telling the truth. He had to say who was doing what. One tape showed people talking about their views on the death penalty and on smoking in public. Another featured a series of nurses who were all talking about a nature film they were supposedly watching, even though some of them were actually watching grisly documentary footage about burn victims and amputees. It may sound as if the tests should have been easy, because we all think we can tell whether someone is lying. But these were not the obvious fibs of a child, or the prevarications of people whose habits and tendencies we know well. These were strangers who were motivated to deceive, and the task of spotting the liars turns out to be fantastically difficult. There is just too much information--words, intonation, gestures, eyes, mouth--and it is impossible to know how the various cues should be weighted, or how to put them all together, and in any case it's all happening so quickly that you can't even follow what you think you ought to follow. The tests have been given to policemen, customs officers, judges, trial lawyers, and psychotherapists, as well as to officers from the F.B.I., the C.I.A., the D.E.A., and the Bureau of Alcohol, Tobacco, and Firearms-- people one would have thought would be good at spotting lies. On average, they score fifty per cent, which is to say that they would have done just as well if they hadn't watched the tapes at all and just guessed. But every now and again-- roughly one time in a thousand--someone scores off the charts. A Texas Ranger named David Maxwell did extremely well, for example, as did an ex-A.T.F. agent named J.J. Newberry, a few therapists, an arbitrator, a vice cop-- and John Yarbrough, which suggests that what happened in Willowbrook may have been more than a fluke or a lucky guess. Something in our faces signals whether we're going to shoot, say, or whether we're lying about the film we just saw. Most of us aren't very good at spotting it. But a handful of people are virtuosos. What do they see that we miss?

2.

All of us, a thousand times a day, read faces. When someone says "I love you," we look into that person's eyes to judge his or her sincerity. When we meet someone new, we often pick up on subtle signals, so that, even though he or she may have talked in a normal and friendly manner, afterward we say, "I don't think he liked me," or "I don't think she's very happy." We easily parse complex distinctions in facial expression. If you saw me grinning, for example, with my eyes twinkling, you'd say I was amused. But that's not the only way we interpret a smile. If you saw me nod and smile exaggeratedly, with the corners of my lips tightened, you would take it that I had been teased and was responding sarcastically. If I made eye contact with someone, gave a small smile and then looked down and averted my gaze, you would think I was flirting. If I followed a remark with an abrupt smile and then nodded, or tilted my head sideways, you might conclude that I had just said something a little harsh, and wanted to take the edge off it. You wouldn't need to hear anything I was saying in order to reach these conclusions. The face is such an extraordinarily efficient instrument of communication that there must be rules that govern the way we interpret facial expressions. But what are those rules? And are they the same for everyone?

In the nineteen-sixties, a young San Francisco psychologist named Paul Ekman began to study facial expression, and he discovered that no one knew the answers to those questions. Ekman went to see Margaret Mead, climbing the stairs to her tower office at the American Museum of Natural History. He had an idea. What if he travelled around the world to find out whether people from different cultures agreed on the meaning of different facial expressions? Mead, he recalls, "looked at me as if I were crazy." Like most social scientists of her day, she believed that expression was culturally determined-- that we simply used our faces according to a set of learned social conventions. Charles Darwin had discussed the face in his later writings; in his 1872 book, "The Expression of the Emotions in Man and Animals," he argued that all mammals show emotion reliably in their faces. But in the nineteen-sixties academic psychologists were more interested in motivation and cognition than in emotion or its expression. Ekman was undaunted; he began travelling to places like Japan, Brazil, and Argentina, carrying photographs of men and women making a variety of distinctive faces. Everywhere he went, people agreed on what those expressions meant. But what if people in the developed world had all picked up the same cultural rules from watching the same movies and television shows? So Ekman set out again, this time making his way through the jungles of Papua New Guinea, to the most remote villages, and he found that the tribesmen there had no problem interpreting the expressions, either. This may not sound like much of a breakthrough. But in the scientific climate of the time it was a revelation. Ekman had established that expressions were the universal products of evolution. There were fundamental lessons to be learned from the face, if you knew where to look.

Paul Ekman is now in his sixties. He is clean-shaven, with closely set eyes and thick, prominent eyebrows, and although he is of medium build, he seems much larger than he is: there is something stubborn and substantial in his demeanor. He grew up in Newark, the son of a pediatrician, and entered the University of Chicago at fifteen. He speaks deliberately: before he laughs, he pauses slightly, as if waiting for permission. He is the sort to make lists, and number his arguments. His academic writing has an orderly logic to it; by the end of an Ekman essay, each stray objection and problem has been gathered up and catalogued. In the mid-sixties, Ekman set up a lab in a ramshackle Victorian house at the University of California at San Francisco, where he holds a professorship. If the face was part of a physiological system, he reasoned, the system could be learned. He set out to teach himself. He treated the face as an adventurer would a foreign land, exploring its every crevice and contour. He assembled a videotape library of people's facial expressions, which soon filled three rooms in his lab, and studied them to the point where he could look at a face and pick up a flicker of emotion that might last no more than a fraction of a second. Ekman created the lying tests. He filmed the nurses talking about the movie they were watching and the movie they weren't watching. Working with Maureen O'Sullivan, a psychologist from the University of San Francisco, and other colleagues, he located people who had a reputation for being uncannily perceptive, and put them to the test, and that's how Yarbrough and the other high-scorers were identified. O'Sullivan and Ekman call this study of gifted face readers the Diogenes Project, after the Greek philosopher of antiquity who used to wander around Athens with a lantern, peering into people's faces as he searched for an honest man. Ekman has taken the most vaporous of sensations-- the hunch you have about someone else-- and sought to give them definition. Most of us don't trust our hunches, because we don't know where they came from. We think they can't be explained. But what if they can?

3.

Paul Ekman got his start in the face-reading business because of a man named Silvan Tomkins, and Silvan Tomkins may have been the best face reader there ever was. Tomkins was from Philadelphia, the son of a dentist from Russia. He was short, and slightly thick around the middle, with a wild mane of white hair and huge black plastic-rimmed glasses. He taught psychology at Princeton and Rutgers, and was the author of "Affect, Imagery, Consciousness," a four-volume work so dense that its readers were evenly divided between those who understood it and thought it was brilliant and those who did not understand it and thought it was brilliant. He was a legendary talker. At the end of a cocktail party, fifteen people would sit, rapt, at Tomkins's feet, and someone would say, "One more question!" and they would all sit there for another hour and a half, as Tomkins held forth on, say, comic books, a television sitcom, the biology of emotion, his problem with Kant, and his enthusiasm for the latest fad diets, all enfolded into one extended riff. During the Depression, in the midst of his doctoral studies at Harvard, he worked as a handicapper for a horse-racing syndicate, and was so successful that he lived lavishly on Manhattan's Upper East Side. At the track, where he sat in the stands for hours, staring at the horses through binoculars, he was known as the Professor. "He had a system for predicting how a horse would do based on what horse was on either side of him, based on their emotional relationship," Ekman said. If a male horse, for instance, had lost to a mare in his first or second year, he would be ruined if he went to the gate with a mare next to him in the lineup. (Or something like that-- no one really knew for certain.) Tomkins felt that emotion was the code to life, and that with enough attention to particulars the code could be cracked. He thought this about the horses, and, more important, he thought this about the human face.

Tomkins, it was said, could walk into a post office, go over to the "Wanted" posters, and, just by looking at mug shots, tell you what crimes the various fugitives had committed. "He would watch the show "To Tell the Truth,' and without fault he could always pick the person who was lying and who his confederates were," his son, Mark, recalls. "He actually wrote the producer at one point to say it was too easy, and the man invited him to come to New York, go backstage, and show his stuff." Virginia Demos, who teaches psychology at Harvard, recalls having long conversations with Tomkins. "We would sit and talk on the phone, and he would turn the sound down as Jesse Jackson was talking to Michael Dukakis, at the Democratic National Convention. And he would read the faces and give his predictions on what would happen. It was profound."

Ekman's most memorable encounter with Tomkins took place in the late sixties. Ekman had just tracked down a hundred thousand feet of film that had been shot by the virologist Carleton Gajdusek in the remote jungles of Papua New Guinea. Some of the footage was of a tribe called the South Fore, who were a peaceful and friendly people. The rest was of the Kukukuku, who were hostile and murderous and who had a homosexual ritual where pre-adolescent boys were required to serve as courtesans for the male elders of the tribe. Ekman was still working on the problem of whether human facial expressions were universal, and the Gajdusek film was invaluable. For six months, Ekman and his collaborator, Wallace Friesen, sorted through the footage. They cut extraneous scenes, focussing just on closeups of the faces of the tribesmen, and when the editing was finished Ekman called in Tomkins.

The two men, protégé and mentor, sat at the back of the room, as faces flickered across the screen. Ekman had told Tomkins nothing about the tribes involved; all identifying context had been edited out. Tomkins looked on intently, peering through his glasses. At the end, he went up to the screen and pointed to the faces of the South Fore. "These are a sweet, gentle people, very indulgent, very peaceful," he said. Then he pointed to the faces of the Kukukuku. "This other group is violent, and there is lots of evidence to suggest homosexuality." Even today, a third of a century later, Ekman cannot get over what Tomkins did. "My God! I vividly remember saying, "Silvan, how on earth are you doing that?' " Ekman recalls. "And he went up to the screen and, while we played the film backward, in slow motion, he pointed out the particular bulges and wrinkles in the face that he was using to make his judgment. That's when I realized, "I've got to unpack the face.' It was a gold mine of information that everyone had ignored. This guy could see it, and if he could see it, maybe everyone else could, too."

Ekman and Friesen decided that they needed to create a taxonomy of facial expressions, so day after day they sat across from each other and began to make every conceivable face they could. Soon, though, they realized that their efforts weren't enough. "I met an anthropologist, Wade Seaford, told him what I was doing, and he said, 'Do you have this movement?'" --and here Ekman contracted what's called the triangularis, which is the muscle that depresses the corners of the lips, forming an arc of distaste-- "and it wasn't in my system, because I had never seen it before. I had built a system not on what the face can do but on what I had seen. I was devastated. So I came back and said, 'I've got to learn the anatomy.' " Friesen and Ekman then combed through medical textbooks that outlined each of the facial muscles, and identified every distinct muscular movement that the face could make. There were forty-three such movements. Ekman and Friesen called them "action units." Then they sat across from each other again, and began manipulating each action unit in turn, first locating the muscle in their mind and then concentrating on isolating it, watching each other closely as they did, checking their movements in a mirror, making notes of how the wrinkle patterns on their faces would change with each muscle movement, and videotaping the movement for their records. On the few occasions when they couldn't make a particular movement, they went next door to the U.C.S.F. anatomy department, where a surgeon they knew would stick them with a needle and electrically stimulate the recalcitrant muscle. "That wasn't pleasant at all," Ekman recalls. When each of those action units had been mastered, Ekman and Friesen began working action units in combination, layering one movement on top of another. The entire process took seven years. "There are three hundred combinations of two muscles," Ekman says. "If you add in a third, you get over four thousand. We took it up to five muscles, which is over ten thousand visible facial configurations." Most of those ten thousand facial expressions don't mean anything, of course. They are the kind of nonsense faces that children make. But, by working through each action-unit combination, Ekman and Friesen identified about three thousand that did seem to mean something, until they had catalogued the essential repertoire of human emotion.

4.

On a recent afternoon, Ekman sat in his office at U.C.S.F., in what is known as the Human Interaction Laboratory, a standard academic's lair of books and files, with photographs of his two heroes, Tomkins and Darwin, on the wall. He leaned forward slightly, placing his hands on his knees, and began running through the action-unit configurations he had learned so long ago. "Everybody can do action unit four," he began. He lowered his brow, using his depressor glabellae, depressor supercilli, and corrugator. "Almost everyone can do A.U. nine." He wrinkled his nose, using his levator labii superioris, alaeque nasi. "Everybody can do five." He contracted his levator palpebrae superioris, raising his upper eyelid.

I was trying to follow along with him, and he looked up at me. "You've got a very good five," he said generously. "The more deeply set your eyes are, the harder it is to see the five. Then there's seven." He squinted. "Twelve." He flashed a smile, activating the zygomatic major. The inner parts of his eyebrows shot up. "That's A.U. ---- distress, anguish." Then he used his frontalis, pars lateralis, to raise the outer half of his eyebrows. "That's A.U. two. It's also very hard, but it's worthless. It's not part of anything except Kabuki theatre. Twenty-three is one of my favorites. It's the narrowing of the red margin of the lips. Very reliable anger sign. It's very hard to do voluntarily." He narrowed his lips. "Moving one ear at a time is still the hardest thing to do. I have to really concentrate. It takes everything I've got." He laughed. "This is something my daughter always wanted me to do for her friends. Here we go." He wiggled his left ear, then his right ear. Ekman does not appear to have a particularly expressive face. He has the demeanor of a psychoanalyst, watchful and impassive, and his ability to transform his face so easily and quickly was astonishing. "There is one I can't do," he went on. "It's A.U. thirty-nine. Fortunately, one of my postdocs can do it. A.U. thirty-eight is dilating the nostrils. Thirty-nine is the opposite. It's the muscle that pulls them down." He shook his head and looked at me again. "Oooh! You've got a fantastic thirty-nine. That's one of the best I've ever seen. It's genetic. There should be other members of your family who have this heretofore unknown talent. You've got it, you've got it." He laughed again. "You're in a position to flash it at people. See, you should try that in a singles bar!"

Ekman then began to layer one action unit on top of another, in order to compose the more complicated facial expressions that we generally recognize as emotions. Happiness, for instance, is essentially A.U. six and twelve-- contracting the muscles that raise the cheek (orbicularis oculi, pars orbitalis) in combination with the zygomatic major, which pulls up the corners of the lips. Fear is A.U. one, two and four, or, more fully, one, two, four, five, and twenty, with or without action units twenty-five, twenty-six, or twenty-seven. That is: the inner brow raiser (frontalis, pars medialis) plus the outer brow raiser (frontalis, pars lateralis) plus the brow-lowering depressor supercilli plus the levator palpebrae superioris (which raises the upper lid), plus the risorius (which stretches the lips), the parting of the lips (depressor labii), and the masseter (which drops the jaw). Disgust? That's mostly A.U. nine, the wrinkling of the nose (levator labii superioris, alaeque nasi), but it can sometimes be ten, and in either case may be combined with A.U. fifteen or sixteen or seventeen.

Ekman and Friesen ultimately assembled all these combinations--and the rules for reading and interpreting them-- into the Facial Action Coding System, or FACS, and wrote them up in a five-hundred-page binder. It is a strangely riveting document, full of details like the possible movements of the lips (elongate, de-elongate, narrow, widen, flatten, protrude, tighten and stretch); the four different changes of the skin between the eyes and the cheeks (bulges, bags, pouches, and lines); or the critical distinctions between infraorbital furrows and the nasolabial furrow. Researchers have employed the system to study everything from schizophrenia to heart disease; it has even been put to use by computer animators at Pixar ("Toy Story"), andat DreamWorks ("Shrek"). FACS takes weeks to master in its entirety, and only five hundred people around the world have been certified to use it in research. But for those who have, the experience of looking at others is forever changed. They learn to read the face the way that people like John Yarbrough did intuitively. Ekman compares it to the way you start to hear a symphony once you've been trained to read music: an experience that used to wash over you becomes particularized and nuanced.

Ekman recalls the first time he saw Bill Clinton, during the 1992 Democratic primaries. "I was watching his facial expressions, and I said to my wife, 'This is Peck's Bad Boy,' " Ekman says. "This is a guy who wants to be caught with his hand in the cookie jar, and have us love him for it anyway. There was this expression that's one of his favorites. It's that hand-in-the-cookie-jar, love-me-Mommy-because-I'm-a-rascal look. It's A.U. twelve, fifteen, seventeen, and twenty-four, with an eye roll." Ekman paused, then reconstructed that particular sequence of expressions on his face. He contracted his zygomatic major, A.U. twelve, in a classic smile, then tugged the corners of his lips down with his triangularis, A.U. fifteen. He flexed the mentalis, A.U. seventeen, which raises the chin, slightly pressed his lips together in A.U. twenty-four, and finally rolled his eyes--and it was as if Slick Willie himself were suddenly in the room. "I knew someone who was on his communications staff. So I contacted him. I said, 'Look, Clinton's got this way of rolling his eyes along with a certain expression, and what it conveys is "I'm a bad boy." I don't think it's a good thing. I could teach him how not to do that in two to three hours.' And he said, 'Well, we can't take the risk that he's known to be seeing an expert on lying.' I think it's a great tragedy, because . . ." Ekman's voice trailed off. It was clear that he rather liked Clinton, and that he wanted Clinton's trademark expression to have been no more than a meaningless facial tic. Ekman shrugged. "Unfortunately, I guess, he needed to get caught--and he got caught."

5.

Early in his career, Paul Ekman filmed forty psychiatric patients, including a woman named Mary, a forty-two-year-old housewife. She had attempted suicide three times, and survived the last attempt--an overdose of pills--only because someone found her in time and rushed her to the hospital. Her children had left home and her husband was inattentive, and she was depressed. When she first went to the hospital, she simply sat and cried, but she seemed to respond well to therapy. After three weeks, she told her doctor that she was feeling much better and wanted a weekend pass to see her family. The doctor agreed, but just before Mary was to leave the hospital she confessed that the real reason she wanted to go on weekend leave was so that she could make another suicide attempt. Several years later, a group of young psychiatrists asked Ekman how they could tell when suicidal patients were lying. He didn't know, but, remembering Mary, he decided to try to find out. If the face really was a reliable guide to emotion, shouldn't he be able to look back on the film and tell that she was lying? Ekman and Friesen began to analyze the film for clues. They played it over and over for dozens of hours, examining in slow motion every gesture and expression. Finally, they saw it. As Mary's doctor asked her about her plans for the future, a look of utter despair flashed across her face so quickly that it was almost imperceptible.

Ekman calls that kind of fleeting look a "microexpression," and one cannot understand why John Yarbrough did what he did on that night in South Central without also understanding the particular role and significance of microexpressions. Many facial expressions can be made voluntarily. If I' m trying to look stern as I give you a tongue-lashing, I'll have no difficulty doing so, and you' ll have no difficulty interpreting my glare. But our faces are also governed by a separate, involuntary system. We know this because stroke victims who suffer damage to what is known as the pyramidal neural system will laugh at a joke, but they cannot smile if you ask them to. At the same time, patients with damage to another part of the brain have the opposite problem. They can smile on demand, but if you tell them a joke they can't laugh. Similarly, few of us can voluntarily do A.U. one, the sadness sign. (A notable exception, Ekman points out, is Woody Allen, who uses his frontalis, pars medialis, to create his trademark look of comic distress.) Yet we raise our inner eyebrows all the time, without thinking, when we are unhappy. Watch a baby just as he or she starts to cry, and you'll often see the frontalis, pars medialis, shoot up, as if it were on a string.

Perhaps the most famous involuntary expression is what Ekman has dubbed the Duchenne smile, in honor of the nineteenth-century French neurologist Guillaume Duchenne, who first attempted to document the workings of the muscles of the face with the camera. If I ask you to smile, you' ll flex your zygomatic major. By contrast, if you smile spontaneously, in the presence of genuine emotion, you' ll not only flex your zygomatic but also tighten the orbicularis oculi, pars orbitalis, which is the muscle that encircles the eye. It is almost impossible to tighten the orbicularis oculi, pars lateralis, on demand, and it is equally difficult to stop it from tightening when we smile at something genuinely pleasurable. This kind of smile "does not obey the will," Duchenne wrote. "Its absence unmasks the false friend." When we experience a basic emotion, a corresponding message is automatically sent to the muscles of the face. That message may linger on the face for just a fraction of a second, or be detectable only if you attached electrical sensors to the face, but It's always there. Silvan Tomkins once began a lecture by bellowing, "The face is like the penis!" and this is what he meant--that the face has, to a large extent, a mind of its own. This doesn't mean we have no control over our faces. We can use our voluntary muscular system to try to suppress those involuntary responses. But, often, some little part of that suppressed emotion--the sense that I' m really unhappy, even though I deny it--leaks out. Our voluntary expressive system is the way we intentionally signal our emotions. But our involuntary expressive system is in many ways even more important: it is the way we have been equipped by evolution to signal our authentic feelings.

"You must have had the experience where somebody comments on your expression and you didn't know you were making it,"Ekman says. "Somebody tells you, "What are you getting upset about?' "Why are you smirking?' You can hear your voice, but you can't see your face. If we knew what was on our face, we would be better at concealing it. But that wouldn't necessarily be a good thing. Imagine if there were a switch that all of us had, to turn off the expressions on our face at will. If babies had that switch, we wouldn't know what they were feeling. They' d be in trouble. You could make an argument, if you wanted to, that the system evolved so that parents would be able to take care of kids. Or imagine if you were married to someone with a switch? It would be impossible. I don't think mating and infatuation and friendships and closeness would occur if our faces didn't work that way."

Ekman slipped a tape taken from the O.J. Simpson trial into the VCR. It was of Kato Kaelin, Simpson's shaggy-haired house guest, being examined by Marcia Clark, one of the prosecutors in the case. Kaelin sits in the witness box, with his trademark vacant look. Clark asks a hostile question. Kaelin leans forward and answers softly. "Did you see that?" Ekman asked me. I saw nothing, just Kato being Kato-- harmless and passive. Ekman stopped the tape, rewound it, and played it back in slow motion. On the screen, Kaelin moved forward to answer the question, and in that fraction of a second his face was utterly transformed. His nose wrinkled, as he flexed his levator labii superioris, alaeque nasi. His teeth were bared, his brows lowered. "It was almost totally A.U. nine," Ekman said. "It's disgust, with anger there as well, and the clue to that is that when your eyebrows go down, typically your eyes are not as open as they are here. The raised upper eyelid is a component of anger, not disgust. It's very quick." Ekman stopped the tape and played it again, peering at the screen. "You know, he looks like a snarling dog."

Ekman said that there was nothing magical about his ability to pick up an emotion that fleeting. It was simply a matter of practice. "I could show you forty examples, and you could pick it up. I have a training tape, and people love it. They start it, and they can't see any of these expressions. Thirty-five minutes later, they can see them all. What that says is that this is an accessible skill."

Ekman showed another clip, this one from a press conference given by Kim Philby in 1955. Philby had not yet been revealed as a Soviet spy, but two of his colleagues, Donald Maclean and Guy Burgess, had just defected to the Soviet Union. Philby is wearing a dark suit and a white shirt. His hair is straight and parted to the left. His face has the hauteur of privilege.

"Mr. Philby," he is asked. "Mr. Macmillan, the foreign secretary, said there was no evidence that you were the so-called third man who allegedly tipped off Burgess and Maclean. Are you satisfied with that clearance that he gave you?"

Philby answers confidently, in the plummy tones of the English upper class. "Yes, I am."

"Well, if there was a third man, were you in fact the third man?"

"No," Philby says, just as forcefully. "I was not."

Ekman rewound the tape, and replayed it in slow motion. "Look at this," he said, pointing to the screen. "Twice, after being asked serious questions about whether he's committed treason, he's going to smirk. He looks like the cat who ate the canary." The expression was too brief to see normally. But at quarter speed it was painted on his face--the lips pressed together in a look of pure smugness. "He's enjoying himself, isn't he?" Ekman went on. "I call this--duping delight-- the thrill you get from fooling other people." Ekman started the VCR up again. "There's another thing he does." On the screen, Philby was answering another question. "In the second place, the Burgess-Maclean affair has raised issues of great"-- he pauses-- "delicacy." Ekman went back to the pause, and froze the tape. "Here it is,"he said. "A very subtle microexpression of distress or unhappiness. It's only in the eyebrows-- in fact, just in one eyebrow." Sure enough, Philby's right inner eyebrow was raised in an unmistakable A.U. one. "It's very brief," Ekman said. "He's not doing it voluntarily. And it totally contradicts all his confidence and assertiveness. It comes when he's talking about Burgess and Maclean, whom he had tipped off. It's a hot spot that suggests, 'You shouldn't trust what you hear.' "

A decade ago, Ekman joined forces with J. J. Newberry--the ex-A.T.F. agent who is one of the high-scorers in the Diogenes Project-- to put together a program for educating law-enforcement officials around the world in the techniques of interviewing and lie detection. In recent months, they have flown to Washington, D.C., to assist the C.I.A. and the F.B.I. in counter-terrorism training. At the same time, the Defense Advanced Research Projects Agency (DARPA) has asked Ekman and his former student Mark Frank, now at Rutgers, to develop experimental scenarios for studying deception that would be relevant to counter-terrorism. The objective is to teach people to look for discrepancies between what is said and what is signalled--to pick up on the difference between Philby's crisp denials and his fleeting anguish. It's a completely different approach from the shouting cop we see on TV and in the movies, who threatens the suspect and sweeps all of the papers and coffee cups off the battered desk. The Hollywood interrogation is an exercise in intimidation, and its point is to force the suspect to tell you what you need to know. It does not take much to see the limitations of this strategy. It depends for its success on the coöperation of the suspect--when, of course, the suspect's involuntary communication may be just as critical. And it privileges the voice over the face, when the voice and the face are equally significant channels in the same system.

Ekman received his most memorable lesson in this truth when he and Friesen first began working on expressions of anger and distress. "It was weeks before one of us finally admitted feeling terrible after a session where we' d been making one of those faces all day," Friesen says. "Then the other realized that he'd been feeling poorly, too, so we began to keep track." They then went back and began monitoring their body during particular facial movements. "Say you do A.U. one, raising the inner eyebrows, and six, raising the cheeks, and fifteen, the lowering of the corner of the lips," Ekman said, and then did all three. "What we discovered is that that expression alone is sufficient to create marked changes in the autonomic nervous system. When this first occurred, we were stunned. We weren't expecting this at all. And it happened to both of us. We felt terrible . What we were generating was sadness, anguish. And when I lower my brows, which is four, and raise the upper eyelid, which is five, and narrow the eyelids, which is seven, and press the lips together, which is twenty-four, I' m generating anger. My heartbeat will go up ten to twelve beats. My hands will get hot. As I do it, I can't disconnect from the system. It's very unpleasant, very unpleasant."

Ekman, Friesen, and another colleague, Robert Levenson, who teaches at Berkeley, published a study of this effect in Science. They monitored the bodily indices of anger, sadness, and fear--heart rate and body temperature--in two groups. The first group was instructed to remember and relive a particularly stressful experience. The other was told to simply produce a series of facial movements, as instructed by Ekman-- to "assume the position," as they say in acting class. The second group, the people who were pretending, showed the same physiological responses as the first. A few years later, a German team of psychologists published a similar study. They had a group of subjects look at cartoons, either while holding a pen between their lips--an action that made it impossible to contract either of the two major smiling muscles, the risorius and the zygomatic major-- or while holding a pen clenched between their teeth, which had the opposite effect and forced them to smile. The people with the pen between their teeth found the cartoons much funnier. Emotion doesn't just go from the inside out. It goes from the outside in. What's more, neither the subjects "assuming the position" nor the people with pens in their teeth knew they were making expressions of emotion. In the facial-feedback system, an expression you do not even know that you have can create an emotion you did not choose to feel.

It is hard to talk to anyone who knows FACS without this point coming up again and again. Face-reading depends not just on seeing facial expressions but also on taking them seriously. One reason most of us--like the TV cop-- do not closely attend to the face is that we view its evidence as secondary, as an adjunct to what we believe to be real emotion. But there's nothing secondary about the face, and surely this realization is what set John Yarbrough apart on the night that the boy in the sports car came at him with a gun. It's not just that he saw a microexpression that the rest of us would have missed. It's that he took what he saw so seriously that he was able to overcome every self-protective instinct in his body, and hold his fire.

6.

Yarbrough has a friend in the L.A. County Sheriff's Department, Sergeant Bob Harms, who works in narcotics in Palmdale. Harms is a member of the Diogenes Project as well, but the two men come across very differently. Harms is bigger than Yarbrough, taller and broader in the chest, with soft brown eyes and dark, thick hair. Yarbrough is restoring a Corvette and wears Rush Limbaugh ties, and he says that if he hadn't been a cop he would have liked to stay in the Marines. Harms came out of college wanting to be a commercial artist; now he plans to open a bed-and-breakfast in Vermont with his wife when he retires. On the day we met, Harms was wearing a pair of jean shorts and a short-sleeved patterned shirt. His badge was hidden inside his shirt. He takes notes not on a yellow legal pad, which he considers unnecessarily intimidating to witnesses, but on a powder-blue one. "I always get teased because I'm the touchy-feely one," Harms said. "John Yarbrough is very analytical. He thinks before he speaks. There is a lot going on inside his head. He's constantly thinking four or five steps ahead, then formulating whatever his answers are going to be. That's not how I do my interviews. I have a conversation. It's not "Where were you on Friday night?' Because that's the way we normally communicate. I never say, "I'm Sergeant Harms.' I always start by saying, "I'm Bob Harms, and I'm here to talk to you about your case,' and the first thing I do is smile."

The sensation of talking to the two men, however, is surprisingly similar. Normal conversation is like a game of tennis: you talk and I listen, you listen and I talk, and we feel scrutinized by our conversational partner only when the ball is in our court. But Yarbrough and Harms never stop watching, even when they're doing the talking. Yarbrough would comment on my conversational style, noting where I held my hands as I talked, or how long I would wait out a lull in the conversation. At one point, he stood up and soundlessly moved to the door-- which he could have seen only in his peripheral vision--opening it just before a visitor rang the doorbell. Harms gave the impression that he was deeply interested in me. It wasn't empathy. It was a kind of powerful curiosity. "I remember once, when I was in prison custody, I used to shake prisoners' hands," Harms said. "The deputies thought I was crazy. But I wanted to see what happened, because that's what these men are starving for, some dignity and respect."

Some of what sets Yarbrough and Harms and the other face readers apart is no doubt innate. But the fact that people can be taught so easily to recognize microexpressions, and can learn FACS, suggests that we all have at least the potential capacity for this kind of perception. Among those who do very well at face-reading, tellingly, are some aphasics, such as stroke victims who have lost the ability to understand language. Collaborating with Ekman on a paper that was recently published in Nature, the psychologist Nancy Etcoff, of Massachusetts General Hospital, described how a group of aphasics trounced a group of undergraduates at M.I.T. on the nurses tape. Robbed of the power to understand speech, the stroke victims had apparently been forced to become far more sensitive to the information written on people's faces. "They are compensating for the loss in one channel through these other channels," Etcoff says. "We could hypothesize that there is some kind of rewiring in the brain, but I don't think we need that explanation. They simply exercise these skills much more than we do." Ekman has also done work showing that some abused children are particularly good at reading faces as well: like the aphasics in the study, they developed "interpretive strategies"--in their case, so they could predict the behavior of their volatile parents.

What appears to be a kind of magical, effortless intuition about faces, then, may not really be effortless and magical at all. This kind of intuition is a product of desire and effort. Silvan Tomkins took a sabbatical from Princeton when his son Mark was born, and stayed in his house on the Jersey Shore, staring into his son's face, long and hard, picking up the patterns of emotion--the cycles of interest, joy, sadness, and anger--that flash across an infant's face in the first few months of life. He taught himself the logic of the furrows and the wrinkles and the creases, the subtle differences between the pre-smile and the pre-cry face. Later, he put together a library of thousands of photographs of human faces, in every conceivable expression. He developed something called the Picture Arrangement Test, which was his version of the Rorschach blot: a patient would look at a series of pictures and be asked to arrange them in a sequence and then tell a story based on what he saw. The psychologist was supposed to interpret the meaning of the story, but Tomkins would watch a videotape of the patient with the sound off, and by studying the expressions on the patient's face teach himself to predict what the story was. Face-reading, for those who have mastered it, becomes a kind of compulsion; it becomes hard to be satisfied with the level and quality of information that most of us glean from normal social encounters. "Whenever we get together," Harms says of spending time with other face readers, "we debrief each other. We're constantly talking about cases, or some of these videotapes of Ekman's, and we say, "I missed that, did you get that?' Maybe there's an emotion attached there. We're always trying to place things, and replaying interviews in our head."

This is surely why the majority of us don't do well at reading faces: we feel no need to make that extra effort. People fail at the nurses tape, Ekman says, because they end up just listening to the words. That's why, when Tomkins was starting out in his quest to understand the face, he always watched television with the sound turned off. "We are such creatures of language that what we hear takes precedence over what is supposed to be our primary channel of communication, the visual channel," he once said. "Even though the visual channel provides such enormous information, the fact is that the voice preëmpts the individual's attention, so that he cannot really see the face while he listens." We prefer that way of dealing with the world because it does not challenge the ordinary boundaries of human relationships. Ekman, in one of his essays, writes of what he learned from the legendary sociologist Erving Goffman. Goffman said that part of what it means to be civilized is not to "steal" information that is not freely given to us. When someone picks his nose or cleans his ears, out of unthinking habit, we look away. Ekman writes that for Goffman the spoken word is "the acknowledged information, the information for which the person who states it is willing to take responsibility," and he goes on:

When the secretary who is miserable about a fight with her husband the previous night answers, "Just fine," when her boss asks, "How are you this morning?"--that false message may be the one relevant to the boss's interactions with her. It tells him that she is going to do her job. The true message--that she is miserable--he may not care to know about at all as long as she does not intend to let it impair her job performance.

What would the boss gain by reading the subtle and contradictory microexpressions on his secretary's face? It would be an invasion of her privacy and an act of disrespect. More than that, it would entail an obligation. He would be obliged to do something, or say something, or feel something that might otherwise be avoided entirely. To see what is intended to be hidden, or, at least, what is usually missed, opens up a world of uncomfortable possibilities. This is the hard part of being a face reader. People like that have more faith in their hunches than the rest of us do. But faith is not certainty. Sometimes, on a routine traffic stop late at night, you end up finding out that your hunch was right. But at other times you'll never know. And you can't even explain it properly, because what can you say? You did something the rest of us would never have done, based on something the rest of us would never have seen.

"I was working in West Hollywood once, in the nineteen-eighties," Harms said. "I was with a partner, Scott. I was driving. I had just recently come off the prostitution team, and we spotted a man in drag. He was on Sunset, and I didn't recognize him. At that time, Sunset was normally for females. So it was kind of odd. It was a cold night in January. There was an all-night restaurant on Sunset called Ben Franks, so I asked my partner to roll down the window and ask the guy if he was going to Ben Franks-- just to get a reaction. And the guy immediately keys on Scott, and he's got an overcoat on, and he's all bundled up, and he starts walking over to the car. It had been raining so much that the sewers in West Hollywood had backed up, and one of the manhole covers had been cordoned off because it was pumping out water. The guy comes over to the squad car, and he's walking right through that. He's fixated on Scott. So we asked him what he was doing. He says, "I was out for a walk.' And then he says, "I have something to show you.'"

Later, after the incident was over, Harms and his partner learned that the man had been going around Hollywood making serious threats, that he was unstable and had just attempted suicide, that he was in all likelihood about to erupt. A departmental inquiry into the incident would affirm that Harms and his partner had been in danger: the man was armed with a makeshift flamethrower, and what he had in mind, evidently, was to turn the inside of the squad car into an inferno. But at the time all Harms had was a hunch, a sense from the situation and the man's behavior and what he glimpsed inside the man's coat and on the man's face-- something that was the opposite of whatever John Yarbrough saw in the face of the boy in Willowbrook. Harms pulled out his gun and shot the man through the open window. "Scott looked at me and was, like, "What did you do?' because he didn't perceive any danger," Harms said. "But I did."

Political Heat

download pdf

../index.html../index.html

 

homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog

August 12, 2002
BOOKS

The great Chicago heat wave,
and other unnatural disasters.

1.

In the first week of July, 1995, a strong high-pressure air mass developed over the plains of the Southwest and began moving slowly eastward toward Chicago. Illinois usually gets its warm summer air from the Gulf of Mexico, and the air coming off the ocean is relatively temperate. But this was a blast of western air that had been baked in the desert ovens of West Texas and New Mexico. It was hot, bringing temperatures in excess of a hundred degrees, and, because the preceding two months had been very wet in the Midwest and the ground was damp, the air steadily picked up moisture as it moved across the farmlands east of the Rockies. Ordinarily, this would not have been a problem, since humid air tends to become diluted as it mixes with the drier air higher up in the atmosphere. But it was Chicago's misfortune, in mid-July, to be in the grip of an unusually strong temperature inversion: the air in the first thousand feet above the city surface was cooler than the air at two and three thousand feet. The humid air could not rise and be diluted. It was trapped by the warmer air above. The United States has cities that are often humid--like Houston and New Orleans--without being tremendously hot. And it has very hot cities--like Las Vegas and Phoenix--that are almost never humid. But for one long week, beginning on Thursday, July 13, 1995, Chicago was both. Meteorologists measure humidity with what is called the dew point--the point at which the air is so saturated with moisture that it cannot cool without forming dew. On a typical Chicago summer day, the dew point is in the low sixties, and on a very warm, humid day it is in the low seventies. At Chicago's Midway Airport, during the heat wave of 1995, the dew point hit the low eighties--a figure reached regularly only in places like the coastal regions of the Middle East. In July of 1995, Chicago effectively turned into Dubai.

As the air mass settled on the city, cars began to overheat and stall in the streets. Roads buckled. Hundreds of children developed heat exhaustion when school buses were stuck in traffic. More than three thousand fire hydrants were opened in poorer neighborhoods around the city, by people looking for relief from the heat, and this caused pressure to drop so precipitately that entire buildings were left without water. So many air-conditioners were turned on that the city's electrical infrastructure was overwhelmed. A series of rolling blackouts left thousands without power. As the heat took its toll, the city ran out of ambulances. More than twenty hospitals, mostly on Chicago's poorer South Side, shut their doors to new admissions. Callers to 911 were put on hold, and as the police and paramedics raced from one home to another it became clear that the heat was killing people in unprecedented numbers. The police took the bodies to the Cook County Medical Examiner's office, and a line of cruisers stretched outside the building. Students from a nearby mortuary school, and then ex-convicts looking to earn probation points, were brought in to help. The morgue ran out of bays in which to put the bodies. Office space was cleared. It wasn't enough. The owner of a local meatpacking firm offered the city his refrigerated trucks to help store the bodies. The first set wasn't enough. He sent another. It wasn't enough. In the end, there were nine forty-eight-foot meatpacking trailers in the morgue's parking lot. When the final statistics were tallied, the city calculated that in the seven days between July 14th and July 20th, the heat wave had resulted in the deaths of seven hundred and thirty-nine Chicagoans; on Saturday, July 15th, alone, three hundred and sixty-five people died from the heat. The chance intersection of a strong high-pressure ridge, a wet spring, and an intense temperature inversion claimed more lives than Hurricane Andrew, the crash of T.W.A. Flight 800, the Oklahoma City bombing, and the Northridge, California, earthquake combined.

2.

In "Heat Wave: A Social Autopsy of Disaster in Chicago" (Chicago; $27.50), the New York University sociologist Eric Klinenberg sets out to understand what happened during those seven days in July. He looks at who died, and where they died, and why they died. He goes to the county morgue and sifts through the dozens of boxes of unclaimed personal effects of heat-wave victims--"watches, wallets, letters, tax returns, photographs, and record books"--and reads the police reports on the victims, with their dry recitations of the circumstances of death. Here is one for a seventy-three-year-old white woman who was found on Monday, July 17th:

A recluse for 10 yrs, never left apartment, found today by son, apparently DOA. Conditions in apartment when R/O's [responding officers] arrived thermostat was registering over 90 degrees f. with no air circulation except for windows opened by son (after death).

Here is another, for a seventy-nine-year-old black man found on Wednesday the 19th:

Victim did not respond to phone calls or knocks on victim's door since Sunday, 16 July 1995. Victim was known as quiet, [kept] to himself and at times, not to answer the door. Landlord . . . does not have any information to any relatives to victim. . . . Chain was on door. R/O was able to see victim on sofa with flies on victim and a very strong odor decay.

The city's response to the crisis, Klinenberg argues, was to look at people like those two victims--the recluse who did not open her windows and the man who would not answer his door--and conclude that their deaths were inevitable, the result of an unavoidable collision between their own infirmity and an extreme environmental event. As one Health Department official put it at the time, "Government can't guarantee there won't be a heat wave." On the Friday, the human-services commissioner, Daniel Alvarez, told the press, "We're talking about people who die because they neglect themselves. We did everything possible. But some people didn't want to open their doors to us." In its official postmortem four months later, the city sounded the same fatalistic note: the disaster had been a "unique meteorological event" that proved that the "government alone cannot do it all."

Klinenberg finds that conclusion unacceptably superficial. The disaster may look inevitable, but beneath the surface he sees numerous explanations for why it took the shape it did. One chapter of the book is devoted to a comparison of two adjoining low-income neighborhoods in Chicago, Little Village and North Lawndale. Statistically, the two are almost identical, each with heavy concentrations of poor, elderly people living alone, so it would seem that the heat wave should have taken a similar toll in both neighborhoods. But North Lawndale had ten times the fatality rate of Little Village. Why? Because Little Village is a bustling, relatively safe, close-knit Hispanic community; the elderly had family and friends nearby who could look in on them, and streets and stores where they could go to escape their stifling apartments. North Lawndale, by contrast, is a sprawling, underpopulated, drug-infested neighborhood. The elderly there were afraid to go outside, and had no one close by to visit them. The heat was deadly only in combination with particular social and physical circumstances.

Klinenberg takes an equally close look at the city's ambulance shortage. The city could have nearly tripled the number of available ambulances by calling in reserves from the suburbs, but it was slow to realize that it had a disaster on its hands. "It's hot. It's very hot. But let's not blow it out of proportion": this was Mayor Richard Daley's assessment of the situation on Friday, July 14th. The streamlining of city governments like Chicago's, Klinenberg explains, isolated city officials. Social-services departments had been professionalized as if they were corporations. Responsibilities had been outsourced. "Police officers replace aldermen and precinct captains as the community sentries," he writes, and as a result political organizations began to lose contact with the needs of their constituents.

Problem solving, in our day and age, brings with it the requirement of compression: we are urged to distill the most pertinent lessons from any experience. Klinenberg suggests that such distillation only obscures the truth, and by the end of "Heat Wave" he has traced the lines of culpability in dozens of directions, drawing a dense and subtle portrait of exactly what happened during that week in July. It is an approach that resembles, most of all, the way the heat wave was analyzed by meteorologists. They took hourly surface-airways observations of temperature, wind speed, and humidity, estimated radiation from cloud cover, and performed complex calculations using the Penman-Monteith formula to factor in soil-heat flux, latent heat of vaporization, stomatal resistance, and the von Kármán constant. Why, Klinenberg asks, can't we bring the same rigor to our study of the social causes of disaster?

3.

Take the question of air-conditioning. The Centers for Disease Control, in their Chicago investigation, concluded that the use of air-conditioners could have prevented more than half of the deaths. But many low-income people in Chicago couldn't afford to turn on an air-conditioner even if they had been given one for free. Many of those who did have air-conditioners, meanwhile, were hit by the power failures that week. Chicago had a problem with a vulnerable population: a lot of very old and very sick people. But it also, quite apart from this, had an air-conditioning problem. What was the cause of that problem?

As it turns out, this is a particularly timely question, since there is a debate going on now in Washington over air-conditioners which bears directly on what happens during heat waves. All air-conditioners consist of a motor and a long coil that acts as a heat exchanger, taking hot air out of the room and replacing it with cold air. If you use a relatively unsophisticated motor and a small coil, an air-conditioner will be cheap to make but will use a lot of electricity. If you use a better motor and a larger heat exchanger, the air-conditioner will cost more to buy but far less to run. Rationally, consumers should buy the more expensive, energy-efficient units, because their slightly higher purchase price is dwarfed by the amount of money the owner pays over time in electric bills. But fifteen years ago Congress realized that this wasn't happening. The people who generally bought air-conditioners--builders and landlords--weren't the people who paid the utility bills to run them. Their incentive was to buy the cheapest unit. So Congress passed a minimum standard for air-conditioning efficiency. Residential central air-conditioning units now had to score at least 10 on a scale known as SEER--the seasonal energy-efficiency ratio. One of Bill Clinton's last acts as President was to raise that standard to 13. This spring, however, the Bush Administration cut the efficiency increase by a third, making SEER 12 the law.

It should be said that SEER 13 is no more technologically difficult than SEER 12. SEER 12 is simply a bit cheaper to make, and SEER 13 is simply cheaper to operate. Nor is this a classic regulatory battle that pits corporate against consumer interests. The nation's largest air-conditioner manufacturer, Carrier, is in favor of 12. But the second-largest manufacturer, Goodman (which makes Amana air-conditioners), is in favor of 13. The Bush decision is really about politics, and the White House felt free to roll back the Clinton standard because most of the time the difference between the two standards is negligible. There is one exception, however: heat waves.

Air-conditioning is, of course, the reason that electrical consumption soars on very hot days. On the worst day in August, electricity consumption in, say, Manhattan might be three or four times what it is on a cool spring day. For most of the year, a local utility can use the electricity from its own power plants, or sign stable, long-term contracts with other power companies. But the extra electricity a city needs on that handful of very hot days presents a problem. You can't build a power plant just to supply this surge--what would you do with it during the rest of the year? So, at peak periods, utilities buy the power they need on the "spot" market, and power bought on the spot market can cost fifty times as much as the power used on normal days. The amount of power that a utility has to buy for that handful of hot days every summer, in other words, is a huge factor in the size of our electric bills.

For anyone wanting to make electricity cheaper, then, the crucial issue is not how to reduce average electrical consumption but how to reduce peak consumption. A recent study estimates that moving the SEER standard from 10 to 13 would have the effect of cutting peak demand by the equivalent of more than a hundred and fifty power plants. The Bush Administration's decision to cut the SEER upgrade by a third means that by 2020 demand will be fourteen thousand megawatts higher than it would have been, and that we'll have to build about fifty more power plants. The cost of those extra power plants--and of running a less efficient air-conditioner on hot days--is part of what will make air-conditioning less affordable for people who will someday desperately need it.

The sheer volume of electricity required on a very hot day also puts enormous strain on a city's power-distribution system. On the Friday of the Chicago heat wave, when power demand peaked, one of the main problem areas was the transmission substation (TSS) at California Avenue and Addison Street, in the city's northwest corner. TSS 114 consists of a series of giant transformers--twenty feet high and fifteen feet across--that help convert the high-voltage electricity that comes into Chicago along power lines into the low-voltage power that is used in offices and homes. Throughout that Friday afternoon, the four transformers in the second terminal at TSS 114 were running at a hundred and eighteen per cent of capacity--that is, they were handling roughly a fifth more electricity than they were designed to carry. The chief side effect of overcapacity is heat. The more current you run through a transformer the hotter it gets, and, combined with the ambient temperature that afternoon, which averaged a hundred and eleven degrees, the heat turned the inside of terminal two into an oven.

At 4:56 P.M., the heat overwhelmed a monitoring device known as a CT--a gauge almost small enough to fit in the palm of one's hand--on the first of the transformers. It tripped and shut down. The current that had been shared by four transformers had to be carried by just three, making them still hotter. The second transformer was now carrying a hundred and twenty-four per cent of its rated capacity. Fifty-one minutes later, a circuit breaker on the second transformer burst into flames. Transformers are engineered to handle extra loads for short periods of time, but there was just a little too much current and a little too much heat. At 6:19, two more CTs tripped on the third transformer and, as workmen struggled to get the terminal up and running, a CT failed on the fourth transformer. In all, forty-nine thousand customers and all of the people in those customers' houses and apartments and offices were without air-conditioning for close to twenty hours--and this is merely what happened at TSS 114.

All around the city that week, between Wednesday and Sunday, there were 1,327 separate equipment failures that left an additional hundred and forty-nine thousand customers without power. Those are staggering numbers. But what is really staggering is how easy it would have been to avoid these power outages. Commonwealth Edison, the city's utility, had forecast a year earlier that electricity use in the summer of 1995 would peak at 18,600 megawatts. The actual high, on the Friday of the heat wave, was 19,201. The difference, in other words, between the demand that the utility was prepared to handle and the demand that brought the city to its knees was six hundred and one megawatts, or 3.2 per cent of the total--which is just about what a place like Chicago might save by having a city full of SEER 13 air-conditioners instead of SEER 12 air-conditioners.

4.

In 1928, a storm near Palm Beach, Florida, killed almost two thousand people, most of them black migrant workers on the shores of Lake Okeechobee. This was, the state comptroller declared, "an act of God." In 1935, the most severe hurricane in American history hit the Florida Keys, sending a storm surge fifteen to twenty feet high through a low-lying encampment of war veterans working on the highway. About four hundred people died. "The catastrophe must be characterized as an act of God and was by its very nature beyond the power of man," the Veterans Authority and Federal Emergency Relief Administration declared in an official report. In 1972, an earthen dam put up by a mining company in Logan County, West Virginia, collapsed in heavy rains, killing a hundred and thirty-nine people. It was an "act of God," a mining-company official said, disavowing any culpability. In 1974, a series of twisters swept across ten states, killing three hundred and fifteen people. Senator Thomas Eagleton, of Missouri, said at the time that his colleagues in Washington viewed the tornado "as an act of God where even the Congress can't intervene," explaining why the government would not fund an early-warning system. This is the way we have thought of catastrophes in the United States. The idea of an "act of God" suggests that any search for causes is unnecessary. It encourages us to see disasters, as the environmental historian Ted Steinberg writes in "Acts of God: The Unnatural History of Natural Disaster in America" (2000), simply as things that happen "from time to time." It suggests, too, that systems or institutions ought to be judged on the basis of how they perform most of the time, under "normal" conditions, rather than by how they perform under those rare moments of extreme stress. But this idea, as "Heat Wave" makes clear, is a grave mistake. Political systems and social institutions ought to be judged the way utilities are judged. The true test is how they perform on a blistering day in July.

Klinenberg tells the story of Pauline Jankowitz, an elderly woman living alone in a third-floor apartment in a transitional neighborhood. Her air-conditioner was old and didn't work well. She had a bladder problem that left her incontinent, and she had to walk with a crutch because she had a weak leg. That made it difficult for her to get down the stairs, and once she was outside she was terrified of being mugged. "Chicago is just a shooting gallery," she said to Klinenberg. She left her apartment only about six times a year. Jankowitz was the prototypical heat-wave victim, and, as she told Klinenberg, that week in July was "the closest I've ever come to death." But she survived. A friend had told her to leave her apartment if it got too hot; so, early on what would turn out to be the worst of the seven days, she rose and crept down the stairs. She caught a city bus to a nearby store, which was air-conditioned, and there she bought fresh cherries and leaned on the shopping cart until she recovered her strength. On the trip home, she recalled, "climbing the stairs was almost impossible." Back in her apartment, she felt her body begin to swell and go numb. She telephoned a friend. She turned a fan on high, lay down on the floor, covered herself with wet towels, and dreamed that she was on a Caribbean cruise. She was poor and old and infirm, but she lived, and one of the many lessons of her story is that in order to survive that week in July she suddenly depended on services and supports that previously she had barely needed at all. Her old air-conditioner was useless most of the time. But that week it helped to keep her apartment at least habitable. She rarely travelled. But on that day the fact that there was a city bus, and that it came promptly and that it was air-conditioned, was of the greatest importance. She rarely went to the store; she had her groceries delivered. But now the proximity of a supermarket, where she could lean on the shopping cart and breathe in the cool air, was critical. Pauline Jankowitz's life depended not on the ordinary workings of the social institutions in her world but on their ability to perform at one critical moment of peak demand. On the hottest of all days, her neighborhood substation did not fail. Her bus came. Her grocery store was open. She was one of the lucky ones.

Group Think

download pdf

../index.html../index.html

 

homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog

December 2, 2002
THE CRITICS

What does 'Saturday Night Live'
have in common with German philosophy?

1.

Lorne Michaels, the creator of "Saturday Night Live," was married to one of the show's writers, Rosie Shuster. One day when the show was still young, an assistant named Paula Davis went to Shuster's apartment in New York and found Dan Aykroyd getting out of her bed--which was puzzling, not just because Shuster was married to Michaels but because Aykroyd was supposedly seeing another member of the original "S.N.L." cast, Laraine Newman. Aykroyd and Gilda Radner had also been an item, back when the two of them worked for the Second City comedy troupe in Toronto, although by the time they got to New York they were just friends, in the way that everyone was friends with Radner. Second City was also where Aykroyd met John Belushi, because Belushi, who was a product of the Second City troupe in Chicago, came to Toronto to recruit for the "National Lampoon Radio Hour," which he starred in along with Radner and Bill Murray (who were also an item for a while). The writer Michael O'Donoghue (who famously voiced his aversion to the appearance of the Muppets on "S.N.L." by saying, "I don't write for felt") also came from The National Lampoon, as did another of the original writers, Anne Beatts (who was, in the impeccably ingrown logic of "S.N.L.," living with O'Donoghue). Chevy Chase came from a National Lampoon spinoff called "Lemmings," which also starred Belushi, doing his legendary Joe Cocker impersonation. Lorne Michaels hired Belushi after Radner, among others, insisted on it, and he hired Newman because he had worked with her on a Lily Tomlin special, and he hired Aykroyd because Michaels was also from Canada and knew him from the comedy scene there. When Aykroyd got the word, he came down from Toronto on his Harley.

In the early days of "S.N.L.," as Tom Shales and James Andrew Miller tell us in "Live from New York" (Little, Brown; $25.95), everyone knew everyone and everyone was always in everyone else's business, and that fact goes a long way toward explaining the extraordinary chemistry among the show's cast. Belushi would stay overnight at people's apartments, and he was notorious for getting hungry in the middle of the night and leaving spaghetti-sauce imprints all over the kitchen, or setting fires by falling asleep with a lit joint. Radner would go to Jane Curtin's house and sit and watch Curtin and her husband, as if they were some strange species of mammal, and say things like "Oh, now you are going to turn the TV on together. How will you decide what to watch?" Newman would hang out at Radner's house, and Radner would be eating a gallon of ice cream and Newman would be snorting heroin. Then Radner would go to the bathroom to make herself vomit, and say, "I'm so full, I can't hear." And they would laugh. "There we were," Newman recalls, "practicing our illnesses together."

The place where they all really lived, though, was the "S.N.L." office, on the seventeenth floor of NBC headquarters, at Rockefeller Center. The staff turned it into a giant dormitory, installing bunk beds and fooling around in the dressing rooms and staying up all night. Monday night was the first meeting, where ideas were pitched. On Tuesday, the writing started after dinner and continued straight through the night. The first read-through took place on Wednesday at three in the afternoon. And then came blocking and rehearsals and revisions. "It was emotional," the writer Alan Zweibel tells Shales and Miller. "We were a colony. I don't mean this in a bad way, but we were Guyana on the seventeenth floor. We didn't go out. We stayed there. It was a stalag of some sort." Rosie Shuster remembers waking up at the office and then going outside with Aykroyd, to "walk each other like dogs around 30 Rock just to get a little fresh air." On Saturdays, after the taping was finished, the cast would head downtown to a storefront that Belushi and Aykroyd had rented and dubbed the Blues Bar. It was a cheerless dive, with rats and crumbling walls and peeling paint and the filthiest toilets in all of New York. But did anyone care? "It was the end of the week and, well, you were psyched," Shuster recalls. "It was like you were buzzing, you'd get turbocharged from the intense effort of it, and then there's like adrenal burnout later. I remember sleeping at the Blues Bar, you know, as the light broke." Sometimes it went even later. "I remember rolling down the armor at the Blues Bar and closing the building at eleven o'clock Sunday morning--you know, when it was at its height--and saying good morning to the cops and firemen,"Aykroyd said. "S.N.L." was a television show, but it was also an adult fraternity house, united by bonds of drugs and sex and long hours and emotion and affection that went back years. "The only entrée to that boys club was basically by fucking somebody in the club," Anne Beatts tells Shales and Miller. "Which wasn't the reason you were fucking them necessarily. I mean, you didn't go "Oh, I want to get into this, I think I'll have to have sex with this person.' It was just that if you were drawn to funny people who were doing interesting things, then the only real way to get to do those things yourself was to make that connection."

2.

We are inclined to think that genuine innovators are loners, that they do not need the social reinforcement the rest of us crave. But that's not how it works, whether it's television comedy or, for that matter, the more exalted realms of art and politics and ideas. In his book "The Sociology of Philosophies," Randall Collins finds in all of known history only three major thinkers who appeared on the scene by themselves:the first-century Taoist metaphysician Wang Ch'ung, the fourteenth-century Zen mystic Bassui Tokusho, and the fourteenth-century Arabic philosopher Ibn Khaldun. Everyone else who mattered was part of a movement, a school, a band of followers and disciples and mentors and rivals and friends who saw each other all the time and had long arguments over coffee and slept with one another's spouses. Freud may have been the founder of psychoanalysis, but it really began to take shape in 1902, when Alfred Adler, Wilhelm Stekel, Max Kahane, and Rudolf Reitler would gather in Freud's waiting room on Wednesdays, to eat strudel and talk about the unconscious. The neo-Confucian movement of the Sung dynasty in China revolved around the brothers Ch'eng Hao and Ch'eng I, their teacher Chou Tun-i, their father's cousin Chang Tsai, and, of course, their neighbor Shao Yung. Pissarro and Degas enrolled in the École des Beaux-Arts at the same time, then Pissarro met Monet and, later, Cézanne at the Académie Suisse, Manet met Degas at the Louvre, Monet befriended Renoir at Charles Gleyre's studio, and Renoir, in turn, met Pissarro and Cézanne and soon enough everyone was hanging out at the Café Guerbois on the Rue des Batignolles. Collins's point is not that innovation attracts groups but that innovation is found in groups: that it tends to arise out of social interaction--conversation, validation, the intimacy of proximity, and the look in your listener's eye that tells you you're onto something. German Idealism, he notes, centered on Fichte, Schelling, and Hegel. Why? Because they all lived together in the same house. "Fichte takes the early lead," Collins writes,

inspiring the others on a visit while they are young students at Tübingen in the 1790s, then turning Jena into a center for the philosophical movement to which a stream of the soon-to-be-eminent congregate; then on to Dresden in the heady years 1799-1800 to live with the Romantic circle of the Schlegel brothers (where August Schlegel's wife, Caroline, has an affair with Schelling, followed later by a scandalous divorce and remarriage). Fichte moves on to Berlin, allying with Schleiermacher (also of the Romantic circle) and with Humboldt to establish the new-style university; here Hegel eventually comes and founds his school, and Schopenhauer lectures fruitlessly in competition.

There is a wonderful illustration of this social dimension of innovation in Jenny Uglow's new book, "The Lunar Men" (Farrar, Straus & Giroux; $30), which is the story of a remarkable group of friends in Birmingham in the mid-eighteenth century. Their leader was Erasmus Darwin, a physician, inventor, and scientist, who began thinking about evolution a full fifty years before his grandson Charles. Darwin met, through his medical practice, an industrialist named Mathew Boulton and, later, his partner James Watt, the steam-engine pioneer. They, in turn, got to know Josiah Wedgwood, he of the famous pottery, and Joseph Priestley, the preacher who isolated oxygen and became known as one of history's great chemists, and the industrialist Samuel Galton (whose son married Darwin's daughter and produced the legendary nineteenth-century polymath Francis Galton), and the innovative glass-and-chemicals entrepreneur James Keir, and on and on. They called themselves the Lunar Society because they arranged to meet at each full moon, when they would get together in the early afternoon to eat, piling the table high, Uglow tells us, with wine and "fish and capons, Cheddar and Stilton, pies and syllabubs." Their children played underfoot. Their wives chatted in the other room, and the Lunar men talked well into the night, clearing the table to make room for their models and plans and instruments. "They developed their own cryptic, playful language and Darwin, in particular, liked to phrase things as puzzles--like the charades and poetic word games people used to play," Uglow writes. "Even though they were down-to-earth champions of reason, a part of the delight was to feel they were unlocking esoteric secrets, exploring transmutations like alchemists of old."

When they were not meeting, they were writing to each other with words of encouragement or advice or excitement. This was truly--in a phrase that is invariably and unthinkingly used in the pejorative--a mutual-admiration society. "Their inquiries ranged over the whole spectrum, from astronomy and optics to fossils and ferns," Uglow tells us, and she goes on:

One person's passion--be it carriages, steam, minerals, chemistry, clocks--fired all the others. There was no neat separation of subjects. Letters between [William] Small and Watt were a kaleidoscope of invention and ideas, touching on steam-engines and cylinders; cobalt as a semi-metal; how to boil down copal, the resin of tropical trees, for varnish; lenses and clocks and colours for enamels; alkali and canals; acids and vapours--as well as the boil on Watt's nose.

What were they doing? Darwin, in a lovely phrase, called it "philosophical laughing," which was his way of saying that those who depart from cultural or intellectual consensus need people to walk beside them and laugh with them to give them confidence. But there's more to it than that. One of the peculiar features of group dynamics is that clusters of people will come to decisions that are far more extreme than any individual member would have come to on his own. People compete with each other and egg each other on, showboat and grandstand; and along the way they often lose sight of what they truly believed when the meeting began. Typically, this is considered a bad thing, because it means that groups formed explicitly to find middle ground often end up someplace far away. But at times this quality turns out to be tremendously productive, because, after all, losing sight of what you truly believed when the meeting began is one way of defining innovation.

Uglow tells us, for instance, that the Lunar men were active in the campaign against slavery. Wedgwood, Watt, and Darwin pushed for the building of canals, to improve transportation. Priestley came up with soda water and the rubber eraser, and James Keir was the man who figured out how to mass-produce soap, eventually building a twenty-acre soapworks in Tipton that produced a million pounds of soap a year. Here, surely, are all the hallmarks of group distortion. Somebody comes up with an ambitious plan for canals, and someone else tries to top that by building a really big soap factory, and in that feverish atmosphere someone else decides to top them all with the idea that what they should really be doing is fighting slavery.

Uglow's book reveals how simplistic our view of groups really is. We divide them into cults and clubs, and dismiss the former for their insularity and the latter for their banality. The cult is the place where, cut off from your peers, you become crazy. The club is the place where, surrounded by your peers, you become boring. Yet if you can combine the best of those two --the right kind of insularity with the right kind of homogeneity--you create an environment both safe enough and stimulating enough to make great thoughts possible. You get Fichte, Schelling, and Hegel, and a revolution in Western philosophy. You get Darwin, Watt, Wedgwood, and Priestley, and the beginnings of the Industrial Revolution. And sometimes, on a more modest level, you get a bunch of people goofing around and bringing a new kind of comedy to network television.

3.

One of "S.N.L."'s forerunners was a comedy troupe based in San Francisco called the Committee. The Committee's heyday was in the nineteen-sixties, and its humor had the distinctive political bite of that period. In one of the group's memorable sketches, the actor Larry Hankin played a condemned prisoner being led to the electric chair by a warden, a priest, and a prison guard. Hankin was strapped in and the switch was thrown--and nothing happened. Hankin started to become abusive, and the three men huddled briefly together. Then, as Tony Hendra recounts, in "Going Too Far," his history of "boomer humor":

They confer and throw the switch again. Still nothing. Hankin starts cackling with glee, doubly abusive. They throw it yet again. Nothing yet again. Hankin then demands to be set free--he can't be executed more than once, they're a bunch of assholes, double jeopardy, nyah-nyah, etc., etc. Totally desperate, the three confer once more, check that they're alone in the cell, and kick Hankin to death.

Is that sketch funny? Some people thought so. When the Committee performed it at a benefit at the Vacaville prison, in California, the inmates laughed so hard they rioted. But others didn't, and even today it's clear that this humor is funny only to those who can appreciate the particular social and political sensibility of the Committee. We call new cultural or intellectual movements "circles" for a reason: the circle is a closed loop. You are either inside or outside. In "Live from New York," Lorne Michaels describes going to the White House to tape President Ford saying, "Live from New York, it's Saturday Night," the "S.N.L." intro: "We'd done two or three takes, and to relax him, I said to him--my sense of humor at the time--"Mr. President, if this works out, who knows where it will lead?' Which was completely lost on him." In another comic era, the fact that Ford did not laugh would be evidence of the joke's failure. But when Michaels says the joke "was completely lost on him" it isn't a disclaimer--it's the punch line. He said what he said because he knew Ford would not get it. As the writers of "Saturday Night Live" worked on sketches deep into the night, they were sustained by something like what sustained the Lunar men and the idealists in Tübingen--the feeling that they all spoke a private language.

To those on the inside, of course, nothing is funnier than an inside joke. But the real significance of inside jokes is what they mean for those who aren't on the inside. Laughing at a joke creates an incentive to join the joke-teller. But not laughing--not getting the joke--creates an even greater incentive. We all want to know what we're missing, and this is one of the ways that revolutions spread from the small groups that spawn them.

"One of Michaels's rules was, no groveling to the audience either in the studio or at home," Shales and Miller write. "The collective approach of the show's creators could be seen as a kind of arrogance, a stance of defiance that said in effect, "We think this is funny, and if you don't, you're wrong.' . . . To viewers raised on TV that was forever cajoling, importuning, and talking down to them, the blunt and gutsy approach was refreshing, a virtual reinvention of the medium."

The successful inside joke, however, can never last. In "A Great Silly Grin" (Public Affairs; $27.50), a history of nineteen-sixties British satire, Humphrey Carpenter relates a routine done at the comedy club the Establishment early in the decade. The sketch was about the rebuilt Coventry Cathedral, which had been destroyed in the war, and the speaker was supposed to be the Cathedral's architect, Sir Basil Spence:

First of all, of course, we owe an enormous debt of gratitude to the German people for making this whole project possible in the first place. Second, we owe a debt of gratitude to the people of Coventry itself, who when asked to choose between having a cathedral and having hospitals, schools and houses, plumped immediately (I'm glad to say) for the cathedral, recognizing, I think, the need of any community to have a place where the whole community can gather together and pray for such things as hospitals, schools and houses.

When that bit was first performed, many Englishmen would have found it offensive. Now, of course, hardly anyone would. Mocking British establishment pieties is no longer an act of rebellion. It is the norm. Successful revolutions contain the seeds of their demise: they attract so many followers, eager to be in on the joke as well, that the circle breaks down. The inside becomes indistinguishable from the outside. The allure of exclusivity is gone.

At the same time, the special bonds that created the circle cannot last forever. Sooner or later, the people who slept together in every combination start to pair off. Those doing drugs together sober up (or die). Everyone starts going to bed at eleven o'clock, and bit by bit the intimacy that fuels innovation slips away. "I was involved with Gilda, yeah. I was in love with her," Aykroyd tells Shales and Miller."We were friends, lovers, then friends again," and in a way that's the simplest and best explanation for the genius of the original "S.N.L." Today's cast is not less talented. It is simply more professional. "I think some people in the cast have fun crushes on other people, but nothing serious," Cheri Oteri, a cast member from the late nineteen-nineties, tells Shales and Miller, in what might well serve as the show's creative epitaph. "I guess we're kind of boring--no romances, no drugs. I had an audition once with somebody who used to work here. He's very, very big in the business now. And as soon as I went in for the audition, he went, "Hey, you guys still doing coke over at SNL?' Because back when he was here, they were doing it. What are we doing, for crying out loud? Oh yeah. Thinking up characters."

Big and Bad

download pdf

../index.html../index.html

 

homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog

January 12, 2004
COMMERCE AND CULTURE

How the S.U.V. ran over automotive safety.

1.

In the summer of 1996, the Ford Motor Company began building the Expedition, its new, full-sized S.U.V., at the Michigan Truck Plant, in the Detroit suburb of Wayne. The Expedition was essentially the F-150 pickup truck with an extra set of doors and two more rows of seats—and the fact that it was a truck was critical. Cars have to meet stringent fuel-efficiency regulations. Trucks don't. The handling and suspension and braking of cars have to be built to the demanding standards of drivers and passengers. Trucks only have to handle like, well, trucks. Cars are built with what is called unit-body construction. To be light enough to meet fuel standards and safe enough to meet safety standards, they have expensive and elaborately engineered steel skeletons, with built-in crumple zones to absorb the impact of a crash. Making a truck is a lot more rudimentary. You build a rectangular steel frame. The engine gets bolted to the front. The seats get bolted to the middle. The body gets lowered over the top. The result is heavy and rigid and not particularly safe. But it's an awfully inexpensive way to build an automobile. Ford had planned to sell the Expedition for thirty-six thousand dollars, and its best estimate was that it could build one for twenty-four thousand—which, in the automotive industry, is a terrifically high profit margin. Sales, the company predicted, weren't going to be huge. After all, how many Americans could reasonably be expected to pay a twelve-thousand-dollar premium for what was essentially a dressed-up truck? But Ford executives decided that the Expedition would be a highly profitable niche product. They were half right. The "highly profitable" part turned out to be true. Yet, almost from the moment Ford's big new S.U.V.s rolled off the assembly line in Wayne, there was nothing "niche" about the Expedition.

Ford had intended to split the assembly line at the Michigan Truck Plant between the Expedition and the Ford F-150 pickup. But, when the first flood of orders started coming in for the Expedition, the factory was entirely given over to S.U.V.s. The orders kept mounting. Assembly-line workers were put on sixty- and seventy-hour weeks. Another night shift was added. The plant was now running twenty-four hours a day, six days a week. Ford executives decided to build a luxury version of the Expedition, the Lincoln Navigator. They bolted a new grille on the Expedition, changed a few body panels, added some sound insulation, took a deep breath, and charged forty-five thousand dollars—and soon Navigators were flying out the door nearly as fast as Expeditions. Before long, the Michigan Truck Plant was the most profitable of Ford's fifty-three assembly plants. By the late nineteen-nineties, it had become the most profitable factory of any industry in the world. In 1998, the Michigan Truck Plant grossed eleven billion dollars, almost as much as McDonald's made that year. Profits were $3. 7 billion. Some factory workers, with overtime, were making two hundred thousand dollars a year. The demand for Expeditions and Navigators was so insatiable that even when a blizzard hit the Detroit region in January of 1999—burying the city in snow, paralyzing the airport, and stranding hundreds of cars on the freeway—Ford officials got on their radios and commandeered parts bound for other factories so that the Michigan Truck Plant assembly line wouldn't slow for a moment. The factory that had begun as just another assembly plant had become the company's crown jewel.

In the history of the automotive industry, few things have been quite as unexpected as the rise of the S.U.V. Detroit is a town of engineers, and engineers like to believe that there is some connection between the success of a vehicle and its technical merits. But the S.U.V. boom was like Apple's bringing back the Macintosh, dressing it up in colorful plastic, and suddenly creating a new market. It made no sense to them. Consumers said they liked four-wheel drive. But the overwhelming majority of consumers don't need four-wheel drive. S.U.V. buyers said they liked the elevated driving position. But when, in focus groups, industry marketers probed further, they heard things that left them rolling their eyes. As Keith Bradsher writes in "High and Mighty"—perhaps the most important book about Detroit since Ralph Nader's "Unsafe at Any Speed"—what consumers said was "If the vehicle is up high, it's easier to see if something is hiding underneath or lurking behind it. " Bradsher brilliantly captures the mixture of bafflement and contempt that many auto executives feel toward the customers who buy their S.U.V.s. Fred J. Schaafsma, a top engineer for General Motors, says, "Sport-utility owners tend to be more like 'I wonder how people view me,' and are more willing to trade off flexibility or functionality to get that. " According to Bradsher, internal industry market research concluded that S.U.V.s tend to be bought by people who are insecure, vain, self-centered, and self-absorbed, who are frequently nervous about their marriages, and who lack confidence in their driving skills. Ford's S.U.V. designers took their cues from seeing "fashionably dressed women wearing hiking boots or even work boots while walking through expensive malls. " Toyota's top marketing executive in the United States, Bradsher writes, loves to tell the story of how at a focus group in Los Angeles "an elegant woman in the group said that she needed her full-sized Lexus LX 470 to drive up over the curb and onto lawns to park at large parties in Beverly Hills. " One of Ford's senior marketing executives was even blunter: "The only time those S.U.V.s are going to be off-road is when they miss the driveway at 3 a. m. "

The truth, underneath all the rationalizations, seemed to be that S.U.V. buyers thought of big, heavy vehicles as safe: they found comfort in being surrounded by so much rubber and steel. To the engineers, of course, that didn't make any sense, either: if consumers really wanted something that was big and heavy and comforting, they ought to buy minivans, since minivans, with their unit-body construction, do much better in accidents than S.U.V.s. (In a thirty-five m.p.h. crash test, for instance, the driver of a Cadillac Escalade—the G.M. counterpart to the Lincoln Navigator—has a sixteen-per-cent chance of a life-threatening head injury, a twenty-per-cent chance of a life-threatening chest injury, and a thirty-five-per-cent chance of a leg injury. The same numbers in a Ford Windstar minivan—a vehicle engineered from the ground up, as opposed to simply being bolted onto a pickup-truck frame—are, respectively, two per cent, four per cent, and one per cent. ) But this desire for safety wasn't a rational calculation. It was a feeling. Over the past decade, a number of major automakers in America have relied on the services of a French-born cultural anthropologist, G. Clotaire Rapaille, whose speciality is getting beyond the rational—what he calls "cortex"—impressions of consumers and tapping into their deeper, "reptilian" responses. And what Rapaille concluded from countless, intensive sessions with car buyers was that when S.U.V. buyers thought about safety they were thinking about something that reached into their deepest unconscious. "The No. 1 feeling is that everything surrounding you should be round and soft, and should give," Rapaille told me. "There should be air bags everywhere. Then there's this notion that you need to be up high. That's a contradiction, because the people who buy these S.U.V.s know at the cortex level that if you are high there is more chance of a rollover. But at the reptilian level they think that if I am bigger and taller I'm safer. You feel secure because you are higher and dominate and look down. That you can look down is psychologically a very powerful notion. And what was the key element of safety when you were a child? It was that your mother fed you, and there was warm liquid. That's why cupholders are absolutely crucial for safety. If there is a car that has no cupholder, it is not safe. If I can put my coffee there, if I can have my food, if everything is round, if it's soft, and if I'm high, then I feel safe. It's amazing that intelligent, educated women will look at a car and the first thing they will look at is how many cupholders it has. " During the design of Chrysler's PT Cruiser, one of the things Rapaille learned was that car buyers felt unsafe when they thought that an outsider could easily see inside their vehicles. So Chrysler made the back window of the PT Cruiser smaller. Of course, making windows smaller—and thereby reducing visibility—makes driving more dangerous, not less so. But that's the puzzle of what has happened to the automobile world: feeling safe has become more important than actually being safe.

2.

One day this fall, I visited the automobile-testing center of Consumers Union, the organization that publishes Consumer Reports. It is tucked away in the woods, in south-central Connecticut, on the site of the old Connecticut Speedway. The facility has two skid pads to measure cornering, a long straightaway for braking tests, a meandering "handling" course that winds around the back side of the track, and an accident-avoidance obstacle course made out of a row of orange cones. It is headed by a trim, white-haired Englishman named David Champion, who previously worked as an engineer with Land Rover and with Nissan. On the day of my visit, Champion set aside two vehicles: a silver 2003 Chevrolet TrailBlazer—an enormous five-thousand-pound S.U.V.—and a shiny blue two-seater Porsche Boxster convertible.

We started with the TrailBlazer. Champion warmed up the Chevrolet with a few quick circuits of the track, and then drove it hard through the twists and turns of the handling course. He sat in the bucket seat with his back straight and his arms almost fully extended, and drove with practiced grace: every movement smooth and relaxed and unhurried. Champion, as an engineer, did not much like the TrailBlazer. "Cheap interior, cheap plastic," he said, batting the dashboard with his hand. "It's a little bit heavy, cumbersome. Quiet. Bit wallowy, side to side. Doesn't feel that secure. Accelerates heavily. Once it gets going, it's got decent power. Brakes feel a bit spongy. " He turned onto the straightaway and stopped a few hundred yards from the obstacle course.

Measuring accident avoidance is a key part of the Consumers Union evaluation. It's a simple setup. The driver has to navigate his vehicle through two rows of cones eight feet wide and sixty feet long. Then he has to steer hard to the left, guiding the vehicle through a gate set off to the side, and immediately swerve hard back to the right, and enter a second sixty-foot corridor of cones that are parallel to the first set. The idea is to see how fast you can drive through the course without knocking over any cones. "It's like you're driving down a road in suburbia," Champion said. "Suddenly, a kid on a bicycle veers out in front of you. You have to do whatever it takes to avoid the kid. But there's a tractor-trailer coming toward you in the other lane, so you've got to swing back into your own lane as quickly as possible. That's the scenario. "

Champion and I put on helmets. He accelerated toward the entrance to the obstacle course. "We do the test without brakes or throttle, so we can just look at handling," Champion said. "I actually take my foot right off the pedals. " The car was now moving at forty m.p.h. At that speed, on the smooth tarmac of the raceway, the TrailBlazer was very quiet, and we were seated so high that the road seemed somehow remote. Champion entered the first row of cones. His arms tensed. He jerked the car to the left. The TrailBlazer's tires squealed. I was thrown toward the passenger-side door as the truck's body rolled, then thrown toward Champion as he jerked the TrailBlazer back to the right. My tape recorder went skittering across the cabin. The whole maneuver had taken no more than a few seconds, but it felt as if we had been sailing into a squall. Champion brought the car to a stop. We both looked back: the TrailBlazer had hit the cone at the gate. The kid on the bicycle was probably dead. Champion shook his head. "It's very rubbery. It slides a lot. I'm not getting much communication back from the steering wheel. It feels really ponderous, clumsy. I felt a little bit of tail swing. "

I drove the obstacle course next. I started at the conservative speed of thirty-five m.p.h. I got through cleanly. I tried again, this time at thirty-eight m.p.h., and that small increment of speed made a dramatic difference. I made the first left, avoiding the kid on the bicycle. But, when it came time to swerve back to avoid the hypothetical oncoming eighteen-wheeler, I found that I was wrestling with the car. The protests of the tires were jarring. I stopped, shaken. "It wasn't going where you wanted it to go, was it?" Champion said. "Did you feel the weight pulling you sideways? That's what the extra weight that S.U.V.s have tends to do. It pulls you in the wrong direction. " Behind us was a string of toppled cones. Getting the TrailBlazer to travel in a straight line, after that sudden diversion, hadn't been easy. "I think you took out a few pedestrians," Champion said with a faint smile.

Next up was the Boxster. The top was down. The sun was warm on my forehead. The car was low to the ground; I had the sense that if I dangled my arm out the window my knuckles would scrape on the tarmac. Standing still, the Boxster didn't feel safe: I could have been sitting in a go-cart. But when I ran it through the handling course I felt that I was in perfect control. On the straightaway, I steadied the Boxster at forty-five m.p.h., and ran it through the obstacle course. I could have balanced a teacup on my knee. At fifty m.p.h., I navigated the left and right turns with what seemed like a twitch of the steering wheel. The tires didn't squeal. The car stayed level. I pushed the Porsche up into the mid-fifties. Every cone was untouched. "Walk in the park!" Champion exclaimed as we pulled to a stop.

Most of us think that S.U.V.s are much safer than sports cars. If you asked the young parents of America whether they would rather strap their infant child in the back seat of the TrailBlazer or the passenger seat of the Boxster, they would choose the TrailBlazer. We feel that way because in the TrailBlazer our chances of surviving a collision with a hypothetical tractor-trailer in the other lane are greater than they are in the Porsche. What we forget, though, is that in the TrailBlazer you're also much more likely to hit the tractor-trailer because you can't get out of the way in time. In the parlance of the automobile world, the TrailBlazer is better at "passive safety. " The Boxster is better when it comes to "active safety," which is every bit as important.

Consider the set of safety statistics compiled by Tom Wenzel, a scientist at Lawrence Berkeley National Laboratory, in California, and Marc Ross, a physicist at the University of Michigan. The numbers are expressed in fatalities per million cars, both for drivers of particular models and for the drivers of the cars they hit. (For example, in the first case, for every million Toyota Avalons on the road, forty Avalon drivers die in car accidents every year, and twenty people die in accidents involving Toyota Avalons. ) The numbers below have been rounded:

Make/Model

Type

Driver
Deaths

Other
Deaths

Total

Toyota Avalon

large

40

20

60

Chrysler Town & Country

minivan

31

36

67

Toyota Camry

mid-size

41

29

70

Volkswagen Jetta

subcompact

47

23

70

Ford Windstar

minivan

37

35

72

Nissan Maxima

mid-size

53

26

79

Honda Accord

mid-size

54

27

82

Chevrolet Venture

minivan

51

34

85

Buick Century

mid-size

70

23

93

Subaru Legacy/Outback

compact

74

24

98

Mazda 626

compact

70

29

99

Chevrolet Malibu

mid-size

71

34

105

Chevrolet Suburban

S.U.V.

46

59

105

Jeep Grand Cherokee

S.U.V.

61

44

106

Honda Civic

subcompact

84

25

109

Toyota Corolla

subcompact

81

29

110

Ford Expedition

S.U.V.

55

57

112

GMC Jimmy

S.U.V.

76

39

114

Ford Taurus

mid-size

78

39

117

Nissan Altima

compact

72

49

121

Mercury Marquis

large

80

43

123

Nissan Sentra

subcompact

95

34

129

Toyota 4Runner

S.U.V.

94

43

137

Chevrolet Tahoe

S.U.V.

68

74

141

Dodge Stratus

mid-size

103

40

143

Lincoln Town Car

large

100

47

147

Ford Explorer

S.U.V.

88

60

148

Pontiac Grand Am

compact

118

39

157

Toyota Tacoma

pickup

111

59

171

Chevrolet Cavalier

subcompact

146

41

186

Dodge Neon

subcompact

161

39

199

Pontiac Sunfire

subcompact

158

44

202

Ford F-Series

pickup

110

128

238

Are the best performers the biggest and heaviest vehicles on the road? Not at all. Among the safest cars are the midsize imports, like the Toyota Camry and the Honda Accord. Or consider the extraordinary performance of some subcompacts, like the Volkswagen Jetta. Drivers of the tiny Jetta die at a rate of just forty-seven per million, which is in the same range as drivers of the five-thousand-pound Chevrolet Suburban and almost half that of popular S.U.V. models like the Ford Explorer or the GMC Jimmy. In a head-on crash, an Explorer or a Suburban would crush a Jetta or a Camry. But, clearly, the drivers of Camrys and Jettas are finding a way to avoid head-on crashes with Explorers and Suburbans. The benefits of being nimble—of being in an automobile that's capable of staying out of trouble—are in many cases greater than the benefits of being big.

I had another lesson in active safety at the test track when I got in the TrailBlazer with another Consumers Union engineer, and we did three emergency-stopping tests, taking the Chevrolet up to sixty m.p.h. and then slamming on the brakes. It was not a pleasant exercise. Bringing five thousand pounds of rubber and steel to a sudden stop involves lots of lurching, screeching, and protesting. The first time, the TrailBlazer took 146. 2 feet to come to a halt, the second time 151. 6 feet, and the third time 153. 4 feet. The Boxster can come to a complete stop from sixty m.p.h. in about 124 feet. That's a difference of about two car lengths, and it isn't hard to imagine any number of scenarios where two car lengths could mean the difference between life and death.

3.

The S.U.V. boom represents, then, a shift in how we conceive of safety—from active to passive. It's what happens when a larger number of drivers conclude, consciously or otherwise, that the extra thirty feet that the TrailBlazer takes to come to a stop don't really matter, that the tractor-trailer will hit them anyway, and that they are better off treating accidents as inevitable rather than avoidable. "The metric that people use is size," says Stephen Popiel, a vice-president of Millward Brown Goldfarb, in Toronto, one of the leading automotive market-research firms. "The bigger something is, the safer it is. In the consumer's mind, the basic equation is, If I were to take this vehicle and drive it into this brick wall, the more metal there is in front of me the better off I'll be. "

This is a new idea, and one largely confined to North America. In Europe and Japan, people think of a safe car as a nimble car. That's why they build cars like the Jetta and the Camry, which are designed to carry out the driver's wishes as directly and efficiently as possible. In the Jetta, the engine is clearly audible. The steering is light and precise. The brakes are crisp. The wheelbase is short enough that the car picks up the undulations of the road. The car is so small and close to the ground, and so dwarfed by other cars on the road, that an intelligent driver is constantly reminded of the necessity of driving safely and defensively. An S.U.V. embodies the opposite logic. The driver is seated as high and far from the road as possible. The vehicle is designed to overcome its environment, not to respond to it. Even four-wheel drive, seemingly the most beneficial feature of the S.U.V., serves to reinforce this isolation. Having the engine provide power to all four wheels, safety experts point out, does nothing to improve braking, although many S.U.V. owners erroneously believe this to be the case. Nor does the feature necessarily make it safer to turn across a slippery surface: that is largely a function of how much friction is generated by the vehicle's tires. All it really does is improve what engineers call tracking—that is, the ability to accelerate without slipping in perilous conditions or in deep snow or mud. Champion says that one of the occasions when he came closest to death was a snowy day, many years ago, just after he had bought a new Range Rover. "Everyone around me was slipping, and I was thinking, Yeahhh. And I came to a stop sign on a major road, and I was driving probably twice as fast as I should have been, because I could. I had traction. But I also weighed probably twice as much as most cars. And I still had only four brakes and four tires on the road. I slid right across a four-lane road. " Four-wheel drive robs the driver of feedback. "The car driver whose wheels spin once or twice while backing out of the driveway knows that the road is slippery," Bradsher writes. "The SUV driver who navigates the driveway and street without difficulty until she tries to brake may not find out that the road is slippery until it is too late. " Jettas are safe because they make their drivers feel unsafe. S.U.V.s are unsafe because they make their drivers feel safe. That feeling of safety isn't the solution; it's the problem.

4.

Perhaps the most troublesome aspect of S.U.V. culture is its attitude toward risk. "Safety, for most automotive consumers, has to do with the notion that they aren't in complete control," Popiel says. "There are unexpected events that at any moment in time can come out and impact them—an oil patch up ahead, an eighteen-wheeler turning over, something falling down. People feel that the elements of the world out of their control are the ones that are going to cause them distress. "

Of course, those things really aren't outside a driver's control: an alert driver, in the right kind of vehicle, can navigate the oil patch, avoid the truck, and swerve around the thing that's falling down. Traffic-fatality rates vary strongly with driver behavior. Drunks are 7. 6 times more likely to die in accidents than non-drinkers. People who wear their seat belts are almost half as likely to die as those who don't buckle up. Forty-year-olds are ten times less likely to get into accidents than sixteen-year-olds. Drivers of minivans, Wenzel and Ross's statistics tell us, die at a fraction of the rate of drivers of pickup trucks. That's clearly because minivans are family cars, and parents with children in the back seat are less likely to get into accidents. Frank McKenna, a safety expert at the University of Reading, in England, has done experiments where he shows drivers a series of videotaped scenarios—a child running out the front door of his house and onto the street, for example, or a car approaching an intersection at too great a speed to stop at the red light—and asks people to press a button the minute they become aware of the potential for an accident. Experienced drivers press the button between half a second and a second faster than new drivers, which, given that car accidents are events measured in milliseconds, is a significant difference. McKenna's work shows that, with experience, we all learn how to exert some degree of control over what might otherwise appear to be uncontrollable events. Any conception of safety that revolves entirely around the vehicle, then, is incomplete. Is the Boxster safer than the TrailBlazer? It depends on who's behind the wheel. In the hands of, say, my very respectable and prudent middle-aged mother, the Boxster is by far the safer car. In my hands, it probably isn't. On the open road, my reaction to the Porsche's extraordinary road manners and the sweet, irresistible wail of its engine would be to drive much faster than I should. (At the end of my day at Consumers Union, I parked the Boxster, and immediately got into my own car to drive home. In my mind, I was still at the wheel of the Boxster. Within twenty minutes, I had a two-hundred-and-seventy-one-dollar speeding ticket. ) The trouble with the S.U.V. ascendancy is that it excludes the really critical component of safety: the driver.

In psychology, there is a concept called learned helplessness, which arose from a series of animal experiments in the nineteen-sixties at the University of Pennsylvania. Dogs were restrained by a harness, so that they couldn't move, and then repeatedly subjected to a series of electrical shocks. Then the same dogs were shocked again, only this time they could easily escape by jumping over a low hurdle. But most of them didn't; they just huddled in the corner, no longer believing that there was anything they could do to influence their own fate. Learned helplessness is now thought to play a role in such phenomena as depression and the failure of battered women to leave their husbands, but one could easily apply it more widely. We live in an age, after all, that is strangely fixated on the idea of helplessness: we're fascinated by hurricanes and terrorist acts and epidemics like sars—situations in which we feel powerless to affect our own destiny. In fact, the risks posed to life and limb by forces outside our control are dwarfed by the factors we can control. Our fixation with helplessness distorts our perceptions of risk. "When you feel safe, you can be passive," Rapaille says of the fundamental appeal of the S.U.V. "Safe means I can sleep. I can give up control. I can relax. I can take off my shoes. I can listen to music. " For years, we've all made fun of the middle-aged man who suddenly trades in his sedate family sedan for a shiny red sports car. That's called a midlife crisis. But at least it involves some degree of engagement with the act of driving. The man who gives up his sedate family sedan for an S.U.V. is saying something far more troubling—that he finds the demands of the road to be overwhelming. Is acting out really worse than giving up?

5.

On August 9, 2000, the Bridgestone Firestone tire company announced one of the largest product recalls in American history. Because of mounting concerns about safety, the company said, it was replacing some fourteen million tires that had been used primarily on the Ford Explorer S.U.V. The cost of the recall—and of a follow-up replacement program initiated by Ford a year later—ran into billions of dollars. Millions more were spent by both companies on fighting and settling lawsuits from Explorer owners, who alleged that their tires had come apart and caused their S.U.V.s to roll over. In the fall of that year, senior executives from both companies were called to Capitol Hill, where they were publicly berated. It was the biggest scandal to hit the automobile industry in years. It was also one of the strangest. According to federal records, the number of fatalities resulting from the failure of a Firestone tire on a Ford Explorer S.U.V., as of September, 2001, was two hundred and seventy-one. That sounds like a lot, until you remember that the total number of tires supplied by Firestone to the Explorer from the moment the S.U.V. was introduced by Ford, in 1990, was fourteen million, and that the average life span of a tire is forty-five thousand miles. The allegation against Firestone amounts to the claim that its tires failed, with fatal results, two hundred and seventy-one times in the course of six hundred and thirty billion vehicle miles. Manufacturers usually win prizes for failure rates that low. It's also worth remembering that during that same ten-year span almost half a million Americans died in traffic accidents. In other words, during the nineteen-nineties hundreds of thousands of people were killed on the roads because they drove too fast or ran red lights or drank too much. And, of those, a fair proportion involved people in S.U.V.s who were lulled by their four-wheel drive into driving recklessly on slick roads, who drove aggressively because they felt invulnerable, who disproportionately killed those they hit because they chose to drive trucks with inflexible steel-frame architecture, and who crashed because they couldn't bring their five-thousand-pound vehicles to a halt in time. Yet, out of all those fatalities, regulators, the legal profession, Congress, and the media chose to highlight the .0005 per cent that could be linked to an alleged defect in the vehicle.

But should that come as a surprise? In the age of the S.U.V., this is what people worry about when they worry about safety—not risks, however commonplace, involving their own behavior but risks, however rare, involving some unexpected event. The Explorer was big and imposing. It was high above the ground. You could look down on other drivers. You could see if someone was lurking behind or beneath it. You could drive it up on someone's lawn with impunity. Didn't it seem like the safest vehicle in the world?

The Terrazzo Jungle

download pdf

../index.html../index.html

 

homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog

March 15, 2004
ANNALS OF COMMERCE

Fifty years ago, the mall was born.
America would never be the same.

1.

Victor Gruen was short, stout, and unstoppable, with a wild head of hair and eyebrows like unpruned hedgerows. According to a profile in Fortune (and people loved to profile Victor Gruen), he was a "torrential talker with eyes as bright as mica and a mind as fast as mercury." In the office, he was famous for keeping two or three secretaries working full time, as he moved from one to the next, dictating non-stop in his thick Viennese accent. He grew up in the well-to-do world of prewar Jewish Vienna, studying architecture at the Vienna Academy of Fine Arts—the same school that, a few years previously, had turned down a fledgling artist named Adolf Hitler. At night, he performed satirical cabaret theatre in smoke-filled cafés. He emigrated in 1938, the same week as Freud, when one of his theatre friends dressed up as a Nazi Storm Trooper and drove him and his wife to the airport. They took the first plane they could catch to Zurich, made their way to England, and then boarded the S.S. Statendam for New York, landing, as Gruen later remembered, "with an architect's degree, eight dollars, and no English." On the voyage over, he was told by an American to set his sights high—"don't try to wash dishes or be a waiter, we have millions of them"—but Gruen scarcely needed the advice. He got together with some other German émigrés and formed the Refugee Artists Group. George S. Kaufman's wife was their biggest fan. Richard Rodgers and Al Jolson gave them money. Irving Berlin helped them with their music. Gruen got on the train to Princeton and came back with a letter of recommendation from Albert Einstein. By the summer of 1939, the group was on Broadway, playing eleven weeks at the Music Box. Then, as M. Jeffrey Hartwick recounts in "Mall Maker," his new biography of Gruen, one day he went for a walk in midtown and ran into an old friend from Vienna, Ludwig Lederer, who wanted to open a leather-goods boutique on Fifth Avenue. Victor agreed to design it, and the result was a revolutionary storefront, with a kind of mini-arcade in the entranceway, roughly seventeen by fifteen feet: six exquisite glass cases, spotlights, and faux marble, with green corrugated glass on the ceiling. It was a "customer trap." This was a brand-new idea in American retail design, particularly on Fifth Avenue, where all the carriage-trade storefronts were flush with the street. The critics raved. Gruen designed Ciro's on Fifth Avenue, Steckler's on Broadway, Paris Decorators on the Bronx Concourse, and eleven branches of the California clothing chain Grayson's. In the early fifties, he designed an outdoor shopping center called Northland outside Detroit for J. L. Hudson's. It covered a hundred and sixty-three acres and had nearly ten thousand parking spaces. This was little more than a decade and a half since he stepped off the boat, and when Gruen watched the bulldozers break ground he turned to his partner and said, "My God but we've got a lot of nerve."

But Gruen's most famous creation was his next project, in the town of Edina, just outside Minneapolis. He began work on it almost exactly fifty years ago. It was called Southdale. It cost twenty million dollars, and had seventy-two stores and two anchor department-store tenants, Donaldson's and Dayton's. Until then, most shopping centers had been what architects like to call "extroverted," meaning that store windows and entrances faced both the parking area and the interior pedestrian walkways. Southdale was introverted: the exterior walls were blank, and all the activity was focussed on the inside. Suburban shopping centers had always been in the open, with stores connected by outdoor passageways. Gruen had the idea of putting the whole complex under one roof, with air-conditioning for the summer and heat for the winter. Almost every other major shopping center had been built on a single level, which made for punishingly long walks. Gruen put stores on two levels, connected by escalators and fed by two-tiered parking. In the middle he put a kind of town square, a "garden court" under a skylight, with a fishpond, enormous sculpted trees, a twenty-one-foot cage filled with bright-colored birds, balconies with hanging plants, and a café. The result, Hardwick writes, was a sensation:

Journalists from all of the country's top magazines came for the Minneapolis shopping center's opening. Life, Fortune, Time, Women's Wear Daily, the New York Times, Business Week and Newsweek all covered the event. The national and local press wore out superlatives attempting to capture the feeling of Southdale. "The Splashiest Center in the U. S.," Life sang. The glossy weekly praised the incongruous combination of a "goldfish pond, birds, art and 10 acres of stores all. . . under one Minnesota roof." A "pleasure-dome-with-parking," Time cheered. One journalist announced that overnight Southdale had become an integral "part of the American Way."

Southdale Mall still exists. It is situated off I-494, south of downtown Minneapolis and west of the airport—a big concrete box in a sea of parking. The anchor tenants are now J. C. Penney and Marshall Field's, and there is an Ann Taylor and a Sunglass Hut and a Foot Locker and just about every other chain store that you've ever seen in a mall. It does not seem like a historic building, which is precisely why it is one. Fifty years ago, Victor Gruen designed a fully enclosed, introverted, multitiered, double-anchor-tenant shopping complex with a garden court under a skylight—and today virtually every regional shopping center in America is a fully enclosed, introverted, multitiered, double-anchor-tenant complex with a garden court under a skylight. Victor Gruen didn't design a building; he designed an archetype. For a decade, he gave speeches about it and wrote books and met with one developer after another and waved his hands in the air excitedly, and over the past half century that archetype has been reproduced so faithfully on so many thousands of occasions that today virtually every suburban American goes shopping or wanders around or hangs out in a Southdale facsimile at least once or twice a month. Victor Gruen may well have been the most influential architect of the twentieth century. He invented the mall.

2.

One of Gruen's contemporaries in the early days of the mall was a man named A. Alfred Taubman, who also started out as a store designer. In 1950, when Taubman was still in his twenties, he borrowed five thousand dollars, founded his own development firm, and, three years later, put up a twenty-six-store open-air shopping center in Flint, Michigan. A few years after that, inspired by Gruen, he matched Southdale with an enclosed mall of his own in Hayward, California, and over the next half century Taubman put together what is widely considered one of the finest collections of shopping malls in the world. The average American mall has annual sales of around three hundred and forty dollars per square foot. Taubman's malls average sales close to five hundred dollars per square foot. If Victor Gruen invented the mall, Alfred Taubman perfected it. One day not long ago, I asked Taubman to take me to one of his shopping centers and explain whatever it was that first drew people like him and Victor Gruen to the enclosed mall fifty years ago.

Taubman, who just turned eighty, is an imposing man with a wry sense of humor who wears bespoke three-piece suits and peers down at the world through half-closed eyes. He is the sort of old-fashioned man who refers to merchandise as "goods" and apparel as "soft goods" and who can glance at a couture gown from halfway across the room and come within a few dollars of its price. Recently, Taubman's fortunes took a turn for the worse when Sotheby's, which he bought in 1983, ran afoul of antitrust laws and he ended up serving a year-long prison sentence on price-fixing charges. Then his company had to fend off a hostile takeover bid led by Taubman's archrival, the Indianapolis-based Simon Property Group. But, on a recent trip from his Manhattan offices to the Mall at Short Hills, a half hour's drive away in New Jersey, Taubman was in high spirits. Short Hills holds a special place in his heart. "When I bought that property in 1980, there were only seven stores that were still in business," Taubman said, sitting in the back of his limousine. "It was a disaster. It was done by a large commercial architect who didn't understand what he was doing." Turning it around took four renovations. Bonwit Teller and B. Altman—two of the original anchor tenants—were replaced by Neiman Marcus, Saks, Nordstrom, and Macy's. Today, Short Hills has average sales of nearly eight hundred dollars per square foot; according to the Greenberg Group, it is the third-most-successful covered mall in the country. When Taubman and I approached the mall, the first thing he did was peer out at the parking garage. It was just before noon on a rainy Thursday. The garage was almost full. "Look at all the cars!" he said, happily.

Taubman directed the driver to stop in front of Bloomingdale's, on the mall's north side. He walked through the short access corridor, paused, and pointed at the floor. It was made up of small stone tiles. "People used to use monolithic terrazzo in centers," he said. "But it cracked easily and was difficult to repair. Women, especially, tend to have thin soles. We found that they are very sensitive to the surface, and when they get on one of those terrazzo floors it's like a skating rink. They like to walk on the joints. The only direct contact you have with the building is through the floor. How you feel about it is very important." Then he looked up and pointed to the second floor of the mall. The handrails were transparent. "We don't want anything to disrupt the view," Taubman said. If you're walking on the first level, he explained, you have to be able, at all times, to have an unimpeded line of sight not just to the stores in front of you but also to the stores on the second level. The idea is to overcome what Taubman likes to call "threshold resistance," which is the physical and psychological barrier that stands between a shopper and the inside of a store. "You buy something because it is available and attractive," Taubman said. "You can't have any obstacles. The goods have to be all there." When Taubman was designing stores in Detroit, in the nineteen-forties, he realized that even the best arcades, like those Gruen designed on Fifth Avenue, weren't nearly as good at overcoming threshold resistance as an enclosed mall, because with an arcade you still had to get the customer through the door. "People assume we enclose the space because of air-conditioning and the weather, and that's important," Taubman said. "But the main reason is that it allows us to open up the store to the customer."

Taubman began making his way down the mall. He likes the main corridors of his shopping malls to be no more than a thousand feet long—the equivalent of about three city blocks—because he believes that three blocks is about as far as peak shopping interest can be sustained, and as he walked he explained the logic behind what retailers like to call "adjacencies." There was Brooks Brothers, where a man might buy a six-hundred-dollar suit, right across from Johnston & Murphy, where the same man might buy a two-hundred-dollar pair of shoes. The Bose electronics store was next to Brookstone and across from the Sharper Image, so if you got excited about some electronic gizmo in one store you were steps away from getting even more excited by similar gizmos in two other stores. Gucci, Versace, and Chanel were placed near the highest-end department stores, Neiman Marcus and Saks. "Lots of developers just rent out their space like you'd cut a salami," Taubman explained. "They rent the space based on whether it fits, not necessarily on whether it makes any sense." Taubman shook his head. He gestured to a Legal Sea Foods restaurant, where he wanted to stop for lunch. It was off the main mall, at the far end of a short entry hallway, and it was down there for a reason. A woman about to spend five thousand dollars at Versace doesn't want to catch a whiff of sautéed grouper as she tries on an evening gown. More to the point, people eat at Legal Sea Foods only during the lunch and dinner hours—which means that if you put the restaurant in the thick of things, you'd have a dead spot in the middle of your mall for most of the day.

At the far end of the mall is Neiman Marcus, and Taubman wandered in, exclaimed over a tray of men's ties, and delicately examined the stitching in the women's evening gowns in the designer department. "Hi, my name is Alfred Taubman—I'm your landlord," he said, bending over to greet a somewhat startled sales assistant. Taubman plainly loves Neiman Marcus, and with good reason: well-run department stores are the engines of malls. They have powerful brand names, advertise heavily, and carry extensive cosmetics lines (shopping malls are, at bottom, delivery systems for lipstick)—all of which generate enormous shopping traffic. The point of a mall—the reason so many stores are clustered together in one building—is to allow smaller, less powerful retailers to share in that traffic. A shopping center is an exercise in coöperative capitalism. It is considered successful (and the mall owner makes the most money) when the maximum number of department-store customers are lured into the mall.

Why, for instance, are so many malls, like Short Hills, two stories? Back at his office, on Fifth Avenue, Taubman took a piece of paper and drew a simple cross-section of a two-story building. "You have two levels, all right? You have an escalator here and an escalator here." He drew escalators at both ends of the floors. "The customer comes into the mall, walks down the hall, gets on the escalator up to the second level. Goes back along the second floor, down the escalator, and now she's back where she started from. She's seen every store in the center, right? Now you put on a third level. Is there any reason to go up there? No." A full circuit of a two-level mall takes you back to the beginning. It encourages you to circulate through the whole building. A full circuit of a three-level mall leaves you at the opposite end of the mall from your car. Taubman was the first to put a ring road around the mall—which he did at his mall in Hayward—for the same reason: if you want to get shoppers into every part of the building, they should be distributed to as many different entry points as possible. At Short Hills—and at most Taubman malls—the ring road rises gently as you drive around the building, so at least half of the mall entrances are on the second floor. "We put fifteen per cent more parking on the upper level than on the first level, because people flow like water," Taubman said. "They go down much easier than they go up. And we put our vertical transportation—the escalators—on the ends, so shoppers have to make the full loop."

This is the insight that drove the enthusiasm for the mall fifty years ago—that by putting everything under one roof, the retailer and the developer gained, for the first time, complete control over their environment. Taubman fusses about lighting, for instance: he believes that next to the skylights you have to put tiny lights that will go on when the natural light fades, so the dusk doesn't send an unwelcome signal to shoppers that it is time to go home; and you have to recess the skylights so that sunlight never reflects off the storefront glass, obscuring merchandise. Can you optimize lighting in a traditional downtown? The same goes for parking. Suppose that there was a downtown where the biggest draw was a major department store. Ideally, you ought to put the garage across the street and two blocks away, so shoppers, on their way from their cars and to their destination, would pass by the stores in between—dramatically increasing the traffic for all the intervening merchants. But in a downtown, obviously, you can't put a parking garage just anywhere, and even if you could, you couldn't insure that the stores in that high-traffic corridor had the optimal adjacencies, or that the sidewalk would feel right under the thin soles of women's shoes. And because the stores are arrayed along a road with cars on it, you don't really have a mall where customers can wander from side to side. And what happens when they get to the department store? It's four or five floors high, and shoppers are like water, remember: they flow downhill. So it's going to be hard to generate traffic on the upper levels. There is a tendency in America to wax nostalgic for the traditional downtown, but those who first believed in the mall—and understood its potential—found it hard to look at the old downtown with anything but frustration. "In Detroit, prior to the nineteen-fifties, the large department stores, like Hudson's, controlled everything, like zoning," Taubman said. "They were generous to local politicians. They had enormous clout, and that's why when Sears wanted to locate in downtown Detroit they were told they couldn't. So Sears put a store in Highland Park and on Oakland Boulevard, and built a store on the East Side, and it was able to get some other stores to come with them, and before long there were three mini-downtowns in the suburbs. They used to call them hot spots." This happened more than half a century ago. But it was clear that Taubman has never quite got over how irrational the world outside the mall can be: downtown Detroit chased away traffic.

3.

Planning and control were of even greater importance to Gruen. He was, after all, a socialist—and he was Viennese. In the middle of the nineteenth century, Vienna had demolished the walls and other fortifications that had ringed the city since medieval times, and in the resulting open space built the Ringstrasse—a meticulously articulated addition to the old city. Architects and urban planners solemnly outlined their ideas. There were apartment blocks, and public squares and government buildings, and shopping arcades, each executed in what was thought to be the historically appropriate style. The Rathaus was done in high Gothic; the Burgtheatre in early Baroque; the University was pure Renaissance; and the Parliament was classical Greek. It was all part of the official Viennese response to the populist uprisings of 1848: if Austria was to remake itself as a liberal democracy, Vienna had to be physically remade along democratic lines. The Parliament now faced directly onto the street. The walls that separated the élite of Vienna from the unwashed in the suburbs were torn down. And, most important, a ring road, or Ringstrasse—a grand mall—was built around the city, with wide sidewalks and expansive urban views, where Viennese of all backgrounds could mingle freely on their Sunday-afternoon stroll. To the Viennese reformers of the time, the quality of civic life was a function of the quality of the built environment, and Gruen thought that principle applied just as clearly to the American suburbs.

Not long after Southdale was built, Gruen gave the keynote address at a Progressive Architecture awards ceremony in New Orleans, and he took the occasion to lash out at American suburbia, whose roads, he said, were "avenues of horror," "flanked by the greatest collection of vulgarity—billboards, motels, gas stations, shanties, car lots, miscellaneous industrial equipment, hot dog stands, wayside stores—ever collected by mankind." American suburbia was chaos, and the only solution to chaos was planning. When Gruen first drew up the plans for Southdale, he placed the shopping center at the heart of a tidy four-hundred-and-sixty-three-acre development, complete with apartment buildings, houses, schools, a medical center, a park, and a lake. Southdale was not a suburban alternative to downtown Minneapolis. It was the Minneapolis downtown you would get if you started over and corrected all the mistakes that were made the first time around. "There is nothing suburban about Southdale except its location," Architectural Record stated when it reviewed Gruen's new creation. It is an imaginative distillation of what makes downtown magnetic: the variety, the individuality, the lights, the color, even the crowds—for Southdale's pedestrian-scale spaces insure a busyness and a bustle.

Added to this essence of existing downtowns are all kinds of things that ought to be there if downtown weren't so noisy and dirty and chaotic—sidewalk cafés, art, islands of planting, pretty paving. Other shopping centers, however pleasant, seem provincial in contrast with the real thing—the city downtown. But in Minneapolis, it is the downtown that appears pokey and provincial in contrast with Southdale's metropolitan character.

One person who wasn't dazzled by Southdale was Frank Lloyd Wright. "What is this, a railroad station or a bus station?" he asked, when he came for a tour. "You've got a garden court that has all the evils of the village street and none of its charm." But no one much listened to Frank Lloyd Wright. When it came to malls, it was only Victor Gruen's vision that mattered.

4.

Victor Gruen's grand plan for Southdale was never realized. There were no parks or schools or apartment buildings—just that big box in a sea of parking. Nor, with a few exceptions, did anyone else plan the shopping mall as the centerpiece of a tidy, dense, multi-use development. Gruen was right about the transformative effect of the mall on retailing. But in thinking that he could reënact the lesson of the Ringstrasse in American suburbia he was wrong, and the reason was that in the mid-nineteen-fifties the economics of mall-building suddenly changed.

At the time of Southdale, big shopping centers were a delicate commercial proposition. One of the first big postwar shopping centers was Shopper's World, in Framingham, Massachusetts, designed by an old business partner of Gruen's from his Fifth Avenue storefront days. Shopper's World was an open center covering seventy acres, with forty-four stores, six thousand parking spaces, and a two-hundred-and-fifty-thousand-square-foot Jordan Marsh department store—and within two years of its opening, in 1951, the developer was bankrupt. A big shopping center simply cost too much money, and it took too long for a developer to make that money back. Gruen thought of the mall as the centerpiece of a carefully planned new downtown because he felt that that was the only way malls would ever get built: you planned because you had to plan. Then, in the mid-fifties, something happened that turned the dismal economics of the mall upside down: Congress made a radical change in the tax rules governing depreciation.

Under tax law, if you build an office building, or buy a piece of machinery for your factory, or make any capital purchase for your business, that investment is assumed to deteriorate and lose some part of its value from wear and tear every year. As a result, a business is allowed to set aside some of its income, tax-free, to pay for the eventual cost of replacing capital investments. For tax purposes, in the early fifties the useful life of a building was held to be forty years, so a developer could deduct one-fortieth of the value of his building from his income every year. A new forty-million-dollar mall, then, had an annual depreciation deduction of a million dollars. What Congress did in 1954, in an attempt to stimulate investment in manufacturing, was to "accelerate" the depreciation process for new construction. Now, using this and other tax loopholes, a mall developer could recoup the cost of his investment in a fraction of the time. As the historian Thomas Hanchett argues, in a groundbreaking paper in The American Historical Review, the result was a "bonanza" for developers. In the first few years after a shopping center was built, the depreciation deductions were so large that the mall was almost certainly losing money, at least on paper—which brought with it enormous tax benefits. For instance, in a front-page article in 1961 on the effect of the depreciation changes, the Wall Street Journal described the finances of a real-estate investment company called Kratter Corp. Kratter's revenue from its real-estate operations in 1960 was $9,997,043. Deductions from operating expenses and mortgage interest came to $4,836,671, which left a healthy income of $5.16 million. Then came depreciation, which came to $6.9 million, so now Kratter's healthy profit had been magically turned into a "loss" of $1.76 million. Imagine that you were one of five investors in Kratter. The company's policy was to distribute nearly all of its pre-depreciation revenue to its investors, so your share of their earnings would be roughly a million dollars. Ordinarily, you'd pay a good chunk of that in taxes. But that million dollars wasn't income. After depreciation, Kratter didn't make any money. That million dollars was "return on capital," and it was tax-free.

Suddenly it was possible to make much more money investing in things like shopping centers than buying stocks, so money poured into real-estate investment companies. Prices rose dramatically. Investors were putting up buildings, taking out as much money from them as possible using accelerated depreciation, then selling them four or five years later at a huge profit—whereupon they built an even bigger building, because the more expensive the building was, the more the depreciation allowance was worth.

Under the circumstances, who cared whether the shopping center made economic sense for the venders? Shopping centers and strip malls became what urban planners call "catalytic," meaning that developers weren't building them to serve existing suburban communities; they were building them on the fringes of cities, beyond residential developments, where the land was cheapest. Hanchett points out, in fact, that in many cases the growth of malls appears to follow no demographic logic at all. Cortland, New York, for instance, barely grew at all between 1950 and 1970. Yet in those two decades Cortland gained six new shopping plazas, including the four-hundred-thousand-square-foot enclosed Cortlandville Mall. In the same twenty-year span, the Scranton area actually shrank by seventy-three thousand people while gaining thirty-one shopping centers, including three enclosed malls. In 1953, before accelerated depreciation was put in place, one major regional shopping center was built in the United States. Three years later, after the law was passed, that number was twenty-five. In 1953, new shopping-center construction of all kinds totalled six million square feet. By 1956, that figure had increased five hundred per cent. This was also the era that fast-food restaurants and Howard Johnsons and Holiday Inns and muffler shops and convenience stores began to multiply up and down the highways and boulevards of the American suburbs—and as these developments grew, others followed to share in the increased customer traffic. Malls led to malls, and in turn those malls led to the big stand-alone retailers like Wal-Mart and Target, and then the "power centers" of three or four big-box retailers, like Circuit City, Staples, Barnes & Noble. Victor Gruen intended Southdale to be a dense, self-contained downtown. Today, fifteen minutes down an "avenue of horror" from Southdale is the Mall of America, the largest mall in the country, with five hundred and twenty stores, fifty restaurants, and twelve thousand parking spaces—and one can easily imagine that one day it, too, may give way to something newer and bigger.

5.

Once, in the mid-fifties, Victor Gruen sat down with a writer from The New Yorker's Talk of the Town to give his thoughts on how to save New York City. The interview took place in Gruen's stylish offices on West Twelfth Street, in an old Stanford White building, and one can only imagine the reporter, rapt, as Gruen held forth, eyebrows bristling. First, Gruen said, Manhattan had to get rid of its warehouses and its light manufacturing. Then, all the surface traffic in midtown—the taxis, buses, and trucks—had to be directed into underground tunnels. He wanted to put superhighways around the perimeter of the island, buttressed by huge double-decker parking garages. The jumble of tenements and town houses and apartment blocks that make up Manhattan would be replaced by neat rows of hundred-and-fifty-story residential towers, arrayed along a ribbon of gardens, parks, walkways, theatres, and cafés.

Mr. G. lowered his brows and glared at us. "You are troubled by all those tunnels, are you not?" he inquired. "You wonder whether there is room for them in the present underground jungle of pipes and wires. Did you never think how absurd it is to bury beneath tons of solid pavement equipment that is bound to go on the blink from time to time?" He leaped from his chair and thrust an imaginary pneumatic drill against his polished study floor. "Rat-a-tat-tat!" he exclaimed. "Night and day! Tear up the streets! Then pave them! Then tear 'em up again!" Flinging aside the imaginary drill, he threw himself back in his chair. "In my New York of the future, all pipes and wires will be strung along the upper sides of those tunnels, above a catwalk, accessible to engineers and painted brilliant colors to delight rather than appall the eye."

Postwar America was an intellectually insecure place, and there was something intoxicating about Gruen's sophistication and confidence. That was what took him, so dramatically, from standing at New York Harbor with eight dollars in his pocket to Broadway, to Fifth Avenue, and to the heights of Northland and Southdale. He was a European intellectual, an émigré, and, in the popular mind, the European émigré represented vision, the gift of seeing something grand in the banality of postwar American life. When the European visionary confronted a drab and congested urban landscape, he didn't tinker and equivocate; he levelled warehouses and buried roadways and came up with a thrilling plan for making things right. "The chief means of travel will be walking," Gruen said, of his reimagined metropolis. "Nothing like walking for peace of mind." At Northland, he said, thousands of people would show up, even when the stores were closed, just to walk around. It was exactly like Sunday on the Ringstrasse. With the building of the mall, Old World Europe had come to suburban Detroit.

What Gruen had, as well, was an unshakable faith in the American marketplace. Malls teach us, he once said, that "it's the merchants who will save our urban civilization. 'Planning' isn't a dirty word to them; good planning means good business." He went on, "Sometimes self-interest has remarkable spiritual consequences." Gruen needed to believe this, as did so many European intellectuals from that period, dubbed by the historian Daniel Horowitz "celebratory émigrés." They had fled a place of chaos and anxiety, and in American consumer culture they sought a bulwark against the madness across the ocean. They wanted to find in the jumble of the American marketplace something as grand as the Vienna they had lost—the place where the unconscious was meticulously dissected by Dr. Freud on Berggasse, and where shrines to European civilization—to the Gothic, the Baroque, the Renaissance, and the ancient Greek traditions—were erected on the Ringstrasse. To Americans, nothing was more flattering than this. Who didn't want to believe that the act of levelling warehouses and burying roadways had spiritual consequences? But it was, in the end, too good to be true. This wasn't the way America worked at all.

A few months ago, Alfred Taubman gave a speech to a real-estate trade association in Detroit, about the prospects for the city's downtown, and one of the things he talked about was Victor Gruen's Northland. It was simply too big, Taubman said. Hudson's, the Northland anchor tenant, already had a flagship store in downtown Detroit. So why did Gruen build a six-hundred-thousand-square-foot satellite at Northland, just a twenty-minute drive away? Satellites were best at a hundred and fifty thousand to two hundred thousand square feet. But at six hundred thousand square feet they were large enough to carry every merchandise line that the flagship store carried, which meant no one had any reason to make the trek to the flagship anymore. Victor Gruen said the lesson of Northland was that the merchants would save urban civilization. He didn't appreciate that it made a lot more sense, for his client, to save civilization at a hundred and fifty thousand square feet than at six hundred thousand square feet. The lesson of America was that the grandest of visions could be derailed by the most banal of details, like the size of the retail footprint, or whether Congress set the depreciation allowance at forty years or twenty years.

When, late in life, Gruen came to realize this, it was a powerfully disillusioning experience. He revisited one of his old shopping centers, and saw all the sprawling development around it, and pronounced himself in "severe emotional shock." Malls, he said, had been disfigured by "the ugliness and discomfort of the land-wasting seas of parking" around them. Developers were interested only in profit. "I refuse to pay alimony for those bastard developments," he said in a speech in London, in 1978. He turned away from his adopted country. He had fixed up a country house outside of Vienna, and soon he moved back home for good. But what did he find when he got there? Just south of old Vienna, a mall had been built—in his anguished words, a "gigantic shopping machine." It was putting the beloved independent shopkeepers of Vienna out of business. It was crushing the life of his city. He was devastated. Victor Gruen invented the shopping mall in order to make America more like Vienna. He ended up making Vienna more like America.

The Ketchup Conundrum

download pdf

../index.html../index.html

 

homethe dogoutliersblinkthe tipping pointthe new yorker archiveetc.blog

September 6, 2004
TASTE TECHNOLOGIES

Mustard now comes in dozens of varieties.
Why has ketchup stayed the same?

1.

Many years ago, one mustard dominated the supermarket shelves: French's. It came in a plastic bottle. People used it on hot dogs and bologna. It was a yellow mustard, made from ground white mustard seed with turmeric and vinegar, which gave it a mild, slightly metallic taste. If you looked hard in the grocery store, you might find something in the specialty-foods section called Grey Poupon, which was Dijon mustard, made from the more pungent brown mustard seed. In the early seventies, Grey Poupon was no more than a hundred-thousand-dollar-a-year business. Few people knew what it was or how it tasted, or had any particular desire for an alternative to French's or the runner-up, Gulden's. Then one day the Heublein Company, which owned Grey Poupon, discovered something remarkable: if you gave people a mustard taste test, a significant number had only to try Grey Poupon once to switch from yellow mustard. In the food world that almost never happens; even among the most successful food brands, only about one in a hundred have that kind of conversion rate. Grey Poupon was magic.

So Heublein put Grey Poupon in a bigger glass jar, with an enamelled label and enough of a whiff of Frenchness to make it seem as if it were still being made in Europe (it was made in Hartford, Connecticut, from Canadian mustard seed and white wine). The company ran tasteful print ads in upscale food magazines. They put the mustard in little foil packets and distributed them with airplane meals—which was a brand-new idea at the time. Then they hired the Manhattan ad agency Lowe Marschalk to do something, on a modest budget, for television. The agency came back with an idea: A Rolls-Royce is driving down a country road. There's a man in the back seat in a suit with a plate of beef on a silver tray. He nods to the chauffeur, who opens the glove compartment. Then comes what is known in the business as the "reveal." The chauffeur hands back a jar of Grey Poupon. Another Rolls-Royce pulls up alongside. A man leans his head out the window. "Pardon me. Would you have any Grey Poupon?"

In the cities where the ads ran, sales of Grey Poupon leaped forty to fifty per cent, and whenever Heublein bought airtime in new cities sales jumped by forty to fifty per cent again. Grocery stores put Grey Poupon next to French's and Gulden's. By the end of the nineteen-eighties Grey Poupon was the most powerful brand in mustard. "The tagline in the commercial was that this was one of life's finer pleasures," Larry Elegant, who wrote the original Grey Poupon spot, says, "and that, along with the Rolls-Royce, seemed to impart to people's minds that this was something truly different and superior."

The rise of Grey Poupon proved that the American supermarket shopper was willing to pay more—in this case, $3.99 instead of $1.49 for eight ounces—as long as what they were buying carried with it an air of sophistication and complex aromatics. Its success showed, furthermore, that the boundaries of taste and custom were not fixed: that just because mustard had always been yellow didn't mean that consumers would use only yellow mustard. It is because of Grey Poupon that the standard American supermarket today has an entire mustard section. And it is because of Grey Poupon that a man named Jim Wigon decided, four years ago, to enter the ketchup business. Isn't the ketchup business today exactly where mustard was thirty years ago? There is Heinz and, far behind, Hunt's and Del Monte and a handful of private-label brands. Jim Wigon wanted to create the Grey Poupon of ketchup.

Wigon is from Boston. He's a thickset man in his early fifties, with a full salt-and-pepper beard. He runs his ketchup business—under the brand World's Best Ketchup—out of the catering business of his partner, Nick Schiarizzi, in Norwood, Massachusetts, just off Route 1, in a low-slung building behind an industrial-equipment-rental shop. He starts with red peppers, Spanish onions, garlic, and a high-end tomato paste. Basil is chopped by hand, because the buffalo chopper bruises the leaves. He uses maple syrup, not corn syrup, which gives him a quarter of the sugar of Heinz. He pours his ketchup into a clear glass ten-ounce jar, and sells it for three times the price of Heinz, and for the past few years he has crisscrossed the country, peddling World's Best in six flavors—regular, sweet, dill, garlic, caramelized onion, and basil—to specialty grocery stores and supermarkets. If you were in Zabar's on Manhattan's Upper West Side a few months ago, you would have seen him at the front of the store, in a spot between the sushi and the gefilte fish. He was wearing a World's Best baseball cap, a white shirt, and a red-stained apron. In front of him, on a small table, was a silver tureen filled with miniature chicken and beef meatballs, a box of toothpicks, and a dozen or so open jars of his ketchup. "Try my ketchup!" Wigon said, over and over, to anyone who passed. "If you don't try it, you're doomed to eat Heinz the rest of your life."

In the same aisle at Zabar's that day two other demonstrations were going on, so that people were starting at one end with free chicken sausage, sampling a slice of prosciutto, and then pausing at the World's Best stand before heading for the cash register. They would look down at the array of open jars, and Wigon would impale a meatball on a toothpick, dip it in one of his ketchups, and hand it to them with a flourish. The ratio of tomato solids to liquid in World's Best is much higher than in Heinz, and the maple syrup gives it an unmistakable sweet kick. Invariably, people would close their eyes, just for a moment, and do a subtle double take. Some of them would look slightly perplexed and walk away, and others would nod and pick up a jar. "You know why you like it so much?" he would say, in his broad Boston accent, to the customers who seemed most impressed. "Because you've been eating bad ketchup all " Jim Wigon had a simple vision: build a better ketchup—the way Grey Poupon built a better mustard—and the world will beat a path to your door. If only it were that easy.

2.

The story of World's Best Ketchup cannot properly be told without a man from White Plains, New York, named Howard Moskowitz. Moskowitz is sixty, short and round, with graying hair and huge gold-rimmed glasses. When he talks, he favors the Socratic monologue—a series of questions that he poses to himself, then answers, punctuated by "ahhh" and much vigorous nodding. He is a lineal descendant of the legendary eighteenth-century Hasidic rabbi known as the Seer of Lublin. He keeps a parrot. At Harvard, he wrote his doctoral dissertation on psychophysics, and all the rooms on the ground floor of his food-testing and market-research business are named after famous psychophysicists. ("Have you ever heard of the name Rose Marie Pangborn? Ahhh. She was a professor at Davis. Very famous. This is the Pangborn kitchen.") Moskowitz is a man of uncommon exuberance and persuasiveness: if he had been your freshman statistics professor, you would today be a statistician. "My favorite writer? Gibbon," he burst out, when we met not long ago. He had just been holding forth on the subject of sodium solutions. "Right now I'm working my way through the Hales history of the Byzantine Empire. Holy shit! Everything is easy until you get to the Byzantine Empire. It's impossible. One emperor is always killing the others, and everyone has five wives or three husbands. It's very Byzantine."

Moskowitz set up shop in the seventies, and one of his first clients was Pepsi. The artificial sweetener aspartame had just become available, and Pepsi wanted Moskowitz to figure out the perfect amount of sweetener for a can of Diet Pepsi. Pepsi knew that anything below eight per cent sweetness was not sweet enough and anything over twelve per cent was too sweet. So Moskowitz did the logical thing. He made up experimental batches of Diet Pepsi with every conceivable degree of sweetness—8 per cent, 8.25 per cent, 8.5, and on and on up to 12—gave them to hundreds of people, and looked for the concentration that people liked the most. But the data were a mess—there wasn't a pattern—and one day, sitting in a diner, Moskowitz realized why. They had been asking the wrong question. There was no such thing as the perfect Diet Pepsi. They should have been looking for the perfect Diet Pepsis.

It took a long time for the food world to catch up with Howard Moskowitz. He knocked on doors and tried to explain his idea about the plural nature of perfection, and no one answered. He spoke at food-industry conferences, and audiences shrugged. But he could think of nothing else. "It's like that Yiddish expression," he says. "Do you know it? To a worm in horseradish, the world is horseradish!" Then, in 1986, he got a call from the Campbell's Soup Company. They were in the spaghetti-sauce business, going up against Ragú with their Prego brand. Prego was a little thicker than Ragú, with diced tomatoes as opposed to Ragú's purée, and, Campbell's thought, had better pasta adherence. But, for all that, Prego was in a slump, and Campbell's was desperate for new ideas.

Standard practice in the food industry would have been to convene a focus group and ask spaghetti eaters what they wanted. But Moskowitz does not believe that consumers—even spaghetti lovers—know what they desire if what they desire does not yet exist. "The mind," as Moskowitz is fond of saying, "knows not what the tongue wants." Instead, working with the Campbell's kitchens, he came up with forty-five varieties of spaghetti sauce. These were designed to differ in every conceivable way: spiciness, sweetness, tartness, saltiness, thickness, aroma, mouth feel, cost of ingredients, and so forth. He had a trained panel of food tasters analyze each of those varieties in depth. Then he took the prototypes on the road—to New York, Chicago, Los Angeles, and Jacksonville—and asked people in groups of twenty-five to eat between eight and ten small bowls of different spaghetti sauces over two hours and rate them on a scale of one to a hundred. When Moskowitz charted the results, he saw that everyone had a slightly different definition of what a perfect spaghetti sauce tasted like. If you sifted carefully through the data, though, you could find patterns, and Moskowitz learned that most people's preferences fell into one of three broad groups: plain, spicy, and extra-chunky, and of those three the last was the most important. Why? Because at the time there was no extra-chunky spaghetti sauce in the supermarket. Over the next decade, that new category proved to be worth hundreds of millions of dollars to Prego. "We all said, 'Wow!' " Monica Wood, who was then the head of market research for Campbell's, recalls. "Here there was this third segment—people who liked their spaghetti sauce with lots of stuff in it—and it was completely untapped. So in about 1989-90 we launched Prego extra-chunky. It was extraordinarily successful."

It may be hard today, fifteen years later—when every brand seems to come in multiple varieties—to appreciate how much of a breakthrough this was. In those years, people in the food industry carried around in their heads the notion of a platonic dish—the version of a dish that looked and tasted absolutely right. At Ragú and Prego, they had been striving for the platonic spaghetti sauce, and the platonic spaghetti sauce was thin and blended because that's the way they thought it was done in Italy. Cooking, on the industrial level, was consumed with the search for human universals. Once you start looking for the sources of human variability, though, the old orthodoxy goes out the window. Howard Moskowitz stood up to the Platonists and said there are no universals.

Moskowitz still has a version of the computer model he used for Prego fifteen years ago. It has all the coded results from the consumer taste tests and the expert tastings, split into the three categories (plain, spicy, and extra-chunky) and linked up with the actual ingredients list on a spreadsheet. "You know how they have a computer model for building an aircraft," Moskowitz said as he pulled up the program on his computer. "This is a model for building spaghetti sauce. Look, every variable is here." He pointed at column after column of ratings. "So here are the ingredients. I'm a brand manager for Prego. I want to optimize one of the segments. Let's start with Segment 1." In Moskowitz's program, the three spaghetti-sauce groups were labelled Segment 1, Segment 2, and Segment 3. He typed in a few commands, instructing the computer to give him the formulation that would score the highest with those people in Segment 1. The answer appeared almost immediately: a specific recipe that, according to Moskowitz's data, produced a score of 78 from the people in Segment 1. But that same formulation didn't do nearly as well with those in Segment 2 and Segment 3. They scored it 67 and 57, respectively. Moskowitz started again, this time asking the computer to optimize for Segment 2. This time the ratings came in at 82, but now Segment 1 had fallen ten points, to 68. "See what happens?" he said. "If I make one group happier, I piss off another group. We did this for coffee with General Foods, and we found that if you create only one product the best you can get across all the segments is a 60—if you're lucky. That's if you were to treat everybody as one big happy family. But if I do the sensory segmentation, I can get 70, 71, 72. Is that big? Ahhh. It's a very big difference. In coffee, a 71 is something you'll die for."

When Jim Wigon set up shop that day in Zabar's, then, his operating assumption was that there ought to be some segment of the population that preferred a ketchup made with Stanislaus tomato paste and hand-chopped basil and maple syrup. That's the Moskowitz theory. But there is theory and there is practice. By the end of that long day, Wigon had sold ninety jars. But he'd also got two parking tickets and had to pay for a hotel room, so he wasn't going home with money in his pocket. For the year, Wigon estimates, he'll sell fifty thousand jars—which, in the universe of condiments, is no more than a blip. "I haven't drawn a paycheck in five years," Wigon said as he impaled another meatball on a toothpick. "My wife is killing me." And it isn't just World's Best that is struggling. In the gourmet-ketchup world, there is River Run and Uncle Dave's, from Vermont, and Muir Glen Organic and Mrs. Tomato Head Roasted Garlic Peppercorn Catsup, in California, and dozens of others—and every year Heinz's overwhelming share of the ketchup market just grows.

It is possible, of course, that ketchup is waiting for its own version of that Rolls-Royce commercial, or the discovery of the ketchup equivalent of extra-chunky—the magic formula that will satisfy an unmet need. It is also possible, however, that the rules of Howard Moskowitz, which apply to Grey Poupon and Prego spaghetti sauce and to olive oil and salad dressing and virtually everything else in the supermarket, don't apply to ketchup.

3.

Tomato ketchup is a nineteenth-century creation—the union of the English tradition of fruit and vegetable sauces and the growing American infatuation with the tomato. But what we know today as ketchup emerged out of a debate that raged in the first years of the last century over benzoate, a preservative widely used in late-nineteenth-century condiments. Harvey Washington Wiley, the chief of the Bureau of Chemistry in the Department of Agriculture from 1883 to 1912, came to believe that benzoates were not safe, and the result was an argument that split the ketchup world in half. On one side was the ketchup establishment, which believed that it was impossible to make ketchup without benzoate and that benzoate was not harmful in the amounts used. On the other side was a renegade band of ketchup manufacturers, who believed that the preservative puzzle could be solved with the application of culinary science. The dominant nineteenth-century ketchups were thin and watery, in part because they were made from unripe tomatoes, which are low in the complex carbohydrates known as pectin, which add body to a sauce. But what if you made ketchup from ripe tomatoes, giving it the density it needed to resist degradation? Nineteenth-century ketchups had a strong tomato taste, with just a light vinegar touch. The renegades argued that by greatly increasing the amount of vinegar, in effect protecting the tomatoes by pickling them, they were making a superior ketchup: safer, purer, and better tasting. They offered a money-back guarantee in the event of spoilage. They charged more for their product, convinced that the public would pay more for a better ketchup, and they were right. The benzoate ketchups disappeared. The leader of the renegade band was an entrepreneur out of Pittsburgh named Henry J. Heinz.

The world's leading expert on ketchup's early years is Andrew F. Smith, a substantial man, well over six feet, with a graying mustache and short wavy black hair. Smith is a scholar, trained as a political scientist, intent on bringing rigor to the world of food. When we met for lunch not long ago at the restaurant Savoy in SoHo (chosen because of the excellence of its hamburger and French fries, and because Savoy makes its own ketchup—a dark, peppery, and viscous variety served in a white porcelain saucer), Smith was in the throes of examining the origins of the croissant for the upcoming "Oxford Encyclopedia of Food and Drink in America," of which he is the editor-in-chief. Was the croissant invented in 1683, by the Viennese, in celebration of their defeat of the invading Turks? Or in 1686, by the residents of Budapest, to celebrate their defeat of the Turks? Both explanations would explain its distinctive crescent shape—since it would make a certain cultural sense (particularly for the Viennese) to consecrate their battlefield triumphs in the form of pastry. But the only reference Smith could find to either story was in the Larousse Gastronomique of 1938. "It just doesn't check out," he said, shaking his head wearily.

Smith's specialty is the tomato, however, and over the course of many scholarly articles and books—"The History of Home-Made Anglo-American Tomato Ketchup," for Petits Propos Culinaires, for example, and "The Great Tomato Pill War of the 1830's," for The Connecticut Historical Society Bulletin—Smith has argued that some critical portion of the history of culinary civilization could be told through this fruit. Cortez brought tomatoes to Europe from the New World, and they inexorably insinuated themselves into the world's cuisines. The Italians substituted the tomato for eggplant. In northern India, it went into curries and chutneys. "The biggest tomato producer in the world today?" Smith paused, for dramatic effect. "China. You don't think of tomato being a part of Chinese cuisine, and it wasn't ten years ago. But it is now." Smith dipped one of my French fries into the homemade sauce. "It has that raw taste," he said, with a look of intense concentration. "It's fresh ketchup. You can taste the tomato." Ketchup was, to his mind, the most nearly perfect of all the tomato's manifestations. It was inexpensive, which meant that it had a firm lock on the mass market, and it was a condiment, not an ingredient, which meant that it could be applied at the discretion of the food eater, not the food preparer. "There's a quote from Elizabeth Rozin I've always loved," he said. Rozin is the food theorist who wrote the essay "Ketchup and the Collective Unconscious," and Smith used her conclusion as the epigraph of his ketchup book: ketchup may well be "the only true culinary expression of the melting pot, and . . . its special and unprecedented ability to provide something for everyone makes it the Esperanto of cuisine." Here is where Henry Heinz and the benzoate battle were so important: in defeating the condiment Old Guard, he was the one who changed the flavor of ketchup in a way that made it universal.

4.

There are five known fundamental tastes in the human palate: salty, sweet, sour, bitter, and umami. Umami is the proteiny, full-bodied taste of chicken soup, or cured meat, or fish stock, or aged cheese, or mother's milk, or soy sauce, or mushrooms, or seaweed, or cooked tomato. "Umami adds body," Gary Beauchamp, who heads the Monell Chemical Senses Center, in Philadelphia, says. "If you add it to a soup, it makes the soup seem like it's thicker—it gives it sensory heft. It turns a soup from salt water into a food." When Heinz moved to ripe tomatoes and increased the percentage of tomato solids, he made ketchup, first and foremost, a potent source of umami. Then he dramatically increased the concentration of vinegar, so that his ketchup had twice the acidity of most other ketchups; now ketchup was sour, another of the fundamental tastes. The post-benzoate ketchups also doubled the concentration of sugar—so now ketchup was also sweet—and all along ketchup had been salty and bitter. These are not trivial issues. Give a baby soup, and then soup with MSG (an amino-acid salt that is pure umami), and the baby will go back for the MSG soup every time, the same way a baby will always prefer water with sugar to water alone. Salt and sugar and umami are primal signals about the food we are eating—about how dense it is in calories, for example, or, in the case of umami, about the presence of proteins and amino acids. What Heinz had done was come up with a condiment that pushed all five of these primal buttons. The taste of Heinz's ketchup began at the tip of the tongue, where our receptors for sweet and salty first appear, moved along the sides, where sour notes seem the strongest, then hit the back of the tongue, for umami and bitter, in one long crescendo. How many things in the supermarket run the sensory spectrum like this?

A number of years ago, the H. J. Heinz Company did an extensive market-research project in which researchers went into people's homes and watched the way they used ketchup. "I remember sitting in one of those households," Casey Keller, who was until recently the chief growth officer for Heinz, says. "There was a three-year-old and a six-year-old, and what happened was that the kids asked for ketchup and Mom brought it out. It was a forty-ounce bottle. And the three-year-old went to grab it himself, and Mom intercepted the bottle and said, 'No, you're not going to do that.' She physically took the bottle away and doled out a little dollop. You could see that the whole thing was a bummer." For Heinz, Keller says, that moment was an epiphany. A typical five-year-old consumes about sixty per cent more ketchup than a typical forty-year-old, and the company realized that it needed to put ketchup in a bottle that a toddler could control. "If you are four—and I have a four-year-old—he doesn't get to choose what he eats for dinner, in most cases," Keller says. "But the one thing he can control is ketchup. It's the one part of the food experience that he can customize and personalize." As a result, Heinz came out with the so-called EZ Squirt bottle, made out of soft plastic with a conical nozzle. In homes where the EZ Squirt is used, ketchup consumption has grown by as much as twelve per cent.

There is another lesson in that household scene, though. Small children tend to be neophobic: once they hit two or three, they shrink from new tastes. That makes sense, evolutionarily, because through much of human history that is the age at which children would have first begun to gather and forage for themselves, and those who strayed from what was known and trusted would never have survived. There the three-year-old was, confronted with something strange on his plate—tuna fish, perhaps, or Brussels sprouts—and he wanted to alter his food in some way that made the unfamiliar familiar. He wanted to subdue the contents of his plate. And so he turned to ketchup, because, alone among the condiments on the table, ketchup could deliver sweet and sour and salty and bitter and umami, all at once.

5.

Last February, Edgar Chambers IV, who runs the sensory-analysis center at Kansas State University, conducted a joint assessment of World's Best and Heinz. He has seventeen trained tasters on his staff, and they work for academia and industry, answering the often difficult question of what a given substance tastes like. It is demanding work. Immediately after conducting the ketchup study, Chambers dispatched a team to Bangkok to do an analysis of fruit—bananas, mangoes, rose apples, and sweet tamarind. Others were detailed to soy and kimchi in South Korea, and Chambers's wife led a delegation to Italy to analyze ice cream.

The ketchup tasting took place over four hours, on two consecutive mornings. Six tasters sat around a large, round table with a lazy Susan in the middle. In front of each panelist were two one-ounce cups, one filled with Heinz ketchup and one filled with World's Best. They would work along fourteen dimensions of flavor and texture, in accordance with the standard fifteen-point scale used by the food world. The flavor components would be divided two ways: elements picked up by the tongue and elements picked up by the nose. A very ripe peach, for example, tastes sweet but it also smells sweet—which is a very different aspect of sweetness. Vinegar has a sour taste but also a pungency, a vapor that rises up the back of the nose and fills the mouth when you breathe out. To aid in the rating process, the tasters surrounded themselves with little bowls of sweet and sour and salty solutions, and portions of Contadina tomato paste, Hunt's tomato sauce, and Campbell's tomato juice, all of which represent different concentrations of tomato-ness.

After breaking the ketchup down into its component parts, the testers assessed the critical dimension of "amplitude," the word sensory experts use to describe flavors that are well blended and balanced, that "bloom" in the mouth. "The difference between high and low amplitude is the difference between my son and a great pianist playing 'Ode to Joy' on the piano," Chambers says. "They are playing the same notes, but they blend better with the great pianist." Pepperidge Farm shortbread cookies are considered to have high amplitude. So are Hellman's mayonnaise and Sara Lee poundcake. When something is high in amplitude, all its constituent elements converge into a single gestalt. You can't isolate the elements of an iconic, high-amplitude flavor like Coca-Cola or Pepsi. But you can with one of those private-label colas that you get in the supermarket. "The thing about Coke and Pepsi is that they are absolutely gorgeous," Judy Heylmun, a vice-president of Sensory Spectrum, Inc., in Chatham, New Jersey, says. "They have beautiful notes—all flavors are in balance. It's very hard to do that well. Usually, when you taste a store cola it's"— and here she made a series of pik! pik! pik! sounds—"all the notes are kind of spiky, and usually the citrus is the first thing to spike out. And then the cinnamon. Citrus and brown spice notes are top notes and very volatile, as opposed to vanilla, which is very dark and deep. A really cheap store brand will have a big, fat cinnamon note sitting on top of everything."

Some of the cheaper ketchups are the same way. Ketchup aficionados say that there's a disquieting unevenness to the tomato notes in Del Monte ketchup: Tomatoes vary, in acidity and sweetness and the ratio of solids to liquid, according to the seed variety used, the time of year they are harvested, the soil in which they are grown, and the weather during the growing season. Unless all those variables are tightly controlled, one batch of ketchup can end up too watery and another can be too strong. Or try one of the numerous private-label brands that make up the bottom of the ketchup market and pay attention to the spice mix; you may well find yourself conscious of the clove note or overwhelmed by a hit of garlic. Generic colas and ketchups have what Moskowitz calls a hook—a sensory attribute that you can single out, and ultimately tire of.

The tasting began with a plastic spoon. Upon consideration, it was decided that the analysis would be helped if the ketchups were tasted on French fries, so a batch of fries were cooked up, and distributed around the table. Each tester, according to protocol, took the fries one by one, dipped them into the cup—all the way, right to the bottom—bit off the portion covered in ketchup, and then contemplated the evidence of their senses. For Heinz, the critical flavor components—vinegar, salt, tomato I.D. (over-all tomato-ness), sweet, and bitter—were judged to be present in roughly equal concentrations, and those elements, in turn, were judged to be well blended. The World's Best, though, "had a completely different view, a different profile, from the Heinz," Chambers said. It had a much stronger hit of sweet aromatics—4.0 to 2.5—and outstripped Heinz on tomato I.D. by a resounding 9 to 5.5. But there was less salt, and no discernible vinegar. "The other comment from the panel was that these elements were really not blended at all," Chambers went on. "The World's Best product had really low amplitude." According to Joyce Buchholz, one of the panelists, when the group judged aftertaste, "it seemed like a certain flavor would hang over longer in the case of World's Best—that cooked-tomatoey flavor."

But what was Jim Wigon to do? To compete against Heinz, he had to try something dramatic, like substituting maple syrup for corn syrup, ramping up the tomato solids. That made for an unusual and daring flavor. World's Best Dill ketchup on fried catfish, for instance, is a marvellous thing. But it also meant that his ketchup wasn't as sensorily complete as Heinz, and he was paying a heavy price in amplitude. "Our conclusion was mainly this," Buchholz said. "We felt that World's Best seemed to be more like a sauce." She was trying to be helpful.

There is an exception, then, to the Moskowitz rule. Today there are thirty-six varieties of Ragú spaghetti sauce, under six rubrics—Old World Style, Chunky Garden Style, Robusto, Light, Cheese Creations, and Rich & Meaty—which means that there is very nearly an optimal spaghetti sauce for every man, woman, and child in America. Measured against the monotony that confronted Howard Moskowitz twenty years ago, this is progress. Happiness, in one sense, is a function of how closely our world conforms to the infinite variety of human preference. But that makes it easy to forget that sometimes happiness can be found in having what we've always had and everyone else is having. "Back in the seventies, someone else—I think it was Ragú—tried to do an 'Italian'-style ketchup," Moskowitz said. "They failed miserably." It was a conundrum: what was true about a yellow condiment that went on hot dogs was not true about a tomato condiment that went on hamburgers, and what was true about tomato sauce when you added visible solids and put it in a jar was somehow not true about tomato sauce when you added vinegar and sugar and put it in a bottle. Moskowitz shrugged. "I guess ketchup is ketchup."

homethe new yorker archivetop

harsh winds and saltwater spray of the North Atlantic Ocean, and as the Norse sailed upriver they saw grassy slopes flowering with buttercups, dandelions, and bluebells, and thick forests of willow and birch and alder. Two colonies were formed, three hundred miles apart, known as the Eastern and Western Settlements. The Norse raised sheep, goats, and cattle. They turned the grassy slopes into pastureland. They hunted seal and caribou. They built a string of parish churches and a magnificent cathedral, the remains of which are still standing. They traded actively with mainland Europe, and tithed regularly to the Roman Catholic Church. The Norse colonies in Greenland were law-abiding, economically viable, fully integrated communities, numbering at their peak five thousand people. They lasted for four hundred and fifty years—and then they vanished.

The story of the Eastern and Western Settlements of Greenland is told in Jared Diamond's "Collapse: How Societies Choose to Fail or Succeed" (Viking; $29.95). Diamond teaches geography at U.C.L.A. and is well known for his best-seller "Guns, Germs, and Steel," which won a Pulitzer Prize. In "Guns, Germs, and Steel," Diamond looked at environmental and structural factors to explain why Western societies came to dominate the world. In "Collapse," he continues that approach, only this time he looks at history's losers—like the Easter Islanders, the Anasazi of the American Southwest, the Mayans, and the modern-day Rwandans. We live in an era preoccupied with the way that ideology and culture and politics and economics help shape the course of history. But Diamond isn't particularly interested in any of those things—or, at least, he's interested in them only insofar as they bear on what to him is the far more important question, which is a society's relationship to its climate and geography and resources and neighbors. "Collapse" is a book about the most prosaic elements of the earth's ecosystem—soil, trees, and water—because societies fail, in Diamond's view, when they mismanage those environmental factors.

There was nothing wrong with the social organization of the Greenland settlements. The Norse built a functioning reproduction of the predominant northern-European civic model of the time—devout, structured, and reasonably orderly. In 1408, right before the end, records from the Eastern Settlement dutifully report that Thorstein Olafsson married Sigrid Bjornsdotter in Hvalsey Church on September 14th of that year, with Brand Halldorstson, Thord Jorundarson, Thorbjorn Bardarson, and Jon Jonsson as witnesses, following the proclamation of the wedding banns on three consecutive Sundays.

The problem with the settlements, Diamond argues, was that the Norse thought that Greenland really was green; they treated it as if it were the verdant farmland of southern Norway. They cleared the land to create meadows for their cows, and to grow hay to feed their livestock through the long winter. They chopped down the forests for fuel, and for the construction of wooden objects. To make houses warm enough for the winter, they built their homes out of six-foot-thick slabs of turf, which meant that a typical home consumed about ten acres of grassland.

But Greenland's ecosystem was too fragile to withstand that kind of pressure. The short, cool growing season meant that plants developed slowly, which in turn meant that topsoil layers were shallow and lacking in soil constituents, like organic humus and clay, that hold moisture and keep soil resilient in the face of strong winds. "The sequence of soil erosion in Greenland begins with cutting or burning the cover of trees and shrubs, which are more effective at holding soil than is grass," he writes. "With the trees and shrubs gone, livestock, especially sheep and goats, graze down the grass, which regenerates only slowly in Greenland's climate. Once the grass cover is broken and the soil is exposed, soil is carried away especially by the strong winds, and also by pounding from occasionally heavy rains, to the point where the topsoil can be removed for a distance of miles from an entire valley." Without adequate pastureland, the summer hay yields shrank; without adequate supplies of hay, keeping livestock through the long winter got harder. And, without adequate supplies of wood, getting fuel for the winter became increasingly difficult.

The Norse needed to reduce their reliance on livestock—particularly cows, which consumed an enormous amount of agricultural resources. But cows were a sign of high status; to northern Europeans, beef was a prized food. They needed to copy the Inuit practice of burning seal blubber for heat and light in the winter, and to learn from the Inuit the difficult art of hunting ringed seals, which were the most reliably plentiful source of food available in the winter. But the Norse had contempt for the Inuit—they called them skraelings, "wretches"—and preferred to practice their own brand of European agriculture. In the summer, when the Norse should have been sending ships on lumber-gathering missions to Labrador, in order to relieve the pressure on their own forestlands, they instead sent boats and men to the coast to hunt for walrus. Walrus tusks, after all, had great trade value. In return for those tusks, the Norse were able to acquire, among other things, church bells, stained-glass windows, bronze candlesticks, Communion wine, linen, silk, silver, churchmen's robes, and jewelry to adorn their massive cathedral at Gardar, with its three-ton sandstone building blocks and eighty-foot bell tower. In the end, the Norse starved to death.

2.

Diamond's argument stands in sharp contrast to the conventional explanations for a society's collapse. Usually, we look for some kind of cataclysmic event. The aboriginal civilization of the Americas was decimated by the sudden arrival of smallpox. European Jewry was destroyed by Nazism. Similarly, the disappearance of the Norse settlements is usually blamed on the Little Ice Age, which descended on Greenland in the early fourteen-hundreds, ending several centuries of relative warmth. (One archeologist refers to this as the "It got too cold, and they died" argument.) What all these explanations have in common is the idea that civilizations are destroyed by forces outside their control, by acts of God.

But look, Diamond says, at Easter Island. Once, it was home to a thriving culture that produced the enormous stone statues that continue to inspire awe. It was home to dozens of species of trees, which created and protected an ecosystem fertile enough to support as many as thirty thousand people. Today, it's a barren and largely empty outcropping of volcanic rock. What happened? Did a rare plant virus wipe out the island's forest cover? Not at all. The Easter Islanders chopped their trees down, one by one, until they were all gone. "I have often asked myself, 'What did the Easter Islander who cut down the last palm tree say while he was doing it?'" Diamond writes, and that, of course, is what is so troubling about the conclusions of "Collapse." Those trees were felled by rational actors—who must have suspected that the destruction of this resource would result in the destruction of their civilization. The lesson of "Collapse" is that societies, as often as not, aren't murdered. They commit suicide: they slit their wrists and then, in the course of many decades, stand by passively and watch themselves bleed to death.

This doesn't mean that acts of God don't play a role. It did get colder in Greenland in the early fourteen-hundreds. But it didn't get so cold that the island became uninhabitable. The Inuit survived long after the Norse died out, and the Norse had all kinds of advantages, including a more diverse food supply, iron tools, and ready access to Europe. The problem was that the Norse simply couldn't adapt to the country's changing environmental conditions. Diamond writes, for instance, of the fact that nobody can find fish remains in Norse archeological sites. One scientist sifted through tons of debris from the Vatnahverfi farm and found only three fish bones; another researcher analyzed thirty-five thousand bones from the garbage of another Norse farm and found two fish bones. How can this be? Greenland is a fisherman's dream: Diamond describes running into a Danish tourist in Greenland who had just caught two Arctic char in a shallow pool with her bare hands. "Every archaeologist who comes to excavate in Greenland . . . starts out with his or her own idea about where all those missing fish bones might be hiding," he writes. "Could the Norse have strictly confined their munching on fish to within a few feet of the shoreline, at sites now underwater because of land subsidence? Could they have faithfully saved all their fish bones for fertilizer, fuel, or feeding to cows?" It seems unlikely. There are no fish bones in Norse archeological remains, Diamond concludes, for the simple reason that the Norse didn't eat fish. For one reason or another, they had a cultural taboo against it.

Given the difficulty that the Norse had in putting food on the table, this was insane. Eating fish would have substantially reduced the ecological demands of the Norse settlements. The Norse would have needed fewer livestock and less pastureland. Fishing is not nearly as labor-intensive as raising cattle or hunting caribou, so eating fish would have freed time and energy for other activities. It would have diversified their diet.

Why did the Norse choose not to eat fish? Because they weren't thinking about their biological survival. They were thinking about their cultural survival. Food taboos are one of the idiosyncrasies that define a community. Not eating fish served the same function as building lavish churches, and doggedly replicating the untenable agricultural practices of their land of origin. It was part of what it meant to be Norse, and if you are going to establish a community in a harsh and forbidding environment all those little idiosyncrasies which define and cement a culture are of paramount importance. "The Norse were undone by the same social glue that had enabled them to master Greenland's difficulties," Diamond writes. "The values to which people cling most stubbornly under inappropriate conditions are those values that were previously the source of their greatest triumphs over adversity." He goes on:

To us in our secular modern society, the predicament in which the Greenlanders found themselves is difficult to fathom. To them, however, concerned with their social survival as much as their biological survival, it was out of the question to invest less in churches, to imitate or intermarry with the Inuit, and thereby to face an eternity in Hell just in order to survive another winter on Earth.

Diamond's distinction between social and biological survival is a critical one, because too often we blur the two, or assume that biological survival is contingent on the strength of our civilizational values. That was the lesson taken from the two world wars and the nuclear age that followed: we would survive as a species only if we learned to get along and resolve our disputes peacefully. The fact is, though, that we can be law-abiding and peace-loving and tolerant and inventive and committed to freedom and true to our own values and still behave in ways that are biologically suicidal. The two kinds of survival are separate.

Diamond points out that the Easter Islanders did not practice, so far as we know, a uniquely pathological version of South Pacific culture. Other societies, on other islands in the Hawaiian archipelago, chopped down trees and farmed and raised livestock just as the Easter Islanders did. What doomed the Easter Islanders was the interaction between what they did and where they were. Diamond and a colleague, Barry Rollet, identified nine physical factors that contributed to the likelihood of deforestation—including latitude, average rainfall, aerial-ash fallout, proximity to Central Asia's dust plume, size, and so on—and Easter Island ranked at the high-risk end of nearly every variable. "The reason for Easter's unusually severe degree of deforestation isn't that those seemingly nice people really were unusually bad or improvident," he concludes. "Instead, they had the misfortune to be living in one of the most fragile environments, at the highest risk for deforestation, of any Pacific people." The problem wasn't the Easter Islanders. It was Easter Island.

In the second half of "Collapse," Diamond turns his attention to modern examples, and one of his case studies is the recent genocide in Rwanda. What happened in Rwanda is commonly described as an ethnic struggle between the majority Hutu and the historically dominant, wealthier Tutsi, and it is understood in those terms because that is how we have come to explain much of modern conflict: Serb and Croat, Jew and Arab, Muslim and Christian. The world is a cauldron of cultural antagonism. It's an explanation that clearly exasperates Diamond. The Hutu didn't just kill the Tutsi, he points out. The Hutu also killed other Hutu. Why? Look at the land: steep hills farmed right up to the crests, without any protective terracing; rivers thick with mud from erosion; extreme deforestation leading to irregular rainfall and famine; staggeringly high population densities; the exhaustion of the topsoil; falling per-capita food production. This was a society on the brink of ecological disaster, and if there is anything that is clear from the study of such societies it is that they inevitably descend into genocidal chaos. In "Collapse," Diamond quite convincingly defends himself against the charge of environmental determinism. His discussions are always nuanced, and he gives political and ideological factors their due. The real issue is how, in coming to terms with the uncertainties and hostilities of the world, the rest of us have turned ourselves into cultural determinists.

3.

For the past thirty years, Oregon has had one of the strictest sets of land-use regulations in the nation, requiring new development to be clustered in and around existing urban development. The laws meant that Oregon has done perhaps the best job in the nation in limiting suburban sprawl, and protecting coastal lands and estuaries. But this November Oregon's voters passed a ballot referendum, known as Measure 37, that rolled back many of those protections. Specifically, Measure 37 said that anyone who could show that the value of his land was affected by regulations implemented since its purchase was entitled to compensation from the state. If the state declined to pay, the property owner would be exempted from the regulations.

To call Measure 37—and similar referendums that have been passed recently in other states—intellectually incoherent is to put it mildly. It might be that the reason your hundred-acre farm on a pristine hillside is worth millions to a developer is that it's on a pristine hillside: if everyone on that hillside could subdivide, and sell out to Target and Wal-Mart, then nobody's plot would be worth millions anymore. Will the voters of Oregon then pass Measure 38, allowing them to sue the state for compensation over damage to property values caused by Measure 37?

It is hard to read "Collapse," though, and not have an additional reaction to Measure 37. Supporters of the law spoke entirely in the language of political ideology. To them, the measure was a defense of property rights, preventing the state from unconstitutional "takings." If you replaced the term "property rights" with "First Amendment rights," this would have been indistinguishable from an argument over, say, whether charitable groups ought to be able to canvass in malls, or whether cities can control the advertising they sell on the sides of public buses. As a society, we do a very good job with these kinds of debates: we give everyone a hearing, and pass laws, and make compromises, and square our conclusions with our constitutional heritage—and in the Oregon debate the quality of the theoretical argument was impressively high.

The thing that got lost in the debate, however, was the land. In a rapidly growing state like Oregon, what, precisely, are the state's ecological strengths and vulnerabilities? What impact will changed land-use priorities have on water and soil and cropland and forest? One can imagine Diamond writing about the Measure 37 debate, and he wouldn't be very impressed by how seriously Oregonians wrestled with the problem of squaring their land-use rules with their values, because to him a society's environmental birthright is not best discussed in those terms. Rivers and streams and forests and soil are a biological resource. They are a tangible, finite thing, and societies collapse when they get so consumed with addressing the fine points of their history and culture and deeply held beliefs—with making sure that Thorstein Olafsson and Sigrid Bjornsdotter are married before the right number of witnesses following the announcement of wedding banns on the right number of Sundays—that they forget that the pastureland is shrinking and the forest cover is gone.

When archeologists looked through the ruins of the Western Settlement, they found plenty of the big wooden objects that were so valuable in Greenland—crucifixes, bowls, furniture, doors, roof timbers—which meant that the end came too quickly for anyone to do any scavenging. And, when the archeologists looked at the animal bones left in the debris, they found the bones of newborn calves, meaning that the Norse, in that final winter, had given up on the future. They found toe bones from cows, equal to the number of cow spaces in the barn, meaning that the Norse ate their cattle down to the hoofs, and they found the bones of dogs covered with knife marks, meaning that, in the end, they had to eat their pets. But not fish bones, of course. Right up until they starved to death, the Norse never lost sight of what they stood for.

1 comment:

  1. Madness Writers: Gladwell Articles >>>>> Download Now

    >>>>> Download Full

    Madness Writers: Gladwell Articles >>>>> Download LINK

    >>>>> Download Now

    Madness Writers: Gladwell Articles >>>>> Download Full

    >>>>> Download LINK ax

    ReplyDelete