Super Freakonomics - Part 10
Library

Part 10

This hunch was supported by the data trail from the September 11 attacks. The banking histories of those nineteen terrorists revealed some behaviors that, in the aggregate, distinguished them from the typical bank customer:

They opened their U.S. accounts with cash or cash equivalents, in the average amount of roughly $4,000, usually at a branch of a large, well-known bank.They typically used a P.O. box as an address, and the addresses changed frequently.Some of them regularly sent and received wire transfers to and from other countries, but these transactions were always below the limit that triggered the bank's reporting requirements.They tended to make one large deposit and then withdraw cash in small amounts over time.Their banking didn't reflect normal living expenses like rent, utilities, auto payments, insurance, and so on.There was no typical monthly consistency in the timing of their deposits or withdrawals.They didn't use savings accounts or safe-deposit boxes.The ratio of cash withdrawals to checks written was unusually high.

It is obviously easier to retroactively create a banking profile of a proven terrorist than to build one that would identify a terrorist before he acts. Nor would a profile of these nineteen men-foreign nationals living in the United States who were training to hijack jetliners-necessarily fit the profile of, say, a homegrown suicide bomber in London.

Furthermore, when data have been used in the past to identify wrongdoing-like the cheating schoolteachers and collusive sumo wrestlers we wrote about in Freakonomics-there was a relatively high prevalence of fraud among a targeted population. But in this case, the population was gigantic (Horsley's bank alone had many millions of customers) while the number of potential terrorists was very small.

Let's say, however, you could develop a banking algorithm that was 99 percent accurate. We'll a.s.sume the United Kingdom has 500 terrorists. The algorithm would correctly identify 495 of them, or 99 percent. But there are roughly 50 million adults in the United Kingdom who have nothing to do with terrorism, and the algorithm would also wrongly identify 1 percent of them, or 500,000 people. At the end of the day, this wonderful, 99-percent-accurate algorithm spits out too many false positives-half a million people who would be rightly indignant when they were hauled in by the authorities on suspicion of terrorism.

Nor, of course, could the authorities handle the workload.

This is a common problem in health care. A review of a recent cancer-screening trial showed that 50 percent of the 68,000 partic.i.p.ants got at least 1 false-positive result after undergoing 14 tests. So although health-care advocates may urge universal screening for all sorts of maladies, the reality is that the system would be overwhelmed by false positives and the sick would be crowded out. The baseball player Mike Lowell, a recent World Series MVP, underscored a related problem while discussing a plan to test every ballplayer in the league for human growth hormone. "If it's 99 percent accurate, that's going to be 7 false positives," Lowell said. "What if one of the false positives is Cal Ripken? Doesn't it put a black mark on his career?"

Similarly, if you want to hunt terrorists, 99 percent accurate is not even close to good enough.

On July 7, 2005, four Islamic suicide bombers struck in London, one on a crowded bus and three in the Underground. The murder toll was fifty-two. "Personally, I was devastated by it," Horsley recalls. "We were just starting to work on identifying terrorists and I thought maybe, just maybe, if we had started a couple years earlier, would we have stopped it?"

The 7/7 bombers left behind some banking data, but not much. In the coming months, however, a flock of suspicious characters accommodated our terrorist-detection project by getting themselves arrested by the British police. Granted, none of these men were proven terrorists; most of them would never be convicted of anything. But if they resembled a terrorist closely enough to get arrested, perhaps their banking habits could be mined to create a useful algorithm. As luck would have it, more than a hundred of these suspects were customers at Horsley's bank.

The procedure would require two steps. First, a.s.semble all the available data on these hundred-plus suspects and create an algorithm based on the patterns that set these men apart from the general population. Once the algorithm was successfully fine-tuned, it could be used to dredge through the bank's database to identify other potential bad guys.

Given that the United Kingdom was battling Islamic fundamentalists and no longer, for instance, Irish militants, the arrested suspects invariably had Muslim names. This would turn out to be one of the strongest demographic markers for the algorithm. A person with neither a first nor a last Muslim name stood only a 1 in 500,000 chance of being a suspected terrorist. The likelihood for a person with a first or a last Muslim name was 1 in 30,000. For a person with first and last Muslim names, however, the likelihood jumped to 1 in 2,000.

The likely terrorists were predominately men, most commonly between the ages of twenty-six and thirty-five. Furthermore, they were disproportionately likely to:

Own a mobile phoneBe a studentRent, rather than own, a home

These traits, on their own, would hardly be grounds for arrest. (They describe just about every research a.s.sistant the two of us have ever had, and we are pretty sure none of them are terrorists.) But, when stacked atop the Muslim-name markers, even these common traits began to add power to the algorithm.

Once the preceding factors were taken into account, several other characteristics proved fundamentally neutral, not identifying terrorists one way or another. They included:

Employment statusMarital statusLiving in close proximity to a mosque

So contrary to common perception, a single, unemployed, twenty-six-year-old man who lived next door to a mosque was no more likely to be a terrorist than another twenty-six-year-old who had a wife, a job, and lived five miles from the mosque.

There were also some prominent negative indicators. The data showed that a would-be terrorist was disproportionately unlikely to:

Have a savings accountWithdraw money from an ATM on a Friday afternoonBuy life insurance

The no-ATM-on-Friday metric would seem to be a proxy for a Muslim who attends that day's mandatory prayer service. The life-insurance marker is a bit more interesting. Let's say you're a twenty-six-year-old man, married with two young children. It probably makes sense to buy some life insurance so your family can survive if you happen to die young. But insurance companies don't pay out if the policyholder commits suicide. So a twenty-six-year-old family man who suspects he may one day blow himself up probably isn't going to waste money on life insurance.

This all suggests that if a budding terrorist wants to cover his tracks, he should go down to the bank and change the name on his account to something very un-Muslim (Ian, perhaps). It also wouldn't hurt to buy some life insurance. Horsley's own bank offers starter policies for just a few quid per month.

All these metrics, once combined, did a pretty good job of creating an algorithm that could distill the bank's entire customer base into a relatively small group of potential terrorists.

It was a tight net but not yet tight enough. What finally made it work was one last metric that dramatically sharpened the algorithm. In the interest of national security, we have been asked to not disclose the particulars; we'll call it Variable X.

What makes Variable X so special? For one, it is a behavioral metric, not a demographic one. The dream of anti-terrorist authorities everywhere is to somehow become a fly on the wall in a roomful of terrorists. In one small, important way, Variable X accomplishes that. Unlike most other metrics in the algorithm, which produce a yes or no answer, Variable X measures the intensity of a particular banking activity. While not unusual in low intensities among the general population, this behavior occurs in high intensities much more frequently among those who have other terrorist markers.

This ultimately gave the algorithm great predictive power. Starting with a database of millions of bank customers, Horsley was able to generate a list of about 30 highly suspicious individuals. According to his rather conservative estimate, at least 5 of those 30 are almost certainly involved in terrorist activities. Five out of 30 isn't perfect-the algorithm misses many terrorists and still falsely identifies some innocents-but it sure beats 495 out of 500,495.

As of this writing, Horsley has handed off the list of 30 to his superiors, who in turn have handed it off to the proper authorities. Horsley has done his work; now it is time for them to do theirs. Given the nature of the problem, Horsley may never know for certain if he was successful. And you, the reader, are even less likely to see direct evidence of his success because it would be invisible, manifesting itself in terrorist attacks that never happen.

But perhaps you'll find yourself in a British pub some distant day, one stool away from an una.s.suming, slightly standoffish stranger. You have a pint with him, and then another and a third. With his tongue loosened a bit, he mentions, almost sheepishly, that he has recently gained an honorific: he is now known as Sir Ian Horsley. He's not at liberty to discuss the deeds that led to his knighthood, but it has something to do with protecting civil society from those who would do it great harm. You thank him profusely for the great service he has performed, and buy him another pint, and then a few more. When the pub at last closes, the two of you stumble outside. And then, just as he is about to set off on foot down a darkened lane, you think of a very small way to repay his service. You push him back onto the curb, hail a taxi, and stuff him inside. Because, remember, friends don't let friends walk drunk.

CHAPTER 3

UNBELIEVABLE STORIES ABOUT APATHY AND ALTRUISM

In March 1964, late on a cold and damp Thursday night, something terrible happened in New York City, something suggesting that human beings are the most brutally selfish animals to ever roam the planet.

A twenty-eight-year-old woman named Kitty Genovese drove home from work and parked, as usual, in the lot at the Long Island Rail Road station. She lived in Kew Gardens, Queens, roughly twenty minutes by train from Manhattan. It was a nice neighborhood, with tidy homes on shaded lots, a handful of apartment buildings, and a small commercial district.

Genovese lived above a row of shops that fronted Austin Street. The entrance to her apartment was around the rear. She got out of her car and locked it; almost immediately, a man started chasing her and stabbed her in the back. Genovese screamed. The a.s.sault took place on the sidewalk in front of the Austin Street shops and across the street from a ten-story apartment building called the Mowbray.

The a.s.sailant, whose name was Winston Moseley, retreated to his car, a white Corvair parked at the curb some sixty yards away. He put the car in reverse and backed it down the block, pa.s.sing out of view.

Genovese, meanwhile, staggered to her feet and made her way around to the back of her building. But in a short time Moseley returned. He s.e.xually a.s.saulted her and stabbed her again, leaving Genovese to die. Then he got back in his car and drove home. Like Genovese, he was young, twenty-nine years old, and he too lived in Queens. His wife was a registered nurse; they had two children. On the drive home, Moseley noticed another car stopped at a red light, its driver asleep at the wheel. Moseley got out and woke the man. He didn't hurt or rob him. The next morning, Moseley went to work as usual.

The crime soon became infamous. But not because Moseley was a psychopath-a seemingly normal family man who, although he had no criminal record, turned out to have a history of grotesque s.e.xual violence. And it wasn't because Genovese was a colorful character herself, a tavern manager who happened to be a lesbian and had a prior gambling arrest. Nor was it because Genovese was white and Moseley was black.

The Kitty Genovese murder became infamous because of an article published on the front page of The New York Times. It began like this:

For more than half an hour 38 respectable, law-abiding citizens in Queens watched a killer stalk and stab a woman in three separate attacks in Kew Gardens.... Not one person telephoned the police during the a.s.sault; one witness called after the woman was dead.

The murder took about thirty-five minutes from start to finish. "If we had been called when he first attacked," said a police inspector, "the woman might not be dead now."

The police had interviewed Genovese's neighbors the morning after the murder, and the Times's reporter reinterviewed some of them. When asked why they hadn't intervened or at least called the police, they offered a variety of excuses:

"We thought it was a lovers' quarrel."

"We went to the window to see what was happening but the light from our bedroom made it difficult to see the street."

"I was tired. I went back to bed."

The article wasn't very long-barely fourteen hundred words-but its impact was immediate and explosive. There seemed to be general agreement that the thirty-eight witnesses in Kew Gardens represented a new low in human civilization. Politicians, theologians, and editorial writers lambasted the neighbors for their apathy. Some even called for the neighbors' addresses to be published so justice could be done.

The incident so deeply shook the nation that over the next twenty years, it inspired more academic research on bystander apathy than the Holocaust.

To mark the thirtieth anniversary, President Bill Clinton visited New York City and spoke about the crime: "It sent a chilling message about what had happened at that time in a society, suggesting that we were each of us not simply in danger but fundamentally alone."