Read The Canon Online

Authors: Natalie Angier

The Canon (6 page)

Yet while a mastery of math is not essential to appreciating and even practicing science, you can't avoid, while milling through the fairground of Science Mind, bumping into a few cousins from math's extended family. One is quantitative thinking, to which the next chapter is devoted: becoming comfortable with concepts of probability and randomness, and learning a few tricks about how to break a problem into tractable pieces and to whip up a back-of-a-wet-cocktail-napkin estimate of some seemingly incalculable figure, like, how many school buses are in your county, or how many people would have to hold hands to form a human chain around the globe and how many of them will be bobbing in open ocean and had better bring a life jacket, shark repellent, and a copy of their dental records just in case? True, you can likely find the answers to these and other fun FAQs on the Internet, yet the habit of thinking in stepwise, quantitative fashion, and facing a problem head-on rather than running off screaming to Google, is worth cultivating. Second only to their desire that science be seen as a dynamic and creative enterprise rather than a calcified set of facts and laws, scientists wish that people would learn enough about statistics—odds, averages, sample sizes, and data sets—to scoff with authority at crooked ones. Through sound quantitative reasoning, they reason, people might resist the lure of the anecdote and the personal testimonial, the deceptive N, or sample size, of "me, my friends, the doorman, and the barista at Caribou." With a better appreciation for the qualities of quantities, people might be able to set aside, if only temporarily, the stubbornness of a human brain that evolved to focus on the quirks and peccadilloes of a small, homogeneous tribe, rather than on the daunting population densities and polycultural vortices that characterize life in contemporary Gotham City. There is a little principle called the law of large numbers, which among other things means that if the group you're considering is very big, nearly anything is possible. Events that would be rare on a limited scale become not merely common, but expected. One favorite example among the numerati is that of repeat lottery winners, people who have won big prizes two or more times and who invariably provoke clucks of awe, envy, what-are-the-odds. "The really amazing thing would be if nobody won twice," said Jonathan Koehler, a professor of economics at the University of Texas.

By thinking small in a large land, we get a skewed sense of what's
meaningful and what's happenstance. "People are overly impressed by coincidences, and they get fooled by them," said John Allen Paulos, a mathematician at Temple University and the author of
Innumeracy
and many other books. Paulos has toyed with the idea of playing the Barnum card to make a point while making a profit. He could start a newsletter of random predictions about the stock market and mail it to two large sets of readers. One group would receive a newsletter predicting that the market would rise in the next three months; another would be told that the market would go bearish. Three months later, he'd see how the market had fared, and direct his next newsletter solely to the recipients of his correct first guess, again separating them into two camps. Half would be flagged to expect a bull market, and half would be warned of an imminent downturn. By the third newsletter, he could boast to a winnowed but still substantial pool of readers, Hey, I've successfully predicted the stock market for two cycles running, and then ask, Care to invest $10 to receive my next divination? (Keep Paulos's scheme in mind should you receive any suspicious solicitations from Temple University.)

Another aspect of quantitative reasoning that characterizes the scientific mindset is this: there must be some quantity to it, some substance, some evidence. Science demands evidence: Does this sound, well, self-evident? Maybe so, but it's a lesson that can be awfully hard to swallow, and must be taken again and again, our daily ABCs and periodic Mendeleevs, folic acid for the backbone, iron in homage to the core of the earth. It's hard to swallow because we love opinions. The most thoroughly read pages in a newspaper are the opinion pages—the editorials, the columns and commentaries, the bellicose lettres from readers living somewhere in the state of Greater Umbrage. Opinions are to have and to hold, in sickness and in health, over breakfast or by blog. Opinions feel good. You're entitled to yours; I'll indulge mine. "In politics, you can say, I like George Bush, or I don't like George Bush, or I do or don't like Howard Dean or John Kerry or Mr. Magoo," said Andrew Knoll of Harvard. "You don't need a principled reason for that political opinion. You don't need evidence that someone else can replicate to justify your opinion. You don't need to think of alternative explanations that would render your opinion invalid. You can go into the voting booth, and say, I prefer this or that politician, and cast your vote accordingly. You don't need excuses for the foods you like, either. If you're ordering dinner at a restaurant, you can ask that your steak be cooked rare or medium or well-done, and the waiter isn't likely to stop and demand that you present evidence to back up your taste, at least not if he wants his tip.

"Unfortunately, people often regard science the same way, as a matter of opinion," Knoll continued. "I do or don't like George Bush, I do or don't believe in evolution. It doesn't matter why I don't believe in evolution, it doesn't matter what the evidence is, I just don't believe in it." You, the evolutionist, "believe" in evolution; I, the creationist, do not. You have your opinion, I have mine, and it takes all kinds of nuts and dips to make a party, right?

At which point most evolutionists are likely to get very impatient and form opinions of their interlocutor that they may or may not choose to express. Scientists can be quite hard on one another, too. They sneer, they dismiss, they scrawl comments on one another's submitted reports like "I feel sorry for whoever funded this so-called research" or "I wouldn't publish this at the bottom of a birdcage." Yet for all the crude inanity of its more extreme sputterings, the attack-dog stance is part of science's strength. The big difference between science and many other aspects of life is, to quote George W. Bush's response to a disgruntled citizen at a July Fourth picnic, "Who cares what you think?" Your opinion doesn't count. Your fond hopes and fantasies of Paradigms Found don't count. What counts is the quality and the quantity of the evidence.

"How you want it to be doesn't make any difference," said the biologist Elliot Meyerowitz of Caltech. "In fact, if things are turning out the way you want them to, you should think harder about how you're doing your experiments, to make sure you're not introducing some bias." As members of the human race, scientists are born to be biased, particularly in favor of their personal biases. After all, we're stuck in our skulls for the whole four-score sentence of sentience. We can't brainhop or mindswap; we merely window-shop. I think, therefore I am right. Yet while self-delusion has been shown to be an extremely useful tool in many situations—particularly when trying to persuade a potential employer or love interest of your extraordinary worth—it is, in the words of the MIT molecular biologist Gerald Fink, "the enemy of science."

"Those of us who are not overly philosophical believe that there is a reality to nature but that it can be very hard to see it and understand it, given all our biases," Meyerowitz said. "The reason a scientist spends all those years in training, as an undergraduate, graduate student, and postdoc, is to learn to deal with personal biases." Good scientists spend a lot of time assuming they're up to no good. They are essentially anti-Sixth Amendment, guilty until proven innocent, or penitents in search
of redemption. "If you're doing your job," said the chemist Daniel Nocera of MIT, "you should be the one who disproves yourself most of the time." It doesn't matter what sort of story you tell yourself as you are doing your experiments, what hypothesis you formulated before you started clicking your pipette or infusing your fetal mice with fluorescent green marker from a jellyfish. Just make sure that the endpoints are pure of heart. "The results section of a scientific paper is where you show you're a good scientist. Here is where you say, I did the experiment properly, and collected the data properly and the data are right," said Nocera. "In the discussion section, where you talk about the implications of the work, you can sound smart or stupid, and tell an interesting story or not. I warn my students, you may sometimes be stupid and you may sometimes be smart, but you must always be good. When I read the results section of your paper, everything in there has got to be right." Darcy Kelley, a neuroscientist at Columbia, sounds a similar warning knell to her students: "Your data should be true even if your story is wrong."

How do scientists seek to purge their work of bias and bad data? Through frequent ablutions at the baptistry of the Control. As vital to the integrity of a scientific report as the finding being showcased are all the no-shows offered in comparison: We did operation A to variable B and got result Z; but when we subjected B to operations E, I, O, U, and even Y, B didn't budge. When researchers at Boston University wanted to show that the eggs of a red-eyed tree frog would hatch early expressly to avoid predation by an oncoming snake, allowing the preemie tadpoles to leap to safety in the water below, it wasn't enough to film the unripe eggs bursting open on the approach of an oviphagous serpent: after all, who's to say that the eggs were responding to a snake-specific threat rather than to an ambient disturbance? The scientists demonstrated the precision of the frog eggs' monitoring system by exposing them to a variety of recorded vibrations of equal amplitude from distinct sources—slithering snake, passing human footsteps, hammering rain. Only with a snake shake would the tadpoles make haste.

A lovable control is often blind: those who perform the experiment should be unaware of what's control and what's the real thing until all the results are in, at which stage the code can be broken. Sometimes devising the right controls is the hardest part of a study. When researchers sought to demonstrate the effectiveness of acupuncture to treat a variety of ailments—drug addiction, headache, nausea—they yearned to be taken seriously. They were tired of their colleagues' twitchy-kneed rejection of all alternative healing practices, and they were really tired of
the catty references to "quackupuncture." They wanted the fourteen-karat validation of a blinded study, in which one group of patients received acupuncture and one did not, and neither set would know who was the treated, who the placebo. But how to fool some of the people some of the time about a procedure as palpable as playing pincushion? The researchers' solution was dapper and to the point: one group of patients would be given needles inserted into officially designated acupuncture nodes, while the second group would have needles inserted into "sham" spots on the body that acupuncturists agreed should have no effect. When patients with nausea and vomiting reported relief from bona fide needling but not from sham acupuncture, even the most skeptical Western doctors had to concede that the 5,000-year-old practice might have its limited uses.

"In my life as a scientist, the thing I worry about the most is, What are the right controls?" said Gerald Fink. "You send a paper off for publication, and you're stricken with doubt: Did I do it? Did I use the right controls?"

Another route to data security is ... another route. Approach a problem from many angles and see if you always end up in Rome. One of my favorite examples of meticulous cartography is a report by Gene Robinson, a neuroethologist at the University of Illinois in Urbana-Champaign. Neuroethologists study the neurobiology of behavior, in Robinson's case of bee behavior. He's exploring how gene activity in the brain is linked to an individual's conduct, and he has decided that the best way to address these big, socially flammable questions is on the modest terrain of the bee brain, which would fit snugly into the belly of this
b.
His question: How does a bee know what to be and not to be? How does a worker bee know that she's meant to spend the first half of her six-week life performing hive-bound duties like tending to the eggs, cleaning out the combs, feeding the voracious queen? And what prompts her at three weeks of age to shrug off her nurse's togs and venture out into the world as a forager, a tireless gatherer of nectar and pollen, and the happenstance key to floral fecundity? What changes occur in the bee brain that might explain the dramatic career shift, with its concomitant capacity to fly a dozen miles a day and not get lost, and to dance the sororal dance that soundlessly booms to workmates the location of blossoms worth probing?

Robinson's team presented various threads of experimental evidence that a gene designated (why not) the foraging gene might be at the heart of the professional overhaul. Firstly, the scientists demonstrated that if they removed all the foraging bees from a hive and thereby
forced some of the young nurse bees to assume breadwinning duties prematurely, the foraging gene flicked on abruptly inside the cells of the bees' beleaguered brains. Secondly, they showed that if they fed young bees sugar water laced with a chemical known to stimulate the activity of the foraging gene artificially, the sedentary cell dwellers suddenly started venturing outside, precociously prepared to gather ye rosebuds. Finally, if the researchers gave young bees another sort of stimulatory chemical that failed to activate the foraging gene, the bees remained hive-bound, a demonstration that not just any chemical kick would do the trick.

Through each evidentiary strand, and every corresponding control, still the discovery held. Unless the foraging gene blazed on, the bee didn't budge. A modest finding perhaps, but one chiseled and polished until it was the bees' knees.

Scientists demand evidence, and they are merciless toward a researcher who gives a PowerPoint presentation with feeble data. "It's a very aggressive, confrontational process," said Lucy Jones. "Conflict is part of the day-to-day reality of how science is done." I have heard scientists guffaw loudly during talks, when it was quite clear that the presenter wasn't telling a Werner Heisenberg joke. I have seen scientists under fire turn as pale as marzipan and start to quiver and almost spit, though I have never seen one cry onstage; and murders in the scientific community are surprisingly rare, although suicides, unfortunately, are not. The scientific hazing can give the enterprise a doctrinaire air, one intolerant of creativity, new ideas, anything that might upset the complacent status quo. It feeds the familiar E = mc
2
of the Hollywood scientist-hero, the lone genius battling an entrenched and blinkered theocracy with only his girlfriend to believe in him and remind him to bathe at least once a week. Now, it is true that when a pharmaceutical company has a best-selling drug at stake, company scientists can be suspiciously quick to dismiss studies showing a cheaper, competing product to be as good or better than the company's billion-dollar gravy boat. Even without the lure of big profits, research scientists often have egos that might best be measured in the astronomical unit known as the parsec; as a result, scientists may defend their research and their perspective long after the data have naysayed them. David Baltimore recalled an MIT scientist who died only within the last couple of years and who was one of the last remaining critics of the theory of the origin of the universe that is now almost universally accepted by astronomers and indeed the entire scientific community. "He didn't believe in the Big Bang," said Baltimore, "and he was in everybody's face about it."

Other books

Motion to Dismiss by Jonnie Jacobs
Kiss Me Again by Kristi Rose
Slaves of the Mastery by William Nicholson
The Exodus Quest by Will Adams
3 Mascara and Murder by Cindy Bell
The Misfit Marquess by Teresa DesJardien