Quantum Legacies: Dispatches From an Uncertain World Read online

Page 3


  In the spring of 1931, under constant hectoring from Heisenberg and Pauli to account for a strange mathematical feature of his new equation, Dirac boldly predicted the existence of antimatter: as-yet-unseen cousins to the ordinary particles we see all around us, which have the same mass but opposite electric charge. Within two years, physicists in California and Cambridge had accumulated striking experimental evidence in support of Dirac’s conjecture. Dirac thus set in motion what has become the single most precise physical theory ever. At the latest count, theoretical predictions calculated with QED match experiments to eleven decimal places. Errors on both sides of the ledger—theorists’ calculations and experimenters’ data—are now measured in parts per trillion.7

  To Dirac, mathematics could be beautiful, and beauty was the surest guide to truth. “It is more important to have beauty in one’s equations than to have them fit experiment,” he was fond of repeating. (Today’s proponents of string theory sometimes borrow his rhetoric.) A stickler for precision, Dirac developed a style that was elegant, even austere: colleagues sometimes complained that he left all the words out of his articles. Asked after a lecture to clarify an earlier point, he often responded by repeating, verbatim, what he had said the first time. He honed his mathematical notation—now universal among physicists—to allow maximum economy of expression. His style was enshrined in his famous textbook, The Principles of Quantum Mechanics, first published in 1930 and an instant classic. Ninety years later the book is still in print, and still much revered.8

  Unlike some less fortunate figures in the history of science, who suffered long delays before their contributions were recognized, Dirac skyrocketed to the top of the profession. He was elected to the Royal Society at the unheard-of age of twenty-seven. (In a further rarity, he was elected upon his first nomination.) In July 1932, just shy of his thirtieth birthday, Dirac was elevated to the Lucasian Chair of Mathematics at Cambridge—once held by the equally precocious Isaac Newton and later occupied by Stephen Hawking. He shared the 1933 Nobel Prize with Schrödinger and remains one of the youngest recipients. Though he was by no means finished as a physicist by that point—he continued to produce important, intriguing work in quantum theory and cosmology, most of which would bear fruit only later in the hands of others—the five-year period following his dissertation was surely one of the most brilliant and far-reaching bursts of scientific creativity ever recorded.

  In other respects, Dirac was a late bloomer. He seems to have become interested in politics only in his early thirties, when he developed a fascination with the “Soviet experiment.” He made several trips to the Soviet Union in the 1930s to collaborate with leading physicists. He stunned guests at the banquet in honor of his Nobel Prize by delivering an impromptu harangue on the importance of protecting workers’ wages in the midst of the global economic crisis.

  Figure 1.2. Physicists gather at the Institute for Theoretical Physics in Copenhagen for a 1933 conference. Front row, from left to right: Niels Bohr, Paul Dirac, Werner Heisenberg, Paul Ehrenfest, Max Delbruck, and Lise Meitner. (Source: Nordisk Pressefoto, courtesy of AIP Emilio Segrè Visual Archives, Margrethe Bohr Collection.)

  He resisted getting involved in the war effort after 1939, even as many of his colleagues in physics and mathematics rallied to the cause. He performed calculations for the “Tube Alloys” project—Britain’s early nuclear weapons program—and consulted on at least one occasion with Klaus Fuchs, the German-born British physicist who later spied for the Soviets from deep within the Manhattan Project. But when J. Robert Oppenheimer, the scientific director of the Los Alamos laboratory, asked Dirac to work full-time on the Manhattan Project, he declined.

  Dirac’s leftist sympathies posed some problems after the war. He was denied a visa to enter the United States in April 1954, at the height of American anti-Communist hysteria. (This public rebuke came just as Oppenheimer was undergoing his own withering interrogation before the US Atomic Energy Commission’s personnel security board, although Oppenheimer’s case was still secret at the time.)9 Nearly two decades later, as Dirac made plans to settle in the United States for his retirement, he was barred from accepting professorships at some universities because of his long-standing membership in the Soviet Academy of Sciences.

  Yet these public setbacks and embarrassments were minor compared with Dirac’s private struggles, as captured in Graham Farmelo’s moving biography, The Strangest Man (2009). Indeed, as Farmelo’s book makes clear, Dirac’s success is all the more remarkable given the travails of his personal life. Six months before he was sent Heisenberg’s page proofs in the autumn of 1925, his older brother, Felix, committed suicide. They had both studied engineering at the eminently practical Merchant Venturer’s College in Bristol, the engineering faculty of Bristol University; their father taught French in the adjoining secondary school.10 Year after year, Felix had watched his younger brother gallop past him scholastically. Their father paid scant attention to Felix and refused to support his interest in medicine. To the end of his life, Dirac blamed his father for Felix’s death.

  The Dirac household was crushingly joyless. Paul’s cold and authoritarian father spent decades in an open feud with his mother. Drawing on a stash of family correspondence, Farmelo documents the efforts made by Dirac’s grasping mother to receive some solace from her younger son. She leaned on him for emotional support of a kind that few people in their teens or early twenties could be expected to provide, and this seems only to have pushed Dirac deeper into his shell. In a country that has long celebrated individuals of quiet reserve, Dirac became a man of almost inhuman reticence. (In Cambridge, a “Dirac” was a unit of measurement corresponding to one word per hour, the “smallest imaginable number of words that someone with the power of speech could utter in company,” as Farmelo puts it.)11 Dirac explained late in life that his father had forced him to speak only French at the dinner table, and frustrated by his lack of facility with the language, he had found it easiest to remain silent. Meals became so stressful for the young Dirac that he developed severe indigestion; he was afflicted by food sensitivity throughout his life.

  Family dynamics like these, combined with Dirac’s famously peculiar demeanor, understandably invite some form of psychological speculation. Retrospective diagnosis has become something of a pastime in our psychologizing age. Abraham Lincoln wasn’t just solemn or moody, several scholars have argued; he suffered from clinical depression. Isaac Newton, whose father died before he was born and whose mother abandoned him at age three upon her remarriage, acted out a lifetime of hostile priority claims for want of secure childhood attachment—or so Frank Manuel concluded in his 1968 biography, A Portrait of Isaac Newton.12 (Nor are these diagnoses limited to historical figures. My wife, a psychologist, has no patience for the great Russian novels. According to her, Raskolnikov could have been set right with a modest regimen of psychotropic drugs; Crime and Punishment should have been a jaunty five-page pamphlet, with a happy ending to boot.)

  Farmelo treads into similar territory in the closing pages of his book. He marches through several traits commonly associated with autism—sensitivity to food and to loud, sudden noises; extreme reticence and awkward social behavior; obsessive focus on a few arcane topics—adding each time that the appearance of similar traits in Dirac was probably not coincidental. In the end, he can’t resist it: “I believe it to be all but certain that Dirac’s behavioural traits as a person with autism were crucial to his success as a theoretical physicist.”13

  Such claims demand a leap of faith. There is, of course, the basic challenge of evidence: even with the rich, textured family correspondence that Farmelo mines so well, traces left behind in letters seem rather different from hours of professional observation or the directed interrogation of a trained clinician. More important is a tacit ontological assumption that today’s repertoire of psychiatric diagnoses transcends time and place. Does the “melancholia” so often described by Lincoln’s contemporaries really map smoothly onto t
oday’s “clinical depression”? The American Psychiatric Association changes entries in its Diagnostic and Statistical Manual of Mental Disorders—the industry standard for cataloging psychiatric ailments—often dramatically, on a timescale of decades. Yesterday’s diseases can become today’s idiosyncrasies; one era’s nervous disorders are another’s quirky tics.14 Why, then, this need to pathologize genius? Is it, perhaps, to let ourselves off the hook? No wonder we failed to achieve the greatness of a Newton, Lincoln, or Dirac, we may well comfort ourselves. They weren’t just smarter than us; their brains were different from ours.

  Whether or not we follow Farmelo on this brief excursion into diagnosis-at-a-distance depends on how closely we hew to Heisenberg’s dictum to focus only on observable quantities. After all, neither “melancholia” nor “autism” seems to correspond, in any straightforward way, to states that historians can measure. (Surely it is telling that just four years after Farmelo’s book appeared, the editors of the Diagnostic and Statistical Manual replaced “autism” with “autism spectrum disorder,” with a consequent shift in definitions and diagnoses.)15

  Diagnoses aside, Farmelo’s book suggests a different, rather remarkable way in which Dirac’s particular mind-set left its mark on the theory he worked so hard to advance. Nearly a century after its formulation, quantum theory remains scientists’ most successful and precise description of nature ever devised. Yet it is a curiously minimalist description, forcing physicists to choose, for example, between saying something definite about where a particular particle is at a given moment and where it is going—they cannot say both. Dirac’s strict economy of expression and his famous reticence have shaped how generations of physicists discuss the quantum world. For all its astounding success, the account that we glean from quantum mechanics—matter behaving at times in familiar ways, only to surprise us, suddenly, with its inescapable oddity—remains fitfully strange. Not unlike Paul Dirac himself.

  2

  Life-and-Death

  When Nature Refuses to Select

  Of all the bizarre facets of quantum theory, few seem stranger than those captured by Erwin Schrödinger’s famous fable about the cat that is neither alive nor dead. It describes a cat locked inside a windowless box, along with some radioactive material. If the radioactive material happens to decay, then a device will detect the decay and release a hammer, which will smash a vial of poison and kill the cat. If no radioactivity is detected, the cat will live. Schrödinger dreamed up this gruesome scenario to criticize what he considered a ludicrous feature of quantum theory. According to proponents of the theory, before anyone opened the box to check on the cat, the cat was neither alive nor dead; it existed in a strange, quintessentially quantum state of alive-and-dead.

  Today, in our LOLcatz-saturated world, Schrödinger’s strange little tale is often played for laughs, with a tone more zany than somber.1 It has also become the standard-bearer for a host of quandaries in physics and philosophy. In Schrödinger’s own time, Niels Bohr and Werner Heisenberg proclaimed that hybrid states like the one the cat was supposed to be in were a fundamental feature of nature. Others, like Einstein, insisted that nature must choose: alive or dead, but not both.

  Although Schrödinger’s cat flourishes as a meme to this day, discussions tend to overlook one key dimension of the fable: the environment in which Schrödinger conceived of it in the first place. It’s no coincidence that, in the face of a looming world war, genocide, and the dismantling of German intellectual life, Schrödinger’s thoughts turned to poison, death, and destruction. Schrödinger’s cat, then, should remind us of more than the beguiling strangeness of quantum mechanics. It also reminds us that scientists are, like the rest of us, humans who feel—and fear.

  : : :

  Schrödinger crafted his cat scenario during the summer of 1935, in close dialogue with Albert Einstein. The two had solidified their friendship in the late 1920s, when they were both living in Berlin. By that time, Einstein’s theory of relativity had catapulted him to worldwide fame. His schedule became punctuated with earthly concerns—League of Nations committee meetings, stumping for Zionist causes—alongside his scientific pursuits. Schrödinger, originally from Vienna, had been elevated to a professorship at the University of Berlin in 1927, just one year after introducing his wave equation for quantum mechanics (now known simply as the “Schrödinger equation”). Together they enjoyed raucous Viennese sausage parties—the Wiener Würstelabende bashes that Schrödinger hosted at his house—and sailing on the lake near Einstein’s summer home.2

  Figure 2.1. Erwin Schrödinger relaxes with a pipe and a drink. (Source: Photograph by Wolfgang Pfaundler, courtesy of AIP Emilio Segrè Visual Archives.)

  Too soon, their good-natured gatherings came to a halt. Hitler assumed the chancellorship of Germany in January 1933. At the time, Einstein was visiting colleagues in Pasadena, California. While he was away, Nazis raided his Berlin apartment and summer house and froze his bank account. Einstein resigned from the Prussian Academy of Sciences and quickly made arrangements to settle in Princeton, New Jersey, as one of the first members of the brand-new Institute for Advanced Study.3

  Meanwhile, Schrödinger—who was not Jewish and had kept a lower profile, politically, than Einstein—watched in horror that spring as the Nazis staged massive book-burning rallies and extended race-based restrictions to university instructors. Schrödinger accepted a fellowship at the University of Oxford and left Berlin that summer. (He later settled in Dublin.) In August, he wrote to Einstein from the road, “Unfortunately (like most of us) I have not had enough nervous peace in recent months to work seriously at anything.”4

  Before too long their exchanges picked up again, their once-leisurely strolls now replaced by transatlantic post. Prior to the dramatic disruptions of 1933, both physicists had made enormous contributions to quantum theory; indeed, both earned their Nobel Prizes for work on the subject. Yet both had grown disillusioned with their colleagues’ efforts to make sense of the equations. Danish physicist Niels Bohr, for example, insisted that according to quantum theory, particles do not have definite values for various properties until they are measured—as if a person had no particular weight until stepping on her bathroom scale. Moreover, quantum theory seemed to provide only probabilities for various events, rather than the rock-solid predictions that flowed from Newton’s laws or Einstein’s relativity. Bohr’s arguments failed to sway Einstein or Schrödinger. Now separated by an ocean but armed with paper and postage stamps, they dove back into their intense discussions.5

  In May 1935, Einstein published a paper with two younger colleagues at the Institute for Advanced Study, Boris Podolsky and Nathan Rosen, charging that quantum mechanics was incomplete. They contended that there exist “elements of reality” associated with objects in the world—properties of physical objects that had definite values—for which quantum theory provided only probabilities.6 In early June Schrödinger wrote to congratulate his friend on the latest paper, lauding Einstein for having “publicly called the dogmatic quantum mechanics to account over those things that we used to discuss so much in Berlin.” Ten days later Einstein responded, venting to Schrödinger that “the epistemology-soaked orgy ought to come to an end”—an “orgy” they each associated with Niels Bohr and his younger acolytes like Werner Heisenberg, who argued that quantum mechanics completely described a nature that was, itself, probabilistic.7

  This exchange produced the first stirrings of the soon-to-be-born cat. In a follow-up letter to Schrödinger, Einstein asked his friend to imagine a ball that had been placed in one of two identical, closed boxes. Prior to opening either box, the probability of finding the ball in the first box would be 50 percent. “Is this a complete description?” Einstein asked. “NO: A complete statement is: the ball is (or is not) in the first box.”8 Einstein believed just as fervently that a proper theory of the atomic domain should be able to calculate a definite value. Calculating only probabilities, to Einstein, meant stopping short.
/>   Encouraged by Schrödinger’s enthusiastic reply, Einstein pushed his ball-in-box analogy even further. What if the small-scale processes that physicists were used to talking about were amplified to human sizes? Writing to Schrödinger in early August, Einstein laid out a new scenario: imagine a charge of gunpowder that was intrinsically unstable, as likely as not to explode over the course of a year. “In principle this can quite easily be represented quantum-mechanically,” he wrote. Whereas solutions to Schrödinger’s own equation might look sensible at early times, “after the course of a year this is no longer the case at all. Rather, the ψ-function”—the wave function that Schrödinger had introduced into quantum theory back in 1926—“then describes a sort of blend of not-yet and of already-exploded systems.” Not even Bohr, Einstein crowed in his letter, should accept such nonsense, for “in reality there is just no intermediary between exploded and not-exploded.”9 Nature must choose between such alternatives, Einstein insisted, and so, therefore, should the physicist.

  Einstein could have reached for many different examples of large-scale effects with which to criticize a quantum-probabilistic description. His particular choice—the unmistakable damage caused by exploding caches of gunpowder—likely reflected the worsening situation in Europe. As early as April 1933, he had written to another colleague to describe his view of how “pathological demagogues” like Hitler had come to power, pausing to note that “I am sure you know how firmly convinced I am of the causality of all events”—quantum and political alike. Later that year he lectured to a packed auditorium in London about “the stark lightning flashes of these tempestuous times.” To a different colleague he observed with horror that “the Germans are secretly re-arming on a large scale. Factories are running day and night (airplanes, light bombs, tanks, and heavy ordnance)”—so many explosive charges ready to explode. In 1935, around the time of his spirited exchange with Schrödinger about quantum theory, Einstein publicly renounced his own prior commitment to pacifism.10