Author: Harris Margaret
Date: November 2001
The coldest stuff in the universe floats in a glass tube no bigger than a man's index finger, half-hidden amid a dizzying tabletop array of lenses, mirrors, magnetic coils, and lasers. Inside the tube, the temperature checks out at a mind-numbing three nanokelvin above absolute zero - more than a billion times colder than the icy emptiness of interstellar space. Suspended by magnets in their icy prison, a few thousand atoms of silvery rubidium gas are acting very strangely. Instead of whizzing around and colliding with each other at random, like atoms in a normal gas, the ultracold rubidium atoms are behaving like well-drilled soldiers, all lined up and moving as if they were a single atom. Which, in a sense, they are - thanks to a phenomenon called Bose-Einstein condensation, the subject of this year's Nobel Prize in physics.
In 1924, physicists Albert Einstein and Satyendra Bose predicted that if one were to cool a sample of bosons - particles with integer quantum spin, such as rubidium and cesium atoms - down past a certain critical temperature, all of the particles in the sample would "fall" into the lowest possible energy level. Once there, quantum mechanics states that since all the particles would have the same energy, each particle would behave exactly like all the others. The result, Bose and Einstein argued, would be an entirely new state of matter, a "superatom" as different from ordinary gases as a solid is from a liquid, or a liquid from a gas.
From the start, physicists regarded this predicted Bose-Einstein condensate (BEC) as theoretically fascinating, but experimentally impossible. In order to achieve the predicted critical temperature, an experimenter would have to cool a sample of matter down to a fraction of a degree above absolute zero, far beyond all known benchmarks of cold - past liquid nitrogen, past liquid helium, past even outer space, where the average temperature is about 3 kelvin. (Room temperature, by contrast, is 293 kelvin.) The usual way to cool an object is to place it inside something colder - a refrigerator, for example, or a vat of liquid nitrogen. But if scientists wanted to make the coldest stuff in the universe, by definition there wouldn't be anything colder to put it inside. Worse, prevailing theories of low-temperature physics indicated that Bose-Einstein phenomena could only occur in a "forbidden" region of the phase-temperature spectrum; at extremely low temperatures, all matter would have to be either solid or liquid - and BEC could only occur in a gas.
These difficulties were not lost on physicists, and so the idea of creating a real, live BEC languished for over sixty years. In the late 1970's and early 1980's, however, physicists began to develop cheaper and more sophisticated methods of cooling atoms and keeping them isolated from their environments. Laser cooling was one of these new methods. Like all light, a laser beam is made up of individual particles of light called photons. Each photon has momentum, so when a photon hits an atom, it gives some of its momentum to the atom. In laser cooling, atoms suspended in a magnetic chamber are bombarded with lasers from several different angles, resulting in millions of photon-atom collisions per second. Researchers discovered that if they tuned their lasers to the right frequency, they could use the tiny momentum "kick" of laser photons to slow down the much heavier atoms - much as an intense stream of ping-pong balls fired at a bowling ball would eventually bring the heavier ball to a halt. With multiple laser beams, physicists could slam the target atoms from nearly every direction - creating a condition known as "optical molasses" in which atoms move as sluggishly as a person trying to wade through a swimming pool full of molasses.
Laser cooling enabled physicists to cool atoms to a few degrees above absolute zero - plenty cold enough for most low-temperature physics research, but still a billion times too warm for Bose-Einstein condensation. In order to push atoms still farther into the depths of extreme cold, researchers turned to the familiar phenomenon of evaporation. When molecules in a hot liquid evaporate, they take some of the liquid's heat with them. The rate at which molecules evaporate depends on the forces binding one molecule to another within the liquid. In evaporative cooling, researchers cool a magnetically trapped sample of atoms by letting the most energetic - warmest - atoms evaporate out of the trap. Then, they gradually reduce the trap's magnetic field, thereby "lowering the walls" of the trap and allowing still more atoms to escape. The result is a sample with fewer atoms, but a lower average temperature.
By the early 1990's, a small but growing group of BEC scientists suspected that laser and evaporative cooling could, together, produce temperatures low enough for condensates to form. Moreover, the "forbidden" region - long a theoretical barrier on the path to BEC - no longer seemed quite so forbidding; scientists had demonstrated that if a sample's density stayed very low, matter could exist in gaseous form below the point where it should have turned into a liquid. However, the challenge of the "forbidden" region remained. In order to break the theoretical barrier, a sample's density had to be very low - but if it were too low, condensation wouldn't occur at all.
Then, on June 5, 1995, a University of Colorado team led by Carl Wieman and Eric Cornell announced that they had done what was once thought impossible: they had created the world's first Bose-Einstein condensate. "The thing about forbidden things is that they tend to be really, really cool," reflects Cornell. "So you kind of have to do them anyway." Wieman and Cornell used cheap diode lasers (similar to the type found in laser pointers and CD players) and evaporative cooling to cool a glass tube full of rubidium gas past the critical temperature, around 20 nanokelvin. Once there, a special mobile magnetic trap enabled them to hit the critical density by "playing keep-a-way" with the coldest atoms. False-color images of the rubidium showed a sharp peak in the gas' density near the center of the sample, indicating a high-density cluster of atoms in a single energy state. Einstein and Bose had been right.
Four months after the announcement in Colorado, a group led by Wolfgang Ketterle of MIT published the results of its independent BEC effort, which used sodium atoms instead of rubidium. The MIT group had succeeded in creating condensates with many more atoms than the earlier experiments, opening up fascinating new opportunities for making measurements. For this reason, the Royal Swedish Academy of Sciences awarded the 2001 Nobel Prize in physics to all three researchers, honoring Cornell, Ketterle and Wieman for "the achievement of Bose-Einstein condensation in dilute gases of alkali atoms, and for early fundamental studies of the properties of the condensates."
Since the breakthrough year of 1995, experimental and theoretical BEC research has focused on answering basic questions about the properties of a condensate. For example, how many atoms can be in a condensate? How long can condensates exist before something causes them to collapse? What happens when experimenters "jiggle" the condensate with a magnetic field? How can physicists use condensates as tools in other areas of research? And, perhaps most important, what insights can BEC give us on the nature of the physical world?
Although the field of BEC research is still young, partial answers to some of these questions have emerged since 1995. It is now possible to cook up BECs big enough to see with a magnifying glass or microscope, and to preserve them for minutes or even hours. Carl Wieman's group recently showed that under some conditions, changing the magnetic field around a condensate can cause it to explode and eject jets of warmer atoms. This phenomenon is similar to the behavior of exploding stars, or supernovae; because of this, the exploding condensates have been dubbed "Bosenovae." Experiments in Randall Hulet's Rice University laboratory have also indicated similarities in the forces governing the collapse of condensates and the behavior of white dwarf stars - cold, dense remnants of once-active suns. Another set of experiments, directed by Lene Hau of Harvard University, used the extreme cold and density of a BEC to slow a beam of light down to an astonishing 38 miles per hour - billions of times slower than the speed of light in a vacuum, which clocks in at a constant 6.7 * 10^8 miles per hour (3*10^8 meters per second). On the theoretical side, the inherent simplicity of a BEC - since all of the atoms in a BEC are identical, many millions of atoms can be described by a single equation - has made it an important testing ground for ideas, new and old, about how particles interact with one another.
Such discoveries hold great promise not only for BEC's "native" field of atomic, molecular, and optical physics, but also for astrophysics and possibly other fields as well. In the future, scientists may use Bose-Einstein condensates to build large-scale, stable atom lasers, or "bosers": devices that produces a coherent beam of matter similar to the way a laser produces a coherent beam of light. Lasers have revolutionized both basic research and applied technology since they were invented in the 1960's, and based on preliminary studies, some researchers argue that the same could prove true for bosers. For the coldest stuff in the universe, the future looks remarkably hot.