Fenrir Logo Fenrir Industries, Inc.
Forced Entry Training & Equipment for Law Enforcement






Have You Seen Me?
Columns
- Call the Cops!
- Cottonwood
Cove

- Dirty Little
Secrets

>- Borderlands of
Science

- Tangled Webb
History Buffs
Tips, Techniques
Tradeshows
Guestbook
Links

E-mail Webmaster








"Chilling Out"

Humans have known how to make things hot for a long time. No one is sure how long, but at the very least it is tens of thousands of years, and perhaps more than a hundred thousand, since the discovery of the use of fire.

Making things cold, on the other hand, is something we have learned to do only recently. The Greeks and Romans knew that you could lower the temperature (although they did not use that word) of something by putting it into a mixture of salt and ice. But they had no way of systematically reaching lower and lower temperatures. That had to wait until the late 18th century, when new methods were invented.

One of them may sound familiar. Compress a gas, which makes it hotter, then allow it to cool, and finally allow it to expand again, when it will absorb heat from whatever is next to it. I have just described the principle of the refrigerator, which was patented by Jacob Perkins in 1834.

William Thomson and James Joule discovered a different method in 1852. It relies on the fact that a gas escaping from a valve into a chamber of lower pressure will, under the right conditions, become colder. Using this, in combination with the technique called "boiling under reduced pressure," lower and lower temperatures can be achieved.

The natural question is, how low can you go? The first answer, the one that everyone might have offered 500 years ago, is that there is no limit. After all, we can (and do) heat things to thousands and even millions of degrees. Why can't we cool them just as much?

The answer to that question was provided only when scientists realized that heat is nothing more than motion at the atomic and molecular scale. "Absolute zero" could then be identified as no motion, the temperature of an object when you "took out all the heat." That happens at about -273.16 degrees Celsius. This seems quite a modest value, since the filament of an ordinary light bulb heats to several thousand degrees Celsius. However, getting down close to absolute zero proved to be hugely difficult.

People tried and tried, all through the 19th century. The breakthroughs often were measured by the ability to liquefy, and then to freeze different gases. So we had liquid ammonia (-33 C) in 1799, solid carbon dioxide (dry ice, -78.5 C) in 1834, liquid oxygen (boiling point -183 C) in 1883, and liquid hydrogen (-253 C) in 1898. There was still one holdout, helium, which obstinately refused to liquefy until Kammerlingh Onnes finally did it, in 1908, at an astonishing -268.9 C -- only 4.2 degrees above absolute zero.

And then the fun started.

Liquid helium proved to be a bizarre substance indeed. Below 2.2 degrees above absolute zero it goes into a second liquid form, called Helium II. Helium II seemed (and still seems) to have absolutely no viscosity. Start it moving along a circular tube, and it will keep on running forever. Just as odd, and in many ways far more important, liquid helium II offers no resistance to the passage of an electrical current. It is a "superconductor."

The quest for lower and lower temperatures went on, in 1926 to reach 0.25 degrees above absolute zero (termed 0.25 K, the K standing for Kelvin - William Thomson, mentioned above, became Lord Kelvin in 1892). That record was lowered to 0.00001 K in 1956, and today to within a few billionths of a degree of absolute zero. Many researchers, however, had already stopped along the way, beguiled by the promise of liquid helium and the mystery of superconductors. No one in the first decade of the 20th century could explain how superconductors worked. That had to wait until 1957. However, it did not take much scientific knowledge to realize that if superconductors could be produced at higher temperatures, a huge economic payoff was possible. Electrical current runs everywhere, in this country and around the world. Because even the best conductors resist the passage of those currents, billions of dollars a day are wasted in electricity converted to useless heat. Anyone who could make cheap superconductors - preferably superconductors operating at the temperatures of everyday life, rather than with the extreme cold of liquid helium - would become a multi-billionaire.

Certainly people tried, but their efforts could not be called a great success. The best that anyone could do led to materials that acted as superconductors only within less than 20 degrees of absolute zero. That meant they had to be cooled by liquid hydrogen or liquid helium, both quite difficult to produce and quite expensive to store. Gradually, most researchers gave up on the idea of high-temperature or room-temperature superconductors.

Most, but not all. In January of 1986, two researchers in Switzerland made a ceramic material that became a superconductor at 11 K. That did not seem earth shattering, but Alex Muller and Georg Bednorz suspected they might be on to something big. They continued to work and later that year produced new materials, which became superconducting at 30 K, well above the old record. By early the following year, others had made superconductors operating at over 90 K.

To see why that produced such excitement, you have to know that nitrogen becomes a liquid at 77 K. So the new superconductors could be cooled using liquid nitrogen, easy to produce and also to store.

Of course, what we would really like is a superconductor that operates not at 77 K, but at, say, 25 C - the sort of temperature you find around the house. If we had that, electrical transmission costs would go down dramatically. Computers would not need to worry about heat generation, permanent magnets would become stronger than ever before, and such goodies as magnetically levitated trains would be attractive commercial prospects.

How long before we have these things? No one knows. Some scientists argue on theoretical grounds that a room-temperature superconductor is impossible. However, from 1956, when Bardeen, Cooper and Schrieffer produced an explanation of what we might call "classical" superconductivity, to 1986, when Muller and Bednorz had their first success with ceramics, a superconductor at 30 K would have seemed just as impossible.

My favorite scientific motto may be at work: "It's not the things we don't know that cause the trouble, it's the things we know that ain't so." Look for room-temperature superconductors before the midpoint of this century.

And cheaper electricity, along with them? Hey, we're talking here about utility companies.


Copyright-Dr. Charles Sheffield-2001  

"Borderlands of Science" is syndicated by:


"Borderlands of Science"
by Dr. Charles Sheffield

Dr. Charles Sheffield



Dr. Charles Sheffield was born and educated in England, but has lived in the U.S. most of his working life. He is the prolific author of forty books and numerous articles, ranging in subject from astronomy to large scale computing, space trasvel, image processing, disease distribution analysis, earth resources gravitational field analysis, nuclear physics and relativity.
His most recent book, “The Borderlands of Science,” defines and explores the latest advances in a wide variety of scientific fields - just as does his column by the same name.
His writing has won him the Japanese Sei-un Award, the John W. Campbell Memorial Award and the Nebula and Hugo Awards. Dr. Sheffield is a Past-President of the Science Fiction Writers of America, and Distinguished Lecturer for the American Institute of Aeronautics and Astronautics, and has briefed Presidents on the future of the U.S. Space Program. He is currently a top consultant for the Earthsat Corporation




Dr. Sheffield @ The White House



Write to Dr. Charles Sheffield at: Chasshef@aol.com



"Borderlands of Science" Archives