Fenrir Logo Fenrir Industries, Inc.
Forced Entry Training & Equipment for Law Enforcement






Have You Seen Me?
Columns
- Call the Cops!
- Cottonwood
Cove

- Dirty Little
Secrets

>- Borderlands of
Science

- Tangled Webb
History Buffs
Tips, Techniques
Tradeshows
Guestbook
Links

E-mail Webmaster








Readers: My apologies for the interruption. We had a slight glitch as we were building the new website and took the last four weeks correcting that instead of updating our columns. We, and they are back and we hope you like the new look. All the missing/interim columns can still be found in the archives. - Ed.

"Machine Thoughts"

I have never seen a cotton gin, and I don't know if you have. Even so, there is a pretty good chance that it brings to your mind the name of Eli Whitney. Right or wrong, we attach people to inventions. Orville and Wilbur Wright get the airplane, Ben Franklin the lightning conductor, Alexander Graham Bell the telephone, Thomas Edison the light bulb and the phonograph. For the sewing machine it's a toss-up between Elias Howe and Isaac Singer, who sued each other over the invention in 1854 despite the fact that a Frenchman, Barthelemy Thimmonier, had built one in 1831 and been granted US patents on the device in 1848.

And the computer, on which I am writing this column and with which you are perhaps reading it? Does a name spring to mind?

If it does, more than likely it is the eccentric English polymath, Charles Babbage, who in 1834 set out the ideas for his Analytical Engine, able to perform all the operations embodied in today's computers. His device would be mechanical - in 1832 no one dreamed of vacuum tubes, still less transistors - and driven by a steam engine. Programs would be entered using punched cards, in the manner already employed in the Jacquard automatic loom. We have to say, "would be," because although all the ideas were there, Babbage was never able to build his own brainchild despite forty years of trying.

After that, the road leading to today's electronic computers took several separate and occasionally murky branches. In 1889, Herman Hollerith adopted the use of punched cards to process the results of the upcoming US 1890 decennial census, an effort which would otherwise have taken more than ten years. He did not, however, seek to make a computer in the sense intended by Babbage. In the 1930's, John Atanasoff built an electronic (as opposed to a slower electrical) calculating machine, for the special purpose of solving large systems of equations. In 1937, the Harvard Mark I relay machine, otherwise known as the Automatic Sequence Controlled Calculator, was proposed for solving problems in electrical engineering (this machine in its completed form was a monster, fifty-two feet long and eight feet high). In 1939, the German engineer Konrad Zuse proposed to make an electrical-relay version of Babbage's machine, containing most of the ideas but lacking an ability to take different program paths depending on calculated results.

Everything seemed set for the first electronic computer, the machine which would employ super-fast electronic circuits to implement the general Babbage design.

And what happened next? No one can give a definite answer, because in 1939 the world went to war and progress in building computers, natural tools for the vast computations needed to break enemy codes, went underground.

The world's first true electronic computer may have been Colossus, a machine built by Tommy Flowers from a design by Max Newman. It employed general ideas developed by Alan Turing and fellow-workers at England's super-secret decoding facility at Bletchley Park, but we may never learn all the details of how it worked. Like everything at Bletchley, the machine was covered by the Official Secrets Act. After the war the Colossus line of machines was destroyed. Flowers was ordered to get rid of the design documents, which he did by burning them. The plans for what could well have been the first computer were lost forever, and as a result, the "official" first electronic computer is ENIAC, built in 1946 by John W. Mauchly and J. Presper Eckert at the University of Pennsylvania's Moore School of Engineering. ENIAC could perform 5,000 calculations a second. The speed of Colossus is not recorded, though we do know its delivery date to Bletchley Park: December 8, 1943.

How does the computer I am using today compare with ENIAC? Well, I tend to be old-fashioned, and refuse to change to a new machine until I am convinced that all the bugs are out of both hardware and software. My laptop is six years old. If we allow for both computing speed and storage capacity, it is probably a million times as powerful as ENIAC. Today's top-of-the-line machines are at least a thousand times better than that.

An improvement by a factor of a billion in less than sixty years is startling enough, but inevitably it makes us ask what another six decades will produce. We don't know, of course, any more than anyone in 1946 could imagine the hundreds of millions of machines in use everywhere today. To Thomas J. Watson, chairman of IBM in 1943, is attributed the possibly apocryphal statement, "I think there is a world market for maybe five computers."

In terms of applications programs, I would certainly like to be able to dictate text and have it appear accurately on my screen, rather than providing the gobbledygook that is often returned to me by even the best voice recognition software. That doesn't seem too much to ask. I think we are talking no more than ten years ahead for this and similar well-defined (but hard to implement) functions. Nor does it seem unreasonable to ask for an operating system which does not at intervals freeze up completely, to the point where it is necessary to switch off the computer and start everything again from scratch (yes, of course I am talking about Windows).

Hardware advances are more difficult to predict. One prospect, at the moment seen only dimly, is for quantum computers. These will permit a vast number of calculations (billions at a minimum) to proceed simultaneously, rather than being performed in the sequential, step-by-step fashion common today.

Quantum computers exist at the moment only at the level of simple logic gates. If they can be built up to the level of large general-purpose machines, they will as far exceed today's computers as the latter are beyond an abacus.

Is quantum computing no more than a fad or a fancy, something which intrigues us today but will never achieve practical importance? I don't know. But here is what the editor in charge of business books for Prentice Hall wrote in 1957: "I have traveled the length and breadth of this country and talked with the best people, and I can assure you that data processing is a fad that won't last out the year."

Replace "data processing" in that sentence by "quantum computing," and perhaps you have today's clouded view of tomorrow.


Copyright-Dr. Charles Sheffield 2002  


"Borderlands of Science"
by Dr. Charles Sheffield

Dr. Charles Sheffield



Dr. Charles Sheffield was born and educated in England, but has lived in the U.S. most of his working life. He is the prolific author of forty books and numerous articles, ranging in subject from astronomy to large scale computing, space trasvel, image processing, disease distribution analysis, earth resources gravitational field analysis, nuclear physics and relativity.
His most recent book, “The Borderlands of Science,” defines and explores the latest advances in a wide variety of scientific fields - just as does his column by the same name.
His writing has won him the Japanese Sei-un Award, the John W. Campbell Memorial Award and the Nebula and Hugo Awards. Dr. Sheffield is a Past-President of the Science Fiction Writers of America, and Distinguished Lecturer for the American Institute of Aeronautics and Astronautics, and has briefed Presidents on the future of the U.S. Space Program. He is currently a top consultant for the Earthsat Corporation




Dr. Sheffield @ The White House



Write to Dr. Charles Sheffield at: Chasshef@aol.com



"Borderlands of Science" Archives