Claude Elwood Shannon( American engineer and mathematician)
Comments for Claude Elwood Shannon
Biography Claude Elwood Shannon
Claude Elwood Shannon (Shannon) (1916 - 2001) - American engineer and mathematician. The man, who called the father of modern theories of information and communication.
One autumn day in 1989 the correspondent of magazine "Scientific American" entered the stone house overlooking the lake north of Boston. But met his boss, a slender 73-year old man with a luxuriant mane of gray hair and mischievous smile, just did not want to remember "things of bygone days" and discuss their discoveries 30-50-year-old. Perhaps the visitor would look better than his toys?
. Without waiting for an answer and not listening to the admonitions of his wife Betty, . master bewildered journalist lured into the next room, . where the proud 10-year-old boy demonstrated his treasures: seven chess machine, . circus pole with a spring and petrol, . folding knife with a hundred razor blades, . double unicycle, . juggling dummy, . and computer, . calculated in the Roman numeral system,
. It does not matter that many of these creations of master's long-broken and order dusty - he is happy
. Who is this old man? Is this it, . while still a young engineer of Bell Laboratories, . wrote in 1948 "Magna Carta" of the information age - "a mathematical theory of communication"? Its a work called "the greatest job in the annals of technological thought? His intuition whether discoverer compared with the genius of Einstein? Yes, . it's all about him,
. And it is in the same 40-ies designed a flying disc on a rocket engine and ride, while juggling, a unicycle through the corridors of Bell Labs. This is Claude Elwood Shannon, the father of cybernetics and information theory, proudly declared: "I have always followed their own interests, not thinking about what they would cost me, nor about their value to the world. I spent a lot of time on completely useless things. "
Claude Shannon was born in 1916 and grew up in Gaylord Michigan. Even in childhood Claude met with both detailed technical designs, and with the generality of mathematical principles. He constantly tinkered with the crystal set and radiokonstruktorami, who brought his father, assistant judge, and solve math problems and puzzles, which supplied his older sister Catherine, which later became professor of mathematics. Claude fell in love with these two worlds are so disparate - technology and mathematics.
While a student at Michigan State University, where he graduated in 1936, Claude majored in both mathematics and electrical engineering. This mutual interest and education identified the first major success that Claude Shannon had reached in his post-graduate years at MIT. In his dissertation in 1940, he proved that the work of switches and relays in electrical circuits can be represented by means of algebra, invented in the middle of the XIX century English mathematician George Boole. "Just so happened that no one else was familiar with these two areas at the same time!" - So modestly Shannon explained the reason for his discovery.
Nowadays, quite unnecessary to explain to readers of computer publications, which means a Boolean algebra for modern circuit. In 1941, the 25-year-old Claude Shannon went to work at Bell Laboratories. During the war he was engaged in the development of cryptographic systems, and later it helped him to open coding methods with error correction. And in his spare time he began to develop ideas that later led to the theory of information. The initial goal was to improve the Shannon information transmission by telegraph or telephone line, which is under the influence of electrical noise.
But what is information? What to measure its quantity? Shannon had to answer these questions even before he began to research the capacity of communication channels. In his writings he defined the 1948-49 period the number of information through the entropy - the value of, . known in thermodynamics and statistical physics as a measure of disorder, . and a unit of information received is, . that subsequently dubbed "a bit", . ie the choice of one of two equally probable options,
. Later, Shannon loved to tell, . that advised him to use the entropy of the famous mathematician John von Neumann, . motivated his advice to, . that few mathematicians and engineers knew about the entropy, . Shannon and this will ensure a great advantage in the inevitable disputes,
. Joke or not, but how difficult it is to us now imagine that just half a century ago, the notion of "information" is needed in the strict definition and that this definition could cause some controversy.
On the solid foundation of his definition of information, Claude Shannon proved the remarkable theorem about the capacity of noisy channels. In its entirety, this theorem was published in his works of 1957-61 years and now bears his name. What is the essence of Shannon's theorem? Every noisy communication channel is characterized by its maximum speed of information, called the Shannon limit. At speeds above this limit, the inevitable errors in the transmitted information. But below this limit can be approached arbitrarily close by providing relevant information encryption arbitrarily small probability of error in any noise channel.
These ideas were too Shannon a prophet and could not find a use in the years of slow tube electronics. But in our time of high-speed chips they work wherever stored, processed and transmitted information: on the computer and laser disc, fax machine and interplanetary station. We do not notice the theorem of Shannon, as we do not notice the air.
In information theory, irrepressible Shannon kissed in many areas. One of the first, he suggested that the machine can play games and self -. In 1950, he made a mechanical mouse Theseus, remotely controlled by a complex electronic circuit. This mouse is studied to find a way out of the labyrinth. In honor of his invention IEEE established an international competition "mini mouse", which still involve thousands of students of technical colleges. In the same 50-s Shannon created a machine, . that "read thoughts" during a game of "coin": people tried to guess "heads" or "flip", . The machine guessed with probability above 50%, . because people can not avoid any laws, . that the machine can use,
In 1956, Shannon left Bell Labs and the following year became a professor at the Massachusetts Institute of Technology, where he retired in 1978. Among his students was, in particular, Marvin Minsky, and other prominent scientists working in the field of artificial intelligence.
Proceedings of the Shannon, to whom with reverence of science are equally interesting for specialists, highly critical application tasks. Shannon laid the foundation for modern coding with error correction, which does not manage now, no hard disk drive or a system of video streaming, and, possibly, many products have yet to see the light.
MIT, and in retirement he is fully seized of its long-standing fascination with juggling. Shannon has built several juggling machines and even set up a general theory of juggling, which, however, did not help him beat a personal record - juggling four balls. He also felt his hand at poetry, and has also developed various models of the exchange of shares and tested them (according to him - successfully) for their own actions.
But since the early 60-ies are not made in Shannon information theory, virtually nothing. It looked as though he's only 20 years created enough of them, the theory. Such a phenomenon - not a rarity in the world of science, and in this case, the scientists say a single word: burned out. As light, or what? I think a more accurate comparison would be scientists with the stars. The most powerful stars shine did not last long, about a hundred million years, and ended his creative life supernova, during which nucleosynthesis occurs: from the hydrogen and helium born all the Mendeleev. We are with you are made up from the ashes of these stars, and just as our civilization is a product of rapid combustion of the most powerful minds. There are stars of the second type: they burn evenly and for a long time and billions of years, give light and warmth of inhabited planets (at least one). The researchers of this type is also very much needed science and humanity: they report the development of civilization energy. And the star of the third grade - red and brown dwarfs - light and warm slightly, just under his nose.
In 1985, Claude Shannon and his wife Betty unexpectedly visited the International Symposium on Information Theory in the English town of Brighton. Almost an entire generation of Shannon did not appear at conferences, and at first no one knew. Then the participants of the symposium began to whisper: 'Get out the modest gray-haired gentleman - that Claude Elwood Shannon, . same! At the banquet Shannon said a few words, . little pozhongliroval three (alas, . only three) balls, . and then signed hundreds of autographs stunned engineers and scientists, . lined up,
. Queued said that experiencing the same feelings, what they would feel physics Appear on their own conference, Sir Isaac Newton.
Claude Shannon died in 2001 in Massachusetts nursing home with Alzheimer's disease at 84 years old.