(1916–2001) American mathematician
Born in Gaylord, Michigan, Shannon graduated from the University of Michigan in 1936. He later worked both at the Massachusetts Institute of Technology and the Bell Telephone Laboratories. In 1958 he returned to MIT as Donner Professor of Science, a post he held until his retirement in 1978.
Shannon's greatest contribution to science was in laying the mathematical foundations of communication theory. The central problem of communication theory is to determine the most efficient ways of transmitting messages. What Shannon did was to show a precise way of quantifying the information content of a message, thus making the study of information flow amenable to exact mathematical treatment. He first published his ideas in 1948 in A Mathematical Theory of Communication, written in collaboration with Warren Weaver. The resulting theory found wide application in such wide-ranging fields as circuit design, computer design, communication technology in general, and even in biology, psychology, semantics, and linguistics. Shannon's work made extensive use of the theory of probability; he also extended the concept of entropy from thermodynamics and applied it to lack of information.
Shannon has also made important contributions to computer science. In his paper A Symbolic Analysis of Relay and Switching Circuits (1938) he drew the analogy between truth values in logic and the binary states of circuits. He also coined the term ‘bit’ for a unit of information.