Skip to Main Content
Claude Shannon founded information theory in the 1940s. The theory has long been known to be closely related to thermodynamics and physics through the similarity of Shannon's uncertainty measure to the entropy function. Recent work using information theory to understand molecular biology has unearthed a curious fact: Shannon???s channel capacity theorem only applies to living organisms and their products, such as communications channels and molecular machines that make choices from several possibilities. Since he used a property of biology to formulate his mathematics, the author concludes that Claude Shannon was doing biology and was therefore, effectively, a biologist - although he was probably unaware of it. What are the implications of the idea that Shannon was doing biology? First, the author claims it means that communications systems and molecular biology are headed on a collision course. As electrical circuits approach molecular sizes, the results of molecular biologists can be used to guide designs. We might envision a day when communications and biology are treated as a single field. Second, codes discovered for communications potentially teach us new biology if we find the same codes in a biological system. Finally, the reverse is also to be anticipated: discoveries in molecular biology about systems that have been refined by evolution for billions of years should tell us how to build new and more efficient communications systems.