History
A full set of 8 trigrams and 64 hexagrams, analogous to the 3-bit and 6-bit binary numerals, was known to the ancient Chinese through the classic text I Ching. An arrangement of the hexagrams of the I Ching, ordered according to the values of the corresponding binary numbers (from 0 to 63), and a method for generating them, was developed by the Chinese scholar and philosopher Shao Yong in the 11th century. However, there is no evidence that Shao understood binary computation; the ordering is also the lexicographical order on sextuples of elements chosen from a two-element set.
The Indian writer Pingala (c. 200 BC) developed advanced mathematical concepts for describing prosody, and in doing so presented the first known description of a binary numeral system.[1][2]
Similar sets of binary combinations have also been used in traditional African divination systems such as Ifá as well as in medieval Western geomancy. The base-2 system utilized in geomancy had long been widely applied in sub-Saharan Africa.
In 1605 Francis Bacon discussed a system by which letters of the alphabet could be reduced to sequences of binary digits, which could then be encoded as scarcely visible variations in the font in any random text. Importantly for the general theory of binary encoding, he added that this method could be used with any objects at all: "provided those objects be capable of a twofold difference only; as by Bells, by Trumpets, by Lights and Torches, by the report of Muskets, and any instruments of like nature".[3] (See Bacon's cipher.)
The modern binary number system was fully documented by Gottfried Leibniz in the 17th century in his article Explication de l'Arithmétique Binaire. Leibniz's system uses 0 and 1, like the modern binary numeral system. As a Sinophile, Leibniz was aware of the I Ching and noted with fascination how its hexagrams correspond to the binary numbers from 0 to 111111, and concluded that this mapping was evidence of major Chinese accomplishments in the sort of philosophical mathematics he admired.[4]
In 1854, British mathematician George Boole published a landmark paper detailing an algebraic system of logic that would become known as Boolean algebra. His logical calculus was to become instrumental in the design of digital electronic circuitry.
In 1937, Claude Shannon produced his master's thesis at MIT that implemented Boolean algebra and binary arithmetic using electronic relays and switches for the first time in history. Entitled A Symbolic Analysis of Relay and Switching Circuits, Shannon's thesis essentially founded practical digital circuit design.
In November 1937, George Stibitz, then working at Bell Labs, completed a relay-based computer he dubbed the "Model K" (for "Kitchen", where he had assembled it), which calculated using binary addition. Bell Labs thus authorized a full research programme in late 1938 with Stibitz at the helm. Their Complex Number Computer, completed January 8, 1940, was able to calculate complex numbers. In a demonstration to the American Mathematical Society conference at Dartmouth College on September 11, 1940, Stibitz was able to send the Complex Number Calculator remote commands over telephone lines by a teletype. It was the first computing machine ever used remotely over a phone line. Some participants of the conference who witnessed the demonstration were John Von Neumann, John Mauchly and Norbert Wiener, who wrote about it in his memoirs.
No comments:
Post a Comment