Saturday, February 1, 2014

Two Ideas

In the 1930’s, Claude Shannon had two ideas that gave rise to modern computing.  The first is that logic can be expressed in electrical circuitry.  In 1854, Charles Boole developed a form of logic in which the only possible values are true and false.  The possible operators are ‘and’, ‘or’, and ‘not.’  Before this, the dominant form of logic had been developed by Aristotle.  (All men are mortal.  Socrates is a man.  Therefore, Socrates is mortal.)  Boole’s logic made set theory and statistics possible.

Shannon realized that the presence or absence of a current could notate truth or falsity and that Boolean operators could be implemented using simple circuits.  ‘And’ can be implemented by a series circuit.  ‘Or’ can be implemented by a parallel circuit.  And ‘not’ can be implemented by a relay.  These simple components are the building blocks of today’s more complicated machines.

Shannon’s other idea is a postulate that all information can be expressed in Boolean logic. This postulate defines the field of information theory.  It is now obvious to us that all numbers can be represented using a binary number system and that all symbols can be codified in systems of representation like ASCII, but this was not at all obvious at the beginning.  Many early computers, like the ENIAC, still used a decimal number system, for example.

Here’s a simple example showing these two ideas at work.  A one bit adder can be built using an exclusive or (XOR) gate and an AND gate.  The truth table is:

INPUTS                 OUTPUTS
A             B             SUM      CARRY
0              0              0              0
0              1              1              0
1              0              1              0
1              1              0              1

In logic gates, this can be implemented using the following:


Just stop and think about this.  Using two very simple components we have implemented an adding algorithm.  It even carries the one.  You could chain a few of these together and add numbers as large as you like.  Just rig up some light bulbs, power sources, and switches, and you'll have your I/O.

It’s hard to appreciate now just how much of a conceptual leap Shannon's two ideas were.  I took multiple semesters of computer hardware design and never really got it.  Charles Petzold’s book Code helped.  He shows how simple the technologies at the heart of computing really are.  In fact, they had been around for a hundred years before Shannon’s work reconceptualized the ways those technologies could be used.  Petzold shows how telegraph relays can be used to build a complex adding machine.

I’m fascinated by Shannon and the history of early computing, even if it’s not directly relevant to my work.  Modern programming languages are so far abstracted from physical computers that I doubt if studying electrical engineering can have much benefit to programmers.  If it’s worth studying, it’s as a system of thought, like physics or chemistry.  But it’s also worth considering these two ideas from time to time.  Are there any forms of logic that cannot be captured by computers?  Are the audiophiles right that analog sound is better than digital?  What gets lost in the translation to transistors?

Finally, the birth of computing is simply an interesting moment in intellectual history.  Shannon’s two ideas let us leap over Babbage, whose difference engine was never completed.  Babbage basically has no intellectual inheritance.  Even if the analytical engine “weaves algebraical patterns just as the Jacquard-loom weaves flowers and leaves,” in Ada Lovelace’s words, punch cards were arrived at independently.  The Von Neumann architecture owes nothing to Babbage's design.

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...