## Evolution of Technology

Edmund Hamilton's 1928 The Metal Giants an artificial brain turned against its creators.

Computers descended from calculating machines, the earliest of which was the abacus. In 1642 French mathematician and philosopher Pascal made a mechanical calculator that used the decimal system to add and subtract. In 1694, German mathematician/philosopher Leibniz created a "Stepped Reckoner," which was supposed to multiply, divide and take square roots. It didn't work, but utilized principles later essential to modern computers. Tasks were broken down into a great many simple mathematical steps using binary numbers and were performed sequentially. When computers later came to be operated by electricity, binary zero and one became represented by off and on. In the early 1800's George Boole developed "Boolean algebra," the mathematical logic by which computer circuits are designed. Charles Babbage and Ada Lovelaceâ€”Lord Byron's eldest daughterâ€”designed an "analytical engine" using punched cards. Their contemporary technology could not construct the machine accurately enough, but it was built and functioned in the twentieth century.

The first electronic computer was apparently constructed and operated in 1939 by John Vincent Atanasoff, a theoretical physicist at Iowa State University (Mackintosh, 1987). Shortly thereafter, Alan Turing and colleagues in Bletchley, England designed a computer to perform all possible mathematical calculations. It was based on Turing's work proving the logical limits of computability and was used to decipher the German "Enigma" code during World War II. In a masterful presentation of key ideas previously developed by other pioneers, John von Neumann further advanced computer design by separating the machine from its problems. Prior to von Neumann, a computer would have to be rewired for each new task. With enough time, memory and software, computers could solve the problems that could be broken down into finite sequences of logical steps. Most current computers use "serial" processing based on von Neumann's design. In the 1940's, the University of Pennsylvania developed the first electronic computer, the Electronic Numerical Integrator and Calculator or "ENIAC." It weighed 30 tons, took up 3,000 cubic feet of space, and contained 18,000 vacuum tubes, one of which failed every seven minutes. It could calculate nuclear physics problems in two hours that would have taken 100 engineers a year to complete. Today, the same capacity is available on one chip. In 1950 Remington Rand marketed UNIVAC, which dealt with words and numbers stored by their binary equivalent. Since that time, roughly four generations of computers have evolved due to increased demand and advances in design, chip size, materials and other factors. For the same reasons further advances seem inevitable.

Von Neumann and Turing hoped that computers could duplicate our ability to think, so that our minds could be amplified just as our muscles had been by industrial machines. However further evolution of computers using serial processing seems limited. Computers and artificial intelligence are now evolving to parallel systems based on brain architecture and neural net models; a future step may be nanoscale, self organizing intelligence.

Von Neumann is one of several "fathers of the computer." In the "serial" processing which he skillfully formalized, information flows in one dimension. In the 1950's and 1960's, von Neumann (1966) and Stanislav Ulam developed the mathematics of computing in multiple dimensions. They considered two dimensional information spaces with discrete subunits ("cells") whose states could vary depending on the states of neighboring cells. Each cell and its neighbor relations were identical. Relatively simple rules among neighbors and discrete time intervals ("generations") led to evolving patterns and self-organization which were exquisitely sensitive to initial conditions. They called these systems "cellular automata." Von Neumann described a "universal computer" automaton which could solve any problem if given sufficient area and time. Today, computer technologists are considering the profound advantages of implementing molecular scale automata (Milch, 1986).

Edward Fredkin of Massachusetts Institute of Technology has considered multidimensional automata and the discreteness of time and matter. He argues that the universe is a cellular automaton whose "cells" are atomic and subatomic particles (Wright, 1985). The universe is made of information, Fredkin reasons. Cellular automata may be generalized "primordial computers" of which all other computers and complex systems are particular examples. Cellular automata in conformational states of cytoskeletal subunits could process biological information and be the substrate of consciousness.

The current trend in computer design and artificial intelligence or "AI" is parallel connectedness, emulating the brain. Many types of problems can be solved by breaking them down into serial mathematical steps. Today's electronic computers serially process very rapidly and can solve complex mathematical problems far faster than can humans alone. However qualitative functions which the brain performs naturally-recognizing patterns, or making judgments-are extremely difficult for computers. Consider the letter "a." We recognize it automatically, in any typeface, in all but the worst handwriting. To our brains it's simple, quick, obvious even if it's missing. If we see, "Sally 'red' a newspaper," we mentally insert the absent "a." Computer/Al scientist Jerome Feldman (1985) cites the example of interpreting the statement "John threw a ball for charity." The inherent ambiguities of this type of statement can be resolved in a highly parallel system in which multiple simultaneous interpretations are processed and evaluated. Hurling a sphere versus hosting a dance can be resolved by the qualifier "for charity" which is much more consistent with a dance than with a sphere. Human brains commonly resolve conflicts among differing drives or input, although failure to do so may cause psychiatric or emotional problems. At least according to science fiction, computers can suffer similar disturbances. In Arthur C. Clarke's and Stanley Kubrick's 2001: Space Odyssey and its sequel 2010, the computer "Hal 9000" becomes psychotic because of conflicting instructions and reacts by killing the space voyagers because their mission was too important to be entrusted to them. The brain/mind can perform "cognitive" functions including resolution of conflict by "subcognitive" processes such as recognizing patterns, making assumptions and performing imaginative leaps. The net effect is consciousness: a collective effect of simpler processes.

0 0