Information Encoding and Exchange Towards a Unified Information Theoretic Underpinning

The Outside-Inside Framework provides one way to organize several of the key issues and ideas surrounding the use of NBICS technology advances to enhance human performance ("make us all healthier, wealthier, and wiser"). This simple framework can be shown to be largely about understanding and controlling how, where, and what information is encoded and exchanged. For example, consider the following four loosely defined systems and the way information is encoded differently, and interdependently, in each: (a) bits and the digital environment (information), (b) brains and memes and the social environment (cogno-socio), (c) bacteria and genes and the bioenvironment (nano-bio), (d) bulk atoms, raw materials, designed artifacts, and the physical environment (nano-based).

At this point, a brief digression is in order to appreciate the scale of successful information encoding and evolution in each of these loosely defined systems. People have existed in one form or another for about 2 million years, which is a few hundred thousand generations (to an order of magnitude). Today, there are about six billion people on Earth. The human body is made up of about 1013 cells, the human brain about 1010 cells (1027 atoms), and the human genome is about 109 base pairs. Humans have been good problem solvers over the generations, creating successful civilizations and businesses as well as creating a growing body of knowledge to draw on to solve increasingly complex and urgent problems. However, in some ways, even more impressive than humans are bacteria, according to author Howard Bloom (2001). Bacteria have existed on Earth for about 3.5 billion years, which is an estimated 1014 bacteria generations ago. Today, there are an estimated 1030 bacteria (or about one hundred million bacteria for every human cell) on Earth living inside people, insects, soil, deep below the surface of the Earth, in geothermal hot springs in the depths of the ocean, and in nearly every other imaginable place. Bacteria have been successful "problem-solvers," as is evidenced by their diversity and ever-growing bag of genetic tricks for solving new problems. People have made use of bacteria for thousands of generations (though electronic digital computers only recently) in producing bread, wine, and cheese, but only in the past couple of generations have bacteria become both a tool kit and a road map for purposeful gene manipulation. Bacteria and viruses are both an ally and a threat to humans. For example, bacterial or viral plagues like the influenza outbreak of 1917 are still a major threat today. Among our best new allies in this fight are the advances in life sciences technologies enabled by more powerful digital technology. Most recently, electronic transistors have been around for less than a century, and at best, we have only a few dozen generations of manufacturing technology. Today, there are more than 1018 transistors on Earth, and very roughly 10 million transistors per microprocessor, 100 million PCs manufactured per year, and 10 billion embedded processors.

Returning to the issue of understanding and controlling how, where, and what information is encoded and exchanged, consider the following milestones in human history (where GA is human generations ago), as seen through the lens of the Outside-Inside Framework:

• Speech (100,000 GA): A new skill (new use of old sensors and effectors, requires learning a new audible language), encoding information in sounds, for exchanging information between people. Probably coincides with the evolution of new brain centers, new organs.

• Writing (500 GA): A new mediator and new skill (new use of old sensors and effectors, requires learning a new visual language), encoding information in visual symbols on materials from the environment for recording, storing, and exchanging information between people. Did not require new brain centers beyond those required for spoken language.

• Libraries (400 GA): A new place and agent (organization) for collecting, storing and distributing written information.

• Universities (40 GA): A new place and agent (organization) for collecting, storing, and distributing information as social capital.

• Printing (14 GA): A new mediator (tool) for distributing information by making many physical copies of written and pictorial information.

• Accurate clocks (16 GA): A new mediator (tool) for temporal information and spatial information (accurate global navigation).

• Telephone (5 GA): A new mediator (tool) for exchanging audio information encoded electrically and transported via wires over great distances.

• Radio (4 GA): A new mediator (tool) for distributing audio information encoded electromagnetically and transported wirelessly over great distances.

• Television (3 GA): A new mediator (tool) for distributing audiovisual information encoded electromagnetically, transported wirelessly over great distances.

• Computers (2 GA): A new mediator and agent for storing, processing, creating, and manipulating information encodable in a binary language.

• Internet (1 GA): A new mediator for distributing information encodable in a binary language.

• Global Positioning System or GPS (0 GA): A new mediator for spatial and temporal (atomic clock accuracy) information.

Stepping back even further for a moment (per Bloom 2001), we can identify six fundamental systems for encoding and accumulating information: matter, genes, brains, memes, language, and bits:

• Big Bang (12 billion years ago): New place and new material - the Universe and matter.

• Earth (4.5 billion years ago): New place and new materials - the Earth and its natural resources.

• Bacteria (3.5 billion years ago): New species and agent, encoding information in primitive genome (DNA) in cells.

• Multicellular (2.5 billion years ago): New species with multicellular chains and films.

• Clams (720 million years ago): New species with multiple internal organs with primitive nervous systems.

• Trilobites (500 million years ago): New species with simple brains for storing information (memes possible).

• Bees (220 million years ago): New species and agent; social insect with memes, collective IQ.

• Humans and Speech (2 million years ago): New species and agent, with primitive spoken language and tools, extensive memes, collective IQ.

• Writing (about 10 thousand years ago): New mediator, recordable natural language and symbolic representations.

• Computers (about 50 years ago): New mediator and agent, binary language and predictable improvement curve through miniaturization.

Of course, all these dates are very approximate. The important point is simply this: if the past is the best predictor of the future, then we can expect NBICS convergence to shed light on all of these key systems for encoding, exchanging, and evolving information. If (and this is a big if) we can (1) truly understand (from an information processing standpoint) the working of material interactions, genes and proteins, nervous systems and brains, memes and social systems, and natural language, and translate all this into appropriate computational models, and (2) use this deep model-based understanding to control and directly manipulate their inner workings to short-cut the normal processes of evolution, then perhaps we can create improvements (solve complex urgent problems) even faster. Of course, accelerating evolution in this way is both staggeringly difficult to do in reality as well as potentially very empowering and dangerous if we should succeed.

Again, the point here is simply that NBICS convergence has zeroed in on the key, few separate information systems that drive enhancements not only to human performance, but to the universe as we know it: matter, genes, brains, memes, language, and bits. Does this mean that we have bitten off too much? Perhaps, but it does seem to be time to ask these kinds of convergence questions, much as physicists in the late 1800s began a quest to unify the known forces. In essence, the quest for NBICS convergence is looking for the Maxwell's equations or, better yet, the "unified field theory" for complex dynamic systems that evolve, but in terms of models of information encoding and exchange instead of models of particle and energy exchange. Author and scientist Richard Dawkins in his book The Selfish Gene foreshadows some of this thinking with his notion of a computational zoology to better understand why certain animal behaviors and not others make sense from a selfish gene perspective. Author and scientist Stuart Kaufman in his book At Home in the Universe: The Search for the Laws of Self-Organization and Complexity is searching for additional mechanisms beyond evolution's natural selection mechanism that could be at work in nature. Testing and applying these theories will ultimately require enormous computational resources.

It is interesting to note that computational power may become the limiting factor to enhancing human performance in many of the scenarios described above. What happens when Moore's Law runs out of steam? To throw one more highly speculative claim into the hopper, perhaps quantum computing will be the answer. Recently, IBM researchers and collaborators controlled a vial of a billion-billion (1018) molecules designed to possess seven nuclear spins. This seven qubit quantum computer correctly factored the number 15 via Shor's algorithm, and had its input programmed by radio frequency pulses and output detected by a nuclear magnetic resonance instrument. Certainly, there is no shortage of candidates for the next big thing in the world of more computing power.

0 0

Post a comment