Abstract Computational Device
When we talk about biocomputers, the first example that usually comes to mind is the human brain. The computational model of the brain mind interaction has gained some popularity, but the biological model has a couple of major differences that we need to be aware of.
First is that the computer was designed as an abstract computational device. We design computers as general purpose tools that can be used in any number of ways.
When we want to use it for something specific, we need a program designed to process specific information in very specific ways to obtain the desired output. We have to input the values we are going to use for our computation. These values are the only thing that gives our computations meaning. The computer itself, was designed without any specific meaning for its code or output. In biological systems, the situation is very different. Information processing evolved around meaning to begin with. The values it processes are raw data, with meaningfulness integrated. It is sort of an inside out version of the process that designed computers.
The same kind of reorganization of system components is exemplified by fact that the central processing unit(cpu) is anything, but central. Every cell does its own processing. In computers, the RAM, cpu, data storage and devices are all separate units that accomplish the various computational functions. In biological systems, cells adapt to perform all functions.
From basic stem cells, all other cells form according the information activated in the DNA. Therefore it must be assumed that all of the functionality of consciousness is contained within every cell, at least the coded instructions that make a cell able to participate in consciousness. Whether one believes that consciousness is a purely internal affair of the brain or that it is connected to a universal source, all of the functionality that makes it possible, must be contained in each and every cell.
This “inside out” version of our computer model appears more like a network than a single machine. However, while these cells all have their own onboard processor, they are all processing in synch, acting very much like a single supercomputer. Instead of processing digital strings in parallel, the biocomputer is processing waves of information in a nonlinear fashion, comparing incoming waves with the record of past waves.
Every component within our biological network contains its own network of nested networks of parallel wave processors. Even the “wires” that connect components act like a router in that they have their own processor.
All of these processors working in synch generate a coherent electromagnetic field that is a dynamic reflection of the “state” of the organism. Each cell within the system is recording its own state, as well as, the state of the overall organism in holographic interference patterns. Organs, connecting subsystems and muscle groups form their own coherent fields, that monitor and record their own state.
The basic process goes like this:
Input as interference patterns, generated by dynamic state of the organism. Past patterns build up to indicate the shape of future patterns. These past patterns are projected into the future as expectations. In this way, input is compared with expectations. Deviation from expectations produces ripples in the interference patterns that define the inconsistency from every possible angle.
We can think of this as a linear process, it is more like our model of a computer, so it makes the concept easier to grasp, but in reality these interference patterns contain all of the information available for memory for the entire organism. That is a lot of information, more than a linear description can accommodate. Trillions of cells participating in a dynamic field of their combined states. So, when we say, “compare input with expectations”, we are talking about comparing oncoming waves of information, rather than a digital string.
Waves of Meaning
It would be like standing on the seashore and noticing the changing patterns of waves pounding at your feet. The position of every water molecule is affected by the changing forces of wind, tide, temperature, salinity, shape of the sea bed, earthly vibrations and all. It is as if you could look at each wave and predict its motion, according to past memory. Then being able to see the deviation from expectations and surmise the change that brought it about, and make corrections to restore predictability. A tough assignment, right? If every molecule in your ocean was a sensor that reported its position and “state”, in one dynamic readout, the task would be much easier.
Computers are essential to the functioning of an information processing system, but when those basic concepts are expanded to incorporate distribution of targeted information in waves, across a network, with feedback from the information itself controlling distribution, we just begin to reach the level of sophistication demonstrated by biological systems. These are systems that monitor themselves and propose adaptive strategies to attain goals, as do intelligent human beings.
In recent years, science has discovered things like entangled particles, wave functions, feedback loops and holographic principles. It was thought that these phenomena had little to do with natural systems, but were concepts confined to the lab and theoretical exercise. In reality these concepts are fundamental to systems at all levels. It is not that there are sometimes feedback loops, nor that wave functions sometimes mirror reality’s uncertainties, nor that particles sometimes become entangled, nor that holographic principles sometimes appear, but that these principles work together to produce the most sophisticated information processing system imaginable. The system is all feedback, all entangled, all holographic, all memory, all processing, with intent, with targeted goals achieved by employing strategies. This system exhibits capability that gives a new depth to the meaning of the term, intelligence.