(more excerpts from cRed-76)
"Information Theory" | ||||
Parallelity is the tendency for things to develop independently (or side-by-side) of one another. I discuss this (in connection with economics) in an article that has not yet been distributed on e-mail nationally: | ||||
| ||||
This tendency towards parallelity is found not just in relation to economics but everywhere in nature. For example, a molecule, in "deciding", so to speak, whether its collision with another molecule will result in a simple "bounce" or a more profound rearrangement of atoms, makes such a decision purely on the basis of local conditions. The molecule does not need to ask permission of some, so to speak, chemical equivalent of a Central Committee, to know what to do. Similarly, in an ecosystem, a coyote chasing a rabbit does not need the authority of any all-wise ecosystem processing center in order to know whether it is allowed to catch the rabbit. The coyote's brains and strength, that of the rabbit, and local conditions are all that matter. To give a further example of what should be an obvious point, a neuron (ie: brain cell) does not ask permission of some kind of central unit in the brain to know if it should fire (activate itself). The neuron acts on the basis of local conditions. | ||||
All systems in nature at any scale, from quark to quasar, conform to this general principle. To put it in other terms, systems in nature are all organized from "the bottom up" rather than "the top down". This means that complex systems evolve from the simultaneous local independent interactions of constituent elements. | ||||
One notable exception to the parallel simultaneous development of a large number of interacting local processes -- is a modern computer. Most computers have a central processing unit (cpu) which, in essence, makes all decisions. Nothing much happens without being routed thru the cpu. The speed at which the computer operates is hence entirely dependent on the speed of the cpu. Such a computer architecture is called "von Neumann architecture" after, of course, von Neumann, who helped design it. | ||||
The limitations of von Neumann architecture in computers (usually called the "von Neumann bottleneck") has been considered a problem for some time. One emerging alternative to the von Neumann bottleneck is what is called "massively parallel processing" (mpp). This employs very large numbers of processors which communicate with one another but still process information simultaneously. | ||||
How does this relate to how we organize the party ? Well there are two views of how we might organize our activity and interactions. These two views can be thought of as existing somewhat at opposite poles. [...] |