This is a much abbreviated discussion of some aspects of complexity theory and how those aspects of the theory apply to what the Navy must do between now and the Navy-after-next. Its contents are gleaned from or inspired by the writings of M. Mitchell Waldrop and Stuart Kauffman.6
Systems can exhibit two extremes of structure: order and chaos. An exceptionally ordered system has little interaction among its elements. There is little flexibility within it. It does what its structure allows, and no more. It does not interact constructively with new systems, and therefore it neither learns nor evolves. It tends to be rigid. It is Stalinist.
A chaotic system has the opposite problem. It has few standards. It lacks the minimum levels of stability that are needed to maintain and nurture a learning system. It constantly reacts and seldom integrates. Its lack of structure allows everything, and therefore nothing evolves beyond its current state. It tends to be utterly fluid and turbulent. It is Bosnian.
In between these two extremes, at a kind of murky, turbid phase transition called "the edge of chaos," there is complexity. In this phase transition the elements of the system never quite lock into place, yet never quite dissolve into turbulence, either. This system is both stable enough to store information and active enough to transmit it. It is American.7
Complex systems on the edge of chaos can self-organize to react to their environment. To attain the levels of spontaneity and adaptation necessary for self-organization, they must be highly interactive with other related systems (no stovepipes allowed), and very quick to absorb, integrate, and change. The United States has developed such a system to manage its society; the Navy has developed such a system to manage an active flight deck. It is the Navy's task to develop a similarly flat, adaptive, and agile system to manage a successful (which means, rapidly evolving) Navy in the edge-of-chaos situation it faces in the trans-industrial world.
Found in the region between order and chaos, coevolution is a process in which two or more related processes support each other in ways that cannot be foreseen before they begin to interact. It is somewhat different from random selection and survival of the fittest.
For example, the invention of the internal combustion engine led to the invention of the automobile, which began life as a rich gentleman's toy.8Development of the automobile led to development of gas stations, better roads, motels, etc., which in turn encouraged more people to buy automobiles. The growing population of owners and operators of automobiles began to live farther from work, meet people in distant towns, and distribute products more rapidly and efficiently. Tire and rubber industries expanded, and petroleum by-products fed the development of the chemical industries. Better steels and metals were developed, engineering skills were honed and polished, and all was done quicker, cheaper, and better than was possible a few years earlier.
Demand fed competition. Competition fueled the growth of the skills and resources that were applied to further development of the internal combustion engine, constantly improving it. This process unleashed an avalanche of applications, which in turn accelerated the rate of development of the engine and of the related industries it spawned. New applications spawned whole new industries in turn. Each development fed the others in ways totally unimaginable to the inventors and early producers of the internal combustion engine and the automobile.
Simultaneously, the internal combustion engine and the automobile pushed the horse out of its central position in society. The new drove out the old. Out went blacksmiths, saddle makers, stables, carriages, and harness shops. In a reversal of the former order, the horse became the gentleman's toy and the car became a family and social necessity.
As with the internal combustion engine and the automobile, so too with the Navy and its environment. No one knows, and no one can know, what Navy will be needed in the foggy distant future. What is apparent is that the nation must have a Navy that will rapidly interact coevolve with its changing environment. To build a Navy that thrives at the edge of chaos, some of the characteristics of complexity must first be considered.
Uncertainty is a fact of existence in the complex region between order and chaos. However, it is not the paralyzing uncertainty of chaos. As illustrated by the coevolution of the internal combustion engine and the automobile, new enabling ideas grow from new or freshly fertilized fields, and lead to other completely unexpected ideas in an avalanche of change that nothing can escape.
The Navy is now swept up in the avalanche of change that was initiated, at some unheralded and quiet moment, by the development of silicon chips and the computer. The implications of new technologies and rapid change are profound and engender uncertainties that cause severe disquiet and unease. The Navy is tempted to cling to the security blanket of its successful past, but that sort of cringing however comforting will not take it safely through the future, which is arriving now.
It is up to the Navy to learn, now, how to ride the avalanche of change. It must continually and quickly adapt. The consequences of the actions it takes, and whether they will succeed, cannot be known. Nonetheless, it is plain that a fixation on the past will not help the Navy. It is a fact that if the Navy does not act, it will not succeed.
What actions will help the Navy succeed? In an avalanche of rapid change, crystal balls looking far into the future are inevitably cloudy. Thus the best the Navy can do is be "locally wise," observing simple decision rules, and gathering (and digesting) relevant information to help execute its decisions wisely.9
Simple Decision Rules, Locally Applied
What the Navy Can Learn from "Boids"
Birds are not very intelligent animals; they can respond to only the simplest of rules. Nevertheless, birds flock, and as a flock they move elegantly and smoothly in complex environments. If the directions for flocking, and for moving as a flock, were transmitted from the leader to each of the members, the leader and the members of the flock would require an elaborate communications system and considerable processing power. But birds have neither. How do they do it?
In the late 1980s, a gentleman by the name of Craig Reynolds developed a computer flock of "boids." His flock "flew" beautifully, but it was not built and led by a leader from the top down. Rather, it was built from the bottom up, in a scheme in which each boid followed three simple rules of behavior, described by Waldrop as follows:10
None of the rules was "form a flock," which would have been too hard for a bird/boid to execute. The rules were entirely local, referring only to what a boid could see and do in its own vicinity. The flock formed "from the bottom up." The boids were able to fly as a flock in a complex environment (from a boid's perspective) through:
The same formula works for more intelligent entities in much more complex environments, even those that threaten to overwhelm their inhabitants. By applying simple decision rules on the basis of information made available to it, the Navy can flourish in the complexity of its environment while continuing to learn and adapt.
The more ideas the Navy has available to it, the more interaction, stimulation, and coevolution is possible. The more structure the Navy has, the fewer the ideas the Navy gets. An intricate system of stovepipes and bureaucracies (such as the present structure of the Navy) tends to quash ideas and stifle creative thought. Too much structure leads to a highly ordered regime that tends also to be slower, less agile, and less flexible than a less ordered regime.
To open the floodgates that are holding back ideas, the Navy must:
Ideas that do not interact cannot coevolve. For coevolution to occur, diversity of ideas must be catalyzed by communication (diversity without communication among various elements is merely divisive and counterproductive). Modern technology fosters extensive and pervasive communication and thus makes possible flat organizations that depend upon and encourage such communication.
A high degree of reactivity the rate and intensity of interaction between the various components of a mixture (any mixture, whether chemical, biological, or social) must be sustained in the soup of diversity long enough to establish a course of development. The greater the reactivity, the shorter the time needed for interaction to evolve a new course (i.e., more options can be explored in less time).
Of course, this paragraph also applies to the Department of Defense, to the other Armed Services separately, and to all the Armed Services jointly. return
6. See M. Mitchell Waldrop, Complexity: The Emerging Science at the Edge of Order and Chaos (New York: Touchstone, 1992); and Stuart Kauffman, At Home in the Universe: The Search for the Laws of Self-Organization and Complexity (New York: Oxford University Press, 1995). Kauffman is a member of the Santa Fe Institute. return
7. Kauffman, p. 293. "Stalinist" is the term Kauffman uses to describe the regime of extreme order, "Bosnian" and "American" are the author's own terms. Kauffman uses "Red Queen" to denote the regime of total chaos, and attributes the tag (p. 216) to Lee Van Valen, a paleontologist at the University of Chicago. return
8. The following discussion about the coevolution of industries associated with the internal combustion engine is an extensive expansion by the author of a theme used by the Austrian economist Joseph Schumpeter, as quoted by Kauffman, p. 279. return
9. See Kauffman, p. 28. return
10. Waldrop, p. 241. For further information on boids, see Mr. Reynolds's web site at http://hmt.com/cwr/boids.html. return