Science Fiction Project - Free Culture
Analog - All editorials - John Wood Campbell
* * Back * *


There has been very little study of the relationships between individuals, and the group generated by the interactions of those individuals - either at the level of purely mechanical units such as relays in a computer, or human beings in a culture. The introduction of the great electronic computers, and of ever more complex systems, and systems-of-systems, has led to a beginning of the study of systems-as-such.
The most pressing aspect of systems-problems has been the obviously high-priority one of systems failures. If we have ten thousand individual units each having a fifty per cent reliability in a one-thousand-hour run, how long can we expect the system, as a whole, to operate before failure, assuming the ten thousand units are connected in series? Answer: about six minutes.
Systems don't behave in quite as simple a way, however, when we have multiple-series-parallel connections, with crossover switching for substitution or bridging around defective units, plus feedback for internal self-checking, plus dynamic homeostasis systems, and a few of the other simpler types of arrangements the systems engineers have introduced for improved reliability. The boys in the drafting rooms are beginning to consult the biologists, and the neurologists are starting to look up from computer journals with a sudden realization of the order of, "Sooooo - so that's why the third ganglion of... hm-m-m...". Living organisms have been evolving solutions - purely pragmatic, but extremely competent after three to four billion years of field testing - to systems reliability problems too, of course. Negative and positive feedback systems - telemetering - servomechanisms - amplifier systems - miniaturization to make a miniaturization engineer tear his hair - it's all been there for a billion years or so.
Most of the naturally evolved systems are so darned highly evolved that human engineering can't figure out what in blazes the thing's built that way for. Usually the miniaturization technique has been carried down to a sub-molecular level, which makes it just a bit difficult for the engineer to trace out the circuits, even if he knew what the circuits were doing.
On the other extreme, the humanic fields are stopped just as completely because the structures they are studying are equally complex, and so huge that a man's-eye-view makes it as difficult to see the shape of the whole system-of-systems involved in a culture, as it is for a man to see the shape of the Earth. The tools for expressing the problem, moreover, are the inherently inadequate tools developed before the existence of the problem was recognized - language that's based on linear logic. Modern languages and thinking-systems work like a chain of links, and are inherently unsuitable for expression of a system-of-systems that works like a rope. Our formal method of discussion denies the use of analogical thinking, and refuses to consider that ten concurrent items, each having a truth-and-relevancy probability of fifty per cent, can constitute high-probability evidence. After all, logically each one of them can be shown to be too untrustworthy for consideration.
The social scientists - and in that group I include psychologists, psychotherapists, sociologists, anthropologists, linguistics specialists, and historians - are struggling with tools inadequate to their task, and struggling with the fact that the new tools can't be invented so long as the Rules of Evidence remain unmodified.
Being a science-fiction editor, I can speculate; anyone interested is invited to speculate along with me, in the full realization that no formally acceptable evidence can be educed to establish the validity of the speculations. This is reasoning by analogy, which every one knows is of no value in a truly logical discussion.

I suggest that in two populations, having a normal distribution of characteristic Alpha, such that population A has the peak of the distribution curve as little as 0.1% off the peak of that characteristic for population B, may, as a system, differ in kind, not merely in degree. That Population A, in other words, may by its interactions, produce a system of type X, while population B, in its interactions, solely because of that 0.1% difference, may produce a different kind of system, type Y.
If this proposition can be validated, that would imply that very minor shifts in the peak of a distribution curve could produce huge differences in the nature of the resultant culture.
The speculation is based on the following analogical reasoning: a human population, in its interactions, is a very complex system of information relay units. The "grapevine" communication system is a tremendously powerful force in shaping the reaction of any population - and grapevine communication involves multiple-parallel information relaying, with an almost indescribably complex system of feedbacks, cross-checkings, shunts, filtering systems, distorting forces, damping forces, and what not. An individual unit in the interacting complex may have a personal bias that causes him to block passage of information of type 1, while strongly amplifying and reinforcing information of type 2. For information of type 1 he acts as a damping filter; for type 2 he's a resonant amplifier.
Due to his interconnections, with type 2 information, he'll excite (transmit to) twenty contacts, and reinforce the input information strongly in transmitting it. Perhaps for type 3, or type 17 information, he's an inverter-amplifier - he actively denies and suppresses any such information. He will spend time and effort seeking out individuals who have the information, and seeking to destroy their belief in its validity. Other individuals may organize to establish blocks in the system seeking to make the entire system non-conductive for information of a specific type. In our current culture, information on sex and various other subjects is actively blocked by organized groups, for example.
All in all, the complex interactions of human individuals in a culture constitutes an enormously complex information filtering and relaying system, with both positive and negative feedback at all stages, complex shunts around blocks, and altogether constituting an unanalyzably involved system.
However some of the general characteristics of such very complex systems have been solved in a quite different area - in the field of nucleonics!
A standard nuclear reactor represents a complex population of different components, having different characteristics with respect to two critical phenomena; neutrons and fission reactions. Present in a nuclear reactor there will be U-235, U-238, a moderator such as graphite or heavy water, and various impurities, plus control rods, which are simply controllable impurities having neutron-absorbing characteristics.
If a neutron reaches a U-235 nucleus, it normally causes fission; the U-235 nucleus can, for our purposes, be considered a neutron-amplifier, since it gives off 2-plus neutrons for each neutron absorbed. All the other substances present are neutron-absorbers, tending to damp out the neutron-signals released by the U-235 neutron-amplifiers. Some neutrons will be lost by escape through the boundaries of the reactor.
If the net gain due to the U-235 "neutron amplifiers" is exactly equal to the total loss of neutrons to all other components, the intensity of the nuclear reaction will be constant at whatever level it happens to be. The overall situation is, under this condition, that, on the average, the birth rate of neutrons in the system equals exactly the death rate, so that the neutron population is constant. The net neutron reproduction constant is, then, 1.000000. This neutron reproduction rate is referred to as the k-factor of the reactor.
However, if the k-factor is 1.0000001, each succeeding generation of neutrons is slightly more numerous; the neutron population is rising, and the level of activity of the reactor going up. In a reactor, the time per generation of neutrons is exceedingly short; the rate of rise of activity will be decidedly noticeable, even with so minute an excess over 1.000...
On the inverse side, if k = 0.9999999..., the rate of reaction is falling, the system is being damped, and will eventually settle down to zero reaction.
In such a system then, if k departs from exactly 1.000... by even a minute degree, the system, as a whole, heads either for zero, or infinity. In the atomic bomb, we have a nuclear reactor with a high k factor, and the system heads for an infinite rate of reaction at a spectacularly high rate. Yet the bomb is perfectly safe and stable until triggered, because the system has been so designed that, until triggered, the k factor is held below 1.0000, and the reaction rate is, therefore, practically zero.
Now herein lies the peculiarity of this type of system-reaction; a minute difference in degree - the k-factor - produces, because of the chain interaction system, a difference of kind. If k is less than 1.0000, the reactor does not react; if k exceeds 1.0000, the reactor does react. A tiny difference of degree becomes, in a complexly interacting system of this type, an Aristotelian difference of Yes or No.
In a nuclear reactor, the k-factor is controlled usually by inserting or withdrawing the neutron-absorbing material of the control rods. The reactor system, as a whole, is highly sensitive to very small changes in the amount of neutron-absorbing material present; a little too much neutron-absorption, and the nuclear reaction damps out completely. A little too little... and things get frantic rather suddenly; the safety rods drive home, alarm bells sound off, various automatic damping devices shut down everything, and start yelling for somebody to find out what in blazes went wrong.

But any human culture is a complexly interacting group. There are individuals who will amplify and transmit certain classes of information - and others who damp it out.
Who wants to bet that a very slight shift in the peak of the population's distribution curve can't make the whole system suddenly become highly reactive to a type of idea that, theretofore, it was totally unreactive to?
Just a few less idea-dampers, or a few more idea-amplifiers - and the system may "go critical" with respect to that idea.
Sure - it's just a matter of degree, not of land, at the level of individual characteristics.
But it's a matter of kind at the level of system response!

October 1957