Ety in R, S moves can force down the variety in the outcomes.
If the varieties are measured logarithmically (as is almost
Always convenient), and if the same conditions hold, then the the-
Orem takes a very simple form. Let VD be the variety of D, VR that
Of R, and VO that of the outcome (all measured logarithmically).
206
Then the previous section has proved that VO cannot be less,
Numerically, than the value of VD – VR. Thus VO’s minimum is VD
– VR.
If VD is given and fixed, VD – VR can be lessened only by a cor-
Responding increase in VR. Thus the variety in the outcomes, if
Minimal, can be decreased further only by a corresponding
Increase in that of R. (A more general statement is given in S.11/9.)
This is the law of Requisite Variety. To put it more pictur-
Esquely: only variety in R can force down the variety due to D;
Variety can destroy variety.
This thesis is so fundamental in the general theory of regulation
That I shall give some further illustrations and proofs before turn-
Ing to consider its actual application.
This section can be omitted at first reading.) The law is of
Very general applicability, and by no means just a trivial outcome
Of the tabular form. To show that this is so, what is essentially the
Same theorem will be proved in the case when the variety is spread
Out in time and the fluctuation incessant— the case specially con-
Sidered by Shannon. (The notation and concepts in this section are
Those of Shannon’s book.)
Let D, R, and E be three variables, such that each is an informa-
Tion source, though “source” here is not to imply that they are act-
Ing independently. Without any regard for how they are related
Causally, a variety of entropies can be calculated, or measured
Empirically. There is H(D,R,E), the entropy of the vector that has
The three as components; there is HD (E), the uncertainty in E
When D, S state is known; there is HED (R), the uncertainty in R
When both E and D are known; and so on.
The condition introduced in S.11/5 (that no element shall occur
Twice in a column) here corresponds to the condition that if R is
Fixed, or given, the entropy of E (corresponding to that of the out-
|
|
Come) is not to be less than that of D, i.e.
HR (E) > HR (D)
Now whatever the causal or other relations between D, R and E,
Algebraic necessity requires that their entropies must be related so
That
H(D) + HD (R) = H(R) + HR (D)
For each side of the equation equals H(R,D). Substitute HR(E) for
HR(D), and we get
H(D) + HD (R) < H(R) + HR (E)
< H(R,E).
207
A N I N T R O D UC T I O N T O C Y B E R NE T I C S
REQ U ISI TE V A RI ETY
But always, by algebraic necessity,
H(R, E) < H(R) + H(E)
soH(D) + HD (R) < H(R) + HR (E)
i. e.H(E) > H(D) + HD (E) – H(R).
Thus the entropy of the E’s has a certain minimum. If this mini-
Mum is to be affected by a relation between the D- and R-sources,
it can be made least when HD(R) = 0, i.e. when R is a determinate
Function of D. When this is so, then H(E)’s minimum is H(D) –
H(R), a deduction similar to that of the previous section. It says
Simply that the minimal value of E’s entropy can be forced down
Below that of D only by an equal increase in that of R.
The theorems just established can easily be modified to give
A worth-while extension.
Consider the case when, even when R does nothing (i.e. pro-
Duces the same move whatever D does) the variety of outcome is
Less than that of D. This is the case in Table 11/4/1. Thus if R gives
the reply α to all D’s moves, then the outcomes are a, b or d— a
Variety of three, less than D’s variety of five. To get a manageable
Calculation, suppose that within each column each element is now
Repeated k times (instead of the “once only” of S.11/5). The same
Argument as before, modified in that kn rows may provide only
|
|
One outcome, leads to the theorem that
VO > VD – log k – log VR ,
In which the varieties are measured logarithmically.
Дата добавления: 2019-11-16; просмотров: 208; Мы поможем в написании вашей работы! |
Мы поможем в написании ваших работ!