Times in the water (W), sometimes under pebbles (P), and some-
Times on the bank (B). Suppose that, over each unit interval of
Time, there is a constant probability that, being under a pebble, it
Will go up on the bank; and similarly for the other possible transi-
Tions. (We can assume, if we please, that its actual behaviour at
Any instant is determined by minor details and events in its envi-
Ronment.) Thus a protocol of its positions might read:
WBWBWPWBWBWBWPWBBWBWPWBWPW
BWBWBBWBWBWBWPPWPWBWBBBW
Suppose, for definiteness, that the transition probabilities are
↓
The stochastic transformation is simply an extension of the
Determinate (or single valued). Thus, suppose the matrix of tran-
Sition probabilities of a three-state system were:
↓
A
B
C
↓
A
B
C
First A 0 0.9 0.1 and then A 0
B 0.9 00B 100
C 0.1 0.1 0.9C 001
The change, from the first matrix to the second, though small (and
Could be made as small as we please) has taken the system from
The obviously stochastic type to that with the single-valued trans-
Formation:
B
W
P
↓
A
B
B
A
164
C
C
B 1/4 3/4 1/8
W 3/4 0 3/4
P 0 1/4 1/8
These probabilities would be found (S.9/2) by observing its
Behaviour over long stretches of time, by finding the frequency of,
say, B → W, and then finding the relative frequencies, which are
165
A N I N T R O D UC T I O N T O C Y B E R NE T I C S
I N CESSA N T TR AN SMI SSIO N
The probabilities. Such a table would be, in essence, a summary of
Actual past behaviour, extracted from the protocol.
Such a sequence of states, in which, over various long stretches,
The probability of each transition is the same, is known as a
Markov chain, from the name of the mathematician who first
Made an extensive study of their properties. (Only during the last
Decade or so has their great importance been recognised. The
Mathematical books give various types of Markov chain and add
Various qualifications. The type defined above will give us all we
Want and will not clash with the other definitions, but an impor-
Tant qualification is mentioned in S.9/7.)
|
|
The term “Markov chain” is sometimes applied to a particular
Trajectory produced by a system (e.g. the trajectory given in Ex. 1)
And sometimes to the system (defined by its matrix) which is
Capable of producing many trajectories. Reference to the context
Must show which is implied.
Ex. 1: A system of two states gave the protocol (of 50 transitions):
ABABBBABAABABABABBBBABAABABBAAB
A B B A B A A A B A B B A A B B A B B A.
Draw up an estimate of its matrix of transition probabilities.
Ex. 2: Use the method of S.9/2 (with the coin) to construct several trajectories,
So as to establish that one matrix can give rise to many different trajectories.
Ex. 3: Use a table of random numbers to generate a Markov chain on two states
A and B by the rule:
If
Then
Present state Random number next state
A0 or 1A
,,2, 3 … 9B
B0, 1, 2, 3, 4A
B
Ex. 4: (Continued.) What is its matrix of transition probabilities?
He can deduce that if it is in water it will not stay there, for W → W
has probability zero, but will go usually to the bank, for W → B
Has the highest probability in the column. From the bank it will
Probably go to the water, and then back to the bank. If under a
Pebble it also tends to go to the water. So clearly it spends much
Of its time oscillating between bank and water. Time spent under
The pebbles will be small. The protocol given, which was con-
Structed with a table of random numbers, shows these properties.
Thus the matrix contains information about any particular sys-
Tem’s probable behaviour.
Ex. 1 Had the P-column of the matrix a 1 in the lowest cell and zero elsewhere,
What could be deduced about the insect’s mode of life ?
Ex. 2: A fly wanders round a room between positions A, B, C, and D, with tran-
|
|
Sition probabilities:
BCD
↓ A
A 1/2001/3
B 1/4101/3
C 1/401/2 1/3
D001/20
Дата добавления: 2019-11-16; просмотров: 232; Мы поможем в написании вашей работы! |
Мы поможем в написании ваших работ!