Guage such as English. Thus: what is the probability that an s will



Be followed by a t? It depends much on what preceded the s; thus

Es followed by t is common, but ds followed by t is rare. Were the

Letters a Markov chain, then s would be followed by t with the

Same frequency in the two cases.

170

Re-coding to Markov form. When a system is found to pro-

Duce trajectories in which the transition probabilities depend in a

Constant way on what states preceded each operand, the system,

Though not Markovian, can be made so by a method that is more

Important than may at first seem— one re-defines the system.

Thus suppose that the system is like that of Ex. 9/7/1 (the pre-

Ceding), and suppose that the transitions are such that after the

two-state sequence … CC it always goes to D, regardless of what

occurred earlier, that after … DC it always goes to C, that after

… CD it goes equally frequently in the long run to C and D, and

similarly after … DD. We now simply define new states that are

Vectors, having two components— the earlier state as first compo-

Nent and the later one as second. Thus if the original system has

just produced a trajectory ending … DC, we say that the new sys-

Tem is at the state (D, C). If the original then moves on to state C,

so that its trajectory is now … DCC, we say that the new system

Has gone on to the state (C, C). So the new system has undergone

the transition (D, C) → (C, C). These new states do form a

Markov chain, for their probabilities (as assumed here) do not

Depend on earlier state in fact the matrix is

(C,C) (C,D) (D,C) (D,D)

0

1

0

0

 0

 0

1/2

1/2

1

0

0

0

 0

 0

1/2

1/2

(C,C)

(C,D)

(D,C)

(D,D)

(Notice that the transition (C,D) → (C,D) is impossible; for any

State that ends (– ,D) can only go to one that starts (D,– ). Some

Other transitions are similarly impossible in the new system.)

171

A N I N T R O D UC T I O N T O C Y B E R NE T I C S

I N CESSA N T TR AN SMI SSIO N

If, in another system, the transition probabilities depend on val-

Ues occurring n steps back, then the new states must be defined as

Vectors over n consecutive states.

The method of re-defining may seem artificial and pointless.

Actually it is of fundamental importance, for it moves our attention

From a system that is not state-determined to one that is. The new

System is better predictable, for its “state” takes account of the orig-

Inal system’s past history. Thus, with the original form, to know

That the system was at state C did not allow one to say more than

That it might go to either C or D. With the second form, to know that

It was at the state (D,C) enabled one to predict its behaviour with

Certainty, just as with the original form one could predict with cer-

Tainty when one knew what had happened earlier. What is impor-

Tant is that the method shows that the two methods of “knowing” a

System— by its present state or by its past history— have an exact

Relation. The theory of the system that is not completely observable

(S.6/21) made use of this fact in essentially the same way. We are

Thus led again to the conclusion that the existence of “memory” in

A real system is not an intrinsic property of the system— we hypoth-

Esise its existence when our powers of observation are limited.

Thus, to say “that system seems to me to have memory” is equiva-

Lent to saying “my powers of observation do not permit me to make

A valid prediction on the basis of one observation, but I can make a

Valid prediction after a sequence of observations”.

Sequence as vector. In the earlier chapters we have often used

Vectors, and so far they have always had a finite and definite

Number of components. It is possible, however, for a vector to

Have an infinite, or indefinitely large number of components. Pro-

Vided one is cautious, the complication need cause little danger.


Дата добавления: 2019-11-16; просмотров: 203; Мы поможем в написании вашей работы!

Поделиться с друзьями:






Мы поможем в написании ваших работ!