I N CESSA N T TR AN SMI SSIO N



Then the average entropy (per step in the sequence) is

0.449 × 0.811 + 0.429 × 0.811 + 0.122 × 1.061 = 0.842 bits.

A coin spun repeatedly produces a series with entropy, at each

Spin, of 1 bit. So the series of locations taken by one of the insects

As time goes on is not quite so variable as the series produced by

A spun coin, for 0.842 is less than 1.00. In this way Shannon’s

Measure enables different degrees of variety to be compared.

The reason for taking a weighted average is that we start by

Finding three entropies: 0.811, 0.811, and 1.061; and from them

We want one. Were they all the same we would obviously just use

That value, but they are not. We can, however, argue thus: When

the system has reached equilibrium, 45 °/O of the insects will be at

state B, 43% at W, and 12% at P. This is equivalent, as the insects

Circulate between all the states, to saying that each insect spends

45% of its time at B, 43% at W, and 12% at P. In other words, 45%

of its transitions will be from B, 43% from W, and 12% from P.

Thus 45% of its transitions will be with entropy, or variety, of

0.811, 43% also with 0.811, and 12% with 1.061. Thus, transi-

Tions with an entropy of 0.811 will be frequent (and the value

“0.811” should count heavily) and those with an entropy of 1.061

Will be rather rare (and the value “1.061” should count little). So

the average is weighted: 88% in favour of 0.811 and 12% in

Favour of 1.061, i.e.

                        45 × 0.811 + 43 × 0.811 + 12 × 1.061

weighted average = ------------------------------------------------------------------------------------------

                                                 45 + 43 + 12

Which is, effectively, what was used above.

Ex. 1: Show that the series of H’s and T’s produced by a spun coin has an average

Entropy of 1 bit per spin. (Hint: Construct the matrix of transition probabili-

Ties.)

Ex. 2: (Continued.) What happens to the entropy if the coin is biased ? (Hint: Try

The effect of changing the probabilities.)

If applied to an information source, with several sets of prob-

Abilities, the matrix of transition probabilities must be Markovian;

That is to say, the probability of each transition must depend only

On the state the system is at (the operand) and not on the states it

Was at earlier (S.9/7). If necessary, the states of the source should

First be re-defined, as in S.9/8, so that it becomes Markovian.

The several entropies of the several columns are averaged

(S.9/12) using the proportions of the terminal equilibrium (S.9/6).

It follows that the theorems assume that the system, however it

Was started, has been allowed to go on for a long time so that the

States have reached their equilibrial densities.

Shannon’s results must therefore be applied to biological mate-

Rial only after a detailed check on their applicability has been made.

A similar warning may be given before any attempt is made to

Play loosely, and on a merely verbal level, with the two entropies

Of Shannon and of statistical mechanics. Arguments in these sub-

Jects need great care, for a very slight change in the conditions or

Assumptions may make a statement change from rigorously true

To ridiculously false. Moving in these regions is like moving in a

Jungle full of pitfalls. Those who know most about the subject are

Usually the most cautious in speaking about it.

Ex. 1: Work out mentally the entropy of the matrix with transition probabilities

A

B

C

A

0.2

0.7

0.1

B

 0

1.0

 0

C

0.3

0.3

0.4

Hint: This is not a feat of calculation but of finding a peculiar simplicity.

What does that 1 in the main diagonal mean (Ex. 9/5/1)? So what is the final

Equilibrium of the system? Do the entropies of columns A and C matter? And

What is the entropy of B’s column (Ex. 9/11/6)?)

Ex. 2: (Continued.) Explain the paradox: “When the system is at A there is vari-

Ety or uncertainty in the next state, so the entropy cannot be zero.”

Before developing the subject further, it is as well to notice

That Shannon’s measure, and the various important theorems that

Use it, make certain assumptions. These are commonly fulfilled in

Telephone engineering but are by no means so commonly fulfilled

In biological work, and in the topics discussed in this book. His

Measure and theorems must therefore be applied cautiously. His

Main assumptions are as follows.

If applied to a set of probabilities, the various fractions must

Add up to 1; the entropy cannot be calculated over an incomplete

Set of possibilities.

176

A little confusion has sometimes arisen because Shannon’s

measure of “entropy”, given over a set of probabilities P1, P2, ….


Дата добавления: 2019-11-16; просмотров: 224; Мы поможем в написании вашей работы!

Поделиться с друзьями:






Мы поможем в написании ваших работ!