Mit the maximal quantity of information (in given time).



Shannon has devised a measure for the quantity of variety shown

By a Markov chain at each step— the entropy— that has proved of

Fundamental importance in many questions relating to incessant

Transmission. This measure is developed in the following way.

If a set has variety, and we take a sample of one item from the

Set, by some defined sampling process, then the various possible

Results of the drawing will be associated with various, correspond-

Ing probabilities. Thus if the traffic lights have variety four, show-

Ing the combinations

Red

Red and Yellow

Green

Yellow,

And if they are on for durations of 25, 5, 25 and 5 seconds respec-

Tively, then if a motorist turns up suddenly at irregular times he

Would find the lights in the various states with frequencies of

about 42, 8, 42 and 8% respectively. As probabilities these

Become 0.42, 0.08, 0.42 and 0.08. Thus the state “Green” has (if

This particular method of sampling be used) a probability of 0 42;

And similarly for the others.

Conversely, any set of probabilities— any set of positive frac-

Tions that adds up to 1— can be regarded as corresponding to some

Set whose members show variety. Shannon’s calculation proceeds

From the probabilities by the calculation, if the probabilities are

p1, p2, …, pn, of

            – p1 log p1 – p2 log p2 – … pn log pn ,

A quantity which he calls the entropy of the set of probabilities

And which he denotes by H. Thus if we take logs to the base 10,

The entropy of the set associated with the traffic lights is

– 0.42 log100.42 – 0.08 log100.08 – 0.42 log100.42 – 0.08 log100.08

which equals 0.492. (Notice that log10 0.42 = 1.6232 = – 1.0000 +

174

0.6232 = – 0.3768; so the first term is ( – 0.42)( – 0.3768), which

is + 0.158; and similarly for the other terms.) Had the logs been

Taken to the base 2 (S.7/7) the result would have been 1.63 bits.

The word “entropy” will be used in this book solely as it is used

By Shannon, any broader concept being referred to as “variety” or

In some other way.

Ex. 1: On 80 occasions when I arrived at a certain level-crossing it was closed on

What is the entropy of the set of probabilities?

Ex. 2: From a shuffled pack of cards one is drawn. Three events are distin-

Guished:

E1 the drawing of the King of Clubs,

E2: the drawing of any Spade

E3: the drawing of any other card.

What is the entropy of the variety of the distinguishable events ?

Ex. 3: What is the entropy of the variety in one throw of an unbiased die ?

Ex. 4: What is the entropy in the variety of the set of possibilities of the outcomes

With their order preserved) of two successive throws of an unbiased die ?

Ex. 5: (Continued.) What is the entropy of n successive throws ?

*Ex. 6: What is the limit of – p log p as p tends to zero ?

The entropy so calculated has several important properties.

First, it is maximal, for a given number (n) of probabilities, when

The probabilities are all equal. H is then equal to log n, precisely

The measure of variety defined in S.7/7. (Equality of the probabil-

Ities, in each column, was noticed in S.9/10 to be necessary for the

Constraint to be minimal, i.e. for the variety to be maximal.) Sec-

Ondly, different H’s derived from different sets can, with suitable

Qualifications, be combined to yield an average entropy.

Such a combination is used to find the entropy appropriate to a

Markov chain. Each column (or row if written in the transposed

Form) has a set of probabilities that sum to 1. Each can therefore

Provide an entropy. Shannon defines the entropy (of one step of

The chain) as the average of these entropies, each being weighted

By the proportion in which that state, corresponding to the col-

Umn, occurs when the sequence has settled to its equilibrium (S.9/

Thus the transition probabilities of that section, with corre-

Sponding entropies and equilibrial proportions shown below, are

                ↓

B

W

P

Entropy:

Equilibrial proportion:

B

W

P

 1/4 3/4 1/8

 3/403/4

01/4 1/8

0.811 0.811 1.061

0.449 0.429 0.122

175

A N I N T R O D UC T I O N T O C Y B E R NE T I C S


Дата добавления: 2019-11-16; просмотров: 239; Мы поможем в написании вашей работы!

Поделиться с друзьями:






Мы поможем в написании ваших работ!