Page 88 - Coincidences in the Bible and in Biblical Hebrew
P. 88
CHAPTER 3 “RANDOMNESS” AND “COLD”
CHAPTER 3 “RANDOMNESS” AND “COLD” 67 67
To understand how this concept is associated with randomness, suppose that
we have seven boxes, and one of these boxes contains a fortune. We wish to select
one of the boxes in accordance with the available information. Let us distinguish
three different scenarios.
Scenario A: We know that the fortune is in Box 4.
Translated into probabilities, this means that the probability that the fortune is
in Box 4 is 1 (or 100%), while all the other probabilities are zero. The Shannon
entropy associated with this decision, or with the distribution that led us to this
decision, is given by
H = –(1)log 2(1) = 0
This decision was taken in conditions of complete certainty; therefore, entropy
is zero.
Scenario B: We do not know for certain where the fortune is hidden—how-
ever, we have some information that allows us to assign probabilities to the various
boxes.
Suppose that these probabilities are (they have to sum up to 1):
p 1 = 0.05; p 2 = 0.1; p 3 = 0.4; p 4 = 0.05; p 5 = 0.2; p 6 = 0.15; p 7 = 0.05
Shannon entropy is
H = –[0.05*log 2(0.05) + 0.1*log 2(0.1) + 0.4*log 2(0.4) + 0.05*log 2(0.05) +
0.2*log 2(0.2) + 0.15*log 2(0.15) + 0.05*log 2(0.05)] = 2.384 bits
The decision as to which box contains the fortune has to be taken under con-
ditions of uncertainty. Some randomness entered our decision. The amount of
this randomness is expressed by the entropy that characterizes the distribution
of probabilities that we face under this scenario. The size of this entropy is 2.384
bits.
The decision in Scenario B will be taken knowing that entropy is not zero (nei-
ther is it maximal, as we will learn from Scenario C).