Page 87 - Coincidences in the Bible and in Biblical Hebrew
P. 87

COINCIDENCES IN THE BIBLE AND IN BIBLICAL HEBREW
          66 66                          COINCIDENCES IN THE BIBLE AND IN BIBLICAL HEBREW

          relates to the statistical contents of information . As we will later discuss, although
          derivations of the two concepts of entropy were based on altogether different con-
          cepts and arguments, later developments (with some heated debates) have shown
          that the two concepts of entropy are in fact equivalent.
            It is not our intention here to deliver a full account of the theory underlying

          entropy and how it is integrated in various scientific disciplines. Rather, we wish

          to demonstrate two realizations of this concept in seemingly unrelated areas—sta-
                                                                             ’

          tistics (or information theory) and thermodynamics, respectively with Shannon s
          and  Boltzmann’s  concepts  of  entropy.  More  explicitly,  it  is  our  intention  to
            demonstrate how the same scientific construct, entropy, is related to randomness

          and to the Second Law of Thermodynamics , as the latter is realized in chemical or
          physical processes that involve transfer of heat.
            By coincidence, both concepts of entropy seem to be associated with Hebrew

          words of a shared root.

          3.2  Two Concepts of Entropy

          3.2.1   Information (Shannon Entropy)



          We start by explaining how Shannon entropy is associated with the concept of
          randomness. Suppose that we have a random phenomenon, which can be real-
          ized in N different nonoverlapping ways. For example, the outcome of throwing
          a dice is a set of six results (N = 6), conveniently displayed by the set {1, 2, 3, 4, 5,
          6}. We denote the quantitative value attached to each possible result of a random
          phenomenon a random variable, r.v. Thus, the result shown by a thrown dice may
          be defined as an r.v.

            Suppose that each possible value of an r.v. has associated with it a certain prob-
          ability , and let us denote the probability of result i by p i. The collection of values,

          {p i} (i = 1, 2, …, N), defi nes the distribution of the r.v. According to Shannon


          definition , entropy is the expected information content of the distribution, and it

          is defined by
                                 H = – ∑ N = i 1  p log 2 ( p )
                                             i
                                                     i

          (The sign ∑ means summing up over all possible values of i.)

            This entropy  is measured in bits.  However, if the natural logarithm is used (log
          on the basis of e = 2.7182 …, rather than 2), then H is measured in nats.
   82   83   84   85   86   87   88   89   90   91   92