Page 86 - Coincidences in the Bible and in Biblical Hebrew
P. 86
CHAPTER 3 “RANDOMNESS” AND “COLD”
CHAPTER 3
“Randomness” and “Cold”
3.1 Introduction
Entropy is one of the most fundamental concepts in modern physics, in cosmol-
ogy theories, information theory, statistics, and, more generally, in the way we
perceive our world. As popularly perceived, entropy is a measurable quantity
which conveys the amount of disorder that a physical system has. A related law in
physics, the Second Law of Thermodynamics , states, in its simplest form, that as a
result of processes that cause a complex system to emerge from a state of equilib-
rium, the total entropy either increases or remains the same, but never decreases.
In modern cosmological theories, the universe as a whole is seen as a system that
had minimal entropy at the moment of creation (the big bang ), and the total
entropy of the universe, according to the Second Law of Thermodynamics, has
been increasing ever since, irrespective of the theorized future history of the uni-
verse (Greene 2004, 171–76, and Penrose 2004, 728).
The notion of entropy was first introduced in 1865 by Clausius , who had
also stated the Second Law of Thermodynamics simply by saying that “It is not
possible for heat to flow from a colder body to a warmer body without any work
having being done to accomplish this flow.” However, it was the Austrian Ludwig
Boltzmann who, in 1877, gave entropy a more rigorous mathematical treatment
and definition, which had since affected many branches of science and technology.
Both Clausius and Boltzmann referred to physical systems when they developed
their notions of entropy.
In 1948, Claude Elwood Shannon introduced his concept of entropy from
a totally innovative perspective. In a landmark paper (Shannon, 1948) which
was published a year later in a book (Shannon and Weaver 1949, and also
1963), Shannon expounded his mathematical theory of communication. Unlike
Boltzmann s entropy, which relates to complex physical systems, Shannon’s entropy
’
65 65