Page 103 - ISCI’2017
P. 103

indefinitely not due to the growth channel output entropy (which, on the contrary, decreases), but due

            to the fact that the noise entropy (the subtrahend in (19)) tends to minus infinity:
                                                        lim H ( ) ξ = −∞ .                               (21)
                                                        N→ 0
               This observation contradicts  the physical  meaning which  is  inherent in the definition of the
            difference (14). This change in the sign and adding of the subtrahend to the output entropy occurs


            already at "weak" noise:  N ≤  (2eπ  ) − 1 .   It is difficult to understand  the physical  meaning of this

            phenomenon. Although in the form (20) the capacity formula shows the monotonicity of the function
              ( )
             CN    at  N →  0 , that allows to explain this phenomenon  by the difference between determining

            differential and discrete entropy,  noted earlier.  However, due to  the lack of a  clear physical
            interpretation of this phenomenon, correctness of the analytical derivation of capacity by using the

            concepts of the differential entropy and the average mutual information is doubtful.
               As we will see later, attributing to this formula the ability to determine the upper limit of data

            transmission rates for the Gaussian channel is even more doubtful.




















































                                                                                                         103
   98   99   100   101   102   103   104   105   106   107   108