Page 114 - ISCI’2017
P. 114

i.e., the effective width of the "ring" of scattering for uniformly distributed noise less, on average, in

              2,5   times(practically,  in any  space dimension  n  ),  than the same parameter for normal  noise.
            Limiting absolute values of the dispersions for the radii of the spheres are:

                                          lim  {n D r⋅  [ ]} =  n  N 2, lim  {n D r⋅  [ ]} =  u  N 5 .           (53)
                                         n→∞                    n→∞
            This phenomenon is illustrated by the graphs in Fig. 6for  N1=  .For the normally distributed noise,

            the absolute dispersion of the radius increases and tends to the limit value "from below", but for
            uniformly distributed noise it decreases and tends to the limit value (53) "from above". Finally, we

            can draw the main and obvious conclusion:

               3) the average radii of the uncertainty spheres for the types of the noise PDF under consideration
            coincide asymptotically:

                                                 lim r =  lim r =  n N  .                                (54)
                                                                    ⋅
                                                              u
                                                     n
                                                n→∞      n→∞
               This result is a consequence of the law of large numbers. Of course, it can be generalized for any
            kind of centered PDF of signal and noise, i.e. for any continuous channel with additive noise, which

            are not statistically associated with signal. The parameters of the geometrical representations of ITS
            at n → ∞   are affected only by the average power values of continuous signal and noise, but not the

            type of their distribution!

               For similar reasons, the radius of the hyper sphere on non-Gaussian channel output space also
            coincides with the value determined by the expression (29), then using (27), (29) and (30) we arrive

                                                                                                SN+
            at the same value of channel capacity with uniformly distributed noise:  C′ =  C =  Flog  , which
                                                                                                  N

            contradicts the definition (43) and the statement (45). Thus, two  Shannon’s works [2] and [6]
            published at one year interval contradict each other when being applied to a non-Gaussian channel.

            To the question "which of two methods of determining the capacity, the analytical (entropic) or
            geometrical, is correct?" – there is the definite answer: the geometrical one. The correctness of the

            geometrical  approach  can  easily be verified by the  statistical modeling of a random code [12].
            Analytical method gives the result which coincides with the result of the geometrical method in the

            only case when signal and noise are Gaussian processes. It's just a coincidence which can be explained

            by the properties of normal distribution, which has a special significance in the theory of probability
            and stochastic processes. Due to the mentioned reasons, the methodology of using "entropic power"

            and the boundaries defined by (37) are not correct.
               The results of the analysis for both non-Gaussian and Gaussian channels (as, indeed, for any other

            model) have shown that these channels have the same capacity CC′     , the value of which depends
                                                                            =

            only on the signal/noise ratio and the channel bandwidth. Therefore, the definition of  C   (20) as the
            limit of information transmission rate in a Gaussian channel with additive noise is not correct, to say
            114
   109   110   111   112   113   114   115   116   117   118   119