Page 361 - Deep Learning
P. 361

344                         Conversion

            with the beliefs that are active at the time the information is received, while
            the beliefs that contradict it are dormant. The resident and contender theo-
            ries grow separately and in parallel through monotonic belief formation, each
            within its own context.
               Parallel, monotonic growth can result in a latent conflict because surface
            features are poor predictors of essences. Seemingly distinct and separate phe-
            nomena and domains sometimes exhibit deep similarities. Hence, the range of
            applicability of a belief system cannot be determined by inspecting the beliefs
            themselves.  Building  an  informal  theory  for  domain  A,  the  person  might
            thereby unwittingly also build a theory of other domains B , B , B , … that do,
                                                             1
                                                               2
                                                                  3
            in fact, share the same underlying structure, but the act of acquiring a theory
            for A does not in and of itself reveal those other applications. A person might
            form a theory that in fact applies to some domain B without being aware that
            it applies to that domain.
               How often might this seemingly unlikely event happen? Recall that a per-
            son might possess 500,000 individual beliefs. Belief systems vary in size, but
            suppose they contain an average of 100 or so beliefs; then a typical person will
            possess 5,000 distinct belief systems or local theories that guide thinking in
            particular domains (health, money, sports, etc.). A latent conflict involves two
            beliefs or belief systems, so the number of possible latent conflicts is approxi-
            mately 5,000 , or 25,000,000. How often does it happen that two domains of
                      2
            experience that seem to be distinct nevertheless are related in such a way that
            a single theory can apply to both? Intuition suggests that such cases are rare.
            How rare? If only 1% of all pairs of domains share the same structure, there are
            250,000 such pairs in a person’s head. In how many of those cases are the two
            belief systems in conflict, according to some background theory? If only 1% of
            all pairs of belief systems that apply to the same domains are in conflict, there
            are 2,500 latent cognitive conflicts waiting to be noticed in the average head.
            This back-of-the-envelope calculation is not a serious mathematical model,
            but it illustrates that even though people do not (intentionally) construct mul-
            tiple theories for the same domain of experience, and even though two belief
            systems that apply to the same domain are not necessarily in conflict, it is nev-
            ertheless reasonable to believe that people carry around with them multiple
            latent conflicts between their beliefs.
               In short, the process that sets the stage for belief revision is that a per-
            son responds to some domain of experience A by forming an intuitive theory,
            Th(A); that Th(A) happens to apply to some other domain B as well; and that
            Th(A) is incompatible with the person’s resident theory Th(B) for that domain
            according  to  some  background  theory  Th(0).  Due  to  the  impossibility  of
   356   357   358   359   360   361   362   363   364   365   366