Page 301 - Deep Learning
P. 301

284                         Adaptation

            we can characterize the ongoing change as consisting of both the elimination
            of steps – the discovery of shortcuts – and the speedup of the remaining steps.
            The fact that change of this dual sort generates negatively accelerated improve-
            ment curves is a level-invariant characteristic.

            Summary
            A collective – group, team, population or organization – can be regarded as a
            single entity that learns from error. The abstract principles of constraint-based
            learning appear to be level-invariant. That is, they apply at the collective level
            as well, even though the material implementation of the relevant processes is
            different. A collective fails, perhaps among other ways, by applying underspeci-
            fied decision rules and operating procedures, thereby performing actions in
            situations in which they should not have been performed. Errors of this sort are
              discovered on the basis of error signals that violate constraints on the appro-
            priate, correct or useful system states. When there is an opportunity to learn
            from such errors, the response is typically to specialize – make more explicit,
            extensive and precise – the applicability conditions for the relevant decision
            rules and operating procedures. If the principles of constraint-based specializa-
            tion are level-invariant, the patterns that characterize collective change over
            time  should  be  similar  to  the  patterns  in  individual  change.  Available  data
            indicate that this is indeed the case. Specifically, learning in collectives follow
            the same negatively accelerated learning curve that characterizes learning in
            individuals.


                                    Safety Implications
            Although individual error rate curves never fall all the way to zero, it is pos-
            sible to design collective systems for higher levels of safety than such curves
            suggest.  Safety  scientists  characterize  commercial  airlines,  the  U.S.  Navy’s
            nuclear submarines and the European railroads as ultrasafe systems, because
            the error rates for those systems have fallen so far that the probability of a
            disastrous accident is 0.0000005 per unit, where a unit is, for example, a unit
                                                                28
            of time or distance or a passenger, depending on the system.  This level of
            safety requires that errors are caught before they develop into disasters, which
            in turn requires that the participating individuals are provided with multiple
            sources of information about the current system state. In technical systems,
            instruments provide such information. Recommendations in other areas of
            experience follow the same principle of providing the operators with more
            dense  information  and  status  indicators  along  the  way,  so  that  pre-failure
   296   297   298   299   300   301   302   303   304   305   306