Page 296 - Deep Learning
P. 296

Error Correction in Context             279

            evolved over centuries to ensure accurate navigation, and the team members –
            quartermasters all – are well trained and have long experience. In addition,
            the team’s procedures provide multiple checks on the accuracy of the plot. For
            example, the bearings are heard by all the members of the plotting team and the
            plot is visible to the officer of the watch, so errors can be caught quickly. Aircraft
            carriers do not run aground because the piloting procedures impose a tight set
            of constraints on the navigation task: If there are different bearings, they should
            cross each other in a tight box, that is, in the same point on the line represent-
            ing the course of the ship; the plotted position of the ship has to have a sensible
            relation to the immediately preceding position; the plotted position has to agree
            with the depth markings on the chart; and so on. Different team members have
            different roles in applying these constraints, and Hutchins argues that the roles
            and the procedures have evolved over a long time precisely to ensure that the
            team is less error prone than its individual members.
               In short, notions of constraints and constraint violations apply to real col-
            lectives. Knowledge that operators use to detect errors and pre-failure states
            can typically be cast as constraints, and the detection of impending disasters
            depends on information about the system state that allows the operators to
            decide  whether  the  relevant  constraints  are  violated.  The  interesting  emer-
            gent feature is that different individuals can play different roles in a collective
            systems, some providing information (e.g., the pelorus operators) and others
            checking for constraint violations (e.g., the plotters). This division of labor has
            no counterpart in the mind of the individual.

            The specialization of operating procedures
            How are errors unlearned in collectives? What happens when a complex sys-
            tem encounters an accident, disaster or failure? What change processes are
            triggered to prevent the same negative outcome from occurring in the future?
            Can that process be described as a specialization of the relevant operating pro-
            cedures, at least in a significant subset of cases? One difficulty in answering this
            question is that we do not possess a corpus of well-documented cases of suc-
            cessful unlearning of errors to complement the corpora of accidents, disasters
            and failures. It is natural that safety scientists have focused on error types and
            their origins. The constraint-based perspective does not assign those topics
            any less importance, but it adds the observation that errors are unavoidable,
            and that error prevention in the long run therefore depends on unlearning
            the errors. This requires a change in the relevant operating procedures. Such
            changes  require  knowledge  about  typical  system  responses  to  failures,  but,
            paradoxically, to the extent that such responses are successful, they prevent
   291   292   293   294   295   296   297   298   299   300   301