Page 294 - Deep Learning
P. 294

Error Correction in Context             277

            more tanks in the area than originally estimated. The violated constraint is all too
            obvious: Do not underestimate the enemy.
               In short, some collective accidents, disasters and failures can be analyzed
            in terms of underspecified decision rules, operating procedures that ignore
            situation features indicating that the action under consideration is not appro-
            priate, correct or useful. If the applicability conditions for the relevant action
            type had been more precisely specified – when, under which circumstances, is
            a merger the right move for a corporation? – the disastrous action might have
            been suppressed, and some other action would have been taken instead. There
            are no data on prevalence. That is, there is no way of knowing what proportion
            of all accidents, disasters and collective failures fit this schema. For present
            purposes, it is enough that some of them do.

            The centrality of constraints
            Organizations no less than individuals detect errors and impending failures as
            deviations of experienced outcomes from desired, expected or predicted out-
            comes. Unintended final outcomes are usually obvious. Falling stock prices,
            heavy casualties, dead patients and shipwrecks leave no room for doubt that
            something went wrong. The important question is how the extended process
            of building wealth, curing patients, operating complex machinery or navigat-
            ing is judged as being on track versus derailed before the final outcome. The
            more the relevant operators know about the way their system should work,
            what the system should look like when everything is going well, the higher the
            probability that they will catch a derailed process before the pre-failure state
            erupts into unacceptable damage.
               To cast the relevant knowledge as consisting of constraints is in many
            cases  straightforward.  In  fact,  safety  regulations  are  particularly  clear
              examples of constraints. The sign “hard hat area” on the gate to a construc-
            tion site means if you are walking in this area, you ought to be wearing a hard
            hat (or else you have broken the rules); the announcement that “as we pre-
            pare to take off, all luggage should be stored under the seat in front of you or
            in the overhead compartments” can be restructured slightly to say, if you are
            in an airplane, and the airplane is about to take off, your luggage ought to be
            stowed (or else you are in violation of the safety rules). Some safety rules apply
            to each participant individually, but there are also safety rules that apply to
            collectives. A familiar example is the theater sign that says, “occupation by
            more than N persons in this room is illegal.” This constraint applies to the
            collective as a whole without prescribing any particular behavior on the part
            of any one individual. Regardless of domain, doing the right thing depends to
   289   290   291   292   293   294   295   296   297   298   299