Page 291 - Deep Learning
P. 291
274 Adaptation
any hierarchical organization will fail eventually. Similarly, the observation that
machinery with many components can malfunction in multiple ways and that
it is not possible to foresee all interactions among simultaneously malfunction-
ing components only predicts that if a complicated technical system operates
long enough, a composite breakdown for which the responsible operators are
ill prepared will occur. Such assertions, although true, provide little insight into
how error rates change over time.
The ability of the individual to learn is one of the factors that contributes
to change in the collective. If every operator or participant performs his part of
the shared overall task better, we would expect the collective to function better,
ignoring for the moment what “better” means in any one context. The ques-
tion is how individual learning events project onto the collective level. Like
individual errors, a collective error can be conceptualized as an action that
was taken when it should not have been; indeed, this formulation is almost
synonymous with the meaning of error. The observation suggests that the con-
straint-based perspective applies to the collective itself, regarded as a single
entity that learns from its errors.
Constraint-Based Learning in Collectives
It is highly unlikely that any single perspective will cover all aspects of such
complicated events as accidents, disasters and collective failures. For the con-
straint-based perspective to be useful, it is enough if it applies to some signifi-
cant subset of such events. Three key questions are whether collectives make
overgeneralization errors, detect errors via constraint violations and correct
errors via the specialization of operating procedures.
The nature of collective errors
Collective errors are, like individual errors, actions taken when they should
not have been. There is little point in distinguishing between errors of commis-
sion and errors of omission, because when the appropriate action A is omitted
in a continuously active system, some other action B is always performed in
its stead; hence, B is performed when it should not have been. Every error of
omission is therefore also an error of commission. This observation is as valid
at the collective as at the individual levels, and it suggests that at least some
collective errors can be understood as overgeneralization errors: If the applica-
bility conditions for B had been specified in more detail, perhaps B would not
have been performed and the probability that A would have been performed
instead would have been greater.