Page 19 - AAPT March 2019
P. 19
Feature Story
achieving situational awareness is example of this is Qantas Flight 72, controls were reasonably maintained
one of the most challenging aspects of in which an automation error resulted but higher-level cognitive tasks such as
these operators’ jobs and is central to in an extreme number of audio and navigation and recognising instrument
good decision making and performance. visual warnings sent to the pilot, some system failures suffered frequent and
In that context Endsley is writing with of which completely contradicted other significant problems. They hypothesise
regards to all workers in highly complex warnings. As such, these warnings that the retention of such cognitive
and dynamic systems but the application intended to assist the pilot, created skills may depend on the pilot’s level of
to aviation is seamless and in the a significant amount of workload and active engagement while supervising
context of aviation, when we understand distraction for the flight crew (ATSB, the automation. The findings of Casner
and apply the concept of situational 2008). This clearly demonstrates a lack et al. are consistent with the three
awareness to the system design of consideration for the human observer pathways to becoming out of the loop
process; we can achieve truly effective of the automated system. suggested by Endsley, particularly with
human-automation coordination. Operators of an automated system regards to the necessity of assuming
For the system designer to safeguard have a diminished ability both to detect an active role in the automated system.
against loss of situation awareness they system errors and subsequently to Automation surprise is another
must understand its causation. perform tasks manually in the face of highly vital factor for a modern system
Humans are naturally poor automation failures when compared designer to be aware of. It is strongly
supervisors of highly automated to workers who manually perform the linked to a loss of situation awareness,
systems that keep them in a state of same task of that automated system although subtly and distinctly different.
mental underload. It has been largely (Endsley and Kiris, 1995). Automation surprise does not
reported that mental underload and Endsley and Kiris are among many necessarily mean the pilot has
overload can negatively influence who have conducted studies into experienced any of the detractive
performance (Xie & Salvendy, 2000). the effects of automated systems pathways suggested by Endsley. A pilot
What this means is that in a highly on human supervisors to discover may believe they are fully engrossed
automated system, the user is potentially the significant impact in human in the system and fully aware of their
left too little to do in the system process performance decreases. Casner et al. current situation, and suddenly the
and falls out of the loop. (2014) conducted a study specifically automated system behaves completely
This out-of-the-loop performance to address the concerns on pilot-skill unexpectedly; the pilot detects but does
issue is suggested to occur through degradation caused by reliance on not understand the issue (Dehais et al.,
vigilance and complacency problems, automation. They found that basic skills 2015).
shifting from active to passive roles in the such as instrument scanning and stick De Boer & Hurts, 2017 conducted
system and changes in feedback to the a study into automation surprise into
operator (Endsley,1996). The opposite Dutch airline pilots and concluded that
of this can also occur and instances of Automation surprise seems to be a
extreme mental overload can severely manifestation of the system complexity
detract from pilot performance. An and interface design choices in aviation
today, nearing the bounds of what is
humanely possible to comprehend.
Furthermore, they concluded that
lack of knowledge or training were
outweighed as factors when
compared to the advanced
sophistication of the
automated systems.
This means that
the modern
designer
w i l l
19