Page 27 - Diagnostic Radiology - Interpreting the Risks Part Two_Neat
P. 27
SVMIC Diagnostic Radiology: Interpreting the Risks
patient’s condition could change an AI diagnosis. For instance,
“alcohol abuse” could produce a different diagnosis than
“alcohol dependence”; and “lumbago” could produce a different
diagnosis than “back pain”.
Changing such diagnoses could harm patients. Any changes
made by doctors to scans or other patient data could become
part of the patient’s permanent medical record and have an
impact on the medical decisions taken. If such a change is
made, and the patient injured, how do you allocate liability?
Right now there is no definitive answer, in part because AI in
medicine is in its infancy.
Another major concern is whether the huge amount of patient
data gathered to train AI systems is safe with the hospitals,
research organizations, insurance companies, and tech
companies that hold the data, especially in terms of HIPAA
(Health Insurance Portability and Accountability Act of 1996). To
cash in on the advantages of digital medicine that is based on
collected data, patient data privacy must be taken into account,
and caution must be taken when AI is both designed and used.
AI will be taking over some tasks, and radiologists must adapt
accordingly. Because of financial and liability interests, it is
unlikely that AI will step in and perform all the tasks normally
performed by radiologists. The integration of AI and radiological
practice has many benefits in terms of diagnosis and accuracy.
This technology will not replace radiologists, but it will replace
radiologists who do not utilize AI.
From a risk perspective, radiologic interpretation cannot be
fully mechanized or automated; it is a human enterprise based
on complex psycho-physiologic and cognitive processes.
Page 27

