Page 87 - วารสารกฎหมาย ศาลอุทธรณ์คดีชํานัญพิเศษ
P. 87
ฉบับพิเศษ ประจำ�ปี 2564
particularly likely to falsely flag black defendants as future criminals, wrongly labelling
them this way at almost twice the rate as white defendants’ and ‘White defendants were
mislabelled as low risk more often than black defendants.’ Some scholars have argued
5
that it is not against the use of predictive justice but concerned about its level of
transparency. This is understandably so because transparency is central when it involves
6
algorithms that make important decisions, especially when ‘scoring’ is involved because
this tends to be subjective thus can lead to the call for greater scrutiny. There is also the
risk of human developing blind reliance and accepting computer-generated outcomes
wholly without questioning. This may arise from our belief that computers are accurate
as they are machines. How often do we question the calculations rendered by electronic
calculators or spreadsheets? We might think, ‘who in the right mind would quarrel or
argue with a machine?’. Incidentally, the words of Lord Denning come to mind in
Thornton v Shoe Lane Parking Ltd pertaining to an automated parking ticket dispenser,
7
where ‘one may protest to the machine, even swear at it but it will remain unmoved.’
Would it be a surprise that AI will make its presence in arbitration and dispute
resolution when a myriad of commercial software and technological solutions are already
being marketed and widely available? In arbitration today, the parties and tribunals are
confronted with complex subject technical issues requiring the expertise of expert
witnesses; they are also confronted with conflicting expert views or opinions as well as
the question of bias-ness. Computers are thought to be unbiased since they are ‘robots’
and since they are also emotionless, they are consistent and cannot be influenced.
In terms of the ability to undertake complex tasks quicker than a human can, AI can
also be developed to perform multi-tasking functions involving a vast amount of data,
for instance when judges or arbitrators are confronted with ascertaining loss of future
profits that would require them to take into account financial calculation models,
5 Jeff J Angwin, ‘Machine Bias,’ ProPublica (New York, 23 May 2016) <https://www.propublica.org/
article/machine-bias-risk-assessments-in-criminal-sentencing> accessed March 23, 2021
6 Cynthia Rudin, Caroline Wang and Beau Coker, ‘The Age of Secrecy and Unfairness in Recidivism
Prediction’ (2020) 2(1) Harvard Data Science Review <https://hdsr.mitpress.mit.edu/pub/7z10o269/release/4>
accessed 30 March 2021
7 EWCA Civ 2 (1970)
85