Page 24 - ARUBA TODAY
P. 24
A24 TECHNOLOGY
Thursday 4 april 2019
Face recognition researcher fights Amazon over biased AI
By MATT O’BRIEN data, will mimic the institu- technology. “Joy’s work lowest error rates, declined computing division, wrote
CAMBRIDGE, Mass. (AP) — tional biases embedded has been part of building comment. Messages left in a January blog post. Am-
Facial recognition technol- in the data they are learn- that awareness.” with Face Plus Plus weren’t azon declined requests for
ogy was already seeping an interview.
into everyday life — from “I didn’t know their reaction
your photos on Facebook would be quite so hostile,”
to police scans of mugshots Buolamwini said recently in
— when Joy Buolamwini no- an interview at her MIT lab.
ticed a serious glitch: Some Coming to her defense
of the software couldn’t Wednesday was a coali-
detect dark-skinned faces tion of researchers, includ-
like hers. ing AI pioneer Yoshua Ben-
That revelation sparked gio , recent winner of the
the Massachusetts Institute Turing Award, considered
of Technology researcher the tech field’s version of
to launch a project that’s the Nobel Prize.
having an outsize influence They criticized Amazon’s
on the debate over how response, especially its dis-
artificial intelligence should tinction between facial
be deployed in the real recognition and analysis.
world. “In contrast to Dr. Wood’s
Her tests on software cre- claims, bias found in one
ated by brand-name tech system is cause for con-
firms such as Amazon un- cern in the other, particu-
covered much higher error larly in use cases that could
rates in classifying the gen- severely impact people’s
der of darker-skinned wom- lives, such as law enforce-
en than for lighter-skinned In this Wednesday, Feb. 13, 2019, photo, Massachusetts Institute of Technology facial recognition ment applications,” they
men. researcher Joy Buolamwini stands for a portrait at the school, in Cambridge, Mass. wrote.
Along the way, Buolam- Associated Press Its few publicly known cli-
wini has spurred Microsoft ents have defended Ama-
and IBM to improve their ing from. For instance, if AI Amazon, whose CEO, Jeff immediately returned. zon’s system.
systems and irked Amazon, systems are developed us- Bezos, she emailed directly Months after her first study, Chris Adzima, senior infor-
which publicly attacked ing images of mostly white last summer, has respond- when Buolamwini worked mation systems analyst for
her research methods. On men, the systems will work ed by aggressively taking with University of Toronto re- the Washington County
Wednesday, a group of AI best in recognizing white aim at her research meth- searcher Inioluwa Deborah Sheriff’s Office in Oregon,
scholars, including a winner men. ods. Raji on a follow-up test, all said the agency uses Ama-
of computer science’s top Those disparities can some- A Buolamwini-led study three companies showed zon’s Rekognition to iden-
prize, launched a spirited times be a matter of life or published just over a year major improvements. tify the most likely matches
defense of her work and death: One recent study of ago found disparities in how But this time they also add- among its collection of
called on Amazon to stop the computer vision sys- facial-analysis systems built ed Amazon, which has roughly 350,000 mug shots.
selling its facial recognition tems that enable self-driv- by IBM, Microsoft and the sold the system it calls Rek- But because a human
software to police. ing cars to “see” the road Chinese company Face ognition to law enforce- makes the final decision,
Her work has also caught shows they have a harder Plus Plus classified people ment agencies. The results, “the bias of that computer
the attention of political time detecting pedestrians by gender. Darker-skinned published in late January, system is not transferred
leaders in statehouses and with darker skin tones. women were the most mis- showed Amazon badly over into any results or any
Congress and led some What’s struck a chord classified group, with er- misidentifying darker-hued action taken,” Adzima said.
to seek limits on the use of about Boulamwini’s work is ror rates of up to 34.7%. By women. But increasingly, regulators
computer vision tools to her method of testing the contrast, the maximum er- “We were surprised to see and legislators are having
analyze human faces. systems created by well- ror rate for lighter-skinned that Amazon was where their doubts.
“There needs to be a known companies. She ap- males was less than 1%. their competitors were a A bipartisan bill in Congress
choice,” said Buolamwini, plies such systems to a skin- The study called for “urgent year ago,” Buolamwini seeks limits on facial recog-
a graduate student and tone scale used by derma- attention” to address the said. nition. Legislatures in Wash-
researcher at MIT’s Media tologists, then names and bias. Amazon dismissed what it ington and Massachusetts
Lab. “Right now, what’s shames those that show ra- “I responded pretty much called Buolamwini’s “erro- are considering laws of
happening is these technol- cial and gender bias. Buol- right away,” said Ruchir neous claims” and said the their own.
ogies are being deployed amwini, who’s also found- Puri, chief scientist of IBM study confused facial anal- Buolamwini said a major
widely without oversight, ed a coalition of scholars, Research, describing an ysis with facial recognition, message of her research
oftentimes covertly, so that activists and others called email he received from improperly measuring the is that AI systems need to
by the time we wake up, the Algorithmic Justice Buolamwini last year. former with techniques for be carefully reviewed and
it’s almost too late.” League, has blended her Since then, he said, “it’s evaluating the latter. consistently monitored if
Buolamwini is hardly alone scholarly investigations with been a very fruitful relation- “The answer to anxieties they’re going to be used
in expressing caution about activism. ship” that informed IBM’s over new technology is on the public.
the fast-moving adop- “It adds to a growing body unveiling this year of a new not to run ‘tests’ inconsis- Not just to audit for accu-
tion of facial recognition of evidence that facial 1 million-image database tent with how the service is racy, she said, but to en-
by police, government recognition affects differ- for better analyzing the di- designed to be used, and sure face recognition isn’t
agencies and businesses ent groups differently,” versity of human faces. Pre- to amplify the test’s false abused to violate privacy
from stores to apartment said Shankar Narayan, of vious systems have been and misleading conclu- or cause other harms.
complexes. Many other re- the American Civil Liber- overly reliant on what Buol- sions through the news me- “We can’t just leave it to
searchers have shown how ties Union of Washington amwini calls “pale male” dia,” Matt Wood, general companies alone to do
AI systems, which look for state, where the group has image repositories. manager of artificial intelli- these kinds of checks,” she
patterns in huge troves of sought restrictions on the Microsoft, which had the gence for Amazon’s cloud- said.q