Page 31 - spring18
P. 31
f it continues in its lane, it will plow into It’s more prudent, then, to ask probing
Iseveral pedestrians. If it swerves into the questions such as: Is an algorithm biased? Is it
adjacent lane, it will ram into a concrete barrier. manipulative? Are there hidden moral or political
In either case, injuries and even loss of life are issues?
likely. An example of bias, Bambauer said, is seen in
Although this scenario currently lives only how scores from the computer program COMPAS
on Moral Machine, a platform designed by the (Correctional Offender Management Profiling
Massachusetts Institute of Technology to “gather for Alternative Sanctions) are used to create a
a human perspective on moral decisions made by risk scale for criminal recidivism. Although
machine intelligence,” it might be coming soon the program’s 137 variables do not include race
to a street near you. And that has Jane Bambauer or ZIP code, the scale still has bias built into
both fascinated and concerned by what she its computations, as Bambauer demonstrated
describes as the current “Wild West” of artificial through a series of bar graphs.
intelligence. Even the perception of bias can turn people
“The coordination of a world with both away from trust in our institutions, she said,
driven cars and driverless cars will be incredibly showing how basic Google searches can be
complicated,” Bambauer, a University of Arizona interpreted by some as an indication of the tech
law professor, told a group of science teachers and giant’s political leanings.
graduate students after her lecture in February Manipulation can surface in something as
on “Machine Influencers and Decision Makers” innocuous as food reviews or as insidious as
at Centennial Hall. The lecture was the fifth “fake news.” Facebook’s news-feed algorithm
in the College of Science series on “Humans, “has incredible power,” Bambauer said, adding
Data and Machines,” which has focused on the that “the filter bubble is limiting you” in terms of
convergence of the digital, physical and biological big-picture perspective.
worlds. In any event, algorithms are complicated to
Bambauer said the transportation industry untangle.
will be turned upside down in the same way “All of the problems (with algorithms) are
that it was when automobiles disrupted a world interconnected,” she said. “Accuracy might
of horse-drawn carriages, forcing both modes increase bias. Data gathering might affect privacy.
to share the road. She said it will be tempting The problems are with setting priorities among
to rush in and regulate — but much better to go competing goals.”
slowly. Echoing what previous speakers in the
“My default position with new technology is series had said, she noted, “It’s easy to blame the
not to do much heavy-handed regulation,” she algorithms, but algorithms do what we ask.”
told the teachers and students.
“I’m a bit of a contrarian in my field. My
impulse is to let companies figure out what’s
working and what isn’t, before we regulate. There Bambauer said the transportation industry
are instances where, when trying to regulate in
advance, you end up missing on innovations (that will be turned upside down in the same way
follow),” she said, citing the early World Wide
Web as an example.
Bambauer told her audience that it’s useless that it was when automobiles disrupted a
trying to fight the onslaught of algorithms.
They’re pervasive, they’re not going away, world of horse-drawn carriages, forcing both
and they’re assessing our credit scores, career
interests, health care and more. modes to share the road.
“We interact with machine-learning
algorithms almost any time we do anything on
the internet,” she said.
SPRING 2018 29