Page 35 - INC Magazine-November 2018
P. 35
Seventy-six percent of participants in a study published in Science said an autonomous
vehicle should sacrifice a passenger to save 10 pedestrians. When participants in The Jargonator
another study considered riding in the car themselves, up to one-third fewer were OK
with that. The unfortunate message to manufacturers: There’s an ethical gap between Swatting the buzzwords
passengers and pedestrians—and selflessness may not sell. of business since 2014.
BY BEn SchOTT
opposite. Whatever decision the algo- who’s now CEO of ZestFinance,
rithm makes in that scenario would be which makes machine-learning soft-
implemented in millions of cars.” If the ware tools for the financial industry,
scenario arose 100,000 times in the real recalls a client whose algorithm
world and resulted in accidents, several noticed that credit risk increased with
more—or fewer—bicyclists could lose the amount of mileage applicants had
their lives as a result of the machines’ on their cars. It also noticed that
decision. That kind of tradeoff goes residents of a particular state were
mem-chanical • adjective
almost unnoticed, Awad continues, higher risks.
Describes a “crossbred keyboard with both
when we drive ourselves: We experi- “Both of those signals make a cer- membranes and a spring,” which gives your
ence it as a one-off. But driverless cars tain amount of sense,” Merrill says— typing experience “the clickiness of mechani-
cal switches with the affordability of rubber
must grapple with it at scale. but “when you put the two together, [keys].” For those who like Excel with a Hungry
On top of that, today’s artificial it turned out to be an incredibly high Hungry Hippos vibe. Source: PC Perspective
intelligence isn’t simply a matter of indicator of being African American.
precoded if-then statements. Rather, If the client had implemented that
intelligent systems learn and adapt system, it would have been discrimi-
as they are fed data by humans and nating against a whole racial group.”
eventually accumulate experience in Merrill has made A.I. transparency
the real world. And what that means ZestFinance’s calling card, but ulti-
is that, over time, it’s impossible to mately he thinks the government will
know quite how or why a machine have to step in. “Machine learning
is making the decisions it’s making. must be regulated. It is unreason- next-generation access • noun
When it comes to A.I. powered by able—and unacceptable and unimagi-
A “Zero Trust” approach to computer network
deep learning, à la driverless cars, nable—that the people who have their security that deploys “multifactor authentica-
tion,” “correlation between accesses and
“there is no way to trace the ethical hands on the things that have the users,” “machine learning,” and “single sign-
tradeoffs that were made in reaching hands on the rudders of our lives on” technology. Shorter version: Don’t plug in
a particular conclusion,” bluntly don’t have a legal framework in which that thumb drive you found on the street.
Source: Forrester
states Sheldon Fernandez, CEO of the they must operate.”
Toronto-based startup DarwinAI. Consider one basic question:
And what data a system learns Should driverless vehicles protect
from can introduce all kinds of unex- their occupants above all else, even
pected problems. Fernandez cites an a jaywalker? To Heck, the answer is
autonomous-vehicle company that his clear: “You shouldn’t kill the interior
firm has worked with: “They noticed occupant over an exterior person,”
a scenario where the color in the sky he says. “But you should be able to
made the car edge rightward when it accept damage to the car in order twaggle • noun
should have been going straight. It to protect the life of someone outside
A cross between “toggle” and “action,”
didn’t make sense. But then they of it. You don’t want egotistical twaggle is “theory and practice, practice and
theory, concept and action, thinking and
realized that they had done a lot of vehicles.” That’s common sense, but doing, doing and thinking, measuring and
training in the Nevada desert, and it’s still engineered software deciding learning … all at the same time.” I assume
they mean “twaddle.” Source: The Marketing Book
that they were training the car to whose lives matter more.
make right turns at a time of day That said, Heck, ever the philoso-
when the sky was that color. The pher, sees a moral imperative to have
computer said, ‘If I see this tint of sky, these debates—while not slowing
that’s my influencer to start turning down the march of technology. “We
this direction.’ ” kill 1.2 million people globally every
More ethically complicated are year in car accidents,” he says. “Any
scenarios in which, say, an algorithm delay we put on [automotive] autono-
used for credit underwriting begins my is killing people.” All the more
profiling applicants on the basis of reason for the industry to start think-
accessomorphosis • nounccessommorphosis • nou
a
race or gender, because those factors ing through these issues—now.
Per a famed designer: “The point at which an
correlate with some other variable. accessory can transform into a garment.” mIcHaeL parKIn (4)
Douglas Merrill, a former Google CIO TOM FOSTER is an Inc. editor-at-large. Coincidentally, also the point at which my
smile can transform into a guffaw.
Source: WWD
2 0 ● i n c . ● n o v e m b e r 2 0 1 8 ● ● ● ● ● ●