Page 34 - INC Magazine-November 2018
P. 34
DEPARTMENT OF “SO, YOU’RE ‘LEANING NO’?’’
“You couldn’t give me enough needles to poke my eyes out.”
—Gary Erickson, the founder of Clif Bar, reacting to the prospect of taking his company public.
S tefan Heck, the CEO increasingly drive decisions in indus- out with driverless cars, which have
Consider how such dilemmas play
tries as diverse as health care, law
of Bay Area–based
Nauto, is the rare
enforcement, and banking, whose
attracted an estimated $100 billion
ethics should they follow?
in investment globally and encompass
engineer who also
Humans live by a system of laws
giant, established companies such
has a background in
philosophy—in his
longer-startups like Didi Chuxing,
S case, a PhD. Heck’s
and shouldn’t do. Some are obvious:
Don’t kill, don’t steal, don’t lie. But
company works with commercial and mores that guide what we should as Ford, GM, and Google; giant no-
Lyft, and Uber; and a vast ecosystem
vehicle fleets to install computer- some are on-the-fly judgment calls— of startups like Heck’s that create
vision and A.I. equipment that and some of these present no good everything from mapping software to
studies road conditions and driver choice. Consider the classic philoso- cameras, ridesharing services, and
behavior. It then sells insights from phy riddle known as the “trolley prob- data applications. Or consider those
that data about human driving pat- lem.” You are the conductor of a dilemmas more than some founders
terns to autonomous-vehicle compa- runaway trolley car. Ahead of you is a in this sector do. “There’s no right
nies. Essentially, Nauto’s data helps fork in the track. You must choose answer to these problems—they’re
shape how driverless cars behave on between running over, say, five people brain teasers designed to generate
the road—or, put more broadly, how on one side and one person on the discussion around morality,” a founder
machines governed by artificial intel- other. It’s easy enough to decide to kill of a company that makes autonomous-
ligence make life-or-death decisions. the fewest people possible. But: What vehicle software told me. “Humans
This is where the background if the five people are all wearing pris- have a hard time figuring out the
in philosophy comes in handy. Heck on jumpsuits, while the one is wearing answers to these problems, so why
spends his days trying to make roads a graduation cap and gown? What if would we expect that we could en-
safe. But the safest decisions don’t the single person is your child? code them?” Besides, this founder
always conform to simple rules. To contends, “no one has ever been in
take a random example: Nauto’s data these situations on the road. The
shows that drivers tend to exceed the h actual rate of occurrence is vanish-
posted speed limit by about 15 per- ingly low.”
cent—and that it’s safer at times for a.i. wiLL add That’s a common viewpoint among
drivers to go with the flow of that as much as industry executives, says Edmond
traffic than to follow the speed limit. $15.7 triLLion Awad, a postdoctoral associate at MIT
“The data is unequivocal,” he says. “If to the gLobaL Media Lab who in 2016 helped create
you follow the letter of the law, you economy a website called the Moral Machine,
become a bottleneck. Lots of people by 2030 which proposed millions of driverless-
pass you, and that’s extremely risky car problem scenarios and asked users
and can increase the fatality rate.” north to decide what to do. “Most of them
Much chatter about A.I. focuses on america’s are missing the point of the trolley
fears that super-smart robots will one share will be problem,” he says. “The fact that it is
day kill us all, or at least take all of our abstract is the point: This is how we
jobs. But the A.I. that already sur- $3.7 do science. If all you focus on is likely
rounds us must weigh multiple risks scenarios, you don’t learn anything
and make tough tradeoffs every time triLLion about different scenarios.”
it encounters something new. That’s He poses a trolley-problem scenario
why academics are increasingly grap- that will boost to illustrate. “Say a car is driving in the
pling with the ethical decisions A.I. right lane, and there’s a truck in the
will face. But, among the entrepre- the continent’s lane to the left and a bicyclist just to the
neurs shaping the future of A.I., it’s gdp by right. The car might edge closer to the
often a topic to belittle or avoid. “I’m a 14.5% truck to make sure the cyclist is safer,
unique specimen in the debate,” Heck but that would put more risk on the courtesy company
says. He shouldn’t be. As robot brains occupant of the car. Or it could do the
Source: PricewaterhouseCoopers
1 8 ● i n c . ● n o v e m b e r 2 0 1 8 ● ● ● ● ● ●