Page 50 - Bloomberg Businessweek July 2018
P. 50

◼ TECHNOLOGY                               Bloomberg Businessweek                     July 2, 2018


      A couple of years ago, as Brian Brackeen was prepar-  asking it to stop marketing its Rekognition system
      ing to pitch his facial recognition software to a poten-  to police departments and other government agen-
      tial customer as a convenient, secure alternative to   cies until guidelines are developed to ensure the
      passwords, the software stopped working. Panicked,   software isn’t leading to civil rights violations. In
      he tried adjusting the room’s lighting, then the Wi-Fi   another letter the following week, Amazon workers
      connection, before he realized the problem was his   asked Chief Executive Officer Jeff Bezos to stop sell-
      face. Brackeen is black, but like most facial recogni-  ing Rekognition to law enforcement agencies given
      tion developers, he’d trained his algorithms with a   “the U.S.’s increasingly inhumane treatment of ref-
      set of mostly white faces. He got a white, blond col-  ugees and immigrants.” Amazon declined to com-
      league to pose for the demo, and they closed the   ment for this story.
      deal. It was a  Pyrrhic victory, he says: “It was like   Government agencies have no broadly agreed-
      having your own child not recognize you.”  upon standards for evaluating facial recognition
        At Kairos AR Inc., his 40–person facial recognition   systems. A 2016 Georgetown University study found    “An inaccurate
      company in Miami, Brackeen says he’s improved the   that almost none of the law enforcement agen-  system will
      software by adding more black and brown faces to   cies that use facial recognition require a minimum   implicate
      his image sets, but the results are still imperfect. For   threshold for overall accuracy, let alone racial dis-  people for
      years the same problem has bedeviled companies   parities. “An inaccurate system will implicate people   crimes they
      including Microsoft, IBM, and Amazon and their   for crimes they didn’t commit and shift the burden   didn’t commit”
      growing range of customers for similar services.   to innocent defendants to show they are not who the
      Facial recognition is being used to help India’s gov-  system says they are,” says Jennifer Lynch, senior
      ernment find missing children, and British news   staff attorney for the Electronic Frontier Foundation,
      outlets spot celebrities at royal weddings. More   an advocate for civil liberties online.
      controversially, it’s being used in a growing num-  And the problem isn’t just in the U.S. Civil rights
      ber of contexts by law enforcement agencies, which   activists in the U.K. recently obtained records regard-
      are often less than forthcoming about what they’re   ing law enforcement’s use of facial recognition. The
      using it for and whether they’re doing enough about   results were terrible. For example, the South Wales   21
      potential pitfalls. Brackeen believes the problem of   Police, which used facial recognition to screen peo-
      racial bias is serious enough that law enforcement   ple at public events, reported more than 90 percent
      shouldn’t use facial recognition at all.   of matches were erroneous. The department says
        Microsoft, IBM, and China’s Face++ misidentified   on its website that the use of facial recognition had
      darker- skinned women as often as 35 percent of the   been a “resounding success.” It didn’t respond to an
      time and darker-skinned men 12 percent of the time,   interview request.
      according to a report published by MIT researchers   Makers of facial recognition technology, includ-
      earlier this year. Such software can see only what   ing Microsoft and IBM, have said the software con-
      it’s taught to see, which has been mostly white men.  tinues to be a work in progress, with engineers
        In recent months, major vendors say they’ve   focused on improving accuracy and transparency
      diversified their training data sets to include darker-   around how the improvements are being made.
      colored faces and have made strides in reducing   They say the technology has helped bust sex traf-
      bias. Microsoft Corp. announced on June 26 that it   fickers and apprehend would-be terrorists, though
      would release a version of its software tool Face API   they’ve provided few details.
      that now misidentifies darker-skinned women, the   Andrew Ferguson, a law professor at the
      group for which it’s most error-prone, only 1.9 per-  University of the District of Columbia and the author
      cent of the time and is 100 percent accurate for other   of The Rise of Big Data Policing, says using the pow-
      groups. International Business Machines Corp. says   erful technology while it’s still under development
      its Watson Visual Recognition, which is similarly at   with scant regulation is dangerous. Law enforce-
      its weakest in identifying darker- skinned women,   ment agencies have consistently botched their
      gets it wrong 3.5 percent of the time. Both IBM and   adoption of novel tech. “Police are beta- testing new
      Microsoft acknowledge their results haven’t been   technologies or piloting new ideas in policing with-
      independently verified and that real-world error   out a vetting process to think through bias or how it
                                                 might affect citizens’ civil rights,” he says.
   ILLUSTRATION BY NICHOLE SHINN  stock images. The makers of Face++ didn’t respond   rithms as more agencies buy the software, but they
      rates could be different from those for tests using
                                                   Engineers are improving how they train algo-
      to requests for comment.
                                                 may not be able to head off growing pressure for reg-
        It’s Amazon.com Inc. that may have to worry
                                                 ulation. The authors of the Georgetown report call
      most about real-world results. On June 15 a group
                                                 for laws governing how police departments use
      of Amazon shareholders sent the company a letter
   45   46   47   48   49   50   51   52   53   54   55