Page 32 - Harvard Business Review (November-December, 2017)
P. 32

SPOTLIGHT HOW DOES AUGMENTED REALITY WORK?


        How Does Augmented




        Reality Work?





        Augmented reality starts with a        AR can provide a view of the real-time   roles, such as a machine operator and a
        camera-equipped device—such as a    data flowing from products and allow users   maintenance technician, can look at the same
        smartphone, a tablet, or smart glasses—  to control them by touchscreen, voice, or   object but be presented with different AR
        loaded with AR software. When a user   gesture. For example, a user might touch a   experiences that are tailored to their needs.
                                            stop button on the digital graphic overlay
                                                                                   A 3-D digital model that resides in the
        points the device and looks at an object,   within an AR experience—or simply say the   cloud—the object’s “digital twin”—serves
        the software recognizes it through   word “stop”—to send a command via the   as the bridge between the smart object
        computer vision technology, which   cloud to a product. An operator using an    and the AR. This model is created either
        analyzes the video stream.          AR headset to interact with an industrial   by using computer-aided design, usually
           The device then downloads information   robot might see superimposed data about   during product development, or by using
        about the object from the cloud, in much    the robot’s performance and gain access    technology that digitizes physical objects.
        the same way that a web browser loads a   to its controls.              The twin then collects information from the
                                               As the user moves, the size and
                                                                                product, business systems, and external
        page via a URL. A fundamental difference is
        that the AR information is presented in a 3-D    orientation of the AR display automatically   sources to reflect the product’s current
                                            adjust to the shifting context. New graphical
                                                                                reality. It is the vehicle through which the
        “experience” superimposed on the object
        rather than in a 2-D page on a screen. What   or text information comes into view while   AR software accurately places and scales
                                                                                up-to-date information on the object.
                                            other information passes out of view.
        the user sees, then, is part real and part digital.
                                            In industrial settings, users in different               HBR Reprint R1706B

             85 0           16KG
             85 0           16KG
             TEMPERATURE
                            WEIGHT
             85 0           16KG
             TEMPERATURE
                            WEIGHT
             TEMPERATURE    WEIGHT                           1 1 1
                                                             1
                             ASSEMBLY                   COMPUTER VISION
                             V2
                             ASSEMBLY
                             ASSEMBLY V2
                             V2
                                 1. A device with AR software analyzes a video stream and recognizes an object either by
                 AR EXPERIENCE     1   An AR-enabled device   3   Data from sensors on   5   The user interacts with
                                 1. A device with AR software analyzes a video stream and recognizes an object either by
                                 identifying a marker on it or matching its shape with an object in a database.
                                 1. A device with AR software analyzes a video stream and recognizes an object either by
                                                       the physical object
                                 analyzes a video
                                                                             the object by sending
                                 identifying a marker on it or matching its shape with an object in a database.
                                                                             commands to the cloud
                                                       streams to the twin and
                                 stream and identifies
                                 identifying a marker on it or matching its shape with an object in a database.
                                 2. The software connects with an interactive, 3-D digital facsimile of the object in the cloud,
                                                                             through a touchscreen;
                                                       may be combined there
                                 a physical object by
                                 2. The software connects with an interactive, 3-D digital facsimile of the object in the cloud,
                                                                             by voice; or with
                                 recognizing its
                                                       with data from business
                                 2. The software connects with an interactive, 3-D digital facsimile of the object in the cloud,
                                 called a “digital twin.”
                                                                             gestures (which requires
                                                       systems and external
                                 shape or a marker
                                 called a “digital twin.”
                                 attached to it.
                                                       sources.
                                                                             enabled headsets or
                                 called a “digital twin.”
                                 3.   Data from the target streams to the twin and may be combined there with data streaming in
                                                                             smart glasses).
                                 3.   Data from the target streams to the twin and may be combined there with data streaming in   INDUSTRIAL ROBOT
                                 from business systems and external sources.
                                 3.   Data from the target streams to the twin and may be combined there with data streaming in
                                 from business systems and external sources.
                               2   AR software connects   4   The software retrieves
                                 from business systems and external sources.
                                 4.  The software retrieves information from the twin, such as performance data about the
                 5   4   2       with a 3-D digital    information from     6   Control commands,      3  6
                                 4.  The software retrieves information from the twin, such as performance data about the
                 5 5  4 4  2 2   facsimile of the object   the twin, such as   such as “stop,” are     3  6
                                 object or interaction instructions, and superimposes it on the user’s view.
                 5   4   2       in the cloud, called a   performance data about   received by the     3 3  6 6
                                 4.  The software retrieves information from the twin, such as performance data about the
                                 object or interaction instructions, and superimposes it on the user’s view.
                                 object or interaction instructions, and superimposes it on the user’s view.
                                 “digital twin.”       the object or interaction   cloud and sent on
                                 5. The user can send commands to the cloud using a virtual  touchscreen; voice commands; or
                                                                             to the object.
                                                       instructions, and the AR
                                 5. The user can send commands to the cloud using a virtual  touchscreen; voice commands; or
                                 gestures (which require enabled headsets or smart glasses).
                                                       device superimposes it
                                 5. The user can send commands to the cloud using a virtual  touchscreen; voice commands; or
                                 gestures (which require enabled headsets or smart glasses).
                                                       on the user’s view.
                                 gestures (which require enabled headsets or smart glasses).
                                 6. Control commands, such as “stop,” are received by the cloud and sent on to the object.
                                 6. Control commands, such as “stop,” are received by the cloud and sent on to the object.
                                 6. Control commands, such as “stop,” are received by the cloud and sent on to the object.
                                CONNECT
                                                    DIGITAL TWIN
                          VISUALIZE OR INSTRUCT/GUIDE   DIGITAL TWIN               SENSOR DATA                       CLINT FORD
                                                    DIGITAL TWIN
                                INTERACT                                             CONTROL
                                                           ANALYTICS
                                                           ANALYTICS
                                                           ANALYTICS
   27   28   29   30   31   32   33   34   35   36   37