Page 40 - CFCM May-June 2021_Neat
P. 40
INDUSTRIAL FINISHING: ROBOTICS
Sensor fusion and its ability to differentiate components of a single
object are essential to identifying appropriate surfaces to be coated. processing unit which then uses an automation controller to infer
Source: https://cs.stanford.edu/~kaichun/partnet/images/teaser.png
the position of that object at any given point on the line. This
effectively allows a robot to function as if it is “seeing” an object in
six axes and able to adapt to surface features of objects – led Nordson real time, even if it is only seeing it through an image generated for it
to develop robotic coating solutions for a variety of customers. In one upstream of the actual process cell.
case study with OBUK, a manufacturer of architectural doors, there
was a substantial improvement in powder transfer efficiency and Process Know-how and Quality Goals
color consistency. This was operated by a convenient HMI which When humans perceive an object, we don’t simply perceive the
allowed an operation to be executed through simple parameters from entire object at once but also constituent parts. In a car we know the
a skilled worker. At the same time, this was only limited to one difference between a steering wheel and a dashboard. On a produc-
specific type of shape, which ultimately means this type of installa- tion line, we are able to tell the foot of a chair and the arm. For a robot,
tion is not as portable to a variety of manufacturers. to be able to differentiate these is a matter of fixturing and program-
ming, but for an autonomous robot, task-planning based approaches
Sensor Fusion and 3D Perception or simple heuristic optimizations can be used to target different
3D depth cameras have been available for many years but recently components and ensure a consistent finish.
saw substantial decreases in price to become more widely available. For instance, when it comes to targeting different components,
They work on similar principles to radar – firing out non-visible light sensor fusion-based renderings are accurate enough to be matched
(like infrared) and looking at the refraction patterns to evaluate the with CAD files of parts, but additionally subordinate CAD layers.
presence of objects in a target area. This means that a robot with an appropriate assignment can target a
While one sensor working on its own can be useful for certain specific layer of a part, while the use of CAD files also enables coordi-
operations, the ability to render entire objects must incorporate data nation between robots connected in the same system.
from multiple sensors (in multiple positions) at once in order to At the same time, these robots can also be provided both goals and
provide a consistent view of an object. instructions about the parts to be processed. For instance – using a
Furthermore, the vision of this object doesn’t need to be fixed in process model that could vary for operations like painting, powder
space. By being able to transfer the entire dimensions and features of coating or even sandblasting – it can target a coating thickness or
an object to an array of scanner through motion (usually on an over- target a thickness to remove based on the difficulty of a process or
head conveyor), sensor fusion information can be passed to a central the abrasiveness of the coating. With parameters for its end effector
40 CANADIAN FINISHING & COATINGS MANUFACTURING MAY/JUNE 2021