Page 42 - Vayyar in the News 1
P. 42
Robot X-Ray Vision is Available Now, Here’s What to Do with It
WalabotDIY Sees Through Cement, Drywall and Other Materials, Taking the Guesswork out of Home Repair and Renovation Projects
By Isaac Maw posted on January 11, 2018
Robots can already have superhuman strength, speed, and precision, but what if they could be equipped with x-ray vision, too? I spoke to Ian Podkamien, Director of Business Development at Vayyar Imaging, about his company’s unique product: a radio frequency imaging sensor with potential robotics and machine vision applications. “Our product has applications very relevant to safety and functionality for industry 4.0 environments,” said Podkamien. “The trend today is toward more and more ‘smart,’ autonomous robots. The limiting factor for how ‘smart’ robots can get is how much data they can collect on their environment.”
How it Works: Radio Frequency Imaging
Even if you aren’t familiar with this technology, you know the basic principle behind radar: The signal propagates in a cone from the emitter, then reflections return to the antenna. Vayyar has expanded on the concept by combining a high frequency emitter with a large array of antennas in multiple possible configurations. On this sensor, the antenna array collects data in vertical and horizontal axes, as well as inherently in the depth axis, creating a real time, high frame-rate representation of everything happening in the sensor range. The sensor’s chip controls frequency and other parameters and manages the signals from many antennas.
These factors enable this sensor to create a high-accuracy image, amounting to a new type of camera which can see through solid objects, unaffected by lighting or visibility conditions. It also protects privacy, as the sensor collects no visible light. The sensor is low power, being supplied by USB. While the device is available in diverse form factors, most fall between the size of smartphone or a business card, says Podkamien.
Radio Frequency Imaging: Industrial Applications
As Ian said, the limit to what a robot can do autonomously is determined by the input data it can collect about its surroundings. Historically, the two main senses available to a robot have been touch, sensing applied forces and torques; and vision, via cameras, proximity sensors and 3D point imaging. The potential advantages to functionality and safety are many: from maintaining a safe distance between humans and robots in motion, to locating objects, to 3D scanning with access to interior surfaces not visible to laser or optical scanners. Most sensors