Page 12 - ME-07-E
P. 12
Research the field of online commerce that is about to change understanding what we see, which is my part; and
(imagine yourself digitally squeezing a baguette the next stage is translating this into a device that en-
from the bakery and comparing its softness to a loaf ables touch, which is Alon’s part. We took Yair as our
of rye). Think about computer games that allow you PhD student for this research.”
to hold objects and feel their texture; TV shows that
immerse you not only in images and sound, but also Today, if we want to shop online for a couch, for
sensation; or researchers being able to “enlarge” mi- instance, our ability to assess its properties remotely
croscopic objects such as bacteria and touch them is relatively limited. Yes, we can ask for pictures and
in order to better understand them. “What I am most a video. But what if we could sit at home and look at
looking forward to is seeing how a system like this the computer screen, which would give us not only vi-
can help in the medical world, whether it be for reha- sual information about the couch, but also let us feel
bilitation, laparoscopic surgery or training doctors,” the texture of its upholstery? “To make this possible,
says Zelnik-Manor. “As far as I’m concerned, that is the software must have a 3D model of the couch,” ex-
the ultimate vision. But if we’re talking about a more plains Zelnik-Manor. “My specialization is converting
practical short-term realization of the technology, it’s pictures into 3D models, and Alon and Yair worked on
in gaming – giving users a multisensory experience.” developing the touch devices. In research, it’s easiest
to study each component separately at the highest
“We translate a picture into an object, possible level – and only later, at a more advanced
then assign it physical properties” stage, bring the components together.
This fascinating collaboration was born of Wolf and How do we even start planning a device that imi-
Zelnik-Manor’s academic curiosity. “Alon and I had a tates touch? “Our skin has receptors that are sensi-
talk and decided to look for things we could work on tive to pressure, vibrations, shearing forces and tem-
together,” recalls Zelnik-Manor. “We had already col- perature,” explains Prof. Wolf. “Our fingertips have
laborated in the past: For example, we co-advised a an especially high concentration of these receptors.
Lihi was in charge of teaching the software the prop-
erties of various surfaces, and we were in charge of
1 2 34
HUGO, sensory feedback system. 1. Wearable version; 2. Mouse and computer tabletop version; 3. CAD model of the
tabletop version showing the system of axes in the device; 4. User interface with avatar of the user’s hand. In this
example, the interface simulates an e-commerce website.
student who researched what grabs the attention of developing a system that conveys sensation based
owls, then built an algorithm for a robot whose be- on the information from the picture.”
havior is inspired by how they navigate. Alon works
in robotics and I work in computer vision, so the re- Touch, warns Wolf, is a very ‘relative’ sense: “Its
search incorporated both fields. resolution is much coarser and more subjective than
that of sight, for instance.” Therefore, he explains,
“When we thought about what to do next, we expectations must be set accordingly. “If someone
came up with digitizing touch. Alon’s specialization uses the device to try out four different comforters,
is understanding how people perceive things through the system still cannot provide exactly the same sen-
touch; but touch is coupled with what we see, and sation that actually touching them would – but it can
digitizing it actually requires making it possible for provide a precise ranking of softness, for example, so
people to touch things they can see virtually. In other the shopper can tell which blanket is softest to the
words, the first stage is photographing the world and touch.”
12 | MEgazine | Faculty of Mechanical Engineering