Page 24 - The EDIT | Q3 2017
P. 24

24
Discovery
In China, Baidu and AI Nemo’s voice chatbot ‘Little Fish’ (Xiaoyu) can help users perform a range
of tasks, from making calls and checking stock prices to booking movie tickets and ordering medicine. Little Fish also comes with a screen and camera, adding a layer of authentication such that
10,000 steps, or tell it what times to buzz to remind them to take their medication. As smart homes and the Internet of Things become more entrenched, users could tell their smartphone assistant to switch the TV on to a certain channel or warm up the oven when they’re on the way home, which a location sensor could determine.
But back to Natural Language Understanding...
Why do we need NLU?
Natural language understanding (NLU) is a subtopic of natural language processing (NLP) in artificial intelligence (AI) that deals with machine reading comprehension. Until now, NLU has been considered an AI-hard problem that is fundamentally unsolved. Competitors in NLP can easily look up the meaning of a word in the dictionary — however, meaning only makes sense with context, and this is where NLP needs NLU.
Read more about ‘Why Chatbots Won’t Survive Without NLU’ here: http://bit.ly/2izLj9v
So here is an experiment: Try and tell Siri:
“Call Beth... no, John,” — and wait what happens... You will probably get Siri confused.
That is because NLU cannot be solved without machines understanding the meaning of words in context. The next generation of chatbots must (and will) include profound understanding.
Some people may say that we need to learn how to speak to Siri, OK Google or whatever the next
payment commands can only be executed by an authorised user.
While most of the hype is centred around customer- facing applications, the B2B space appears to have large potential. An example of this is Canada’s Chata, an emerging enterprise NLUI voice chatbot aiming to allow executives easy access to data without needing to know how to actually access it. Kelly Cherniwchan, CEO and co-founder of Chata, explained that many executives feel left out and unable to do their tasks because of technological barriers. The status quo is that either you get someone in to write an SQL query to run, or you export all the data into Excel and go after that.
Read Chata’s story and what they are doing here:
http://bit.ly/2wVRczW
We are also likely to see chatbots become integrated with wearables — the Apple Watch already has Siri. Users could tell a wearable to remind them to walk
THE EDIT ISSUE 7 | Q3 2017


































































































   22   23   24   25   26