iNAGO founder Ron Di Carlantonio is a Tokyo/Toronto-based computer scientist who saw an opportunity to extend the use of the intelligent assistant platform—a context-aware and automated dialogue management system—to the smart factory by using his invention to make industrial equipment—CNC machines, presses, conveyors, industrial robots—talk.
He spoke with Machine Design in a three-part video series about the unique parameters of the platform his company developed. Called Netpeople, the patent-pending intelligent assistant technology enables human-like conversation and interaction, and in combination with other tech, creates a context-aware natural language understanding.
Di Carlantonio said his focus is to make computers more humanlike and that iNAGO focuses on B2B. “Netpeople is an assistant platform that allows other companies to create intelligent assistants for their products in hopes that, rather than us getting to one specific set of consumers, that everybody can work together to create this kind of humanlike computer for everybody.”
In attempting to simplify for the layperson the purpose and scope of the technology, Di Carlantonio will have you conjure up memories of KITT, the talking bulletproof car popularized in the television series “Knight Rider.”
In Part 1 of this series, Di Carlantonio discusses his work on a government-funded initiative that called upon OEMs to build an all-Canadian zero emissions concept vehicle. iNAGO worked alongside Canadian companies such as Geotab, Denso, Aisin, ABC Technologies and Vehiqilla to develop a cockpit for the vehicle named Project Arrow.
“We created this open platform for everybody to work on, and so that we could create these new innovations,” he said.
iNAGO’s netpeople platform goes beyond speech recognition, Di Carlantonio explained. Companies like Google, Nuance and Cerence have that covered. iNAGO’s technology enhances the communication experience by being able to respond according to the context of the car.
“What we do is, after the speech has been recognized, we are analyzing the context, understanding and then determining what the right response is,” he explained. “And then we provide tools to allow anybody—non-programmers—to be able to create that knowledge and create that experience.”
Consider, for example, the limitations when asking Google or Alexa follow-up questions. Without understanding the context, said Di Carlantonio, these AI-based applications have not been able to converse on a topic beyond basic prompts that need a specific format.
Di Carlantonio further described the need to review the unique circumstance or environment: “When we say context-aware, we mean that when interpreting what a person is saying, we need to understand all of the things we talked about—the conversational context and the context around our environment. If you were in a car, and you’re going 150 kilometers an hour, and the person next to you says, ‘What are you doing?’
“Well, that has a very specific meaning because of the context, not just because of the words,” he continued. “What we’ve tried to do is develop technologies that will understand the context and help better bridge that gap when conversing with a human.”
For more coverage of emergent technologies in the manufacturing space, be sure to check out the Nov./Dec. issue of Machine Design, out now.