IBM launches experimental platform to embed Watson into any device
IBM's Watson project has been a leader in cognitive computing, enabling machines to understand the world in a similar way to humans.
Today the company is unveiling the experimental release of Project Intu, a new, system-agnostic platform designed to allow developers to embed Watson functions into various end-user device form factors, offering a next generation architecture for building cognitive-enabled experiences.
Using Project Intu developers can simplify and integrate Watson services, such as Conversation, Language and Visual Recognition, with the capabilities of the 'device' to, in essence, act out the interaction with the user. Instead of a developer needing to program each individual movement of a device or avatar, Project Intu makes it easy to combine movements that are appropriate for performing specific tasks like assisting a customer in a retail setting or greeting a visitor in a hotel in a way that is natural for the end user.
"IBM is taking cognitive technology beyond a physical technology interface like a smartphone or a robot toward an even more natural form of human and machine interaction," says Rob High, IBM Fellow, VP and CTO of IBM Watson. "Project Intu allows users to build embodied systems that reason, learn and interact with humans to create a presence with the people that use them -- these cognitive-enabled avatars and devices could transform industries like retail, elder care, and industrial and social robotics".
The announcement came as part of IBM's inaugural Watson Developer Conference. Project Intu, in its experimental form, is now accessible via the Watson Developer Cloud and also available on Intu Gateway and GitHub.