Cake batter doesn’t mix well with touchscreens. A connected refrigerator with a large touchscreen in the middle of the kitchen is a blessing for a busy working mom who can organize and sync her kids’ school schedules and the weekly grocery list with her smartphone and the fridge. Yet that same “smart” appliance doesn’t feel very smart with a dollop of double chocolate cupcake mix on the touchscreen to-do list.
Zero-touch user interfaces (UI) are coming.
“As customer experiences go to a personal universe of sensor and cloud-based devices less reliant on touch input, technology product managers must act now to prepare for a near-future market where natural, “zero-touch” interfaces will be pervasive,” says Roberta Cozza, Senior Director Analyst, Gartner.
The zero-touch user interface is a paradigm where sensory channels beyond touch shift interfaces away from the screen, pointing devices and keyboards. These sensory inputs include voice, vision, gaze, gestures, facial recognition, sound, motion, gait and other biometrics to exploit the entire human body. These interfaces create seamless, no-touch interactions between the user, the devices and the technology.
Prepare for technology that learns from humans
Gartner predicts that by 2020, two billion devices will have zero-touch UIs available, including wearables and other connected home devices such as smart speakers and security cameras. Technology product managers must prepare for this reality now.
Ideally, consumers want to communicate in natural languages with people-literate technologies and eliminate much of the need for computer literacy training. The technology should learn how to interact with users, not the other way around.
Establish a plan to design for “tech that learns us” by investing in highly capable analytical tools
A number of hearable products are integrating natural language processing or natural language understanding (NLP/NLU) technologies to provide relevant content and advice to the user. Bragi’s Dash Pro, for example, enables users to use head gestures to navigate menus, and hearables from Google, Sony, Samsung and Apple tap into virtual assistants to enable users to initiate calls or listen to commuting information without having to touch their smartphones.
This hands-free and eyes-free technology in vertical applications can, for example, improve frontline worker safety through guided instructions. Product management leaders must think creatively about “smart” interfaces that are independent of the device type on the road to realizing a future where consumers can easily speak to devices in natural language.
- Establish a plan to design for “tech that learns us” by investing in highly capable analytical tools, predictive analytics and conversational platforms.
- Scout for talent (in-house or as a collaboration) with experience in designing for multimodal UIs with superior experiences on zero-screen devices.
- Get involved in the developments of standards and new technologies related to natural-language developments.