Thursday, August 22
12:00 pm - 12:40 pm
Orchestrating Multi-Modal Interactions
About the event
The rise of the Natural User Interfaces (NUIs) such as speech-, gaze-, and gesture recognition, as well as the skyrocketing adoption of connected devices such as smart speakers and wearables, has brought in the age of multi-modal interactions.
They allow us to create beautifully complex transitions between touchpoints, devices, and input modalities.
But tackling such interfaces can be scary and overwhelming. I would like to share the lessons I've learned from creating such experiences for the medical professionals with the Abstractions community.