Orchestrating Multi-Modal Interactions

The rise of the Natural User Interfaces (NUIs) such as speech-, gaze-, and gesture recognition, as well as the skyrocketing adoption of connected devices such as smart speakers and wearables, has brought in the age of multi-modal interactions. They allow us to create beautifully complex transitions between touchpoints, devices, and input modalities. But tackling such interfaces can be scary and overwhelming. In this talk, I share lessons I’ve learned from creating such experiences for the medical professionals. 

More Talks by Anna

Discover More Talks

More talks