The rise of the Natural User Interfaces (NUIs) such as speech-, gaze-, and gesture recognition, as well as the skyrocketing adoption of connected devices such as smart speakers and wearables, has brought in the age of multi-modal interactions. They allow us to create beautifully complex transitions between touchpoints, devices, and input modalities. But tackling such interfaces can be scary and overwhelming. In this talk, I share lessons I’ve learned from creating such experiences for the medical professionals.
Title slide for Speech & Gaze section
Title slide for Haptics section
Explanation of the Wizard-of-Oz research method
Anna "orchestrating" smart speakers