Ecosystem of Modes

Chapter 2

We start with the basics of Interaction Design and observe how patterns like Affordance, Mental Model, Mapping, and Constraints appear in everyday life. We critically reflect on the rapidly growing field of multimodal human-machine interaction.

Interaction Modes

The following six Interaction Modes represent fundamental channels through which people can interact with digital systems. Each mode uses different sensory and motor capabilities and is therefore suited to different use cases. In today’s design landscape, these modes are rarely considered in isolation — rather, the focus is on their interplay in multimodal interactions.

While we previously worked primarily with visual interfaces and simple tactile inputs (mouse/keyboard), technological advances now enable much richer forms of interaction. Understanding each individual mode forms the foundation for designing holistic experiences that engage people across their full sensory and expressive range.

Tactile

Tactile

Touch, pressure, vibration, haptic feedback.

Gesture

Gesture

Spatial body movements, pointing, positioning.

Visual

Visual

Light, color, visual displays, optical feedback.

Voice

Voice

Speech input and output, verbal commands.

Audio

Audio

Non-speech sounds, tones, ambient audio cues.

Spatial

Spatial

Physical positioning, proximity, location-based.

Multimodal Patterns

Building on the Interaction Modes presented above, we can identify patterns that describe how these modes work together in multimodal systems. These Multimodal Interaction Patterns help us understand and design complex interactions that use multiple sensory and motor channels simultaneously.

In today’s digital landscape, we rarely interact through a single mode. Instead, we experience fluid transitions and combinations of different modes. The following six patterns describe how these modes interact with each other and how we as designers can shape these relationships to create more user-friendly and effective experiences.

Coherence

Coherence

Unified experience across all interaction modes.

Adaptivity

Adaptivity

Mode adaptation based on capabilities and context.

Synchronization

Synchronization

Real-time alignment between interaction modes.

Mode Shifting

Mode Shifting

Transitions between interaction modes.

Complementarity

Complementarity

Interaction modes enhance each other.

Simultaneity

Simultaneity

Multiple interaction modes are active together.

The Multimodal Interaction Patterns are a 2025 evolution of the Patterns for Multiscreen Strategies published in 2011 by precious design studio. In the meantime, the slide deck had 70k+ views, 3k+ downloads and more than 600 likes.

Distinction
  • Coherence is about how it behaves and how it feels (unified experience)
  • Adaptivity is about who uses it where and when (context-awareness)
  • Synchronization is about data staying current and connected (state management)
  • Mode Shifting is about when you use a specific mode (timing)
  • Complementarity is about what a mode does in relation to another (role specialization)
  • Simultaneity is about using multiple modes together and in parallel (concurrency)
Examples

Coherence Example

Coherence

Netflix interface: Same content library, consistent navigation whether you use voice remote, touch screen, or gesture control.

Integration: Same design language and mental models across all modes. Unified experience. “This feels like the same app whether I touch, speak, or gesture”

Adaptivity Example

Adaptivity

Smart home system: Uses touch interface during day, switches to voice control at night, and offers simplified controls for elderly users.

Context-awareness: System changes interaction mode based on user, environment, or situation. Responsive adaptation. “The system adjusts HOW I interact based on WHO I am and WHEN/WHERE I’m using it”

Synchronization Example

Synchronization

Spotify: Pause music on your phone, it instantly pauses on your smart speaker and car display.

State alignment: Changes in one mode instantly reflect everywhere else. Data consistency. “What I do here immediately shows up there”

Mode Shifting Example

Mode Shifting

Cooking app: Start reading recipe on screen → switch to voice commands when hands get messy → return to touch for timer.

Sequential: you use modes one after another. Temporal separation. “I’m switching FROM touch TO voice”

Complementarity Example

Complementarity

Navigation: Map shows visual route on screen, audio provides turn-by-turn directions, haptic buzz warns of speed cameras.

Specialized roles: Each mode contributes different information types. Functional separation. “Screen SHOWS the map, voice TELLS directions”

Simultaneity Example

Simultaneity

Gaming: Moving joystick + speaking voice commands + seeing visual feedback + feeling controller vibration all happening together.

Concurrent: Multiple modes active at the exact same moment. Temporal overlap. “I’m using joystick AND voice AND haptics RIGHT NOW”