Semantic Modelling in Support of Adaptive Multimodal Interface Design - Human-Computer Interaction – INTERACT 2013 Access content directly
Conference Papers Year : 2013

Semantic Modelling in Support of Adaptive Multimodal Interface Design

Elena Tsiporkova
  • Function : Author
  • PersonId : 1006508
Anna Hristoskova
  • Function : Author
  • PersonId : 1006509
Tom Tourwé
  • Function : Author
  • PersonId : 1006510
Tom Stevens
  • Function : Author
  • PersonId : 1006511

Abstract

The design of multimodal interfaces requires intelligent data interpretation in order to guarantee seamless adaptation to the user’s needs and context. HMI (human-machine interaction) design accommodates varying forms of interaction patterns, depending on what is most appropriate for a particular user at a particular time. These design patterns are a powerful means of documenting reusable design know-how. The semantic modelling framework in this paper captures the available domain knowledge in the field of multimodal interface design and supports adaptive HMIs. A collection of multimodal design patterns is constructed from a diversity of real-world applications and organized into a meaningful repository. This enables a uniform and unambiguous description easing their identification, comprehensibility and applicability.
Fichier principal
Vignette du fichier
978-3-642-40498-6_54_Chapter.pdf (307.62 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-01510520 , version 1 (19-04-2017)

Licence

Attribution

Identifiers

Cite

Elena Tsiporkova, Anna Hristoskova, Tom Tourwé, Tom Stevens. Semantic Modelling in Support of Adaptive Multimodal Interface Design. 14th International Conference on Human-Computer Interaction (INTERACT), Sep 2013, Cape Town, South Africa. pp.627-634, ⟨10.1007/978-3-642-40498-6_54⟩. ⟨hal-01510520⟩
71 View
92 Download

Altmetric

Share

Gmail Facebook X LinkedIn More