Welcome to P K Kelkar Library, Online Public Access Catalogue (OPAC)

Normal view MARC view ISBD view

The paradigm shift to multimodality in contemporary computer interfaces /

By: Oviatt, Sharon [author.].
Contributor(s): Cohen, Philip R [author.].
Material type: materialTypeLabelBookSeries: Synthesis digital library of engineering and computer science: ; Synthesis lectures on human-centered informatics: # 30.Publisher: San Rafael, California (1537 Fourth Street, San Rafael, CA 94901 USA) : Morgan & Claypool, 2015.Description: 1 PDF (xxii, 221 pages) : illustrations.Content type: text Media type: electronic Carrier type: online resourceISBN: 9781627057523.Subject(s): Multimodal user interfaces (Computer systems) | Human-computer interaction | multimodal interface | multimodal-multisensor interface | new media | mobility | expressive power | human-centered | communication interfaces | cognition and performance | multisensory perception | design prototyping | data-intensive systems | signal processing | semantic integration | time-sensitive processing | hybrid architectures | commercialization | smart phones and mobile devices | wearable technology | technology paradigm shift | theoretical foundationsDDC classification: 005.437 Online resources: Abstract with links to resource Also available in print.
Contents:
1. Definition and types of multimodal interface --
2. History of paradigm shift from graphical to multimodal interfaces -- 2.1 Early multimodal interfaces -- 2.2 Later advanced multimodal interfaces -- 2.3 Recent mobile devices with multimodal interfaces --
3. Aims and advantages of multimodal interfaces -- 3.1 User preference and natural interaction patterns -- 3.2 Flexible interaction patterns -- 3.3 Accommodation of individual differences -- 3.4 Efficiency -- 3.5 Superior error handling -- 3.6 Minimization of cognitive load -- 3.7 Expressive power and stimulation of cognition --
4. Evolutionary, neuroscience, and cognitive foundations of multimodal interfaces -- 4.1 Evolution of multimodal perceptual and communication abilities -- 4.2 Neuroscience and behavioral research on multisensory processes -- 4.3 Cognitive science and multimodal human-computer interaction research -- 4.3.1 Frequency and occurrence of multimodal interaction -- 4.3.2 Multimodal integration and synchronization patterns -- 4.3.3 Individual and cross-cultural differences in integration patterns -- 4.3.4 Complementarity versus redundancy in multimodal integration --
5. Theoretical foundations of multimodal interfaces -- 5.1 Gestalt theory: greater multimodal coherence, stability, and robustness -- 5.2 Affordance theory: multimodal interfaces guide human communication and activity -- 5.3 Communication accommodation theory: multimodal dialogue convergence improves intelligibility -- 5.4 Working memory theory: distributed multimodal processing improves memory -- 5.5 Cognitive load theory: multimodal processing minimizes load -- 5.6 Activity theory: intensity of multimodal communication stimulates thought --
6. Human-centered design of multimodal interfaces -- 6.1 Methods for prototyping and evaluating multimodal interfaces -- 6.2 Human-centered philosophy and strategies for multimodal interface design -- 6.3 Community-based participatory design of global multimodal interfaces --
7. Multimodal signal processing, fusion, and architectures -- 7.1 Dimensions of modality fusion -- 7.1.1 Why fuse? -- 7.1.2 What to fuse? -- 7.1.3 When and how to fuse? -- 7.2 Fusion architectures -- 7.3 Multimodal emotion recognition example --
8. Multimodal language, semantic processing, and multimodal integration -- 8.1 Multimodal language -- 8.2 Semantic processing and multimodal integration -- 8.2.1 Meaning representations -- 8.2.2 Meaning combination -- 8.3 Evaluation of multimodal integration -- 8.4 Principles for strategizing multimodal integration --
9. Commercialization of multimodal interfaces -- 9.1 Automotive interfaces -- 9.2 Geospatial and design interfaces -- 9.3 Virtual assistant interfaces -- 9.4 Robotic interfaces -- 9.5 Multi-biometric interfaces -- 9.6 Educational interfaces -- 9.7 Entertainment and gaming interfaces -- 9.8 Warehouse automation interfaces -- 9.9 International standards and commercial toolkits -- 9.10 Organizational resistance to adopting new technologies --
10. Emerging multimodal research areas and applications -- 10.1 Tangible, ubiquitous, and wearable multimodal interfaces -- 10.2 Multimodal affect recognition -- 10.3 Multimodal learning analytics and education -- 10.4 Multimodal accessible interfaces -- 10.5 Multimodal virtual agents and robotic interfaces --
11. Beyond multimodality: designing more expressively powerful interfaces -- 11.1 Impact of interface support for expressing multiple representations -- 11.1.1 Ideational fluency -- 11.1.2 Problem solving -- 11.1.3 Inferential accuracy -- 11.1.4 Beyond average: reducing the performance gap between users -- 11.2 Explanatory theory and interface design principles -- 11.3 Future directions: interface support for expressing world linguistic codes --
12. Conclusions and future directions -- Bibliography -- Author biographies.
Abstract: During the last decade, cell phones with multimodal interfaces based on combined new media have become the dominant computer interface worldwide. Multimodal interfaces support mobility and expand the expressive power of human input to computers. They have shifted the fulcrum of human-computer interaction much closer to the human. This book explains the foundation of human-centered multimodal interaction and interface design, based on the cognitive and neurosciences, as well as the major benefits of multimodal interfaces for human cognition and performance. It describes the data-intensive methodologies used to envision, prototype, and evaluate new multimodal interfaces. From a system development viewpoint, this book outlines major approaches for multimodal signal processing, fusion, architectures, and techniques for robustly interpreting users' meaning. Multimodal interfaces have been commercialized extensively for field and mobile applications during the last decade. Research also is growing rapidly in areas like multimodal data analytics, affect recognition, accessible interfaces, embedded and robotic interfaces, machine learning and new hybrid processing approaches, and similar topics. The expansion of multimodal interfaces is part of the long-term evolution of more expressively powerful input to computers, a trend that will substantially improve support for human cognition and performance.
    average rating: 0.0 (0 votes)
Item type Current location Call number Status Date due Barcode Item holds
E books E books PK Kelkar Library, IIT Kanpur
Available EBKE630
Total holds: 0

Mode of access: World Wide Web.

System requirements: Adobe Acrobat Reader.

Part of: Synthesis digital library of engineering and computer science.

Includes bibliographical references (pages 173-219).

1. Definition and types of multimodal interface --

2. History of paradigm shift from graphical to multimodal interfaces -- 2.1 Early multimodal interfaces -- 2.2 Later advanced multimodal interfaces -- 2.3 Recent mobile devices with multimodal interfaces --

3. Aims and advantages of multimodal interfaces -- 3.1 User preference and natural interaction patterns -- 3.2 Flexible interaction patterns -- 3.3 Accommodation of individual differences -- 3.4 Efficiency -- 3.5 Superior error handling -- 3.6 Minimization of cognitive load -- 3.7 Expressive power and stimulation of cognition --

4. Evolutionary, neuroscience, and cognitive foundations of multimodal interfaces -- 4.1 Evolution of multimodal perceptual and communication abilities -- 4.2 Neuroscience and behavioral research on multisensory processes -- 4.3 Cognitive science and multimodal human-computer interaction research -- 4.3.1 Frequency and occurrence of multimodal interaction -- 4.3.2 Multimodal integration and synchronization patterns -- 4.3.3 Individual and cross-cultural differences in integration patterns -- 4.3.4 Complementarity versus redundancy in multimodal integration --

5. Theoretical foundations of multimodal interfaces -- 5.1 Gestalt theory: greater multimodal coherence, stability, and robustness -- 5.2 Affordance theory: multimodal interfaces guide human communication and activity -- 5.3 Communication accommodation theory: multimodal dialogue convergence improves intelligibility -- 5.4 Working memory theory: distributed multimodal processing improves memory -- 5.5 Cognitive load theory: multimodal processing minimizes load -- 5.6 Activity theory: intensity of multimodal communication stimulates thought --

6. Human-centered design of multimodal interfaces -- 6.1 Methods for prototyping and evaluating multimodal interfaces -- 6.2 Human-centered philosophy and strategies for multimodal interface design -- 6.3 Community-based participatory design of global multimodal interfaces --

7. Multimodal signal processing, fusion, and architectures -- 7.1 Dimensions of modality fusion -- 7.1.1 Why fuse? -- 7.1.2 What to fuse? -- 7.1.3 When and how to fuse? -- 7.2 Fusion architectures -- 7.3 Multimodal emotion recognition example --

8. Multimodal language, semantic processing, and multimodal integration -- 8.1 Multimodal language -- 8.2 Semantic processing and multimodal integration -- 8.2.1 Meaning representations -- 8.2.2 Meaning combination -- 8.3 Evaluation of multimodal integration -- 8.4 Principles for strategizing multimodal integration --

9. Commercialization of multimodal interfaces -- 9.1 Automotive interfaces -- 9.2 Geospatial and design interfaces -- 9.3 Virtual assistant interfaces -- 9.4 Robotic interfaces -- 9.5 Multi-biometric interfaces -- 9.6 Educational interfaces -- 9.7 Entertainment and gaming interfaces -- 9.8 Warehouse automation interfaces -- 9.9 International standards and commercial toolkits -- 9.10 Organizational resistance to adopting new technologies --

10. Emerging multimodal research areas and applications -- 10.1 Tangible, ubiquitous, and wearable multimodal interfaces -- 10.2 Multimodal affect recognition -- 10.3 Multimodal learning analytics and education -- 10.4 Multimodal accessible interfaces -- 10.5 Multimodal virtual agents and robotic interfaces --

11. Beyond multimodality: designing more expressively powerful interfaces -- 11.1 Impact of interface support for expressing multiple representations -- 11.1.1 Ideational fluency -- 11.1.2 Problem solving -- 11.1.3 Inferential accuracy -- 11.1.4 Beyond average: reducing the performance gap between users -- 11.2 Explanatory theory and interface design principles -- 11.3 Future directions: interface support for expressing world linguistic codes --

12. Conclusions and future directions -- Bibliography -- Author biographies.

Abstract freely available; full-text restricted to subscribers or individual document purchasers.

Compendex

INSPEC

Google scholar

Google book search

During the last decade, cell phones with multimodal interfaces based on combined new media have become the dominant computer interface worldwide. Multimodal interfaces support mobility and expand the expressive power of human input to computers. They have shifted the fulcrum of human-computer interaction much closer to the human. This book explains the foundation of human-centered multimodal interaction and interface design, based on the cognitive and neurosciences, as well as the major benefits of multimodal interfaces for human cognition and performance. It describes the data-intensive methodologies used to envision, prototype, and evaluate new multimodal interfaces. From a system development viewpoint, this book outlines major approaches for multimodal signal processing, fusion, architectures, and techniques for robustly interpreting users' meaning. Multimodal interfaces have been commercialized extensively for field and mobile applications during the last decade. Research also is growing rapidly in areas like multimodal data analytics, affect recognition, accessible interfaces, embedded and robotic interfaces, machine learning and new hybrid processing approaches, and similar topics. The expansion of multimodal interfaces is part of the long-term evolution of more expressively powerful input to computers, a trend that will substantially improve support for human cognition and performance.

Also available in print.

Title from PDF title page (viewed on April 26, 2015).

There are no comments for this item.

Log in to your account to post a comment.

Powered by Koha