Skip to main content

Standard Portals for Intelligent Services

  • Chapter
  • First Online:
Book cover Multimodal Interaction with W3C Standards

Abstract

Some multimodal interpretation services natively support W3C multimodal standards, but most still use their own proprietary formats and protocols. This makes it much more difficult for developers to use different systems because they have to learn and program to a new API for each vendor. This paper describes how standards-based servers can wrap proprietary systems in the W3C MMI Architecture and EMMA 2.0 to allow developers to interact with modality interpretation services in a standard way, even if the service that they are using does not natively support the standards.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Barnett, J., Bodell, M., Dahl, D. A., Kliche, I., Larson, J., Porter, B., et al. (2012). Multimodal architecture and interfaces. World Wide Web Consortium. http://www.w3.org/TR/mmi-arch/. Accessed 20 Nov 2012.

  2. Dahl, D. A. (2013). The W3C multimodal architecture and interfaces standard. Journal on Multimodal User Interfaces, 1–12 (2013). doi:10.1007/s12193-013-0120-5.

  3. Barnett, J. (2016). Introduction to the multimodal architecture. In D. Dahl (Ed.), Multimodal interaction with W3C standards: Towards natural user interfaces to everything. New York, NY: Springer.

    Google Scholar 

  4. wit.ai (2015). wit.ai. https://wit.ai/. Accessed 17 Mar 2015.

  5. api.ai (2015). api.ai. http://api.ai/. Accessed 17 Mar 2015.

  6. Microsoft (2015). Language Understanding Intelligent Service (LUIS). Microsoft. http://www.projectoxford.ai/luis. Accessed 5 June 2015.

  7. Amazon (2016). Alexa Skills Kit. Amazon. https://developer.amazon.com/public/solutions/alexa/alexa-skills-kit. Accessed 6 Jan 2016.

  8. affectiva (2016). Affdex emotion sensing and analytics. affectiva. http://www.affectiva.com/solutions/apis-sdks/. Accessed 11 Jan 2016.

  9. EmoVu (2016). EmoVu Cloud API. Eyeris. http://emovu.com/e/developers/api/. Accessed 12 Jan 2016.

  10. Microsoft (2016). Project oxford emotion recognition. Microsoft. https://www.projectoxford.ai/demo/emotion. Accessed 12 Jan 2016.

  11. Kairos (2016). Emotion analysis API. Kairos. https://www.kairos.com/emotion-analysis-api. Accessed 11 Jan 2016.

  12. nViso (2016). nViso emotion recognition. nViso. http://www.nviso.ch/index.html. Accessed 11 Jan 2016.

  13. Burnett, D., Bergkvist, A., Jennings, C., & Narayanan, A. (2015). Media capture and streams (14th ed.). Boston, MA: World Wide Web Consortium.

    Google Scholar 

  14. Johnston, M. (2016). Extensible multimodal annotation for intelligent interactive systems. In D. Dahl (Ed.), Multimodal interaction with W3C standards: Towards natural user interfaces to everything. New York, NY: Springer.

    Google Scholar 

  15. Johnston, M., Baggia, P., Burnett, D., Carter, J., Dahl, D. A., McCobb, G., et al. (2009). EMMA: Extensible MultiModal Annotation markup language. W3C. http://www.w3.org/TR/emma/. Accessed 9 Nov 2012.

  16. Johnston, M., Dahl, D. A., Denny, T., & Kharidi, N. (2015). EMMA: Extensible MultiModal Annotation markup language Version 2.0. World Wide Web Consortium. http://www.w3.org/TR/emma20/. Accessed 16 Dec 2015.

  17. Barnett, J. (2016). Introduction to SCXML. In D. Dahl (Ed.), Multimodal interaction with W3C standards: Toward natural user interfaces to everything. New York, NY: Springer.

    Google Scholar 

  18. Barnett, J., Akolkar, R., Auburn, R. J., Bodell, M., Burnett, D. C., Carter, J., et al. (2015). State Chart XML (SCXML): State machine notation for control abstraction. World Wide Web Consortium. http://www.w3.org/TR/scxml/. Accessed 20 Feb 2016.

  19. Fielding, R. T., Gettys, J., Mogul, J., Frystyk, H., Masinter, L., Leach, P., et al. (1999). RFC 2616 hypertext transfer protocol—HTTP/1.1. Internet Engineering Task Force (IETF). https://tools.ietf.org/html/rfc2616. Accessed 12 Jan 2016.

  20. Fette, I., & Melnikov, A. (2011). RFC 6455 The WebSocket Protocol. Internet Engineering Task Force (IETF). https://tools.ietf.org/html/rfc6455. Accessed 12 Jan 2016.

  21. Hickson, I. (2012). The WebSocket API. The World Wide Web Consortium. http://www.w3.org/TR/websockets/. Accessed 20 Nov 2012.

  22. Rodríguez, B. H., Barnett, J., Dahl, D., Tumuluri, R., Kharidi, N., & Ashimura, K. (2015). Discovery and registration of multimodal modality components: State handling. World Wide Web Consortium. https://www.w3.org/TR/mmi-mc-discovery/.

  23. Rodriguez, B. H., & Moissinac, J.-C. (2016). Discovery and registration—finding and integrating components into dynamic systems. In D. A. Dahl (Ed.), Multimodal interaction with W3C standards: Toward natural user interfaces to everything. New York, NY: Springer.

    Google Scholar 

  24. Rodriguez, B. H., Wiechno, P., Dahl, D. A., Ashimura, K., & Tumuluri, R. (2012). Registration & discovery of multimodal modality components in multimodal systems: Use cases and requirements. World Wide Web Consortium. http://www.w3.org/TR/mmi-discovery/. Accessed 26 Nov 2012.

  25. Garrett, J. J. (2005). Ajax: A new approach to web applications. Adaptive Path. https://web.archive.org/web/20080702075113/http://www.adaptivepath.com/ideas/essays/archives/000385.php. Accessed 14 Jan 2016.

  26. Hilton, A. (2015). EmotionAPI 0.2.0. Coolfire solutions. https://github.com/Felsig/Emotion-API. Accessed 11 Jan 2016.

  27. Schröder, M., Baggia, P., Burkhardt, F., Pelachaud, C., Peter, C., & Zovato, E. (2014). Emotion Markup Language (EmotionML) 1.0 World Wide Web Consortium. http://www.w3.org/TR/emotionml/.

  28. Burkhardt, F., Pelachaud, C., & Schuller, B. (2016). Emotion markup language. In D. Dahl (Ed.), Multimodal interaction with W3C standards: Toward natural user interfaces to everything. New York, NY: Springer.

    Google Scholar 

  29. Kliche, I., Dahl, D. A., Larson, J. A., Rodriguez, B. H., & Selvaraj, M. (2011). Best practices for creating MMI modality components. World Wide Web Consortium. http://www.w3.org/TR/2011/NOTE-mmi-mcbp-20110301/. Accessed 20 Nov 2012.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Deborah A. Dahl .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Dahl, D.A. (2017). Standard Portals for Intelligent Services. In: Dahl, D. (eds) Multimodal Interaction with W3C Standards. Springer, Cham. https://doi.org/10.1007/978-3-319-42816-1_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-42816-1_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-42814-7

  • Online ISBN: 978-3-319-42816-1

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics