Advertisement

Instruvis: Play Music Virtually and Visualize the Data

  • Ismail Ayaz
  • Elumalai Monisha
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 852)

Abstract

The use of micro-inertial and magnetic sensors in day-to-day life is becoming more and more common with the introduction of IoT devices. Using the concept of these sensors we have designed Instruvis - a virtual platform for musicians to produce music using wearable devices that use machine learning algorithms to train gestures representing notes using MIDI in Digital Audio Workstation. Instruvis has been developed using Intel-Curie chips that is designed with motion sensors and modules involving pattern matching engine. Wearing the devices makes it possible for musicians to produce and enjoy quality music on-the-go in place of confining them to a setup. We evaluated our prototype on music enthusiasts and are further improving Instruvis based on our findings.

Keywords

Virtual music Gesture recognition Pattern recognition Music visualization 

References

  1. 1.
    Fang, B., Sun, F., Liu, H., Guo, D.: Development of a wearable device for motion capturing based on magnetic and inertial measurement units. Sci. Program. 2017 (2017)CrossRefGoogle Scholar
  2. 2.
    Pinoli, J.-C.: Mathematical Foundations of Image Processing and Analysis, vol. 1. Wiley-ISTE, Hoboken (2014). PrintzbMATHGoogle Scholar
  3. 3.
    ROLI Tutorials: Seaboard Playing Techniques and Sounds - Classic Pad. YouTube, 24 August 2015 (2015). https://www.youtube.com/watch?v=WUV79eVugTk. Accessed 04 Dec 2017
  4. 4.
    Electronic music in the 1920s – the ondes Martenot. Audio and sound. http://www.noiseaddicts.com/2009/01/ondes-martenot-electronic-music-theremin/. Accessed 04 Dec 2017
  5. 5.
    Roads, C.: Composing Electronic Music: A New Aesthetic. Oxford University Press, Oxford (2015)CrossRefGoogle Scholar
  6. 6.
    Premaratne, P., Ajaz, S., Premaratne, M.: Hand gesture tracking and recognition system using Lucas-Kanade algorithms for control of consumer electronics. Neurocomputing 116, 242–249 (2013)CrossRefGoogle Scholar
  7. 7.
    Chan, W.-Y., Huamin, Q., Mak, W.-H.: Visualizing the semantic structure in classical music works. IEEE Trans. Visual Comput. Graph. 16(1), 161–173 (2010)CrossRefGoogle Scholar
  8. 8.
    ZDNet: Design with the Intel Curie Compute Module. Digit, 14 June 2017 (2017). https://www.digit.in/apps/design-with-the-intel-curie-compute-module-35608.html. Accessed 03 Dec 2017
  9. 9.
    Collopy, F.: Color, form, and motion: dimensions of a musical art of light. Leonardo 33(5), 355–360 (2000)CrossRefGoogle Scholar
  10. 10.
    ® Quark™ SE microcontroller, by Intel. http://www.mouser.com/pdfdocs/IntelQuarkSEProductBrief.pdf
  11. 11.
    NeuroMem technology Guide, rev 4.0.1, revised 5/6/2016. General VisionGoogle Scholar
  12. 12.

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.University of Texas at DallasRichardsonUSA

Personalised recommendations