Haptics in Computer Music: A Paradigm Shift
Haptics in Computer Music: A Paradigm Shift
Abstract. With an historical point of view combined with a bibliographic overview, the article discusses the idea that haptic force-feedback transducers correspond with a paradigm shift in our real-time digital tools for creating music. So doing, it shows that Computer Music may be regarded as a major field of research and application for haptics.
Introduction
The possible cross-empowerment of Haptics and Computer Music has not yet been widely investigated. Though, real-time digital musical instruments are a particularly promising field of research and application for the Haptics community. With an historical approach to Computer Music, this article demonstrates how Haptics correspond with a paradigm shift in our digital instruments - from the principle of interactive musical systems to the concept of instrumental interaction.
?
Primary Feedback: gesture
(due to the ergonomy of the transducers)
Fig. 1: Usual structure of a contemporary real time sound system from [1]
1. The gesture controller(s), which ergonomy impacts both the primary feedback and the category of the possible gestures, their dexterity, and finally expressivity. 422
2. The real time sound processes, which usually inherit from the well-known signalbased paradigms that have developed till the 90s: additive, subtractive, FM synthesis; sampling; sound filtering; sound processing; etc. 3. The mapping stage, which is in charge of solving the ontological gap between the gestures (or the gesture signals), and the parameters of the sound processes. The choice of an appropriate strategy is difficult, since various parameters should be varied in correlation in order to approach sufficiently thin variations in the sounds. Such a three-level structure does extend the possibilities of our musical tools, which enabled innovative musical uses. For example, musicians can now choose the gesture controller in a large panoply (keyboards, mouth pads, joysticks, cameras, etc), and thus adapt their gesture to musical needs. Ideally, they can also program the sound quality that will be controlled or interpreted when performing: amplitude frequency, etc. of course, but also rhythms, timbre, localization in space, morphing However, we can nothing but note that the digital systems that conform with this structure have not yet succeeded in offering expressive possibilities as interesting as those of non-digital traditional instruments (such as the violin or the electric guitar, for example) [1]. Now that this main stream approach has led to a high level of complexity and technological efficiency, there must be some fundamental reasons that explain this still-remaining lack in expressivity.
Player
Gesture interaction
Instrument
Passive physical object
Energetic coupling
Eye
Visual cues
As a hypothesis, we may consider that this energetic coupling, and the tactiloproprio-kinesthetic feedback correlated with the sound structure behavior, are important properties. They influence the sound quality and diversity, the readability of the gestures within the sound, and they are both needed for a high level of sensitivity and expressivity.
The organ was probably the very first engine that decoupled the data flow (on/off) from the energy. Consequently, it does not fit with the analysis in the article.
423
The study of usual digital artifacts reinforces this hypothesis. For example, usual digital systems for sustained excitation instruments, such as strings or winds, are still hardly satisfactory for human hearing. This is not due to a lack of precision of the sound models since these are now very accurate. Indeed, with such instruments, the principles of gesture controller and mapping do encounter their limits, because they prohibit a close relationship between the player and sound, or, more precisely, between the player and the sound production mechanism2.
Player
Gesture interaction
Virtual energetic coupling
Eye
Visual cues
Various experiments (see [3, 4, 5, 6] for example) have nowadays proved at least partially the relevance of the structure on Fig. 3. The case study of the violin by Florens in our laboratory, that was recently improved [7], will be summarized. In this experiment, the string was considered as a fully linear system, and the bow/string interaction implemented the most simple ever non-linear viscosity curve. Conversely to the very-simplicity in the modeling of the string, the installation implemented a TGR haptic device [8] with a specific mechanical morphologic adapter of the ERGOS panoply [9]. As a result, most of the relevant sound cues could be easily: and naturally obtained: full excitation of the string on its first mode, full harmonic, creaking, etc. This experiment proved that the use of a high-quality haptic system is at least as important (and probably more important) in that case than the accuracy of the computed model.
2
This analysis can be extended to the case of the piano. Whatever the accuracy of the sound signal process involved and the quality of the touch of the keyboard (their primary feedback on Fig. 1), digital pianos are still less vivid than real ones. Indeed, they still treat the gesture as a small-band command signal for the sound process. The fact that the organ-like digital instruments are amongst the most comparable with their real counterparts also reinforces the hypothesis, given the foot note 1.
424
References
[1] M Wanderley : Contrle gestuel de la synthse sonore in Interfaces HommesMachine et Cration Musicale Vinet H & Delalande F Dr - Hermes, Paris, 1999. [2] V Verfaille : Adaptative Digital Audio Effects, COST-G6. Conference on Digital Audio Effects DAF-X 01 Limerick, Ireland, 2001. [3] S. O'Modhrain, C. Chafe : Incorporating Haptic Feedback into Interfaces for Music Applications - in proceedings of ISORA, World Automation Conference, 2000 [4] S Rimell, D M. Howard, A M. Tyrrell, R Kirk, A Hunt Cymatic: Restoring the Physical Manifestation of Digital Sound using Haptic Interfaces to Control a New Computer Based Musical Instrument International Computer Music Conference ICMC02, Goteborg, Sweden, 2002. [5] C Nichols: the vBow: a Virtual Violin Bow Controller for Mapping Gesture to Synthesis with Haptic Feedback in Organised Sounds Leigh Landy ed. - Leicester, United Kingdom, 2002. [6] Bongers B.: The Use of Active Tactile and Force Feedback in Timbre Controlling Musical Instruments , Proceedings of the International Computer Music Conference (ICMC) in rhus, Denmark. September 1994 [7] JL Florens Expressive bowing on a Virtual String Instrument, Forum Acusticum, Sevilla, September 2002. [8] Cadoz C, Lisowski L, Florens JL : A Modular Feedback Keyboard Design Computer Music Journal vol. 14/2, pp. 47-51 MIT Press1990. [9] JL Florens, A Luciani, C. Cadoz, N Castagne: ERGOS: a Multi-Degrees of Freedom and Versatile Force-Feedback Panoply in these Proceeding of Eurohaptics 2004. [10] D Prytherch, B Jerrard: Haptics, the Secret Senses; the covert nature of the haptic senses in creative tacit skills Proceedings of the Eurohaptics 2003 Conference, Dublin, Ireland, 2003.
425