3 mins read

How Does Musical Touch Integrate With Digital Instruments?

Musical Touch Integrate With Digital Instruments

We are a team of experienced software developers and musicians that have combined our skills to develop Musical Touch, an interactive and engaging educational tool that uses state of the art LED technology to produce dazzling visual effects. Designed to be used by children of all ages to stimulate their senses and creativity. Ideal for use in schools, hospitals, day care centers, children’s nurseries and assessment centers. Musical Touch can be used in a variety of ways including the classic mode of pressing buttons on the front face of the unit, using a sensor on the back to control the modes or simply moving your hand across the screen. It’s simple, fun and engaging.

We believe that a fundamental challenge in digital instrument design is how to allow the musician to control and learn to control an instrument based on the dynamics of the mechanical coupling between instrument player and the instrument itself. This is a different paradigm than the traditional view that treats the instrument as the system under control, with the Musical Touch as the controller.

In our model, the instrument player is the controller of a dynamically coupled system determined conjointly by the biomechanics of the body and the mechanics of the instrument. The instrument player’s superior access to feedback from the dynamics of this coupled system enables them to better refine and stabilize their ancillary movement controls as they acquire skill with the instrument.

How Does Musical Touch Integrate With Digital Instruments?

Moreover, the musician’s internal description or internal model of the behavior of an instrument is informed by the rapid communication between body and instrument via their mechanical coupling. The musician’s perception of the instrument’s behavior in this feedback loop is mediated by their experience with the instrument itself, which they know through direct comparisons of the instrument’s sound and its feel.

The dynamic ranges for both the auditory and tactile modalities differ, however. Auditory dynamic ranges typically exceed 130 dB, while tactile dynamic ranges are around 50 dB. This can pose challenges when generating haptic feedback from audio signals, because the level of vibration required to reach a threshold for perceptibility must be limited by the capabilities of both modalities.

This is one of the reasons why it is critical to take into account the latency between a virtual button’s tactile and audio feedback, as well as its perceived velocity. Research on touchscreen virtual button interaction suggests that tolerable multimodal latency should not exceed 25 ms for musical applications. Unfortunately, current digital instruments are often unable to meet these requirements, and so they are not as successful at supporting the musician’s development of manual skill as their acoustic counterparts. This is a critical point for rethinking how to construct new digital instruments.

Leave a Reply

Your email address will not be published. Required fields are marked *