Interactive Digital Multimedia Techniques: Final Project (Computer Science MSc)

This project aims to digitally re-imagine the traditional lyre harp using lasers and photoresistors to replicate the strings, and a consumer-grade electroencephalography (EEG) device to produce semi-active timbre and modulation controls. We propose two contrasting playstyles that each embody a different paradigm of sonic art, and discuss and evaluate the results and impact. Further, we suggest new directions in which this project could be taken in future work.

Introduction

In recent years, the accessibility of electronic sensing components and development platforms has led to a surge in the reinvention of traditional instruments as digital musical instruments (DMIs). These digital re-imaginings often preserve the essence of the original instruments while redefining their form and functions. Some have even surpassed their predecessors in popularity, such as the digital piano. These advancements in musical interfaces have given rise to new styles and techniques in music.

The goal of this project is to digitally re-imagine the traditional lyre harp and explore the use of emerging creative technologies to augment its playstyle. We use an array of lasers and photoresistors to replicate the strings and a consumer-grade EEG device to produce semi-active timbre and effect modulation controls. This setup transforms the harp into a frame that reflects the performer’s mind, inviting users to explore their psychophysiological states through sound and music.

Method

Physical Build

A reclaimed wooden mirror frame serves as the body of the harp, housing all the components. Channels were cut into the back of the frame using a routing machine, and sensors/diodes were mounted with hot glue. Specifically, KY-008 laser diode modules and KY-018 photoresistor modules were used.

The KY-018s were chosen for their compact form and built-in voltage divider circuitry, eliminating the need for separate boards for each sensor. Conductive tape was used to lay low-profile power rails, carrying 5v, 3.3v, and ground lines. Hand-made cables carry signals from the photoresistor boards to the microcontroller on the side of the frame. In total, 11 of each module were used, creating a laser harp with 11 programmable strings.

Routing, Mounting Components and Power-rails, Testing

This design was very effective, providing space for a relatively high number of strings and resulting in a clean, low-profile, and robust wiring system. During construction, great care was taken to keep the components as low-profile as possible, ensuring the frame would sit flush with any surface. Once all components were mounted, secured, and connected, a protective cover was laser cut and affixed to the back of the frame.

Protective cover and lighting effects

Arduino

To maintain the low-profile design, a Seeeduino Xiao, a low-profile Arduino-compatible microcontroller board, was used. The Xiao offers up to eleven digital/analog inputs in a 23.5x17.5mm form factor, making it ideal for this build. The Xiao reads sensor values at each photoresistor and records them in an array.

Upon powering on, it undergoes an automatic self-calibration phase, recording the highest and lowest values seen at each photoresistor and producing a set of timer and trigger thresholds for each sensor. After 5 seconds, the harp becomes active and can send note triggers. The harp is programmed as a class-compliant MIDI controller, so no additional drivers are required for use.

Seeeduino XIAO Pinout
Laser harp circuit diagram

To preserve the technical essence of the lyre harp, a pressure-sensitive velocity effect was implemented. The velocity value is calculated from the time it takes for the photoresistor signal to pass from the timer threshold to the trigger threshold. Once the trigger threshold is entered, the Xiao transmits a string pluck with the recorded velocity the moment the sensor value begins to decrease. This technique mimics the physical technicalities of a plucked stringed instrument, where the strings only make a sound when they are released, rather than by just being placed under pressure.

Signal timer and trigger thresholds

Processing

Processing was chosen to handle the MIDI output and to interface with the EEG device. Specifically, we use the Muse: Brain Sensing Headband to capture the neural oscillation data of the performer. The Processing sketch grabs string pluck/release messages from the Xiao and converts them to MIDI notes using if and switch statements to extract relevant information. The state of each string is stored in an array and displayed on-screen. The performer can select the tonal center and scale using the arrow keys.

Muse EEG Headband and Electrode Placement

The Processing sketch interfaces with the EEG headset through the Mind Monitor app, which connects to the headset via Bluetooth, retrieves raw data, applies a Fourier transform, and broadcasts it over OSC. The OSCP5 library is used to bind to a UDP port and listen for incoming EEG messages. The absolute amplitude values for each frequency bandwidth at each electrode position (FP9, TP1, TP2, FP10) are smoothed using a moving average algorithm with a window size of 30.

To calculate the performer’s emotional valence (positive vs negative affect), the alpha activation at the right pre-frontal electrode is subtracted from that of the left pre-frontal electrode – a measurement known as frontal alpha asymmetry, defined as:

\[\text{Valence} \equiv \alpha_{\text{asym}} = \alpha_{\text{PSD}_L} - \alpha_{\text{PSD}_R}\]

Where PSD (power spectral density) is the amplitude power of the respective left or right hemisphere. To calculate the performer’s activation (emotion intensity), the absolute amplitude values across all bandwidths over all electrodes are averaged.

\[\text{Activation} = \frac{\sum_{i=1}^{n}\text{PSD}_i}{n}\]

To visualise the harp strings, the Processing sketch tracks when strings are held and released, drawing active strings on the screen. When a string is held, a red line appears; when released, the string fades while decreasing in width. The background color reacts to the frequency of plucking, becoming brighter with more rapid playing. An EEG signal graph is displayed in the background, allowing performers to monitor and investigate how their responses shape the sound.

Example Visual Component

The processed EEG data, including valence and activation prediction values, is re-broadcasted over OSC to Ableton Live and MAX MSP. A virtual internal MIDI loopback solution is used to transmit MIDI messages to Ableton.

Max for Live

The main sound production takes place in Ableton Live, where virtual instruments generate high-quality sounds controlled by EEG data. A Max for Live device, a version of Max MSP that interfaces with elements inside the DAW, receives OSC messages from the sketch and scales them based on their minimum and maximum values. Various VST instruments’ levels and attributes are controlled by different bandwidth and prediction values. Two contrasting implementations were created to explore the potential musical forms of the digital lyre harp.

Traditional

The traditional implementation preserves much of the original essence of the harp while introducing something novel. The harp still voices ‘one-shot’ notes when strings are plucked, but the timbre depends on the performer’s valence state and total alpha amplitude. These values blend voices of a music box, harp, and marimba. A sustained strings voice plays for the duration of a note hold, blending based on the performer’s focus. DSP techniques, such as a low-pass filter controlled by activation, are also applied.

Experimental

The experimental implementation is quite different. The harp does not sound ‘one-shot’ notes when plucked; instead, string plucks, durations, and velocities voice ambient and evolving VST samplers. The attributes and blendings are controlled by EEG data, with valence state shifting the balance between pleasant and unpleasant voices and various bandwidth amplitudes producing clicking or note arpeggio sounds. This provides an interesting experience by abstracting the harp from musical contexts.

Video Demo