Lundheim: Exploring Affective Audio Techniques in an Action-Adventure Video Game
ATTENTION - This is a simplified overview of the following publication:
Sound Design for Video Games: Final Project (Sound Design BA)
Introduction
Affective computing, as discussed by Picard, is a field dedicated to creating computer systems that can recognise, respond to, and even invoke emotions. These systems aim to make technology more adaptive and user-friendly by catering to the emotional needs of users. Drawing from cognitive psychology, Russell’s circumplex model of affect provides a framework where emotions are mapped on a two-dimensional plane. The x-axis represents valence (pleasantness), and the y-axis represents arousal (activation level). This model is particularly useful for computer systems, as it simplifies the complex world of emotions into a manageable format. For instance, Griffiths et al. used this model to generate music playlists that match the listener’s emotional state.
Affective techniques can also be applied to video games, a field sometimes referred to as affective gaming. Garner has explored using biofeedback technologies like consumer-grade EEG headsets to control audio in horror games, creating adaptive gameplay experiences. However, there’s a need to explore how these systems can be applied to other genres and contexts. Lundheim, an action-adventure game, steps into this gap by integrating affective mechanisms.
The Lundheim Experience
Game Concept and Narrative
Lundheim is set in a mystical Old Norse realm where emotions shape reality. The player wakes up in a fog-bound world and must navigate a series of emotional challenges to find their way home. The game uses an EEG headband to capture the player’s emotional state, which is displayed in real-time on the screen.
As the player progresses, they collect fragments of a map that reveals the emotional runes they must decode. Each rune corresponds to a specific emotional state, from calm to intense, and from good to bad. The player must enter these states to activate the runes, triggering in-game events like a waterfall, a bonfire, or a colony of bats. Completing these challenges eventually raises a bridge, leading to a portal that ends the game.
Interactive Music
Music in Lundheim is not just background noise; it’s a dynamic experience that mirrors the player’s emotions. The game features a main theme with two timbre profiles — warm sounds for positive valence and cold for negative. These profiles are further divided into three intensity levels, modulated by the arousal value, creating a six-track system that blends seamlessly based on the player’s emotional state.
For instance, an excited state might invoke warm, intense music, while a sad state could trigger cold, passive music. This system, implemented using Wwise’s blend containers, allows for a rich, multidimensional soundspace that adapts in real-time to the player’s feelings.
Sound Design
Sound design in Lundheim is linked directly to the player’s progress, with each challenge triggering a unique element in the soundscape. The game uses a mix of original and processed sounds, created using tools like Studio One 5 and Audacity. For example, the bonfire sound is a layered composition of twigs, bubble wrap, and water dripping on a hot pan.
As the player completes challenges, new sounds are introduced, gradually building a dynamic soundscape. The “calm” rune triggers the sound of a waterfall, while the “strong” (intense) rune lights a bonfire, adding crackling sounds. The “bad” rune brings forth the eerie sounds of bats and demonic laughter, and the “good” rune raises a bridge with a chiming sound.
Environmental Ambience
The game’s environment is as responsive as its music. As the player’s emotions shift, so does the ambient sound. A positive shift will bring birdsong and sunny weather, while a negative shift introduces rain and wind. This continuous adaptation ensures that the player’s emotional state is always reflected in the game world, communicating important contextual information.
Technical Implementation
EEG and Neural Scores
Lundheim uses an Interaxon Muse EEG headband to capture the player’s emotional state. This data is processed by a bespoke software called Neural Scores, which interprets the EEG signals and plots the emotional state on Russell’s circumplex model. The system communicates these values to the game via the Open Sound Control (OSC) protocol, allowing for real-time adaptation of the game environment and music.
Game Engine and Assets
The game is built using the Unity game engine, with various graphics from third-party asset packs and original sounds. The sounds are implemented using the game audio middleware software Wwise, which provides a suite of tools for interactive sound design. The combination of Unity and Wwise allows for a seamless integration of affective techniques into the gameplay.
Video Demo
Share on: