Auditory Shadows is an experimental sound exhibit that explores the concept of higher-dimensional auditory perception. Drawing inspiration from the way higher-dimensional objects cast shadows in lower dimensions, this project reimagines how sound might leave traces or ‘shadows’ in a multidimensional space.
Using spatialized audio techniques such as convolution, feedback delay networks, phase speaker arrays, and granular synthesis, the exhibit creates an immersive, surreal listening environment.
Physical objects in the space subtly alter the sonic experience, reinforcing the theme of dimensional ‘shadows’ and auditory presence. The sound field is designed to feel surreal rather than natural, inviting audiences to engage with sound in an abstract and introspective way.
Beyond its technical exploration, Auditory Shadows also incorporates a deeply personal element, reflecting on grief and the idea of communication across unseen dimensions. Future iterations of the project may involve AI modeling to process and reinterpret the voice of a lost friend, weaving together themes of memory, absence, and presence.
Auditory Shadows is designed to be adaptable, adjusting to different venues and acoustic environments. The project is currently evolving through different phases of development, each expanding its creative and technical potential.
The first phase focuses on developing foundational sound material and spatialization techniques, exploring how sound interacts with space and perception. This includes:
The sound is designed to be performed or played in stereo or quadraphonic setups (4 speakers) using a combination (depending on space and feasibility) of:
This phase serves as the foundation for future development, ensuring flexibility across different sound systems and performance setups.
The second phase involves scaling up to a multi-speaker array, allowing for more immersive and dynamically shifting sound environments. This stage will explore higher-resolution spatialization and how different speaker placements can influence perception.
The final phase envisions a more interactive exhibit>, where participants can move through a space, influencing the sound field dynamically. This could involve motion tracking, directional sound projection, or spatial sensors, creating a more participatory experience.