MURMUR Collective Intelligence System
Services
Immersive Storytelling
AI Generative Design
Talks & Workshops
Location
London
Year
2025
Credits
Collaboration: Kexin Jin
Info
‘Elements’ was developed as part of Murmur in 2025, bringing together sound, visuals, and embodied interaction within an immersive installation-performance. The work explored the feeling of being by the sea, using sound as a grounding force. The soundscape blended natural textures with meditative rhythms, creating a space that encouraged stillness, reflection, and intuitive movement. The visuals were inspired by black sand drifting in the wind, forming generative patterns that continuously shifted and dissolved. These forms behaved like a living system, responding to subtle changes in the environment and creating a sense of constant movement and flow. Together, sound and image created a space that felt fluid and atmospheric, inviting participants into a shared sensory experience.

SPECIFICATIONS
Interactive Installation | Audiovisual Performance | Body Tracking | Real-time Generative Visuals | Spatial Sound | AI Integration
ROLE
I developed and contributed to the interactive audiovisual system, working across sound design, generative visuals, and real-time interaction.
The project was built using tools such as TouchDesigner for real-time visual generation and system integration, alongside spatial audio workflows that allowed sound to move and evolve within the space.
Body tracking was used to capture audience movement, allowing the visuals and sound to respond dynamically to presence and gesture. This created a feedback loop between the participant and the system, where even subtle movement could shift the environment.
I also explored the use of AI-driven processes within the system to shape visual behaviour and introduce variation over time, allowing the installation to feel less fixed and more alive.
As audiences entered the space, they became part of the work. Movement emerged naturally, shaped by the interaction between body, sound, and visuals, dissolving the boundary between performer and participant.
Through this project, I explored how interactive technologies can create shared, embodied experiences, where systems respond not just to action, but to presence, rhythm, and atmosphere.
Real-time Visual System
Generative visuals developed in TouchDesigner responding dynamically to sound and movement.
Sound + Spatial Design
A layered soundscape designed to move through space, shaping how participants experience the environment.
Body Tracking + Interaction
Audience movement captured and translated into visual and sonic changes, creating a feedback loop between body and system.
AI Integration
AI-driven sound design processes in Somax AI introduced variation and unpredictability, allowing the installation to evolve over time.





