THE FIRST EVER AUDIOVISUAL INSTRUMENT!
(as far as I’m aware)
The Zenoid is a project in constant evolution, the apex of the search for an audiovisual instrument. It creates audio and video in parallel, allowing the performer to create an audiovisual composition in the same way that a Jazz musician can tag along to a drummer.
Zenoid @ Art in Flux 2018 from Mowgli on Vimeo.
Ever since I started vjing, I’ve strived to make a tool that creates audio and video simultaneously to deliver a unique synaesthetic output. My ambition, always being to create a versatile instrument that can be picked up to instantly create a unique audiovisual composition.
Through on-going experimentation, I’m fine-tuning the Zenoid, trying to achieve an effortless sensory tie between audio and video.
Despite there being plenty of sound reactive video tools out there, I’ve always found them to be “noisy” the second there’s more than one sound element playing. This meaning that the moment there’s more than one sound source, the sound reaction controlling the video doesn’t convey the subjective and emotional human experience of the audio. While it’s easy to achieve audio/video correlation when a single kick drum is controlling a video parameter, getting the same sensory experience when the audio input is a complex composition is no mean feat, direct correlation of audio and video parameters doesn’t deliver an output that the brain unifies as one.
Using my experiences as a DJ and VJ, I’m gradually working towards defining a paradigm that will unify audio and video from the subjective, emotional, human experience. BPM, pitch, timbre and volume are not enough when it comes to creating emotional equivalences between auditory and visual stimuli.
My interest in audio synthesis provided me with the methodology from which to start exploring the audiovisual parallels; in the same way that a modular synthesiser starts with an oscillator, I started my visual synthesis with a visual wave: not a visual representation of the sound wave but just “a” wave. The sound source used in the Zenoid MK 1 was a sine wave as it is the simplest, purest waveform and I had the means to easily create a smooth wave visually.
From the 2 initial audio and video seeds, parameters affecting the audio and visual wave have been paired following on-going experimentation to create a whole where human perception ties both stimuli to interpret them as one.
While this is the starting point, this methodology is far from achieving a complex audio or video output. Again, the process used in audio synthesis is used as an analogy to create complexity from a basic visual seed. After the seeds are created, they are routed through a chain of audio and visual effects which create the layers of complexity in the audio and video outputs.
Presenting the Zenoid MK 1 to the public for the first time at VJ London. Thankfully I’m not the only one with a massive grin!
Zenoid³ is an installation combining the Zenoid with projection mapping. It exists in a number of variants. As a single cube, as 3 cubes as seen on the video, as a self-running sound-reactive installation and as an audiovisual performance installation.
Using projection techniques the illusion of objects floating in space is achieved, creating a mesmerising experience that always surprises viewers.
The 3 cube variant currently exists with 40cm per side cubes. A single cube with 1.5m per side is also available. Neither of them are suitable for outdoor use but outdoor versions could be easily made.
Due to the nature of the installation, it can be easily scaled to achieve much bigger sizes. The limiting factor will be the positioning of the projector as it needs to be far enough to cover the whole cube area.
Do not hesitate to contact me if you’d like to enquire about Zenoid³.