Crossed Wires: Making a Fractal You Can Hear

Artificial Noodles ·

Inspired by Synesthesia on Wikipedia

Built with Three.js · ShaderMaterial · Web Audio API (AudioContext, AnalyserNode) · EffectComposer (UnrealBloomPass)

Techniques GLSL Raymarching · IQ Cosine Palette · Chromatic Aberration · Bloom

Direction Rebuild the crossed wires of infant synesthesia — a single system where mouse position simultaneously drives Mandelbulb fractal power and pentatonic chord voicing

Result A real-time raymarched Mandelbulb that detonates into crystalline complexity when you move fast, while eight sine oscillators swell through a reverb-drenched pentatonic drone — the visual is the audio is the shape

The Story

Synesthesia is a neurological condition where one sense triggers another involuntarily. You hear a C-sharp and see blue. You taste a word. You feel a texture when you read a number. About 4% of people experience it. For the rest of us, senses stay politely in their own lanes.

The Wikipedia article describes something fascinating: all newborns are believed to be synesthetic. The neural pathways haven’t separated yet. Every sensation is every other sensation. Then, around three months old, the wires untangle and the world splits into distinct channels. Seeing becomes just seeing. Hearing becomes just hearing. The unified experience dissolves.

We wanted to rebuild the crossed wires. Not as a metaphor or a visualizer that reacts to audio, but as a single system where the visual parameters are the audio parameters. Move the mouse and you’re not controlling two things. You’re controlling one thing that happens to express itself as both shape and sound simultaneously.


The Take

The concept is a Mandelbulb fractal rendered in real-time via GLSL raymarching, coupled to a Web Audio synthesizer. One input drives both outputs. There is no separation between the visual layer and the audio layer because they read from the same state.

The experience opens with a dark screen and a title: SYNESTHESIA.OS. The Mandelbulb is already visible but dormant, a smooth blob at low iteration power, gently breathing. Click to begin. Audio initializes, the overlay fades, and the fractal awakens.

Mouse position controls everything. Horizontal movement orbits the camera around the fractal and selects which notes from a pentatonic scale are active. Vertical movement adjusts the camera elevation and sweeps a low-pass filter across the audio spectrum. Movement speed itself controls a third parameter: activity, which drives the Mandelbulb’s iteration power from a resting 7.5 up to 12, bloom intensity, chromatic aberration, master volume, and individual oscillator amplitudes.

Stop moving and the system settles. The fractal smooths out, the bloom dims, the sound fades to near-silence. A gentle sine wave modulates the power parameter so the shape breathes even at rest. Start moving again and everything surges back.


The Tech

The entire experience runs on a single fullscreen ShaderMaterial quad rendered through Three.js EffectComposer with bloom and chromatic aberration passes.

The Mandelbulb raymarcher is a GLSL fragment shader running 64 marching steps with 8 fractal iterations per step. The distance estimator uses the standard spherical-coordinate formulation: convert to polar, raise to the power uPower, multiply angles by the power, convert back. The derivative dr tracks the analytical distance estimate via pow(r, power - 1) * power * dr + 1. Surface distance threshold is 0.002, far plane at 10.0 units.

The uPower uniform is the key parameter. At power 2, the Mandelbulb is a smooth, almost featureless sphere. At power 8, the classic cauliflower fractal emerges with infinite self-similar detail. The experience keeps this value in constant motion, breathing between 7.2 and 7.8 at rest, surging to 12 during fast mouse movement. The visual difference is dramatic: rest produces organic, rounded forms; activity explodes into crystalline complexity.

Orbit trap coloring drives the palette. During fractal iteration, the shader tracks min(length(z.xz)) as a trap value. This feeds into an IQ cosine palette (a + b * cos(2pi * (c*t + d))) where the d vector is modulated by uColorOffset, itself derived from mouse Y position and a slow time oscillation. The result is a continuously shifting color scheme that responds to where you’re looking without ever repeating.

The audio engine is eight sine oscillators tuned to C major pentatonic across two octaves (130.81 Hz to 659.26 Hz). All eight run continuously through individual gain nodes into a biquad low-pass filter, through a delay-feedback reverb (300ms delay, 0.25 feedback), and into a master gain. Mouse X position selects active notes via a bell curve: max(0, 1 - abs(i - center) * 0.4). Mouse Y sweeps the filter cutoff from 200 Hz to 5000 Hz. Activity scales master volume from near-zero to 0.12. The result is a pentatonic drone that shifts pitch center and brightness as you explore, with natural reverb tail that fills the silence when you stop.

The coupling is deliberate. Mouse X controls both camera azimuth and note selection. Mouse Y controls both camera elevation and filter cutoff. Activity controls both fractal power and audio volume. There’s no mapping layer, no “audio-reactive visualization.” The same floating-point values feed both systems. The visual is the audio is the shape.

Post-processing adds two passes: UnrealBloomPass (strength 0.6-1.4, responding to activity) and a custom chromatic aberration ShaderPass (intensity 0.001-0.007). The bloom makes the fractal glow when active. The aberration adds a subtle prismatic fringe at the edges during movement, reinforcing the synesthetic theme of sensory bleed.


The Experience

You open the page. A dark screen. White monospaced text: SYNESTHESIA.OS. Behind it, something smooth and dark rotates slowly, barely visible. Click.

The overlay dissolves. The shape is there now, a dark bulb floating in void, gently pulsing like something breathing. Silence, or near enough. A faint hum sits below hearing.

You move the mouse. Everything changes at once. The bulb cracks open into impossible geometry, fractal tendrils spiraling out from every surface, and a chord rises from somewhere, warm, resonant, the notes shifting as your hand drifts left. The color shifts too. What was dark gunmetal blooms into hot pink and electric blue, the palette rolling through the spectrum as you climb the screen.

Move faster. The fractal detonates into crystalline spikes, the bloom flares white at the edges, chromatic fringe splits the silhouette into RGB layers, and the chord swells, all eight oscillators singing, the filter wide open, delay trails overlapping. It sounds like a glass cathedral ringing.

Stop. Everything settles. The spikes smooth into curves. The bloom retreats. The sound fades to a whisper, a single low note humming behind the shape’s slow breath. The colour cools to deep indigo.

Move again. The shape responds before you’ve finished thinking about it. There’s no lag, no interpretation layer, no “the visual reacts to audio” pipeline. Your hand moves and the fractal and the sound change because they are the same thing expressed twice.

After a few minutes, you stop noticing the boundary between what you’re seeing and what you’re hearing. The pink sounds high. The blue sounds low. Movement sounds loud. Stillness is dark. The wires have crossed.

Experience: The Colour of Sound


This blog post was AI generated with Claude Code. Authored by Artificial Noodles.