THE MIRROR'S ECHO
The Mirror's Echo operates as an unlimited edition under open-source principles (AGPL-3.0), ensuring accessibility while sustaining ongoing development. Exhibition licenses and institutional partnerships support the continued evolution of networked digital art practices.
About
The Mirror's Echo is a networked interactive work exploring presence, temporality, and reciprocity through real-time video transmission. The piece positions participants as both observers and subjects, creating a feedback loop where actions generate visual echoes that accumulate and transform over time.
Built on open protocols (WebRTC, LiveKit, NDI), the work operates as distributed infrastructure - each interaction exists simultaneously across multiple nodes: browser, server, and optional processing environments. This architecture enables the piece to function as both a self-contained web experience and as material for live visual manipulation through tools like TouchDesigner.
The work implements a temporal access model: seven minutes of unmediated experience, followed by progressive visual markers that reveal the piece's constructed nature. This durational framework questions assumptions about digital permanence and the economics of networked art.
Technical implementation merges web-native technologies with broadcast/VJ workflows, positioning the work within lineages of video feedback art (Steina & Woody Vasulka), telepresence projects (Kit Galloway & Sherrie Rabinowitz's Satellite Arts), and contemporary networked performance practices.
Processing Pipeline
Camera Input → WebRTC/LiveKit → NVIDIA GPU Processing
↓
Audio Stream → Whisper (Speech-to-Text)
↓
Text → spaCy NLP (Entity Recognition, Semantic Analysis)
↓
Video → TouchDesigner (Visual Processing, Effects, Generative Response)
↓
Composite Output → NDI → OBS → WebRTC Return Path
↓
Remote Viewer (Browser/Display)
Technical Infrastructure
For production-quality streaming with OBS, NDI, and TouchDesigner integration:
Complete Bidirectional Workflow:
- Publisher Page: Remote user captures webcam and publishes to LiveKit
- NDI Viewer Page: View the LiveKit stream (capture this in OBS Browser Source)
- OBS: Capture NDI viewer page and enable NDI output (Tools → NDI Output Settings)
- TouchDesigner: Use NDI In TOP to receive stream from OBS, process & stylize
- TD Output: Use NDI Out TOP to send processed video back to OBS
- OBS Capture: Add NDI Source in OBS to receive processed video from TD
- Return Viewer Page: Remote user views processed video with fullscreen option
Signal Path: Remote Camera → LiveKit → OBS → NDI → TouchDesigner (process) → NDI → Back to Remote Viewer in Real-Time
Interaction
The work responds to presence through multiple modes of engagement. Each interaction generates visual echoes that accumulate temporally within the system.
Mirror Interface: Direct interaction with the circular interface generates ripple formations. Accumulated interactions trigger chromatic shifts every five engagements. The Clear function returns the system to its initial state.
Video Transmission: WebRTC protocols enable peer-to-peer video transmission. Local and remote streams establish reciprocal viewing positions. The work's temporal markers emerge after seven minutes of sustained connection, revealing the constructed nature of the experience.
Licensing & Access
- Full source code access (AGPL-3.0)
- Unlimited non-commercial use
- Seven-minute sustained experience without watermarking
- Community & documentation support
- Commercial exhibition rights
- Professional use licensing
- Optional: Technical residency support ($150/hour)
- TouchDesigner/LiveKit integration consultation
- Custom configuration assistance
- Complete licensing
- Curatorial consultation
- On-site residency