Forest pathway with light shafts
02 / Overview

Presence, Time, Listening

An immersive, AI‑assisted installation and talk format for Foresta Maestra Music Festival — designed to turn ambient signals into a shared moment of sense‑making.

Challenge
How do you make “presence” tangible — without turning the forest into data spectacle? The experience had to stay calm, legible, and emotionally truthful, while still revealing invisible dynamics (attention, rhythm, time passing).
Solution
A restrained audiovisual system that listens first. AI is used for interpretation and narrative framing (not for noise): clustering, summarising, and translating signals into a minimal visual language and prompts for collective reflection.
Output
A festival-ready installation kit (spatial layout + interaction flow), a live “guided session” script, and a visual identity system for screens and documentation. Designed for repeatability across editions.
Image Festival entrance / arrival moment
Image Installation in situ (night / soft haze)
Video 12–20s atmospheric teaser loop
Next: concept, system design, and how it was produced (space, interaction, visuals, and documentation).
03 / Concept

Making relationship perceivable

Ombre nella valle is not about presenting data as information. It's about making a relationship perceivable: how forest conditions and human presence form a continuous, reciprocal system.

The output should feel like a new sense—an immediate translation of invisible exchanges into a legible experience, without requiring a legend.

Presence
The visitor is not an observer; they become a perturbation within the same substrate.
Time
Meaning emerges through accumulation, not instant reaction.
Listening
Signals are sensed as a field, not as isolated inputs.
AI Sense-Making
Not decoration, but a translation engine that turns signals into form.
Installation
The system is experienced physically: space, movement, atmosphere.
Image Macro forest texture — moss / bark / leaf litter
Image Network diagram — thin branching lines, abstract
Image Human silhouette as disturbance — heatmap presence
To make “relationship” the default reading, the project adopts an intuitively readable visual language: mycelial calligraphy.

Mycelial Calligraphy

A living drawing that writes the forest's relational field over time.

A mycelial network is the right metaphor because it's real—hidden connection, distributed exchange, permeability—and visually fertile. The output isn't a chart; it's a slow calligraphy that accumulates, remembers, and changes structure when the environment changes.

Presence
Branching, new nodes, reroutes — presence changes structure, not just glow.
Wind
Curvature bias, lean — biases future direction like a force field.
Humidity
Diffusion, ink bleed, soft halos — permeability, saturation.
Noise
Edge fray, micro-jitter — granular instability at boundaries.
Light
Revelation, layer visibility — what becomes perceivable, not just brightness.
Image Ink-on-paper network — calligraphic strokes
Image Humidity diffusion — soft bleed halos
Image Layer reveal / ghost strata
Next: a live 2D simulation — an interactive signal instrument to observe the system in real-time or take manual control.
04 / System Demo

Signal Instrument

Mycelial calligraphy: the system "writes" a network over time. Presence creates new nodes; wind biases curvature; humidity bleeds the ink; noise frays edges; light reveals hidden strata. Colors begin legible, then mix into a more entangled palette.

Signal Instrument

Ombre nella valle

Manual override is an educational layer: change primary signals and read the system's response.

LISTENING
update:1.0s
render:ink/network
Manual override
Drive primary signals directly. Secondary signals remain derived to keep coherence.
OFF
Simulated signals
Reading
The palette begins as a codified mapping (presence/cyan, wind/blue, humidity/green, noise/magenta, light/amber, temperature/orange). As signals co-occur, strokes become mixtures — legible, but increasingly entangled.
Relational output
Mycelial field
A slow drawing: structure persists; only growth decisions occur.
05 / SYSTEMS

A listening pipeline

Signals are sensed, condensed into a coherent state over time, then rendered as image, sound, and traces.

1) Inputs
Light · temperature/humidity · wind/air movement · ambient sound/music energy · presence (density / dwell time / proximity).
2) Sense‑making
An AI layer (LLM + rules) aggregates signals in a time window, compressing noise into descriptors, shifts, and thresholds — a system state.
3) Rendering
That state drives multiple renderers: visual, audio, and trace. Not a single prompt — a continuous score.
Modular by design: the sensing layer can be lightweight or rich depending on site constraints.
Image Pipeline diagram — sensors → state → renderers
Image Field capture — sensor collage / ambient readings
06 / OUTPUTS

What the system produces

One state, expressed through three output families — plus two runnable previews.

Visual
A live visual field (slow evolution), spatial presentation (screen/projection), and an end‑of‑day synthesis: condensed still / loop / atlas page.
Sound
A second reading of the same state: discreet diffusion zones or headphone points — not “music for visuals”, but interpretation.
Diffuse / Personal
Shareable traces derived from the state: stills, short loops, daily postcards, post‑event compilation — the system leaves an archive.
Image Output atlas — end‑of‑day synthesis / “day fingerprint”
Preview Launch simulation
Live visual simulation (auto state).
Preview Launch simulation
System instrument (manual control).
Previews are illustrative: the full installation runs on on‑site signals and evolves across hours.