Turning plants into active participants in their own representation

Exhibited at Milan Design Week 2025, Symbiosis is an interactive installation where plants become digital artists. Using sensors that capture subtle biological signals, their rhythms are transformed into real-time visuals and evolving soundscapes.

Symbiosis - Interactive Plant Installation

Client

Master project

Industry

MedTech

Role

Creative Technologist

Team Setup

3 designers

Timeline

4 days

Goal

The goal of Symbiosis was to create an installation where plants are not represented by humans but represent themselves. By using sensors and data pipelines, we wanted to listen to the hidden signals of plants and let these signals directly drive both visuals and sound. The project was also an exploration of biofeedback in design: how living systems can become co-creators of digital media. Our aim was not only to produce an aesthetically engaging installation but also to spark conversations about sustainability, interaction, and the boundaries between natural and artificial systems.

Challenge

On the technical side, the project had to: Run continuously in real time during a public exhibition. Integrate hardware (sensors, Arduino), cloud storage (Supabase), and a frontend system (Raspberry Pi) into a stable pipeline. Ensure that both visuals and sound evolved organically without feeling repetitive or mechanical. On the design side, we needed to balance poetry and clarity: making the experience immersive while still showing a clear link between the plant's biological state and the digital representation.

Outcome

The final installation offered visitors a multi-sensory experience where plants became digital performers. Their moisture levels and subtle movements directly shaped both colour and sound, making their hidden rhythms perceivable in a way that felt organic and alive.

Symbiosis was exhibited at Milan Design Week 2025, where it attracted strong interest from visitors and designers. Many described the experience as “watching the plant breathe” or “listening to a plant's mood.” The project succeeded in making an abstract concept tangible and emotionally engaging.

Symbiosis Outcome - Milan Design Week Exhibition
1/

Discovery & Concept Development

Initial Exploration

The idea grew from a question: What if plants could represent themselves digitally? Rather than designing visuals about plants, we wanted their own biological data to become the creative material. This conceptual shift framed the plants not as subjects but as collaborators.

Choosing the Right Signals

After experimenting with different types of sensors, we focused on two inputs:

  • Soil moisture – a slow but steady indicator of a plant's well-being. It became the main driver of colour palettes and textures.
  • Movement (accelerometer) – a more dynamic signal, sensitive to subtle vibrations or external interactions. It shaped the distortion and rhythm of visuals and sounds.

This combination gave us both a baseline rhythm (moisture) and moments of surprise (movement), resulting in an installation that felt balanced and alive.

Symbiosis Outcome - Milan Design Week Exhibition
2/

Data Collection & Retrieval

Hardware Layer

Each plant was connected to an Arduino Nano ESP32, chosen for its compact form, Wi-Fi capability, and ability to handle real-time sensor input.

  • The soil moisture sensor tracked hydration, generating a range of values that could be mapped to visual and sonic parameters.
  • The accelerometer detected micro-movements, often caused by human interaction, air currents, or the plant's own subtle shifts.

Database Layer

A Supabase database functioned as the central hub. It received JSON-formatted data from the Arduino at regular intervals, ensuring that no signal was lost. The database created a reliable bridge between the raw sensor readings and the responsive frontend.

Frontend Layer

A Raspberry Pi acted as the installation's display engine. It continuously queried Supabase for the most recent values and used them to update the visuals and sound in real time. This three-layer architecture (Arduino → Supabase → Raspberry Pi) ensured a smooth flow from biology to digital experience.

3/

Visual Display

The Raspberry Pi ran a web-based interface built with HTML, CSS, and JavaScript.

  • A curated archive of visuals represented different states of moisture and movement. The system dynamically selected images based on incoming values.
  • To avoid abrupt changes, a fade-in/fade-out transition was applied, creating an organic sense of continuity.
  • Movement data introduced distortions and layering effects, making the images feel fluid and responsive.

The result was a display that never repeated exactly the same way twice. Visitors could watch as the plant's current state shaped an evolving digital portrait.

4/

Audio Display

Sound was generated in real time using the Web Audio API. Instead of looping pre-recorded tracks, the system synthesised audio on the spot.

  • Moisture values shifted pitch and tonal layers, creating a “mood” that reflected hydration.
  • Movement data modulated rhythm, depth, and texture — sudden vibrations could trigger more active sonic patterns.
  • Effects such as oscillators, filters, reverb, and delay gave richness and spatial depth to the soundscape.

This approach ensured the audio was not just background ambience but a living composition tied directly to the plant's behaviour.

Symbiosis Project Gallery - Image 1
Symbiosis Project Gallery - Image 2
Symbiosis Project Gallery - Image 3
Symbiosis Project Gallery - Image 4
Symbiosis Project Gallery - Image 5