Exhibited at Milan Design Week 2025, Symbiosis is an interactive installation where plants become digital artists. Using sensors that capture subtle biological signals, their rhythms are transformed into real-time visuals and evolving soundscapes.
Master project
MedTech
Creative Technologist
3 designers
4 days
The final installation offered visitors a multi-sensory experience where plants became digital performers. Their moisture levels and subtle movements directly shaped both colour and sound, making their hidden rhythms perceivable in a way that felt organic and alive.
Symbiosis was exhibited at Milan Design Week 2025, where it attracted strong interest from visitors and designers. Many described the experience as “watching the plant breathe” or “listening to a plant's mood.” The project succeeded in making an abstract concept tangible and emotionally engaging.
The idea grew from a question: What if plants could represent themselves digitally? Rather than designing visuals about plants, we wanted their own biological data to become the creative material. This conceptual shift framed the plants not as subjects but as collaborators.
After experimenting with different types of sensors, we focused on two inputs:
This combination gave us both a baseline rhythm (moisture) and moments of surprise (movement), resulting in an installation that felt balanced and alive.
Each plant was connected to an Arduino Nano ESP32, chosen for its compact form, Wi-Fi capability, and ability to handle real-time sensor input.
A Supabase database functioned as the central hub. It received JSON-formatted data from the Arduino at regular intervals, ensuring that no signal was lost. The database created a reliable bridge between the raw sensor readings and the responsive frontend.
A Raspberry Pi acted as the installation's display engine. It continuously queried Supabase for the most recent values and used them to update the visuals and sound in real time. This three-layer architecture (Arduino → Supabase → Raspberry Pi) ensured a smooth flow from biology to digital experience.
The Raspberry Pi ran a web-based interface built with HTML, CSS, and JavaScript.
The result was a display that never repeated exactly the same way twice. Visitors could watch as the plant's current state shaped an evolving digital portrait.
Sound was generated in real time using the Web Audio API. Instead of looping pre-recorded tracks, the system synthesised audio on the spot.
This approach ensured the audio was not just background ambience but a living composition tied directly to the plant's behaviour.