The Unfolding Present – PlantWave as a Platform

We finally delivered the hardware, but the hardware was just the vessel. The true revolution was happening inside the software.

When we first released the MIDI Sprout for iPhone app in 2017, it was a simple translation tool with a single sound set. It proved that people wanted to listen to plants without complex synthesizers, but I knew the experience could be so much richer. We began building a robust sonification platform—a digital signal processing (DSP) engine that would turn the raw data of nature into a symphony.

The Evolving Sound Engine

The PlantWave app today is the realization of a design I had been dreaming of for years. We didn’t just want to trigger notes; we wanted to map the plant’s subtle shifts in activity to the texture of the sound itself.

We started by allowing the plant’s wave to control basic elements like instrument selection and arpeggiation rates. But we kept going deeper. We built a synthesis engine where the plant’s activity level—the "delta" or rate of change—could modulate the attack, decay, sustain, and release (ADSR) of a sound. We added polyphony and glide, and allowed the data to control the wet/dry mix of reverb and delay.

This means the plant isn't just playing a melody; it is sculpting the timbre and atmosphere of the music in real-time. Today, I’ve designed about 40 different soundset configurations that run on this engine, allowing the device to be basically plug-and-listen while retaining the deep, high-resolution responsiveness of that original "aha" moment in the studio.

The Growing Community

As the technology matured, so did the community. We moved far beyond our initial circle of "MIDI nerds" and electronic musicians. By 2022, we were shipping consistent units, and today, there are over 25,000 PlantWaves out in the wild.

This isn't just a customer base; it’s a global network of people tuning into nature. I see people using PlantWave for meditation, for creative inspiration, or simply as a daily ritual to connect with their houseplants. It validates what we suspected back in the museum days: humans have a deep, innate desire to connect with the subtle rhythms of the living world, and technology, when designed with intention, can be the bridge rather than the barrier.

The Vision for Broader Biosonification

While PlantWave is focused on plants, the platform we’ve built is capable of much more. In 2019, we actually worked with the HeartMath Institute on a pilot project to build a sonification engine for their heart meditation app. That project hinted at where we are going next.

My calling has always been to use technology to extend human perception into subtle realms of reality. I believe humans have sensory abilities that we have largely abandoned in our technological evolution, and that by making the invisible visible (or audible), we can reactivate these capacities.

We are now at a point where we are ready to deploy our software to sonify other data sources. I envision a future where humans have a real-time soundtrack to their lives generated from wearable data—music that is responsive to mood, tailored to taste, and optimized for our well-being.

Whether it is the galvanic response of a philodendron or the heart rate variability of a human, the goal remains the same: to help us experience the "implicate order"—the deep, interconnected field of life that surrounds us. We started by listening to plants, but we are ultimately learning to listen to life itself.