The Joshua Tree and Infrared Light: A Sensor-driven Sound Performance

I create artwork, which captures live data from natural phenomena that is invisible to humans, and translate the data it into soundscapes, what many call data sonification. I’ve worked with water quality, air quality, electrical activity in mycelium and much more. The artworks aren’t just soundtracks. They take the forms of sculptural installations or even performances that others can experience viscerally.

This spring, I was invited to be an artist-in-residence at Joshua Tree National Park, which is my favorite of all the National Parks. I love this landscape so much, and feel so alive and present in the high desert.

I’ve created many site-specific sound installation in different parts of the world, including Abu Dhabi, Thailand, Slovenia and various ecosystems in the United States. Here, I knew I had to focus on the Joshua Tree itself. It’s iconic, and it’s a keystone species that it hosts a variety of other organisms. Without the Joshua Tree, this desert ecosystem would be drastically different.

The challenge was that, because this was in a National Park, I couldn’t damage the flora in any way, so putting nails in the plants (the Joshua Tree is a yucca), or even attaching any sort of sensor to it wasn’t going to happen.

In the weeks before the residency and after some research and brainstorming, I designed experiments of where I would use spectral sensors that tracked visible light by holding them near the Joshua Tree. These worked surprisingly well and I could integrate them into my wireless custom hardware + software system. Then, I found similar ones that captured data from wavelengths of near-infrared light: the light just outside the visible spectrum, well, visible to humans at least.

I did research on how the Joshua Tree might react. Maybe there would be some differences in the data, maybe not. I based my experiments on articles such as these, which suggest that a high percentage of IR light is reflected (not emitted) from the leaves of healthy plants and the chlorophyll itself is responsible via this article by NASA and this one by USGS.

On my first day, at the park, I did some data logging from a nearby Joshua Tree.

The wave lengths of light here captures here are at 730, 760, 810 and 860nm.

The first one is of the “barky part” — the dead brown leaves of the plant.

For the scientists in the audience, the labels on the graph correspond to my sensor hardware transmission code:

988_T = 730nm

988_U = 760nm

988_V = 810nm

988_W = 860nm

The second one is of the live green leaves.

When I saw this, I was floored. The low readings were from the base of the tree and the high readings were from the leafy parts. This was exactly what I had thought it could do, but the differences was stark.

What’s amazing is that we can think of these sensor readings roughly as an indicator of health of plants. Yes, I would expect this to work on other plants and trees and haven’t even yet tested the project on them. That’s for another day.

This was one of my early experiments. Forgive the sound quality!

Further research cited this National Library of Medicine article indicates that the high near-infrared reflectance is due to “high scattering of light by the leaf mesophyll tissues”, not the chlorophyll (as I stated in the video). I find the scientific source material to be fascinating and want my sensor-driven soundscapes to be based in true signal and not just noise.

Remember that color isn’t real. Color is data that we receive and construct in our brains. What is reality are photos: small particles of electromagnetic energy, each with a unique wavelength. In the near-infrared spectrum is a stream of data that we cannot perceive, but it is out there and other organisms, usually non-mammalian can see them. That’s how mosquitos find tasty meals. Frogs and salmon use the IR spectrum to navigate through murky waters. Vampire bats use infrared vision to locate prey.

I spent my time during this short residency, which was less than month, building a stable electronics system and mostly working on a soundscape that I felt would express the Joshua Trees in the park on May 4th, 2024. I decided to “play” a few different Joshua Trees like a theremin in a performance, using a “sloth glove”, which would host the sensor on the palm. More on the sloth below…

(Photo by NPS/ Paul Martinez)

For the sound design, I designed four different “instruments” that I could activate with a handheld controller, which had latching buttons on them. I could mute/unmute different tracks, creating a dynamic performance where I could move around without using the computer. Each one corresponded to one of the four different wavelengths of light.

Here are the some of the live data sound recordings from each of the tracks:

Joshua Tree National Park, 2024, High Notes

Joshua Tree National Park, 2024, Like a Theremin

Joshua Tree National Park, 2024, Water Guitar

Joshua Tree National Park, 2024, Electric Piano

About the sloth

The Shasta ground sloth (https://en.wikipedia.org/wiki/Nothrotheriops), now extinct, used to eat the seeds of the Joshua Tree and poop them out, and in a symbiotic relationship, spreading the plant over wide distances, dispersing them over a wider range than they now can travel.

As climate change changes the ecology of the desert environment where the Joshua Tree lives, the plant is now under environmental distress. Since it is a keystone species, that hosts many organisms, and is essential to its desert ecosystem, the health of this species is even more important to this environment.

I see this artwork as a performance where I commune with the tree, reading it’s health and generating compositions from this.

 

And here is the final documentation video for the project.