Category: Performance

Collecting Sacred Fluids

I recently debuted a new art installation called Cybernetic Spirits at the L.A.S.T. Festival. This is an interactive electronic artwork, where participants generate sonic arrangements based on various sacred fluids. These include both historical liquids-of-workshop such as holy water, blood and breast milk and more contemporary ones such as gasoline and coconut water.

My proposal got accepted. Next, I had to actually collect these fluids.

My original list included: blood, holy water, coffee, gasoline, adrenaline, breast milk, corn syrup, wine, coca-cola, coconut water, vaccine (measles), sweat and kombucha

Some of these were easily procured at the local convenience store and a trip to the local gas pump. No problem.

But what about the others? I found holy water on Amazon, which didn’t surprise me, but then again this wasn’t anything I had ever thought about before.

I knew the medical ones would be the hardest: adrenaline and a measles vaccine. After hours scouring the internet and emailing with a doctor friend of mine, I realized I had to abandon these two. They were either prohibitively expensive or would require deceptive techniques that I wasn’t willing to try.

Art is a bag of failures and I expected not to be entirely successful. Corn syrup surprised me however. After my online shipment arrived, I discovered was sticky and too thick. It is syrup after all. Right. My electrical probes got gunky and more to the point, it didn’t conduct any electrical current. No current = no sound.

Meanwhile, I put out feelers for the human bodily fluids: blood, sweat and breast milk. Although it was easy to find animal blood, what I really wanted was human blood (mine). I connected with a friend of a friend, who is a licensed nurse and supporter of the arts. After many emails, we arranged an in-home blood draw. I thought I’d be squeamish about watching my blood go into several vials (I needed 50ml for the installation), but instead was fascinated by the process. We used anti-coagulant to make it less clotty, but it still separated into a viscous section at the bottom.

Since I am unable to produce breast milk, I cautiously inquired with some good friends who are recent moms and found someone willing to help. So grateful! She supplied me with one baby-serving size of breast milk just a couple of days before the exhibition, so that it would preserve better. At this point, along with the human blood in the fridge, I was thankful that I live alone and didn’t have to explain what was going on to skeptical housemates.

I saved the sweat for the last-minute, thinking that there was some easy way I could get sweaty in an exercise class and extract some. Once again a friend helped me, or at least tried, by going to a indoor cycling class and sweating into a cotton t-shirt. However, wringing it out produced maybe a drop or two of sweat, nowhere close to the required 50ml for the vials.

I was sweating over the sweat and really wanted it. I made more inquiries. One colleague suggested tears. Of course, blood, sweat and tears, though admittedly I felt like I was treading into Kik Smith territory at this point.

So, I did a calculation on the amount of tears you would need to collect 50ml and this would mean a crying a river everyday for about 8 months. Not enough time and not enough sadness.

Finally, just before shooting the documentation for the installation, the sweat came through. I friend’s father works for a company that produces artificial sweat and gave me 5 gallons of this mixture. It was a heavy thing to carry on BART, but I made it home without any spillage.

Artificial sweat? Seems gross and weird. The truth is a lot more sensible. A lot of companies need to test human sweat effects on products from wearable devices to steering wheels and it’s more efficient to make synthetic sweat than work with actual humans. Economics carves odd channels.

My artwork often takes me on odd paths of inquiry and this was no exception. Now, I just have to figure out what to do with all the sweat I have stored in my fridge. 

 

 

 

 

Reality Remix: Salon 1

I just returned from our first Reality Remix workshop in Dundee, Scotland. The prompt that we gave ourselves afterwards was to write up things that came up for us, returning thoughts and think about what is next. I write now on the train journey back to London.

The background is that Reality Remix is part of an Arts & Humanities and Research Council AHRC grant around Immersive Experiences and I am one of the “collaborators” (artists) — others include Ruth Gibson and Bruno Martelli (Ruth is the Principal Investigator), Joe DeLappe, Darshana Jayemanne, Alexa Pollman and Dustin Freeman. We are also working with several “partners”, who are in academia, industry and government to act as advisors and contribute in various ways. These are Nicolas Lambert (Ravensbourne University), Lauren Wright (Siobhan Davies Dance), Alex Woolner (Ads Reality) and Paul Callaghan (British Council).

The short project description is:

Reality Remix will explore how we move in and around the new spaces that emergent technologies afford. Through the development and examination of a group of prototypes, initiated from notions of Memory, Place and Performance and with a team of artists, computer programmers, fashion and game designers, we aim to uncover the mysteries of these new encounters, focussing on human behaviour, modes of moving, and kinaesthetic response. Reality Remix brings a unique dance perspective in the form of somatic enquiry that questions concepts of embodiment, sensory awareness, performance strategy, choreographic patterning and the value of touch in virtual worlds.

Within this framework, each of us will be developing our own VR/AR projects and possible collaborations might arise in the process.

Some of the reasons that I was invited to be part of this project stem from core inquiries about what we call “Gibsonian” cyberspace versus a simulated cyberspace. I find it odd that when we often depict virtual reality — and for the purposes of simplicity, I will treat VR a subset of cyberspace — as a simulation, a weak reproduction of some sort of physical reality. VR has immense possibilities that most people don’t tap into. With the dominance of first-person shooter games, reproductions of museums, and non-spaces such as TiltBrush, I have often wondered about how we can conceptualize VR landscapes in new ways. And so, this was my starting point for our first session.

Improving the functionality of the headset

However, technical skills are not a precursor to producing compelling work and, in fact, this is part of my artistic practice. I quickly adapt. For example in 2014, I quickly dove into 3D printing without knowledge of any real 3D modeling package and in the space of a few months, produced some conceptually-driven 3D print work that drew strong responses. I will easily pick up Unity, Unreal, 3ds Max or whatever else is needed.

It is with this lack of technical knowledge, that I can approach concepts with a beginner’s mind, a core concept of Buddhist thinking where you approach a situation without preconceptions and harvest a disposition of openness. Without deep investment in the structures of discourse, it is here that you can ask questions about the effectiveness of the technology such the nature of immersive spaces, the bodiless hands of VR and hyperreality.

For this initial meeting, we each have our own project ideas that we will be researching and producing in various forms. Some of my own inquiries stem from these Gibsonian landscapes. On the train I re-read Neuromancer and was still inspired by this seminal quote from Neuromancer:

Cyberspace. A consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts… A graphic representation of data abstracted from banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding…

I arrived with this general framework and began to ponder about how to incorporate threads of previous work such as physical data-visualizations and network performance. What about the apparatus of the headset itself? How can we play with the fact that we are partially in real space and partially out. And as one of our partners (Alex) pointed out rather than being immersed in VR, we are absorbed by it. Like a fish in water, we are live in full reality immersion. And when we talked about this, I chuckled to myself, remembering this David Foster Wallace joke:

There are these two young fish swimming along and they happen to meet an older fish swimming the other way, who nods at them and says “Morning, boys. How’s the water?” And the two young fish swim on for a bit, and then eventually one of them looks over at the other and goes “What the hell is water?

The first day was a full-out setup day, installing our Windows work machines, getting the Oculus Rifts working, Google Pixel 2 phones, Unreal, Unity and anything else. We all centralized on some specific platforms so we can easily share work with each other and invite possible collaborations. Fighting with a slow wi-fi connection was the biggest challenge here but within a few hours I got my Alienware work machine making crude VR on the Oculus.

Goodies! This is the Oculus Rift hardware that we will all be using for project production

I yearned to visit Dundee despite the cold weather but all of our time was spent in this workspace, supported by the NEoN Digital Arts Festival (where I showed my Strewn Fields pieces last November) and in the evenings, we saw some art lectures by our own Joe DeLappe and my friend and former colleague Sarah Brin. We are food and drank in the local speakeasy bar, chitchatting about ideas.

Lecture by Joe DeLappe

Within the first couple of days, what was previously a mystery to me now became more clear. While some of the collaborators had well-developed projects (Dustin and Bruno/Ruth), others such as myself, Alexa and Joe were approaching it from a more conceptual angle with less technical aptitude.

I kept in mind that our goals for this project are to create compelling proof-of-concepts rather than finalized work, which makes it more of a research-oriented project with a forward-face rather than something that will compete in the already littered landscape of the good, bad and ugly of the Oculus Store.

We started each day with movement exercises led by Ruth, reminding us that we all live in “meatspace”. Our minds and bodies are not separate as we hunch over the machines and then stand with a headset on and wave our arms around. We began experimenting with the technology. I vacillated between diving into Unreal or Unity, each with their own advantages. While Unity has more generalized support and is easier to learn, Unreal undoubtedly has a better graphics engine and is the weapon-of-choice by Bruno. So, solving the early technical challenges began to help coalesce my ideas.

Ruth leading us on some Skinner Releasing exercises

We soon entered into a vortex of artistic energy — some of us from performance, wearables, immersive theater with various conceptual practices and the parters from other organizations who had a less artistic approach but loads of experience in the gaming worlds, university community and impact studies. I knew this on paper, but in reality, the various talents of our Reality Remix dream team soon became apparent.

Twice a day, each of the collaborators led workshops related to their practices. Alexa treated us to her performance-based clothing which registered AR markers and asked us to do an exercise where we tried to perceive something through someone else. Bruno and I made a drawing where he wore the VR headset and I sketched on his back which he replicated on paper. The process was fun and predictability the drawing was unimpressive. The process was fun and predictability the drawing was unimpressive.

 

Ruth is wearing fabric created by Alexa while she shows us some of her responsive AR augments

 

Our collaborative drawing in response to Alexa’s prompt

Dustin led us on a prototype for a sort of semi-immersive experience where actors jump into various avatars. With his deep background of improv, theater and role-playing, I began to shift, thinking about how to involve people not in a headset, which make up the majority of people in a VR experience as essential players.

Darshana in the VR headset while Dustin demonstrates some ways in which “non-players” can interact in VR space

I am not wondering about intimacy and vulnerability in VR. There is a certain amount of trust in this space. You are blind and often suffused in another audio dimension. Then, could you guide people through a VR experience like a child in a baby carriage? What can be done with multiple actors? So many questions. So many possibilities.

One thing that I was reminded of was the effectiveness of simple paper-prototyping and physical movement. Make things free from technology; keep it accessible. Stick to the concepts.

My own orientation began shifting more into virtual landscapes and thinking about data as the generator. I asked people to brainstorm various datasets and come up with some abstract representations based on that earlier quote from Neuromancer. I do want to get away from the sci-fi notion of cyberspace since it is limiting and enmeshed in VR 1.0, but will still claim this as the starting point to an antidote to the often mundane reality-simulation space.

This was useful for my own brainstorming. Alexa brought in an interesting point of view because she was thinking about time rather than landscape and brought in conversations around anticipation, reality and memory, which reminded me of the work by Bergson. Her movements were around personal data and captured her attention.

Meanwhile, Ruth made marks on the wall, translating gesture to 2D. Bruno worked with abstract visual forms. Despite being a poor draftsman, the question arose: how can we incorporate movement into a system? My own perception is highly visual and orientation towards abstract patterns. The success of my exercise was based in the fact that some useful (to me) renderings were produced while I also quickly learned that the a line-based VR landscape doesn’t resonate with everyone. How can we incorporate movement into a system.

Drawings by Bruno Martelli in response to my workshop prompt

As a manifesto bullet item: the scriptedness of VR is something we would all like to break. With all the possibilities of VR, why are the dominant forms assuming a feeling of immersion. Why don’t we consider what can be done before rushing to produce so much content.

Where is the element of surprise? VR is a solitary experience. I’m reminded of Joe’s work with the military and what one can do with a gaming space. Could we intervene or somehow interfere with VR space?

Darshana, the theorist amongst the group, did a beautiful summary of the session. My head was spinning with ideas at this point, so I can’t even recall everything he spoke about but certainly ideas around how to both be engaging and critical in this space surface. He envisioned a nexus around abstract spatialization, performance, role play and the body that tied our various projects together.

Bruno, Ruth and Alexa wearing some of Bruno + Ruth’s Dazzle Masks

There is much more to write and think about. I made progress on the technical side of things, such as getting an OSC pathway to Unreal working, so that can begin playing with electronic interfaces into a VR world.

More importantly, I feel like I’ve found my people with this Reality Remix team: one where we understand the history of new media, subversion of forms, aren’t dazzled by simplicity. We got along so very well with mutual respect and laughter. I’m excited about what comes next.

Reality Remix group phptp

Music Box Village

Last week, I visited the Music Box Village in New Orleans. This is a true DIY space where artists, fabricators and more have built “houses” that make sounds/music/noise in various ways. Together, skilled musicians (which does not include me) can make an orchestra of cacophonous music.

John Cage would have loved this space. Any sort of noise even silence is music, as people witnessed with his 4’33” composition. I’ve always loved this idea, the very fact that the tension between performance and non-performance can be music. At this site, the structures become the instruments. Anyone can play them. They are rusty, brittle, gentle and beautiful at the same time.

I’ve gone to many, many DIY spaces. I’ve even helped build some of them, such as The Shipyard, which was a mass of shipping containers that I helped weld, wire and cut in 2001. But all of these felt self-serving, creating a community of those that we included and those, who were somehow excluded because they didn’t speak the proper cultural language of metal-working and whiskey-drinking.

The Music Box Village felt different. I watched some of the founders present the project at the INST-INT Conference the day before and they spoke about community engagement and pairing collaborators from different socioeconomic backgrounds, skills and ages to build the houses. Their approach was organic and they finally secured a more permanent home which has metalworking facilities.

I can’t help but be inundated with the banality of architecture. Houses pretty much look alike, entirely functional and rectilinear. Our commerce spaces are branded box stores adorning cities and suburbs. As humans, we are molded by our physical environment. Our eyes conform to corners. Our minds become less imaginative as a result.

One of my favorite artists who works with architectures is Krzysztof Wodiczko who worked for many decades projecting iconography onto buildings in order to subvert the function of the building, the war memorial and the political body.

He writes: “Dominant culture in all its forms and aesthetic practices remains in gross contradiction to the lived experience, communicative needs and rights of most of society, whose labour is its sole base”

We have so much more to offer in terms of human imagination and creativity than the buildings that surround us and are institutions of capital. I left my tour of the Music Box Village feeling rejuvenated. Then I promptly went to airport to catch I flight back home, engaging with the odd transitional space where air travel happens.

 

 

 

 

 

EEG Data Crystals

I’ve had the Neurosky Mindwave headset in a box for over a year and just dove into it, as part of my ongoing Data Crystals research at Autodesk. The device is the technology backbone behind the project: EEG AR with John Craig Freeman (still working on funding).

The headset fits comfortably. Its space age retro look aesthetically pleases except that I’d cover up the logo in a final art project. The gray arm rests on your forehead and reads your EEG levels, translating them into a several values. The most useful are “attention” and “meditation”, which are calculations derived from a few different brainwave patterns.

eeg_headestI’ve written custom software in Java, using the Processing libraries and ModelBuilder to generate 3D models in real-time from the headset. But after copious user-testing, I found out that the effective sample rate of the headset was 1 sample/second.* Ugh.

This isn’t the first time I’ve used the Neurosky set. In 2010, I developed art piece, which is a portable personality kit called “After Thought”. That piece, however, relied on slow activity and was more like a tarot card reading where the headset readings were secondary to the performance.

The general idea for the Data Crystals is to translate data into 3D prints. I’ve worked with data from the San Francisco’s Data Portal. However, the idea of generating realtime 3D models from biometric data is hard to resist.

This is one of my first crystals — just a small sample of 200 readings. The black jagged squares represents “attention” and the white cubes correspond to “meditation”.

IMG_0963

Back to the sample rate…a real-time reading of 600 samples would take 10 minutes. Still, it’s great to be able to do real-time, so I imagine a dark room and a beanbag chair where you think about your day and then generate the prints.

Here’s what the software looks like. This is a video of my own EEG readings (recorded then replayed back at a faster rate).

And another view of the 3D print sample:

IMG_0965

What I like about this 3D print is the mixing of the two digital materials, where the black triangles intersect with the white squares. I still have quite a bit of refinement work to do on this piece.

Now, the challenge is what kind of environment for a 10-minute “3D Recording Session”. Many colleagues immediately suggest sexual arousal and drugs, which is funny, but I want to avoid. One thing I learned at the Exploratorium was how to appeal to a wide audience, i.e. a more family-friendly one. This way, you can talk to anyone about the work you’re doing instead of a select audience.

Some thoughts: just after crossing the line in an extreme mountain bike race, right after waking up in the morning, drink a pot of coffee (our workplace drug-of-choice) or soaking in the hot tub!

IMG_0966

* The website  advertises a “512Hz sampling rate – 1Hz eSense calculation rate.” Various blog posts indicate that the raw values often get repeated, meaning that the effective rate is super-slow.

 

Welcome to the Party: @lenenbot

Say hello to the latest Twitterbot from the Bot Collective: @lenenbot

vlad_john_lenen

Lenenbot* mixes up John Lennon and Vladimir Lenin quotes. The first half of one with the second half of the other.

Some of my favorites so far are:

Communism is everybody’s business.
It’s weird not to be able to run the country.
Revolution is love.

There are more, surreal ones. There are about 600 different possibilities, all randomized. Subscribe to the Twitter account here.

 

 

* I chose the name “Lenen” to avoid confusion. Lenonbot and Lenninbot look like misspellings of Lennon and Lenin, respectively. Lenen is it’s own bot.

Babula Rasa with Second Front

Second Front performed Babula Rasa as part of “The Artist is Elsewhere” — a one-night performance event hosted by ZERO1 and curated by Sean Fletcher and Isabel Reichert. These are some stills from the event.

My idea was to use Google Docs, specifically its spreadsheet as a virtual Tabula Rasa — a blank slate for performance. I had imagined word-play, formulas, formatting changes and text-upon-text revisions and edits. I’ve often found Mail Art to be a source of inspiration, where artists re-purposed communication networks for art discourse. I was hoping for a similar effect with Google Docs, a space normally reserved for business documents or household expense sheets.

However, my Second Front compatriots always surprise me and they quickly begain inserting images into Google Docs. Who knew? Apparently everyone else but me.

Projected live for 2 hours during “The Artist is Elsewhere” event, this quickly became a group collage. In the first 30 minutes what appeared was the “I Say” Shark, various blue women appeared, Patrick Lichty’s birthday cake, and lots and lots of cats.

Images from various Second Front performances popped up: Last Supper and Wrath of Kong. And lots of memes from popular culture, reminding me of How Conceptual Art Influenced the World Wide Web.

We could overhear the other performances live on a UStream channel. At one point, one of the performances seemed to be carrying on for a long time and someone (maybe me) uploaded Chuck Barris from the Gong Show.

At the 1-hour mark, the Shark is still there but now with the Shaggy D.A., the Tweets in Space logo, Dr. McCoy, an evil bunny and more.

Does this embrace, reject or dry-hump the New Aesthetic? That’s for you to decide.

And like all Second Front performances, we had to bomb the virtual venue when we were done…only this time with cats.

Participating Second Front members: Yael Gilks, Bibbe Hansen, Doug Jarvis, Scott Kildall, Patrick Lichty, Liz Solo with stealth guest appearance by Victoria Scott.

Volunteer for the 2049 Hotline

Are you interested in being an emissary from the future?
2049_red_full_res

For my upcoming “2049” show at the Dump in San Francisco, one of the artworks that will be featured will be a phone booth where you can talk to someone from the year 2049. People can pick up the phone (it will be set up as a live line) and talk to an ambassador-from-the-future, who will answer questions about what life is like in the year 2049.

What is 2049 like? It is up to YOU to answer this. You can change it for each caller.

I’m gathering a volunteers and if this is something you might be interested in, please email me at: lucky (at) kildall.com — its a 45-minute commitment — and will be a fun performance where you can pretty much do what you want.

The show is from

5-9pm, Friday May 20th and 1-5pm, Saturday, May 21st (Pacific Standard Time), you don’t have to be local to San Francisco to play.

Background
I am playing the role of a prospector from the future who mines the garbage heaps of a past civilization to build technologies to survive. Trawling through construction debris, discarded electronics and the scraps of people’s lives, I have etched blueprints and made imaginary devices such as an infinite battery and scent-based resource detector (a.k.a. “The Sniffer”).
2049_postcard_retouched

Prospecting from the Future

Last week, I began a 4-month residency at Recology San Francisco (a.k.a. The Dump) where I make art solely from the refuse that people drop off in their cars and trucks. I am treating this residency as a performance.

I am playing the role of a prospector from the future who mines the garbage heaps of a past civilization to build technologies to survive. Trawling through construction debris, discarded electronics and the scraps of people’s lives, I am making blueprints and building imaginary devices such as a food synthesizer and an infinite battery.

scott_with_cart

I derived inspiration by props from films such as E.T. and The Science of Sleep. These contraptions obviously don’t work, yet they activate a child-like imagination, where we can build whatever we want from materials at hand. In this consumer culture where the desire for designed objects runs rampant, this project serves as critique and antidote.

et_scisleep

From the standpoint of new media artwork, I have been grappling with how to work with technology in artworks — I use technology precisely because our economy and values are so steeply driven by it. I have long ago moved away from interactive artwork for a number of reasons.

I want to celebrate the imaginary. This project lets me play the role of artist-as-mystic instead of as-technologist. I am free to create narratives in which I simultaneously critique our ecological disaster course but also to suggest possible futures. And, more than anything, have fun. Without this, we have zero hope, which is something we need at this time.

walle

After Thought at Art in Odd Places

Last Thursday, I exhibited After Thought, a performance-installation that I developed while at Eyebeam Art + Technology Center at Art in Odd Places in New York (check out their AIOP website, there’s some great projects there).

As the name implies, these performances that happen in unusual spots in the city, this one being at the 14th Street Y.

14_y_entrance

We scheduled this to happen during the CSA pick up where folks were picking up their weekly organic veggies.

csa

Here I am posing with my two assistants: Minha Lee and Zack Frater. We used the lab coat + eyeglasses props to reel people in.

3_of_us

I began with a short intake form with questions such as “What is your greatest physical fear?” I discovered that an inordinate number of people are afraid of snakes.

intake_form

After completing the intake form, people wear a brainwave-reading headset — I use the Neurosky Mindset — to capture stress and relaxation levels. They turn over flashcards while I monitor their reactions.

I can’t see what they are looking at. If their their stress or relaxation responses spike, I ask them for the card, then note it down on my result form. This person was especially negatively triggered by cockroaches.

scott_testing

And this gentleman was relaxed by the guys hanging out in the hot tub. Give me that flashcard!

scott_testing3

Minha, who interned for me at Eyebeam also administered tests. This subject has no reaction, good or bad to the image of the police car.
minha_testing

Here you can see how the intervention occurs. People had no idea why we were there. Many were suspicious, thinking that we our Scientology-style relaxation/stress test was trying to sell them something or lure them into a cult. Others were immediately intrigued. Some needed convincing. One respondent offered us a bundle of swiss chard for barter.

scott_testing4

Afterward, I would sit down with each respondent and we would talk about their results. “Why did you get stressed out by the cute puppy?”

scott_anaylzing

In the background here, you can see one of the two curators, Yaelle Amir, who demonstrates her ambidexterity by texting while typing.

scott_analyzing

One of my last tests of the day was with Stephanie Rothenberg, a good friend of mine. I knew her too well to provide unbiased analysis. The image of the crying baby was one of her stress indicators. Hmmm.

scott_testing_rothenberg

01SJ Day 5: Public Viruses

Today we shifted to the virus-making portion of Gift Horse, where anyone can assemble a virus sculpture to be placed inside the belly of the Trojan Horse. The gesture is to gather people in real space, give them a way to hand-construct their “artwork” and to hide hundeds of the mini-sculptures inside the horse.

The first virus to go inside, the Rat of the Chinese zodiac, was The Andromeda Strain, an imaginary virus from the film. This father-daughter team cut, folded and glued the paper sculpture together and she did the honors of secreting it inside the armature.

father_daughter
daughter_places
It takes a long time to cut each virus from the printed sheet. This is where the lasercutter from the Tech Shop came in handy. In the afternoon, we traced the outlines of the Snow Crash virus and tried cutting it out. After about an hour of fiddling around with settings and alignment, I was able to get a batch done.
lasercutter

many_snowcrashes

Hurray for mechanized production!

This halved the assembly time from 30 minutes to 15 minutes, bypassing the tedious cutting step. Perhaps this is a compromise in the process of hand-construction techniques, but I’ll gladly make the trade-off for practicality.

The next person to sit with us was Jeff who worked on one of the freshly-cut Snow Crash viruses.

jeff_builds

Once finished, it joined The Andromeda Strain. Come on down to South Hall (435, S. Market, San Jose) and check us out — we will be holding workshops on building viruses all weekend.

andromeda_snowcrash