From Residencies

GPS Tracks

I am building water quality sensors which will capture geolocated data. This was my first test with this technology. This is part of my ongoing research at the Santa Fe Water Rights residency (March-April) and for the American Arts Incubator program in Thailand (May-June).

This GPS data-logging shield from Adafruit arrived yesterday and after a couple of hours of code-wrestling, I was able to capture the latitude and longitude to a CSV data file.

This is me walking from my studio at SFAI to the bedroom. The GPS signal at this range (100m) fluctuates greatly, but I like the odd compositional results. I did the plotting in OpenFrameworks, my tool-of-choice for displaying data that will be later transformed into sculptural results.

The second one is me driving in the car for a distance of about 2km. The tracks are much smoother. If you look closely, you can see where I stopped at the various traffic lights.

Now, GPS tracking alone isn’t super-compelling, and there are many mapping apps that will do this for you. But as soon as I can attach water sensor data to latitude/longitude, then it can transform into something much more interesting as the data will become multi-dimensional.

Views from 9000 feet

9000 feet in the air gives you entirely different perspective on the world. Last Sunday, I had the opportunity to fly in a single-engine Cessna with my old friend, Gary. His plane was from the 1970s and had similar instrumentation as my dad’s plane from the same era.

My father, coincidentally also named Gary, loved flying. When I was a kid, he took me up in his plane for countless hours. It’s been about 35 years since I’ve been in a small plane like this. It was comforting, loud, fun and magical.

Although I have no interest in being a pilot, I certainly appreciated the view. Moving slowly (130 mph) at 9000 feet, gives the opportunity to see the landscape at a slower pace and at such a low altitude, I saw dimensionality in the terrain unlike I’ve seen in a long time.

The folds of the hills, the washouts from snow melt and the various waterways fascinated me. The odd manmade structures and dirt access roads punctuated the depopulated desert terrain

I saw the acequias — community-owned irrigation canals for family farms, which delineated the parcels of land. As they say, water is life. There area has no agribusiness here, just family farms, often growing alfalfa on the side in addition to a day job.

I gazed at the results of the San Juana-Chama Project — which linked to the Abiquiu Dam that feeds the Colorado River through Rio Chama and into the Rio Grande so that Santa Fe and Albuquerque can have drinking and water.

The most fantastic sight was the Rio Grand Gorge near Taos. Here you can see how the Earth got split apart by tectonic forces. Rather than carving its own path, the Rio Grande trickles through the gorge because its the easiest way for the water to flow.

After a couple hours and a lunch stop, we landed back on the ground. I was again bound by gravity as I drove back to Santa Fe, along the highway that earlier that day I had seen from the sky.

Santa Fe River Walk

When you get to a new place, take a long walk. This is essential to ground yourself in that space. Rebecca Solnit writes about it; Guy Debord speaks of diverting the stream of capitalism with it; Richard Long incorporates it into his art practice.

Just after unpacking at a new art residency Water Rights at the Santa Fe Art Institute, I went on a walk up the Santa Fe River with two of my fellow residents, Christina Catanese and Megan Heeres.

Santa Fe is a new place with new people. Before jumping into studio practice, which can be a crutch for compulsive art-making, I wanted to engage with the physical environment. At the residency, the purpose will be to open the mind and the art practice.

We started at Frenchy’s Field and walked up the riverbed itself towards downtown.We walked, talked and observed.

At the head of the trail was a poem kiosk with laminated sheets of poetry and a little shelf full of rocks. The riverbed here was dry and sandy.

We began walking in the bed itself. Christina is a trained hydrologist and Megan knows much about plants.I know a little bit about geology after my Strewn Fields project.

At the start of the walk, we encountered a collection of heart-shaped rocks, obviously put here by humans. I love this organically-generated “land art”.

We wondered why these large rocks were stacked this way. Was it for humans? Or for the river? Christina later determined that it was to control the river flow, as future steps required tricky traversals.

Here I am with a backpack full of branches that I collected. Im specifically intrigued by the Salt Cedar, which is an invasive species that was brought to the area many years ago as a wind break for agriculture. Ooops, as is often the case, the introduction of a new species created more problems than it solved. The salt cedar is a water-sucker and consumes the areas most precious resource.

Here is the “rock penitentiary” maybe these rocks were bad and had to be put behind fencing.

And here is a rock that escaped. Fly away, be free!

Under a bridge, we found a rope swing. Wheeee!

As we traversed further, the salt cedar thinned out and we saw various grasses along the banks of the (dry) river.

And I found my own heart-shaped rock. A beautiful specimen, which looks like two geological samples that were grafted together.

We took a side path and disturbed two birds of prey who had been feasting on this treat.

Around the mid-point of the walk, we started seeing icy formations.

I love these alluring crystalline structures surrounding various stones.

And the ground was damp. We noticed various animal prints. What was this? I still do not know. The front foot matches the hind foot, which seems like an odd walking pattern.

Finally, we began to see actual water with this miniature waterfall.

As we approached downtown, there was more and more human-generated waste.

And one shoe? Who loses a single shoe?

At the end of the walk was a patch of rainbow in the sky.

Movies about Water

A few days ago, I asked on Facebook:

What’s your favorite movie about water? We’re doing a Monday movie night at the Water Rights residency and I’m taking suggestions. Narrative or documentary, but not exceedingly lengthy.

78 responses! Here is the list, in order of posting, which has less than 78 because there were duplicates:

Blue Planet
SlingShot
Chinatown
Milagro Bean Field War
Riding Giants
Step Into Liquid
Deliverance
One Water
Jaws
Dune
Waterworld
Force 10 from Navarone
Knife in the Water
The Abyss
Darwins Nightmare
Marvelous Resources
Dripping Water (Joyce Wieland, Michael Snow)
Into Blue
Sharknado
Even the Rain
Salween Spring (Travis Winn)
Glass-memory of Water (Leighton Pierce)
Old Man And The Sea.
Flow
Gasland
Plagues & Pleasures on the Salton Sea
Water Warriors!
The Swimmer!
Peter Hutton (various films)
Erin Brockovich
Paddle to the Sea
H20 Film (Ralph Steiner)
Titanic
Splash
Point Break
Patagonia Rising
Step Into Liquid
Civil Action
Trouble the Water
Like Water for Chocolate
Guy Sherwin’s black and white film of his daughter watering shadows. Prelude – 1996, 12 mins
Moana
The Same River Twice
The Illustrated Man
Whale Rider
The Gods Must Be Crazy
The Big Blue
The Dry Summer
Joe Versus the Volcano
Watermark
Water & Power: A California Heist
The Finest Hours
The Woman in the Dunes
Total Recall (first one)
Tears
“Water Wrackets” by Peter Greenaway
“Watersmith” by Will Hindle
My Winnipeg

Joining SETI as an artist-in-residence

The SETI Institute just announced their new cohort of artists-in-residents for 2016 and I couldn’t be happier to be joining this amazing organization for a long-term (up to 2 years!) stint.

This includes a crew of other amazing artists: Dario Robleto (Conceptual Artist, Houston), Rachel Sussman (Photographer, Artist, Writer, New York), George Bolster (Filmmaker, Artist, New York), Jen Bervin (Visual Artist, Writer, Brooklyn), David Neumann (Choreographer, New York). The SETI Air program is spearheaded by Charles Lindsay (artist) and Denise Markonish (curator at MASS MoCA). I first met Charles at ISEA 2012 in Albuquerque, New Mexico when we were on the same panel around space-related artwork.

On January 13, 2016, at 7pm, in San Francisco’s Millennium Tower, SETI Institute President and CEO Bill Diamond will formally welcome the incoming artists and our new institutional partners, as well as patrons and friends of the program. This event is invitational and seating is limited.

SETI_Logo

So, what will I be working on?

Well, this follows on the heels of a number of artwork related to space such as Tweets in Space (in collaboration with Nathaniel Stern), Uncertain LocationBlack Hole Series and Moon v Earth, which were meditations of metaphors of space and potential.

uncertainlocation_main

Roughly speaking, I will be researching, mapping and creating installations of asteroids, meteor and meteorite data and working with scientists such as Peter Jenniskens, who is an expert on meteorite showers. These will be physical data-visualizations — installation, sculptures, etc, which follow my interests in digital fabrication and code with projects such as Water Works.

What specifically fascinates me is the potential between outer space and the earth, is the metaphor of both danger and possibility from above. These range from numerous spiritual interpretations to practical ones such as the extinction of the human race to the possibility that organic material from other planets being carried to our solar system. Despite appearances to the contrary Earth is not only a fragile ecosystem but also once that could easily be transformed from outside.

And already I have begun mapping some meteor showers with my custom 3D software, working with in collaboration with Dr. Jenniskens and a dataset of ~230,000 meteors over Northern California in the last few years. This makes the data-space-geek in me very happy.

Stay subscribed for more.

meteor-of-Screen Shot 2015-11-18 at 8.41.10 AM

meteor-2Screen Shot 2015-10-27 at 4.55.25 PM

And I will heed Carl Sagan’s words: “Imagination will often carry us to worlds that never were, but without it we go nowhere.”

EquityBot goes live!

During my time at Impakt as an artist-in-residence, I have been working on a new project called EquityBot, which is an online commission from Impakt. It fits well into the Soft Machines theme of the festival: where machines integrate with the soft, emotional world.

EquityBot exists entirely as a networked art or “net art” project, meaning that it lives in the “cloud” and has no physical form. For those of you who are Twitter users, you can follow on Twitter: @equitybot

01_large

What is EquityBot? Many people have asked me that question.

EquityBot is a stock-trading algorithm that “invests” in emotions such as anger, joy, disgust and amazement. It relies on a classification system of twenty-four emotions, developed by psychologist and scholar, Robert Plutchik.

Plutchik-wheel.svg

how it works
During stock market hours, EquityBot continually tracks worldwide emotions on Twitter to gauge how people are feeling. In the simple data-visualization below, which is generated automatically by EquityBot, the larger circles indicate the more prominent emotions that people are Tweeting about.

At this point in time, just 1 hour after the stock market opened on October 28th, people were expressing emotions of disgust, interest and fear more prominently than others. During the course of the day, the emotions contained in Tweets continually shift in response to world events and many other unknown factors.

twitter_emotionsEquityBot then uses various statistical correlation equations to find pattern matches in the changes in emotions on Twitter to fluctuations in stocks prices. The details are thorny, I’ll skip the boring stuff. My time did involve a lot of work with scatterplots, which looked something like this.

correlationOnce EquityBot sees a viable pattern, for example that “Google” is consistently correlated to “anger” and that anger is a trending emotion on Twitter, EquityBot will issue a BUY order on the stock.

Conversely, if Google is correlated to anger, and the Tweets about anger are rapidly going down, EquityBot will issue a SELL order on the stock.

EquityBot runs a simulated investment account, seeded with $100,000 of imaginary money.

In my first few days of testing, EquityBot “lost” nearly $2000. This is why I’m not using real money!

Disclaimer: EquityBot is not a licensed financial advisor, so please don’t follow it’s stock investment patterns.

accountThe project treats human feelings as tradable commodities. It will track how “profitable” different emotions will be over the course of months. As a social commentary, I propose a future scenario that just about anything can be traded, including that which is ultimately human: the very emotions that separate us from a machine.

If a computer cannot be emotional, at the very least it can broker trades of emotions on a stock exchange.

affect_performanceAs a networked artwork, EquityBot generates these simple data visualizations autonomously (they will get better, I promise).

It’s Twitter account (@equitybot) serves as a performance vehicle, where the artwork “lives”. Also, all of these visualizations are interactive and on the EquityBot website: equitybot.org.

I don’t know if there is a correlation between emotions in Tweets and stock prices. No one does. I am working with the hypothesis that there is some sort of pattern involved. We will see over time. The project goes “live” on October 29th, 2014, which is the day of the opening of the Impakt Festival and I will let the first experiment run for 3 months to see what happens.

Feedback is always appreciated, you can find me, Scott Kildall, here at: @kildall.

 

Data-Visualizing + Tweeting Sentiments

It’s been a busy couple of weeks working on the EquityBot project, which will be ready for the upcoming Impakt Festival. Well, at least some functional prototype in my ongoing research project will be online for public consumption.

The good news is that the Twitter stream is now live. You can follow EquityBot here.

EquityBot now tweets images of data-visualizations on its own and is autonomous. I’m constantly surprised and a bit nervous by its Tweets.

exstasy_sentimentAt the end of last week, I put together a basic data visualization using D3, which is a powerful Javascript data-visualization tool.

Using code from Jim Vallandingham, In just one evening, I created dynamically-generated bubble maps of Twitter sentiments as they arrive EquityBot’s own sentiment analysis engine.

I mapped the colors directly from the Plutchik wheel of emotions, which is why they are still a little wonky like the fact that the emotion of Grief is unreadable. Will be fixed.

I did some screen captures and put them my Facebook and Twitter feed. I soon discovered that people were far more interested in images of the data visualizations than just text describing the emotions.

I was faced with a geeky problem: how to get my Twitterbot to generate images of the data visualizations using D3, a front-end Javascript client? I figured it out eventually, after stepping into a few rabbit holes.

Screen Shot 2014-10-21 at 11.31.09 AM

I ended up using PhantomJS, the Selenium web driver and my own Python management code to solve the problem. There biggest hurdle was getting Google webfonts to render properly. Trust me, you don’t want to know the details.

Screen Shot 2014-10-21 at 11.31.29 AM

 

But I’m happy with the results. EquityBot will now move to other Tweetable data-visualizations such as its own simulated bank account, stock-correlations and sentiments-stock pairings.

Blueprint for EquityBot

For my latest project, EquityBot, I’ve been researching, building and writing code during my 2 month residency at Impakt Works in Utrecht (Netherlands).

EquityBot is going through its final testing cycles before a public announcement on Twitter. For those of you who are Bot fans, I’ll go ahead and slip you the EquityBot’sTwitter feed: https://twitter.com/equitybot

The initial code-work has involved configuration of a back-end server that does many things, including “capturing” Twitter sentiments, tracking fluctuations in the stock market and running correlation algorithms.

I know, I know, it sounds boring. Often it is. After all, the result of many hours of work: a series of well-formatted JSON files. Blah.

But it’s like building city infrastructure: now that I have the EquityBot Server more or less working, it’s been incredibly reliable, cheap and customizable. It can act as a Twitterbot, a data server and a data visualization engine using D3.

This type of programming is yet another skill in my Creative Coding arsenal. And consists of mostly Python code that lives on a Linode server, which is a low-cost alternative to options like HostGator or GoDaddy, which incur high monthly costs. And there’s a geeky sense of satisfaction in creating a well-oiled software engine.

The EquityBot Server looks like a jumble of Python and PHP scripts. I cannot possibly explain it excruciating detail, nor would anyone in their right mind want to wade through the technical details.

Instead, I wrote up a blueprint for this project.

ebot_server_diagram_v1For those of you who are familiar with my art projects, this style of blueprint may look familiar. I adapted this design from my 2049 Series, which are laser-etched and painted blueprints of imaginary devices. I made these while an artist-in-residence at Recology San Francisco in 2011.

sniffer-blue

EquityBot: Capturing Emotions

In my ongoing research and development of EquityBot — a stock-trading bot* with a philanthropic personality, which is my residency project at Impakt Works — I’ve been researching various emotional models for humans.

The code I’m developing will try to make correlations between stock prices and group emotions on Twitter. It’s a daunting task and one where I’m not sure of the signal-to-noise ratio will be (see disclaimer). As an art experiment, I don’t know what will emerge from this, but it’s geeky and exciting.

In the last couple weeks, I’ve been creating a rudimentary system that will just capture words. A more complex system would use sentiment analysis algorithms. My time and budget is limited, so phase 1 will be a simple implementation.

I’ve been looking for some sort of emotional classification system. There are several competing models (of course).

My favorite is the Plutchik Wheel of Emotions, which was developed in 1980. It has a symmetrical look to it and apparently is deployed in various AI systems.

 

Plutchik-wheel.svg

Other models such as the Lövheim cube of emotion are more recent and seem compelling at first. But it’s missing something critical: sadness or grief. Really? This is such a basic human emotion and when I saw it was absent, I tossed the cube model.

1280px-Lövheim_cube_of_emotion

Back to the Plutchik model…my “Twitter bucket” captures certain words, from the color wheel above. I want enough words for a reasonable statistical correlation (about 2000 tweets/hour). Too many of one word will strain my little Linode server. For example, the word “happy” is a no-go since there thousands of Tweets with that word in it each minute.

Many people tweet about anger by just using the word “angry” or “anger”, so that’s an easy one. Same thing goes with boredom/boring/bored.

For other words, I need to go synonym-hunting, like: apprehension. The twitter stream with this word is just a trickle. I’ve mapped it to “worry” or “anxiety”, which shows up more often in tweets. It’s not quite correct, but reasonably close.

The word “terror” has completely lost it’s meaning, and now only refers to political discourse. I’m still trying to figure out a good synonym-map for terror: terrifying, terrify, terrible? It’s not quite right. There’s not a good word to represent that feeling of absolute fear.

This gets tricky and I’m walking into the dark valley of linguistics. I am well-aware of the pitfalls.

Screen Shot 2014-10-01 at 3.18.33 PM

 

* Disclaimer:
EquityBot doesn’t actually trade stocks. It is an art project intended for illustrative purposes only, and is not intended as actual investment advice. EquityBot is not a licensed financial advisor. EquityBoy It is not, and should not be regarded as investment advice or as a recommendation regarding any particular security or course of action.

 

EquityBot @ Impakt

My exciting news is that this fall I will be an artist-in-residence at Impakt Works, which is in Utrecht, the Netherlands. The same organization puts on the Impakt Festival every year, which is a media arts festival that has been happening since 1988. My residency is from Sept 15-Nov 15 and coincides with the festival at the end of October.

Utrecht is a 30 minute train ride from Amsterdam and 45 minutes from Rotterdam and by all accounts is a small, beautiful canal city with medieval origins and also hosts the largest university in the Netherlands.

Of course, I’m thrilled. This is my first European art residency and I’ll have a chance to reconnect with some friends who live in the region as well as make many new connections.

impakt; utrecht; www.impakt.nlThe project I’ll be working on is called EquityBot and will premiere at the Impakt Festival in late October as part of their online component. It will have a virtual presence like my Playing Duchamp artwork (a Turbulence commission) and my more recent project, Bot Collective, produced while an artist-in-residence at Autodesk.

Like many of my projects this year, this will involve heavy coding, data-visualization and a sculptural component.

equity_bot_logo

At this point, I’m in the research and pre-production phase. While configuring back-end server code, I’m also gathering reading materials about capital and algorithms for the upcoming plane rides, train rides and rainy Netherland evenings.

Here is the project description:

EquityBot

EquityBot is a stock-trading algorithm that explores the connections between collective emotions on social media and financial speculation. Using custom algorithms Equitybot correlates group sentiments expressed on Twitter with fluctuations in related stocks, distilling trends in worldwide moods into financial predictions which it then issues through its own Twitter feed. By re-inserting its results into the same social media system it draws upon, Equitybot elaborates on the ways in which digital networks can enchain complex systems of affect and decision making to produce unpredictable and volatile feedback loops between human and non-human actors.

Currently, autonomous trading algorithms comprise the large majority of stock trades.These analytic engines are normally sequestered by private investment companies operating with billions of dollars. EquityBot reworks this system, imagining what it might be like it this technological attention was directed towards the public good instead. How would the transparent, public sharing of powerful financial tools affect the way the stock market works for the average investor?

kildall_bigdatadreamsI’m imagining a digital fabrication portion of EquityBot, which will be the more experimental part of the project and will involve 3D-printed joinery. I’ll be collaborating with my longtime friend and colleague, Michael Ang on the technology — he’s already been developing a related polygon construction kit — as well as doing some idea-generation together.

“Mang” lives in Berlin, which is a relatively short train ride, so I’m planning to make a trip where we can work together in person and get inspired by some of the German architecture.

My new 3D printer — a Printrbot Simple Metal — will accompany me to Europe. This small, relatively portable machine produces decent quality results, at least for 3D joints, which will be hidden anyways.

printrbot