Flagscape: Data-visualizing Global Economic Exchange in Virtual Reality

Overview

Scott Kildall is conducting research into data-navigation techniques in virtual reality with a project called Flagscape, which constructs a surreal world of economic exchange between nations, based on United Nations data.

The work deploys “data bodies,” which represent exports such as metal ores and fossil fuels that move through space and impart complexities of economic relations. Viewers move through the procedurally-generated datascape rather than acting upon the data elements, inverting the common paradigm of legible and controlled data access.

Economic exchange in VR

Details

The code constructs data from several databases at runtime including population, carbon emissions per capita, military personnel per capita and a United Nations database on resource extraction. All of these get combined to construct the Flagscape data bodies. Each one represents a single datum, linked to a specific country.

The only stationary data body is a population model for each country, which scales to the relative value for each country and resembles a 3D person using a revolve around a central axis. The code positions these forms at their appropriate 3D world location, such that China and India — the largest two population bodies — act as waypoints as their forms dwarf all others.

Population bodies of India and China

A moshed flag skins every data body, acting as a glitched representation that subverts its own national identity. Underneath the flag is a complex set of relations of exchange that exceeds nationhood. For example, resource-extraction machines are made in one country that then get purchased by another to extract the very resources that make those machines.

Brazil flag, moshed

Flagscape reminds us that our borders are imaginary and in this idealized 3D space, there are no delineations of territory, only lines that guide trade between countries, forms magically gliding along an invisible path. What the database cannot tell us is how exactly the complex power relations move resources from one nation to another. Meanwhile, carbon emissions, the only untethered data body in Flagscape, which affects the entire planet spin out of control into the distance only to get endlessly respawned.

Carbon emissions by Canada and Australia

The primary acoustic element triggers when you navigate close to a population body. That country’s national anthem plays, filling your ears with a wash of drums, horns and militaristic melodies that flow into a state of sameness.

Initial Inspiration

The project is inspired by early notions of cyberspace described by writers such as William Gibson, where virtual reality is a space of infinity and abstraction. In Neuromancer, published in 1984, he describes cyberspace as:

“Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding…”

Neuromancer

While this text entices, most VR content recreates physical spaces, such as the British Museum with the same artwork, floor tiles and walls as the real, or it builds militarized spaces in which “you” are a set of hands that trigger weapons as you walk through combat mazes. At some level, this is a consequence of linear-thinking embedded in our fast-paced capitalist economy, arcing towards functionality but ignoring artistic possibilities. This research project acts as an antidote to these constrained environments.

OverkillVR, a virtual reality game

It was with these initial conversations around virtual datascapes with Ruth Gibson and Bruno Martelli that I was invited to be part of the Reality Remix project and was included in the AHRC Reality Remix grant which is part of their Next Generation Immersive Experiences call. My role is a “collaborator” (aka artist) who is creating their own project under these auspices.

Spatialization and Materializing Data

Unlike the 2D screen, which has a flatness and everyday familiarity, VR offers full spatialization and a new form of non-materiality, which Flagscapes fully plays with. One concept that I have been working with is that since data has physical consequences, it should exist as a “real” object. This project will expand this idea but will also blur sensorial experiences, tricking the visitor into a boundary zone of the non-material.

At the same time, Flasgscapes is its own form of landscape, creating an entire universe of possibility. It refers to traditions of depicting landscapes as art objects as well as iconic Earthworks pieces such as Spiral Jetty, where the Earth itself acts as a canvas. However, this type of datascape will be entirely infinite, like the boundaries of the imagination.

Spiral Jetty

Finally, Flagscape continues the steam of instruction-based work by artists such as Sol LeWitt, where an algorithm rather than the artist creates the work. Here, it accomplishes a few things such as taking the artists hand away from creating the form itself but also recognizing the power of artificial intelligence to assist in creating new forms of artwork.

Alternate Conception of Space in Virtual Reality

VR offers many unique forms of interaction, perception and immersion, but one aspect that defines it is the alternate sense of space. Similar to the religious spaces before the dominance of science, as described by Margaret Wertheim in the Pearly Gates of Cyberspace, this “other” space has the potential to create a set of rules that transport us to a unique imagination space.

As technology progresses and culture responds, the linearity of engineering-thinking often confines creativity rather than enhances it. Capitalist spaces get replicated and modified to adapt to the technology, validating McLuhan’s predictions of instantaneous, group-like thinking. The swipe gestures we use on our phones get encoded in muscle memory. We slyly refer to Wikipedia as the “wonder-killer”. The flying car is often cited as the most desirable future invention.

Flying car from Blade Runner

At stake with technological progress is imagination itself. Will the content of the spaces that get opened up with new technologies be ones that enhance our creativity or dull it? Who has access to technology-inspired culture? How can we use, enhance and subvert online distribution channels? These are just some of the questions and conversations that this project will ask — in the context of virtual space.

I see VR in a similar place as Video Art was in the 1970s, which thrived with access to affordable camcorders. However, VR and this specific project has the ability to easily disseminate into homes and public spaces through various app stores. Ultimately, with this project I hope to direct conversations around access and imagination with art and technology.

Marshall McLuhan with many telephones

Work-in-progress Presentation

Our Reality Remix group will be presenting its research, proof-of-concepts and prototypes at two venues in London on July 27th and July 28th, 2018 at Ravensbourne and Siobhan Davies Studios. Both free events are open to the public.

Bibliography
Gibson, W. (1993). Neuromancer. London: Harper Collins Science Fiction & Fantasy.
McLuhan, M. (1967). The medium is the massage : an inventory of effects. Bantam Books.
Wertheim, M. (2010). The pearly gates of cyberspace. New York [u.a.]: Norton.

Revamping Moon v Earth

›My artwork occupies the space between the digital and analog as I generate physical expressions of the virtual. In the last several years, most of my work with transforming data into sculptures and installations.

But sometimes I return to narratives themselves. It’s not so much a lack of focus but rather a continual inquiry into technology and its social expression. Imaginary narratives seem particularly relevant these days with the subjectivity of truth magnifying an already polarized political discourse.

I recently finished revamping a project called Moon v Earth, originally presented in 2012 at the Adler Planetary Museum. This augmented reality artwork installation depicts a future narrative where a moon colony run by elites declares its independence from Earth. It is now on display at the Institute of Contemporary Art in San Jose.

Here are a few augments from the 2012 exhibition that made it in the 2018 show. My favorite was this pair of newspapers, which showed two different ‘truths’. At the time, “fake news” meant nothing and the idea of seeding false stories into online outlets wasn’t a remarkable.

The last augment — the ridiculous wooden catapult about to launch rocks at Earth — refers to the Robert Heinlein novel, The Moon is a Harsh Mistress. This inspired the my project many years ago. In his plot line, the moon was a penal colony much like Australia 200 years ago and features an AI as one of the three heads of the revolution. The independence-seekers achieved victory by hurled asteroids at Earth as their most effective weapon.

I created this absurd 3D model in the imaginary world of Second Life as an amateur 3D assemblage. It was quick and dirty, like much digital artwork and as we see nowadays, like the fragility of truth.

The turn of Moon v Earth, at least the 2018 version is that the augments aren’t virtual at all, but instead are constructed as physical augments hanging from fishing line or hot-glued against a cardboard backing. At first, I tried working with AR technology, but soon discovered its compromises: a device-dependence and a distance between the viewer and the experience. Instead, the physical objects shows the fragile and fragmentary nature of the work in cheap cardboard facades and flimsy hanging structures distributed throughout the venue.

NextNewGames is at the San Jose ICA until September 16th, 2018

Farewell, Dinacon

I just spent 20 days on a sparsely-inhabited island in Thailand with about 80 artists, scientists and other imaginative people. Everyone worked on their own projects ranging from jungle-foraged dinners to plant-piloted drones to creating batteries from microbial energy. We had no AC for much of the day, got bitten by weaver ants, were surrounded by jungle cats and ate off each other’s plates. And, I absolutely loved the experience.

Microbial Battery Workshop

The gathering was Dinacon, the first Digital Naturalism conference and was co-organized by Tasneem Khan and Andrew Quitmeyer. I was a “node leader” which meant that I spent a bit of time reviewing the applications, organizing workshops and spending longer at the event than most.

Dinacon registration area

The site was Koh Lon, a small island that is just off the coast of Phuket. We stayed at a “resort”, which was actually fairly minimal and had small cabins, common house or options for tent camping. From the main work area, you walk a few minutes in one direction and you’re on the beach. In the other direction is jungle. There were no cars on the island, a handful of scooters, two hundred or so local residents and not a single dog. The soundscape felt entirely tropical with cicadas, birds and frogs filling the airwaves with their chatter. Our dinner was boated in each day and at the small restaurant we could get the three essentials: wifi (when the power was on), breakfast + lunch, and beer.

Selfie with Koh Lon in the background

The participants came from all over the world and arrived and left at random times such that there was constant inflow of new friends and outflow of sad goodbyes. Each day, we had about 40 people on the island. I could nerd out on my project, kayak in the water, take a break on the ship that we had access to (Diva Andaman), find myself sitting in a chair sharing ideas, play with hermit crabs or get away from everyone and walk in the jungle. Helping one another was something that effortlessly emerged in our temporary community.

Saying hi to the Diva Andaman

Questions that I asked myself upon arrival was will happen when you assemble a group of project-creating strangers in a natural environment, where you can take a break by putting on a pair of swim trunks and walk into the ocean? What does building things in on the island with its outdoor space and natural light do to your creative practice? How can I prototype an artwork that collaborates with this specific place?

I quickly became a lot less efficient and much more connected to people and place. I ended up creating better work and my body was utterly relaxed. Any shoulder pain I might have in an office space dissipated quickly. There were no Google calendar invites, no afternoon soy lattes and certainly no eating at my desk.

I found myself in daily arrhythmic patterns of production, often sitting on my neighbor’s the porch with headphones on and composing audio synth code, then stopping suddenly and reveling in nature. I would get interrupted to see a tree snake or find myself lost in conversation with someone’s project. In the evenings, we usually had self-organized small workshops or informal talks. I drank beer sometimes but also often went to bed early, worn out from the humidity and brain swell each day.

Arduino coding by susnet

I did make a thing! This experiment — a potential new artwork has a working title of DinaSynth Quartet. It is a live audio-synth performance between a plant, the soil, the air and the water, which is an electronics installation that is designed exist only outside. I connected each of these four “players” to sensors: plant with electrodes, ground to soil sensor, water to EC sensor and air to humidity.

Each one used a variation on my Sonaqua boards — a kit which I am actively using for workshops — to make a dynamic audio synth track, modulating bleeps and clicks to their sensor readings, creating a concert performance of sorts.

I’m not sure exactly where the work will go next, but I’m happy with the results. It was my first audio synth project and I’m far from being an expert, calling my approach “beginner’s mind”. However, most of the participants liked the idea and the specific composition that the jungle played.

I already miss everyone there: Jen, Tina, David, Rana, Pom, Sebastian and so, so many other delightful friends. And this is one thing I love about the life I’ve created for myself as an new media artist: after events like this, I now have friends who are doing inspiring work all over the world.

Jungle-foraged dinner party

 

Kira’s birthday party

Putting on a heart rate sensor on one of the local cats

Sonaqua Workshop in the common space

Local lotus flower

 

Millipedes were everywhere

 

Soldering work in the main space

Dani doing a lizard dissection on the beach at susnet

 

Andrew holding a snake

 

Little Niko, my favorite of the Dinacon Cats

Dinacon: 2 more environmental synths

Dinacon — the Digital Naturalism Conference on island of Koh Lon in Thailand — has been amazing. It’s been an opportunity to meet and collaborate with other artists, scientists, hackers, writers and more. The caliber of the participants has been extraordinary.

My art experiments have been around creating audio synth compositions from the environment, using low-cost sensors and custom electronics to make site-specific results.

In the last two days, I’ve made two composition-circuits, this one (below) which uses a soil sensor and tracks moisture in the sand.

And this one, which uses electrodes on plant leaves to simulate what the plants might be “saying”.

The GitHub repo for all my experiments is here.

Sonaqua at Currents 2018

I jokingly referred to my Sonaqua artwork as “the most annoying piece at the festival”. The exhibition was Currents New Media 2018, which was an incredible event.

It was a hit with the public and invited multi-user interaction. Kids went crazy for it. Adults seemed to enjoy the square-waves of audio glitch all night.

So yes, perhaps a tad abrasive, but it was also widely popular.

A number of people were intrigued by the water samples and electronics with what looked like a tangly mess of wires. It was actually a solid wiring job and nothing broke!

After working at the Exploratorium for a couple of years, I adjusted my approach to public engagement so that anyone can get something from this artwork.

How does it work?

The electrodes take a reading of the electrical current flow in various water samples that I collected throughout New Mexico. If more current flows through the water, then this means there are more minerals and salts, which is usually an indicator of less clean water.

The technical measurement is electrical conductivity, which correlates to total dissolved solids, which is one measure of water quality that scientists frequently use.

The installation plays lower tones for water that is more conductive (less pure) and higher tones for water that has less pollutants in it.

The results are unpredictable and fun, with 12 different water quality samples.

The light table is custom-built with etchings of New Mexico rivers and waterways, indicating where the original water sample was taken.

 

 

 

 

 

Gun Control (revisited)

My writing (below) was originally printed as part of the Disobedient Electronics project by Garnet Hertz. It is a limited edition publishing project that highlights confrontational work from industrial designers, electronic artists, hackers and makers that disobey conventions. 

 

Gun Control (revisited)

In 2004, I created Gun Control — a set of four electromechanical sculptures, which used stepper motors, servos and cheap cameras that were controlled by AVR code. The distinguishing feature of each unit is a police-issue semi-automatic replica handgun. You can purchase these authentic-looking firearms for less than $100.

The make-believe weapons arrived in the mail a week after I ordered them. That night, I closed the blinds, drank too much whisky and danced around my apartment in my underwear waving my new guns around. The next morning, I packed them in a duffel bag and took the “L” in Chicago to my studio. During the 45-minute commute I felt like a criminal.

Each gun is connected a stepper motor via a direct-drive shaft and flexible couplings. I used a lathe and a milling machine to make custom fittings. I hid unsightly electronics in a custom-sewn leather pouch, resembling some sort of body bag.

As people enter the Gun Control installation space, the cameras track their movement, and the guns follow their motion. Well, at least this is what I had hoped it would do. However, I had committed to using the first gen CMUCam and its blob-tracking software was spotty at best. I was under a deadline. It was too late to spec out new cameras. Plus, these were the right size for the artwork, which was using decentralized embedded hardware. I shifted my focus to building a chaotic system.

I re-coded the installation so the guns would point at different targets. They would  occasionally twirl about playfully and re-home themselves. I programmed the stepper motors to make the armatures shake and rattle when they got confusing target information. The software design embraced unpredictability, which made the whole artwork feel uncertain, embodying the primal emotion of fear.

Gun Control was my heavy-handed response to the post-911 landscape and the onset of the Iraq War. I exhibited it twice, then packed it up. It lacked subtlety and tension. At the time, there was not enough room for the viewer.

Just last month, I pulled the artwork out of deep storage. I brought the pieces to my studio and plugged in one of the units. It functioned perfectly. Upon revisiting this piece after 12 years, my combination of guns and surveillance seems eerily prescient.

Mass shootings have drastically increased in the last several years. Surveillance is everywhere, both with physical cameras and the invisible data-tracking from internet servers. Documentation of police shootings of unarmed African Americans is sadly, commonplace. I no longer recoil from the explicit violence of this old artwork.

I coded this using AVR microcontrollers, just before the Arduino was launched. It was tedious work just to get the various components working. I can no longer understand the lines of C code that I wrote many years ago. The younger me was technically smarter than the current me. My older self can put this historical piece into perspective. I plan to re-exhibit it in the coming years.

GitHub repo: https://github.com/scottkildall/GunControl

Collecting Sacred Fluids

I recently debuted a new art installation called Cybernetic Spirits at the L.A.S.T. Festival. This is an interactive electronic artwork, where participants generate sonic arrangements based on various sacred fluids. These include both historical liquids-of-workshop such as holy water, blood and breast milk and more contemporary ones such as gasoline and coconut water.

My proposal got accepted. Next, I had to actually collect these fluids.

My original list included: blood, holy water, coffee, gasoline, adrenaline, breast milk, corn syrup, wine, coca-cola, coconut water, vaccine (measles), sweat and kombucha

Some of these were easily procured at the local convenience store and a trip to the local gas pump. No problem.

But what about the others? I found holy water on Amazon, which didn’t surprise me, but then again this wasn’t anything I had ever thought about before.

I knew the medical ones would be the hardest: adrenaline and a measles vaccine. After hours scouring the internet and emailing with a doctor friend of mine, I realized I had to abandon these two. They were either prohibitively expensive or would require deceptive techniques that I wasn’t willing to try.

Art is a bag of failures and I expected not to be entirely successful. Corn syrup surprised me however. After my online shipment arrived, I discovered was sticky and too thick. It is syrup after all. Right. My electrical probes got gunky and more to the point, it didn’t conduct any electrical current. No current = no sound.

Meanwhile, I put out feelers for the human bodily fluids: blood, sweat and breast milk. Although it was easy to find animal blood, what I really wanted was human blood (mine). I connected with a friend of a friend, who is a licensed nurse and supporter of the arts. After many emails, we arranged an in-home blood draw. I thought I’d be squeamish about watching my blood go into several vials (I needed 50ml for the installation), but instead was fascinated by the process. We used anti-coagulant to make it less clotty, but it still separated into a viscous section at the bottom.

Since I am unable to produce breast milk, I cautiously inquired with some good friends who are recent moms and found someone willing to help. So grateful! She supplied me with one baby-serving size of breast milk just a couple of days before the exhibition, so that it would preserve better. At this point, along with the human blood in the fridge, I was thankful that I live alone and didn’t have to explain what was going on to skeptical housemates.

I saved the sweat for the last-minute, thinking that there was some easy way I could get sweaty in an exercise class and extract some. Once again a friend helped me, or at least tried, by going to a indoor cycling class and sweating into a cotton t-shirt. However, wringing it out produced maybe a drop or two of sweat, nowhere close to the required 50ml for the vials.

I was sweating over the sweat and really wanted it. I made more inquiries. One colleague suggested tears. Of course, blood, sweat and tears, though admittedly I felt like I was treading into Kik Smith territory at this point.

So, I did a calculation on the amount of tears you would need to collect 50ml and this would mean a crying a river everyday for about 8 months. Not enough time and not enough sadness.

Finally, just before shooting the documentation for the installation, the sweat came through. I friend’s father works for a company that produces artificial sweat and gave me 5 gallons of this mixture. It was a heavy thing to carry on BART, but I made it home without any spillage.

Artificial sweat? Seems gross and weird. The truth is a lot more sensible. A lot of companies need to test human sweat effects on products from wearable devices to steering wheels and it’s more efficient to make synthetic sweat than work with actual humans. Economics carves odd channels.

My artwork often takes me on odd paths of inquiry and this was no exception. Now, I just have to figure out what to do with all the sweat I have stored in my fridge. 

 

 

 

 

Sonaqua goes to Biocultura

Last month…yes, blogging can be slow, I traveled to Santa Fe with the support of Andrea Polli and taught a workshop on my Sonaqua project.

The basic idea of Sonaqua is to sonfiy — create sounds — based on water quality. As a module, these are Arudino-based and designed for a single-user to make a sound. I’m actively teaching workshops on these and have open-sourced the software and made the hardware plans available.

interested in a Sonaqua workshop? then contact me

My Sonaqua installation creates orchestral arrangements of water samples based on electrical conductivity. Here’s a link to the video that explains the installation, which I did in Bangkok this June.

Back to New Mexico..In the early part of the week, I taught a workshop on the Sonaqua circuit at one of Andrea’s classes at UNM, creating single-player modules for each student. We collected water samples and played each one separately. The students were fun and set up this small example of water samples with progressive frequencies, almost like a scale.

The lower the pitch, the more polluted* the water sample and so higher-pitched samples might correspond to filtered drinking water.

Later in the week, I traveled to Biocultura in Santa Fe, which is a space that Andrea co-runs. Here, I installed the orchestral arrangement of the work, based on 12 water samples in New Mexico. She had a whole set of beakers and scientific-looking vessels, so I used what we had on hand and installed it on a shelf behind the presentation.

A physical map (hard to find!) of the sites where I took water samples.

And a close-up shot of one of the water samples + speakers. If you look closely, you can see an LED inside the water sample.

My face is obscured by the backlit screen. I presented my research with Sonaqua, as well as several other projects around water that evening to the Biocultura audience.

And afterwards, the attendees checked out the installation while I answered questions.

Data Crystals at EVA

I just finished attending the EVA London conference this week and did a demonstration of my Data Crystals project. This is the formal abstract for the demonstration and writing it helped clear up some of my ideas about the Data Crystals project and digital fabrication of physical sculptures and installations.

 

Embodied Data and Digital Fabrication: Demonstration with Code and Materials
by Scott Kildall

1. INTRODUCTION

Data has tangible consequences in the real world. Accordingly, physical data-visualizations have the potential to engage with the actual effects of the data itself. A data-generated sculpture or art installation is something that people can move around, though or inside of. They experience the dimensionality of data with their own natural perceptual mechanisms. However, creating physical data visualizations presents unique material challenges since these objects exist in stasis, rather than in a virtual space with a guided UX design. In this demonstration, I will present my recent research into producing sculptures from data using my custom software code that creates files for digital fabrication machines.

2. WHAT DOES DATA LOOK LIKE?

The overarching question that guides my work is: what does data look like? Referencing architecture, my artwork such as Data Crystals (figure 2) executes codes that maps, stacks and assembles data “bricks” to form unique digital artifacts. The form of these objects are impossible to predict from the original data-mapping, and the clustering code will produce different variations each time it runs.

Other sculptures remove material through intense kinetic energy. Bad Data (figure 3) and Strewn Fields (figure 1) both use the waterjet machine to gouge data into physical material using a high- pressure stream of water. The material in this case — aluminum honeycomb panels and stone slabs — reacts in adverse ways as it splinters and deforms due to the violence of the machine.

2.1 Material Expression

Physical data-visualizations act on materials instead of pixels and so there is a dialogue between the data and its material expression. Data Crystals depict municipal data of San Francisco and have a otherworldly ghostly quality of stacked and intersecting cubes. The data gets served from a web portal and is situated in the urban architecture and so the 3D-printed bricks are an appropriate form of expression.

Bad Data captures data that is “bad” in the shallow sense of the word, rendering datasets such as Internet Data Breaches, Worldwide UFO Sightings or Mass Shootings in the United States. The water from the machine gouges and ruptures aluminum honeycomb material in unpredictable ways, similar to the way data tears apart our social fabric. This material is emblematic of the modern era, as aluminum began to be mass-refined at the end of the 19th century. These datasets exemplify conflicts of our times such as science/heresy and digital security/infiltration.

2.2 Frozen in Time

Once created, these sculptures cannot be endlessly altered like screen-based data visualizations. This challenges the artwork to work with fixed data or to consider the effect of capturing a specific moment.

For example, Strewn Fields is a data-visualization of meteorite impact data. When a large asteroid enters the earths atmosphere, it does so at high velocity of approximately 30,000km/hour. Before impact, it breaks up into thousands of small fragments, which are meteorites. Usually they hit our planet in the ocean or at remote locations. The intense energy of the waterjet machine gouges the surface of each stone, mirroring the raw kinetic energy of a planetoid colliding with the surface of the Earth. The static etching captures the act of impact, and survives as an antithetical gesture to the event itself. The actual remnants and debris (the meteorites) have been collected, sold and scattered and what remains is just a dataset, which I have translated into a physical form.

2.3 Formal Challenges to Sculpture

This sort of “data art” challenges the formal aspects of sculpture. Firstly, machine-generated artwork removes the artist’s hand from the work, building upon the legacy of algorithmic artwork by Sol Lewitt and others. Execution of this work is conducted by the stepper motor rather than by gestures of the artist.

Secondly, the input source of data are unknowable forms until they are actually rendered. The patterns are neither mathematic nor random, giving a certain quality of perceptual coherence to the work. Data Crystals: Crime Incidents has 30,000 data points. Using code-based clustering algorithms, it creates forms only recently possible with the combination of digital fabrication and large amounts of data.

3. CODE

My sculpture-generation tools are custom- developed in C++ using Open Frameworks, an open source toolkit. My code repositories are on GitHub: https://github.com/scottkildall. My own software bypasses any conventional modeling package. It can handle very complex geometry, and more importantly doesn’t have the “look” that a program such as Rhino/Grasshopper generates.

3.1 Direct-to-Machine

My process of data-translation is optimized for specific machines. Data Crystals generate STL files which most 3D printers can read. My code generates PostScript (.ps) files for the waterjet machine. The conversation with the machine itself is direct. During the production and iteration process, once I define the workflow, the refinements proceed quickly. It is optimized, like the machine that creates the artwork.

3.2 London Layering

In my demonstration, I will use various open data from London. I focus not on data that I want to to acquire, but rather, data that I can acquire. I will demonstrate a custom build of Data Crystals which shows multiple layers of municipal data, and I will run clustering algorithms to create several Data Crystals for the City of London.

 

Figure 1: Strewn Fields (2016)
by Scott Kildall
Waterjet-etched stone

Figure 2:
Data Crystals: Crime Incidents (2014)
by Scott Kildall
3D-print mounted on wood

Figure 3:
Bad Data: U.S. Mass Shootings (2015)
by Scott Kildall
Waterjet-etched aluminum honeycomb panel

Playing with the e-mail scammers

When someone sends you an email scam, think of it as an opportunity for fun. They stopped replying to my emails after several responses.

Here is the exchange:

—–

Goodday,
Good Day,
How is everything with you? I picked interest in your artwork and decided to write you. I will like to know if your artwork can be purchased and shipped internationally?. I can email the artwork of interest and payment will be completed in full once you confirm my purchase order with a quotation.
Kindly let me know when you are in office and ready to take my artwork order also let me know if you accept either Visa Card or Master Card for payment furthermore you can email me your recently updated website or art price list in your response.
Best Regards
Yoshida

Hi Yoshida,

Thank you for contacting me.

I’m curious which artwork you are interested in, I have available:

(1) Shoe-gazing — a 96-hour performance art video of me looking at my shoes. Audio track is optional.

(2) MDMA Buttplug — I think the title says it all. Leave it to your imagination.

(3) The Salmonella Experience — A crowdsourced experiment on Mechanical Turk, where I send people salmonella-infested eggs, which they ingest and document over a 4-day period.

Sincerely,
Scott Kildall

Hi Scott,
Good to hear from you please can you email me the cost of three available pieces

Thanks
Yoshida

Hi Yoshida,

Which one do you like best from my list?

That is the most important question. Price is secondary.

Best,
Scott

(3) The Salmonella Experience — A crowdsourced experiment on Mechanical Turk, where I send people salmonella-infested eggs, which they ingest and document over a 4-day period.

Thanks,
Yoshida

Hi Yoshida,

Thank you for choosing The Salmonella Experience.

I had thought that MDMA Buttplug would be more to your liking, for some reason. I do want to give you one last chance to reconsider. For, once we go down a financial path, then we cannot turn back and choose another artwork.

So, are you sure about The Salmonella Experience?

Question: What attracted you to this project over the other ones that were available?

Thank you,
Scott Kildall

<no response after this one…>

A friend of mine pointed me to this TED talk by James Veitch. So, obviously I’m not the first:

Equitybot goes to Vienna

EquityBot resumes its world tour (Utrecht, Vancouver, Bilbao, Amsterdam, San Francisco, Columbus) with a group show in Vienna.

MOOD SWINGS – On Mood Politics, Sentiment Data, Market Sentiments and Other Sentiment Agencies

Curated by Sabine Winkler

 

dates and times
Mar 31 to May 28, Tue to Sun 13-20:00

Press Tour: Wed, Mar 29, 10:00
Opening: Thu, Mar 30, 19:00

abstract
It is moods rather than facts that are determining perceptions, decisions and courses of action to an ever greater degree. Mood data, in turn is a sought-after subject for analysis; emotions are being quantified and simulated. The exhibition “Mood Swings – On mood politics, sentiment data, market sentiments and other sentiment agencies”, curated by Sabine Winkler, focuses on the significance and radius of sentiment in politics, business, technology, media and art.

Artists:
Antoine Catala (FRA)*, Xavier Cha (USA), Florian Göttke (GER/NLD), Femke Herregraven (NLD), Hertog Nadler (NLD/ISR)*, Micah Hesse (USA)*, Francis Hunger (GER), Scott Kildall (USA), Barbora Kleinhamplová (CZE), Tom Molloy (IRL), Barbara Musil (AUT), Bego M. Santiago (ESP)*, Ruben van de Ven (NLD)*, Christina Werner (AUT)
*Q21/MQ Artist-in-Residence

Music Box Village

Last week, I visited the Music Box Village in New Orleans. This is a true DIY space where artists, fabricators and more have built “houses” that make sounds/music/noise in various ways. Together, skilled musicians (which does not include me) can make an orchestra of cacophonous music.

John Cage would have loved this space. Any sort of noise even silence is music, as people witnessed with his 4’33” composition. I’ve always loved this idea, the very fact that the tension between performance and non-performance can be music. At this site, the structures become the instruments. Anyone can play them. They are rusty, brittle, gentle and beautiful at the same time.

I’ve gone to many, many DIY spaces. I’ve even helped build some of them, such as The Shipyard, which was a mass of shipping containers that I helped weld, wire and cut in 2001. But all of these felt self-serving, creating a community of those that we included and those, who were somehow excluded because they didn’t speak the proper cultural language of metal-working and whiskey-drinking.

The Music Box Village felt different. I watched some of the founders present the project at the INST-INT Conference the day before and they spoke about community engagement and pairing collaborators from different socioeconomic backgrounds, skills and ages to build the houses. Their approach was organic and they finally secured a more permanent home which has metalworking facilities.

I can’t help but be inundated with the banality of architecture. Houses pretty much look alike, entirely functional and rectilinear. Our commerce spaces are branded box stores adorning cities and suburbs. As humans, we are molded by our physical environment. Our eyes conform to corners. Our minds become less imaginative as a result.

One of my favorite artists who works with architectures is Krzysztof Wodiczko who worked for many decades projecting iconography onto buildings in order to subvert the function of the building, the war memorial and the political body.

He writes: “Dominant culture in all its forms and aesthetic practices remains in gross contradiction to the lived experience, communicative needs and rights of most of society, whose labour is its sole base”

We have so much more to offer in terms of human imagination and creativity than the buildings that surround us and are institutions of capital. I left my tour of the Music Box Village feeling rejuvenated. Then I promptly went to airport to catch I flight back home, engaging with the odd transitional space where air travel happens.

 

 

 

 

 

Orientation Week at American Arts Incubator

The first week in 2017 was orientation week for the American Arts Incubator program. I met the four other artists and soon associated their names with the respective exchange countries: Elaine Cheung (Russia), Michael Kuetemeyer (Cambodia), Nathan Ober (Colombia), and Balam Soto (Guatemala)

My exchange country will be Thailand, where I’ll be staying in the multilayered metropolis of Bangkok for 28 days in May/June timeframe

Thailand sounds exciting and of course it is. However, I’m approaching this not as a tourist, but rather as an arts ambassador. The issue that I’ll be addressing in my exchange is environmental health and specifically water pollution in the Chao Phraya River. This is especially relevant to Thailand, which has underground rapid industrialization in the last couple of decades with environmental regulations lagging behind.

In Bangkok, I will engage in a dialogue of community data-collection and mapping though DIY science with a focus on water pollution, resulting in data-visualization installations and sculptures.

My time will be split about 80/20 on leading public workshops and creating my own artwork.

This ties into my current area of focus: creating physical data-visualizations such as the sculptures of the water infrastructure of San Francisco as well as relates to my longstanding history of working in art and education at institutions such as the Exploratorium.

I learned many things this week, including, but not limited to: better patience for long meetings, organizational models for workshop engagement, the Drupal blogging platform, art-budgeting in a foreign country and organizational techniques.

But most of all, I learned that I have an amazing organization, ZERO1, that will be supporting my work there as well as a cohort of four other artists I can learn from. Trust.

For more information and updates, please join the American Arts Incubator Facebook page.

Machine Data Dreams @ Black & White Projects

This week, I opened a solo show called Machine Data Dreams, at Black & White Projects. This was the culmination of several months of work where I created three new series of works reflecting themes of data-mapping, machines and mortality.

The opening reception is Saturday, November 5th from 7-9pm. Full info on the event is here.

Two of the artworks are from my artist-in-residency with SETI and the third is a San Francisco Arts Commission Grant.

All of the artwork uses custom algorithms to translate datasets into physical form, which is an ongoing exploration that I’ve been focusing on in the last few years.

Each set of artwork deserves more detail but I’ll stick with a short summary of each.

Fresh from the waterjet, Strewn Fields visualizes meteorite impact data at four different locations on Earth.

water-jet-1Strewn Fields: Almahata Sitta

As an artist-in-residence with SETI, I worked with planetary scientist, Peter Jenniskens to produce these four sculptural etchings into stone.

When an asteroid enters the earths atmosphere, it does so at high velocity — approximately 30,000 km/hour. Before impact, it breaks into thousands of small fragments — meteorites which spread over areas as large as 30km. Usually the spatial debris fall into the ocean or hits at remote locations where scientists can’t collect the fragments.

And, only recently have scientists been able to use GPS technology to geolocate hundreds of meteorites, which they also weigh as they gather them. The spread patterns of data are called “Strewn Fields”.

Dr. Jenniskens is not only one of the world’s experts on meteorites but led the famous  2008 TC3 fragment recovery in Sudan of the Almahata Sitta impact.

With four datasets that he both provided and helped me decipher, I used the high-pressure waterjet machine at Autodesk’s Pier 9 Creative Workshops, where I work as an affiliate artist and also on their shop staff, to create four different sculptures.

water-jet-2Strewn Fields: Sutter’s Mill

The violence of the waterjet machine gouges the surface of each stone, mirroring the raw kinetic energy of a planetoid colliding with the surface of the Earth. My static etchings capture the act of impact, and survive as an antithetical gesture to the event itself. The actual remnants and debris — the meteorites themselves — have been collected, sold and scattered and what remains is just a dataset, which I have translated into a physical form.

A related work, Machine Data Dreams are data-etchings memorials to the camcorder, a consumer device which birthed video art by making video production accessible to artists.

pixel_visionMACHINE DATA DREAMS: PIXELVISION

This project was supported by an San Francisco Individual Arts Commission grant. I did the data-collection itself during an intense week-long residency at Signal Culture, which has many iconic and working camcorders from 1969 to the present.

sonyvideorecorderSONY VIDEORECORDER (1969)
pixelvisionPIXELVISION CAMERA (1987)

During the residency, I built a custom Arduino data-logger which captured the raw electronic video signals, bypassing any computer or digital-signal processing software.data_loggerWith custom software that I wrote, I transformed these into signals that I could then etch onto 2D surfaces.Screen Shot 2015-08-02 at 10.56.15 PM I paired each etching with its source video in the show itself.

sony_video_recorderMACHINE DATA DREAMS: PIXELVISION

Celebrity Asteroid Journeys is the last of the three artworks and is also a project of from the SETI Artist in Residency program, though is definitively more light-hearted than the Strewn Fields.

Celebrity Asteroid Journeys charts imaginary travels from one asteroid to another. There are about 700,000 known asteroids, with charted orbits. A small number of these have been named after celebrities.

Working with asteroid orbital data from JPL and estimated spaceship velocities, I charted 5 journeys between different sets of asteroids.

My software code ran calculations over 2 centuries (2100 – 2300) to figure out the the best path between four celebrities. I then transposed the 3D data into 2D space to make silkscreens with the dates of each stop.

20161025_165421_webCELEBRITY ASTEROID JOURNEY: MAKE BELIEVE LAND MASHUP

This was my first silkscreened artwork, which was a messy antidote to the precise cutting of the machine tools at Autodesk.

All of these artworks depict the ephemeral nature of the physical body in one form or another. Machine Data Dreams is a clear memorial itself, a physical artifact of the cameras that once were cutting-edge technology.

With Celebrity Asteroid Journeys, the timescale is unreachable. None of us will ever visit these asteroids. And the named asteroids are memorials themselves to celebrities (stars) that are now dead or soon, in the relative sense of the word, will be no longer with us.

Finally, Strewn Fields captures a the potential for an apocalyptic event from above. Although these asteroids are merely minor impacts, it is nevertheless the reality that an extinction-level event could wipe out human species with a large rock from space. This ominous threat of death reminds us that our own species is just a blip in Earth’s history of life.

 

Asteroids and Celebrities

Asteroids! Planetary scientists have found and mapped about 700,000 of them and some estimate upwards of 150 million asteroids in our solar system. Most of them are in the asteroid belt, between Mars and Jupiter.

David Bowie has one named after him. Prince does not, though both have songs about being in space. Recently Freddie Mercury was awarded one on his 70th posthumous birthday, which seems a fitting tribute to a star, whose life was cut short by AIDS.

FILE - In this 1985 file photo, singer Freddie Mercury of the rock group Queen, performs at a concert in Sydney, Australia. Queen guitarist Brian May says an asteroid in Jupiter's orbit has been named after the band's late frontman Freddie Mercury on what would have been his 70th birthday, it was reported on Monday, Sept. 5, 2016. May says the International Astronomical Union's Minor Planet Centre has designated an asteroid discovered in 1991, the year of Mercury's death, as "Asteroid 17473 Freddiemercury." (AP Photo/Gill Allen, File)

Most asteroids have provisional designations. The full list of human-named asteroids are here. A few pets and fictional characters have even made it onto the list.

I saw this as an opportunity, as part of my SETI Artist-in-Residency to work with asteroid orbital data from JPL, estimated spaceship velocities* and create a new work called Celebrity Asteroid Journeys, which charts imaginary travels from one asteroid to another as silkscreen prints on wood panels.

20161025_165421_webCelebrity Asteroid Journey: Make Believe Land Mashup

I will be presenting the Celebrity Asteroid Journeys as part of my Machine Data Dreams solo show at Black and White Projects. The reception is on Saturday, November 5th, 7-9pm.

Representation is important and the list of asteroids-named-after people is no exception. Even though the majority of the asteroids are named after Western men, I worked to balance as much as possible.

 20161025_165440_webCELEBRITY ASTEROID JOURNEY: SINGERS

And how are asteroids named? According to my research, they are first given a provisional name. Then, when the orbit is determined, it is assigned a sequential number. The discoverer of the asteroid can then request from the International Astronomical Union to give the asteroid a formal name.

*the spaceship speeds do not use true acceleration and deceleration (the math was beyond my skills), but I did work with the best numbers I could find, about 140,000km/hour using a nuclear-electric engine.

Display at Your Own Risk by Owen Mundy

I get a lot of press for my artwork. These articles often gloss over the nuances, distilling the essence of a story.

Well-written academic articles about my artwork is what thrills me the most.

Such is the case, with Owen Mundy’s article, Display at Your Own Risk, which looks at 3D printing, copyright and photogrammetry in art.

bts_matrix

The work, he is referring to, in our case is Chess with Mustaches, which is detailed here.

cwm_fullset_adjusted -960x540

What Mundy hones in on is that our original Duchamp Chess set is not like ‘ripping’ music from physical media to a computer, but rather a “hand” tracing from a set of photographs to create a 3D model. It is essentially a translation rather than a crude copy.

These are the sorts of comparisons and nuances that garner my appreciation.

cw_duchamp_pieces

 

 

Waterjet Etching Tests

For the last several weeks, I have been conducting experiments with etching on the waterjet — a digital fabrication machine that emits a 55,000 psi stream of water, usually used for precision cutting. The site for this activity is Autodesk Pier 9 Creative Workshops. I continue to have access to their amazing fabrication machines, where I work part-time as one of their Shop Staff.

My recent artwork focuses on writing software code that transforms datasets into sculptures and installations, essentially physical data-visualizations. One of my new projects is called Strewn Fields, which is part of my work as an artist-in-residence with the SETI Institute. I am collaborating with the SETI research scientist, Peter Jenniskens, who is a leading expert on meteor showers and meteorite impacts. My artwork will be a series of data-visualizations of meteorite impacts at four different sites around the globe.

While the waterjet is normally used for cutting stiff materials like thick steel, it can etch using lower water pressure rather than pierce the material. OMAX — the company that makes the waterjet that we use at Pier 9 —  does provide a simple etching software package called Intelli-ETCH. The problem is that it will etch the entire surface of the material. This is appropriate for some artwork, such as my Bad Data series, where I wanted to simulate raster lines.

Meth Labs in Albuquerque(Data source: http://www.metromapper.org)

The technique and skills that I apply to my artistic practice is to write custom software that generates specific files for digital fabrication machines: laser-cutters, 3D printers, the waterjet and CNC machines. The look-and-feel is unique, unlike using conventional tools that artists often work with.

For meteorite impacts, I first map data like the pattern below (this is from a 2008 asteroid impact). For these impacts, it doesn’t make sense to etch the entire surface of my material, but rather, just pockets, simulating how a meteorite might hit the earth.

strewn_field_15scaled_no_notation

I could go the route of working with a CAM package and generating paths that work with the OMAX Waterjet. Fusion 360 even offers a pathway to this. However, I am dealing with four different datasets, each with 400-600 data points. It just doesn’t make sense to go from a 2D mapping, into a 3D package, generate 3D tool paths and then back to (essentially) a 2D profiling machine.

So, I worked on generating my own tool paths using Open Frameworks, which outputs simple vector shapes based on the size of data. For the tool paths, I settled on using spirals rather than left-to-right traverses, which spends too much time on the outside of the material, and blows it out. The spirals produce very pleasing results.

My first tests were on some stainless steel scrap and you can see the results here, with the jagged areas where the water eats away at the material, which is the desired effect. I also found that you have to start the etching from the outside of the spiral and then wind towards the inside. If you start from the inside and go out, you get a nipple, like on the middle right of this test, where the water-jet has to essentially “warm-up”. I’m still getting the center divots, but am working to solve this problem.

This was a promising test, as the non-pocketed surface doesn’t get etched at all and the etching is relatively quick.

IMG_0286

I showed this test to other people and received many raised eyebrows of curiosity. I became more diligent in my test samples and produces this etch sample with 8 spirals, with an interior path ranging from 2mm to 9mm to test on a variety of materials.

sprial_paths.png

I was excited about this material, an acrylic composite that I had leftover from a landscape project. It is 1/2″ thick with green on one side and a semi-translucent white on the other. However, as you can see, the water-jet is too powerful and ends up shattering the edges, which is less than desirable.

IMG_0303

And then I began to survey various stone samples. I began with scavenging some material from Building Resources, which had an assortment of unnamed, cheap tiles and other samples.

Forgive me…I wish I hadn’t sat in the back row of “Rocks for Jocks” in college. Who knew that a couple decades later, I would actually need some knowledge of geology to make artwork?

I began with some harder stone — standard countertop stuff like marble and granite. I liked seeing how the spiral breaks down along the way. But, there is clearly not enough contrast. It just doesn’t look that good.

IMG_0280

IMG_0294

I’m not sure what stone this is, but like the marble, it’s a harder stone and doesn’t have much of an aesthetic appeal. The honed look makes it still feel like a countertop.

IMG_0295

I quickly learned that thinner tile samples would be hard to dial in. Working with 1/4″ material like this, often results in blowing out the center.

IMG_0282

But, I was getting somewhere. These patterns started resembling an impact of sorts and certainly express the immense kinetic energy of the waterjet machine, akin to the kinetic energy of a meteorite impact.

white_tile_detail

This engineered brick was one of my favorite results from this initial test. You can see the detail on the aggregate inside.

IMG_0290brick_all

And I got some weird results. This material, whatever it is, is simple too delicate, kind of like a pumice.

IMG_0289

This is a cement compound of some flavor and for a day, I even thought about pouring my own forms, but that’s too much work, even for me.

 

IMG_0291

I think these two are travertine tile samples and I wish I had more information on them, but alas, that’s what you get when you are looking through the lot. These are in the not-too-hard and not-too-soft zone, just where I want them to be.

 

IMG_0274

IMG_0292

I followed up these tests by hitting up several stoneyards and tiling places along the Peninsula (south of San Francisco). This basalt-like material is one of my favorite results, but is probably too porous for accuracy. Still, the fissures that it opens up in the pockets is amazing. Perhaps if I could tame the waterjet further, this would work.

IMG_0275basalt-detail

basalt-more-detailThis rockface/sandstone didn’t fare so well. The various layers shattered, producing unusable results.

IMG_0299discolored_slate

Likewise, this flagstone was a total fail.

IMG_0302flagstone-shatter

The non-honed quartzite gets very close to what I want, starting to look more like a data-etching. I just need to find one that isn’t so thick. This one will be too heavy to work with.

IMG_0284  quartzite_close_IMG_0340

Although this color doesn’t do much for me, I do like the results of this limestone.

IMG_0298

Here is a paver, that I got but can’t remember which kind it is. Better notes next time! Anyhow, it clearly is too weak for the water-jet.

IMG_0297

This is a slate. Nice results!

IMG_0296

And a few more, with mixed results.

IMG_0300 IMG_0301

And if you are a geologist and have some corrections or additions, feel free to contact me.

Art in Space: the First Art Exhibition in Space

Art in Space is the first art exhibition in space, which was created in conjunction with Autodesk’s Pier 9 Creative Workshops and Planet Labs, a company which dispatches many fast-orbiting imaging satellites that document rapid changes on the Earth’s surface.

For this exhibition, they selected several Pier 9 artists to create artwork, which were then etched onto the satellites panels. Though certainly not the first artwork in space*, this is the first exhibition of art in space. And, if you consider that several satellites are constantly orbiting Earth on opposite sides of the planet, this would be the largest art exhibition ever.

My contribution is an artwork called: Hello, World! It is the first algorithmically-generated artwork sent to space and also the first art data visualization in space. The artwork was deployed on August 19th, 2015 on the satellite: Dove 0C47. The artwork will circle the Earth for 18 months until its satellite orbit decays and it burns up in our atmosphere.

 

The left side of the satellite panel depicts the population of each city, represented by squares proportional to the population size. The graphics on the right side represent the carbon footprint of each city with circles proportional to carbon emissions. By comparing the two, one can make correlations between national policies and effects on the atmosphere. For example, even though Tokyo is the most populated city on earth, its carbon emissions per capita is very low, making its carbon footprint much smaller in size, than Houston, Shanghai or Riyadh, which have disproportionately large footprints.

The etched panel resembles a constellation of interconnected activity and inverts the viewpoint of the sky with that of the earth. It is from this “satellite eye,” that we can see ourselves and the effect of humans on the planet. The poetic gesture of the artwork burning up as the satellite re-enters the Earth’s atmosphere, serves as reminder about the fragile nature of Earth.

Also consider this: the Art in Space exhibition is something you can neither see nor is it lasting. After only 18 months, the satellite, as well as the artwork vaporizes. I thought of this as an opportunity to work with ephemerality and sculpture. And, this is the first time I have had the chance for a natural destruction of my work. Everything dies and we need to approach life with care.

A few people have asked me where did my title come from? Anyone who has written any software code is familiar with the phrase: “Hello, World!” This is the first test program that any instructional has you write. It shows the basic syntax for constructing a working program, which is helpful since all computer programs embody different language constructions. By making this test code work, you also have verified that your development environment is working properly.

“Hello, World!” C implementation.

/* Hello World program */
#include<stdio.h>
main() {
    printf("Hello World");
}

And here is the full a video that explains more about the Art in Space exhibition.

 

* There has been plenty of other art in space, and more recent projects such as my collaboration with Nathaniel Stern for Tweets in Space (2012) and Trevor Paglen’s The Last Pictures.

Joining SETI as an artist-in-residence

The SETI Institute just announced their new cohort of artists-in-residents for 2016 and I couldn’t be happier to be joining this amazing organization for a long-term (up to 2 years!) stint.

This includes a crew of other amazing artists: Dario Robleto (Conceptual Artist, Houston), Rachel Sussman (Photographer, Artist, Writer, New York), George Bolster (Filmmaker, Artist, New York), Jen Bervin (Visual Artist, Writer, Brooklyn), David Neumann (Choreographer, New York). The SETI Air program is spearheaded by Charles Lindsay (artist) and Denise Markonish (curator at MASS MoCA). I first met Charles at ISEA 2012 in Albuquerque, New Mexico when we were on the same panel around space-related artwork.

On January 13, 2016, at 7pm, in San Francisco’s Millennium Tower, SETI Institute President and CEO Bill Diamond will formally welcome the incoming artists and our new institutional partners, as well as patrons and friends of the program. This event is invitational and seating is limited.

SETI_Logo

So, what will I be working on?

Well, this follows on the heels of a number of artwork related to space such as Tweets in Space (in collaboration with Nathaniel Stern), Uncertain LocationBlack Hole Series and Moon v Earth, which were meditations of metaphors of space and potential.

uncertainlocation_main

Roughly speaking, I will be researching, mapping and creating installations of asteroids, meteor and meteorite data and working with scientists such as Peter Jenniskens, who is an expert on meteorite showers. These will be physical data-visualizations — installation, sculptures, etc, which follow my interests in digital fabrication and code with projects such as Water Works.

What specifically fascinates me is the potential between outer space and the earth, is the metaphor of both danger and possibility from above. These range from numerous spiritual interpretations to practical ones such as the extinction of the human race to the possibility that organic material from other planets being carried to our solar system. Despite appearances to the contrary Earth is not only a fragile ecosystem but also once that could easily be transformed from outside.

And already I have begun mapping some meteor showers with my custom 3D software, working with in collaboration with Dr. Jenniskens and a dataset of ~230,000 meteors over Northern California in the last few years. This makes the data-space-geek in me very happy.

Stay subscribed for more.

meteor-of-Screen Shot 2015-11-18 at 8.41.10 AM

meteor-2Screen Shot 2015-10-27 at 4.55.25 PM

And I will heed Carl Sagan’s words: “Imagination will often carry us to worlds that never were, but without it we go nowhere.”

EquityBot World Tour

Art projects are like birthing little kids. You have grand aspirations but never know how they’re going to turn out. And no matter, what, you love them.

20151125 125225

It’s been a busy year for EquityBot. I didn’t expect at all last year that my stock-trading algorithm Twitterbot would resonate with curators, thinkers and  general audience so well. I’ve been very pleased with how well this “child” of mine has been doing.

This year, from August-December, it has been exhibited in 5 different venues, in 4 countries. They include MemFest 2015 (Bilbao), ISEA 2015, (Vancouver), MoneyLab 2, Economies of Dissent (Amsterdam) and Bay Area Digitalists (San Francisco).

Of course, it helps the narrative that EquityBot is doing incredibly well, with a return rate (as of December 4th) of 19.5%. I don’t have the exact figures, but the S&P for this time period, according to my calculations, is the neighborhood of -1.3%.

Screen Shot 2015-12-05 at 9.13.20 AM

 

The challenge with this networked art piece is how to display it. I settled on making a short video, with the assistance of a close friend, Mark Woloschuk. This does a great job of explaining how the project works.

And, accompanying it is a visual display of vinyl stickers, printed on the vinyl sticker machine at the Creative Workshops at Autodesk Pier 9, where I once had a residency and now work (part-time).

EquityBot_installation_screen_c

 

from-columbus-show

Cistern Mapping Project…with Bikes

Do you like riding bikes and mapping urban space?

On October 11th, 2015, I will be leading the Cistern Mapping Project, which will be an urban treasure hunt, where we document and geolocate all of the 170 (or so) Cisterns of San Francisco.

The easiest way to let me know you want to participate is to sign up with this contact form. This will email me (Scott Kildall) and I can give you some more exact details.

The plan
We will meet at a specific location in the Mission District at 11am on Sunday, October 11th. I am hoping to gather about 20 riders, paired up in groups. Each will be provided with a map of approximate locations of several cisterns.

The pair will find the search for the exact location of each brick circle, photo-document it and get the geolocation (latitude + longitude) of the cistern, using an app on their iPhone or Android. Plan for 4-5 hours or so of riding, mapping, documenting and tweeting.

The background story
Underneath our feet, usually marked by brick circles are cisterns, There are 170 or so of them spread throughout the city. They’re part of the AWSS (Auxiliary Water Supply System) of San Francisco, a water system that exists entirely for emergency use and is separate from the potable drinking water supply and the sewer system.

cistern-mapping

In the 1850s, after a series of Great Fires in San Francisco tore through the city, 23 (or so) cisterns were built. These smaller cisterns were all in the city proper, at that time between Telegraph Hill and Rincon Hill. They weren’t connected to any other pipes and the fire department intended to use them in case the water mains were broken, as a backup water supply.

They languished for decades. Many people thought they should be removed, especially after incidents like the 1868 Cistern Gas Explosion.

However, after the 1906 Earthquake, fires once again decimated the city. Many water mains broke and the neglected cisterns helped save portions of the city. Afterward, the city passed a $5,200,000 bond and begin building the AWSS in 1908. This included the construction of many new cisterns and the rehabilitation of other, neglected ones. Most of the new cisterns could hold 75,000 gallons of water. The largest one is underneath the Civic Center and has a capacity of 243,000 gallons.

cistern_main

Augmenting an Existing Map
Last year, as part of the Creative Code Fellowship between Stamen Design, Gray Area and Autodesk, I worked on a project called Waterworks, which mapped the San Francisco water infrastructure as a series of 3D prints and web maps.

As part of this project, I created an interactive web map of the San Francisco Cisterns (the only one), based on the intersections listed in the SFFD water supplies manual. However, this map is less-than-complete.

The problem is that the intersections are approximate and are sometimes a block or so away. They are inaccurate. Also, there are very few photographs of the brick circles that make the San Francisco cisterns. I think it would be an urban service to map these out for anyone to look at.

The goal will be to geolocate (lat + long) cistern location, photograph the bricks that (usually) mark them and produce a dataset that anyone can use. cistern-web-map

A Live Twitter Performance…on bikes

This will be a live Twitter event, where we update each cistern location live using Twitter and Google docs, adding photographs and building out the cistern map in real-time.

Bikes are the perfect mode of transport. Parking won’t be an issue and we can conveniently hit up many parts of the city.

Will we map all of these cisterns? This is up to you. Contact me here if you would like to join.

 

Press for Chess with Mustaches: the response to the Duchamp Estate

Press coverage is like an improv performance. It’s unpredictable, erratic and sometimes works or falls on its face, usually by the lack of press.

I’ve seen my work get butchered, my name get dragged in the mud. I’ve been called a “would-be performance artist”, an “amateur cartographer” and even Cory Doctorow recently called me a “hobbyist”.

But as long as my name is spelled right, I’m happy.

We recently went public with our response to the Duchamp Estate and the Chess with Mustaches artwork.

We soon received coverage from three notable press sources: Hyperallergic, 3DPrint.com and The Atlantic, and this was soon followed up by Boing Boing and later 3ders.com, plus a mention in Fox News (scroll down) and then Tech Dirt.

These are arts blogs, 3D printing blogs, tech rags — and well, The Atlantic, a  well-read political and culture new source — so there’s a wide audience for this story.

The press has certainly reached the critical threshold for the work. The cat is out of the bag, after being inside for nearly a year…a frustrating process where we kept silent about the cease-and-desist letter from the Duchamp Estate.

This is perhaps the hardest part of any sort of potential legal conflict. You have to be quiet about it, otherwise it might imperil your legal position. The very act of saying anything might make the other party react in some sort of way.

But the outpouring of support has been amazing, both on a personal and a press level. Sure, some of the articles have overlooked certain aspects of the project.

And as always #dontreadthecomments. But overall, it has been such a relief to be able to be talk about the Duchamp Estate and the chess pieces, and to devise an appropriate artistic response.

 

cwm_fullset_adjusted

What Happened to the Readymake: Duchamp Chess Pieces?

Over the last several months, we (Scott Kildall and Bryan Cera) have been contacted by many people asking the same question: What happened to the Readymake: Duchamp Chess Pieces?

cwm_orig_set

The answer is that we ran into an unexpected copyright concern. The Marcel Duchamp Estate objected to the posting of our reconstructed 3D files on Thingiverse, claiming that our project was an infringement of French intellectual property law. Although the copyright claim never went to legal adjudication, we decided that it was in our best interests to remove the 3D-printable files from Thingiverse – both to avoid a legal conflict, and to respect the position of the estate.

For those of you who are unfamiliar with Readymake: Duchamp Chess Set by Scott Kildall and Bryan Cera, this was our original project description:

Readymake: Duchamp Chess Set is a 3D-printed chess set generated from an archival photograph of Marcel Duchamp’s own custom and hand-carved game. His original physical set no longer exists. We have resurrected the lost artifact by digitally recreating it, and then making the 3D files available for anyone to print.

We were inspired by Marcel Duchamp’s readymade — an ordinary manufactured object that the artist selected and modified for exhibition — the readymake brings the concept of the appropriated object to the realm of the internet, exploring the web’s potential to re-frame information and data, and their reciprocal relationships to matter and ideas. Readymakes transform photographs of objects lost in time into shared 3D digital spaces to provide new forms and meanings.

Just for the sake of clarity, what we call a “readymake” is a play on the phrase “readymade”. It is ready-to-make, since it can be physically generated by a 3D printer.

Our Readymake project was not to exist solely as the physical 3D prints that we made, but rather as the gesture of posting the 3D-printable files for anyone to download, as well as the initiation of a broader conversation around digital recreation in the context of artwork. We chose to reconstruct Duchamp’s chess set, specifically, for several reasons.

The chess set, originally created by Duchamp in 1917-18, was a material representation of his passion for the game. Our intention was not to create a derivative art work, but instead to re-contextualize an existing non-art object through a process of digital reconstruction as a separate art project.

What better subject matter to speak to this idea than a personal possession of the father of the Readymade, himself?  Given the artifact’s creation date, we believed it would be covered under U.S. Copyright Law. We’ll get back to that in a bit.

cwm_bw_duchamp_set

 cw_duchamp_pieces

On April 21st, 2014, we published this project on our website and also uploaded the 3D (STL) files onto Thingiverse, a public online repository of free 3D-printable models.  We saw our gesture of posting the files not only as an extension of our art project, but also as an opportunity to introduce the conceptual works of Duchamp, specifically his Readymades, to a wider audience.

cwm_makerbot_grouping

The project generated a lot of press. By encouraging discussion between art-oriented and technology-oriented audiences, it tapped into a vein of critical creative possibilities with 3D printing. And perhaps, with one of Marcel Duchamp’s personal belongings as the context, the very notions of object, ownership and authenticity were brought into question among these communities.

Unfortunately, the project also struck a nerve with the Duchamp Estate. On September 17th, 2014, we received a cease and desist letter from a lawyer representing the heirs of Marcel Duchamp. They were alleging intellectual property infringement on grounds that they held a copyright to the chess pieces under French law.

Gulp.

cwm_170914-letter-blackedout-p1

cwm_170914-letter-blackedout-p2

cwm_170914-letter-blackedout-p3

We assessed our options and talked to several lawyers. Yes, we talked to the Electronic Frontier Foundation…and others. We were publicly quiet about our options, as one needs to with legal matters such as this. The case was complex since jurisdiction was uncertain. Does French copyright law apply? Does that of the United States? We didn’t know, but had a number of conversations with legal experts.

Some of the facts, at least as we understand them

1)  Duchamp’s chess pieces were created in 1917-1918. According to US copyright law, works published before 1923 are in the realm of “expired copyright”.

2) The chess pieces themselves were created in 1917-1918 while Duchamp was in Argentina. He then brought the pieces back to France where he worked to market them.

3)  According to French copyright law, copyrighted works are protected for 70 years after the author’s death.

4)  Under French copyright law, you can be sued for damages and even serve jail time for copyright infringement.

5)  The only known copy of the chess set is in a private collection. We were originally led to believe the set was ‘lost’ – as it hasn’t been seen, publicly, for decades.

6) For the Estate to pursue us legally, the most common method would be to get a judgment in French court, then get a judgment in a United States court to enforce the judgement.

7) Legal jurisdiction is uncertain. As United States citizens, we are protected by U.S. copyright law. But, since websites like Thingiverse are global, French copyright could apply.

Our decision to back off

Many people have told us to fight the Estate on this one. This, of course, is an obvious response. But our research indicated this would be a costly battle. We pursued pro-bono representation from a variety of sources, and while those we reached out to agreed it was an interesting case, each declined. We even considered starting a legal defense fund or crowdsourcing legal costs through an organization such as Kickstarter. However, deeper research showed us that people were far more interested in funding in technology gadgets than legal battles.

Finally we ascertained, through various channels, that the Estate was quite serious. We wanted to avoid a serious legal conflict.

And so, without proper financial backing or pro-bono legal representation, we backed off — we pulled the files from Thingiverse. This was painful – it was incredible to see how excited people were to take part in our project, and when we deleted the Thingiverse entry and with it the comments and photo documentation shared by users, we did so with much regret. But we didn’t see any other option.

Initially, we really struggled to understand where the estate was coming from. As part of the estate’s task is to preserve Duchamp’s legacy, we were surprised that our project was seen by them as anything other than a celebration, and in some ways a revitalization, of his ideas and artworks. Despite the strongly-worded legal letter, we heard that the heirs were quite reasonable.

The resolution was this: we contacted the estate directly. We explained our intention for the project: to honor the legacy of Duchamp, and notified them that we had pulled the STL files from online sources.

We were surprised by the amicable email response — written sans lawyers — directly from one of the heirs. Their reply highlighted an appreciation for our project, and an understanding of our artistic intent. It turns out that their concern was not that we were using the chess set design, but rather that the files – then publicly available — could be taken by others and exploited.

We understand the Estate’s point-of-view – their duty, after all, is to preserve Duchamp’s legacy. Outside of an art context, a manufacturer could easily take the files and mass produce the set. Despite the fact we did put this under a Creative Commons license that stipulated that the chess set couldn’t be used for commercial purposes, we understand the concern.

If we had chosen to stand our ground, we would have had various defenses at our disposal. One of them is that French law wouldn’t have applied since we are doing this from a U.S. server. But, the rules around this are uncertain.

If we had been sued, we would have defended on two propositions: (1) our project would be protected under U.S. law; (2) not withstanding this, under U.S. law, we have a robust and widely-recognized defense under the nature of Fair Use.

We would make the argument that our original Duchamp Chess Pieces would have have added value to these objects. We would consider invoking Fair Use in this case.

But, the failure of a legal system is that it is difficult to employ these defenses unless you have the teeth to fight. And teeth cost a lot of money.

Parody: Our resolution

We thought about how to recoup the intent of this project without what we think will be a copyright infringement claim from the Duchamp Estate and realized one important aspect of the project, which would likely guarantee it as commentary is one of parody.

Accordingly, we have created Chess with Mustaches, which is based on our original design, however, adds mustaches to each piece. The pieces no longer looks like Duchamp’s originals, but instead improves upon the original set with each piece adorned with mustaches.

chesswithmustaches_fullset

cwm_plastic_set

The decorative mustache references vandalized work, including Duchamp’s own adornment of the Mona Lisa.

cwm_mona_lisa

Coming out with this new piece is risky. We realize the Duchamp Estate could try to come back at us with a new cease-and-desist. However, we believe that this parody response and retitled artwork will be protected under U.S. Copyright Law (and perhaps under French law as well). We are willing to stand up for ourselves with the Chess with Mustaches.

Also for this reason, we decided not to upload the mustachioed-pieces to Thingiverse or any other downloadable websites. They were created as physical objects solely in the United States.

cwm_king

Final thoughts

3D printing opens up entire new possibilities of material production. With the availability of cheap production, the very issue of who owns intellectual property comes into play. We’ve seen this already with the endless reproductions on sites such as Thingiverse. Recently, Katy Perry’s lawyers demanded that a 3D Print of the Left Shark should be removed from Shapeways.

And in 2012, Golan Levin and Shawn Sims provided the Free Universal Construction Kit, a set of 3D-printable files for anyone to print connectors between Legos, Tinker Toys and many other construction kits for kids. Although he seems to have dodged legal battles, this was perhaps a narrow victory.

Our belief is that this our project of reviving Duchamp’s chess set is a strong as both a conceptual and artistic gesture. It is unfortunate that we had to essentially delete this project from the Internet. What copyright law has done in this case is to squelch an otherwise compelling conversation about the original, Duchamp’s notion of the readymade in the context of 3D printing.

Will our original Duchamp Chess pieces, the cease-and-desist letter from the Duchamp Estate and our response of the Chess with Mustaches be another waypoint in this conversation?

We hope so.

And what would Marcel Duchamp have thought of our project? We can only guess.

   cwm_knight

Scott Kildall’s website is: www.kildall.com
Twitter: @kildall

Bryan Cera’s website is: http://bryancera.com
Twitter: @BryanJCera

BOOM! WaterWorks

My Water Works project recently got coverage in BOOM: A Journal of California and I couldn’t be more pleased.

Screen Shot 2015-08-25 at 9.06.28 AM

A few months ago, I was contacted by the editorial staff to write about the 3D printed maps and data-visualization for Water Works.

What most impressed me is the context for this publication, which is a conversation about California, in their own words: “to create a lively conversation about the vital social, cultural, and political issues of our times, in California and the world beyond.”

So, while my Water Works project is an artwork, it is having the desired effect of a dialogue outside of the usual art world.

EquityBot got clobbered

Just after the Dow Jones dropped 1000 points on Aug 24th (yesterday), I checked out how EquityBot was doing. Annual rate of return of > -50%

Screen Shot 2015-08-24 at 11.20.25 PM

Crazy! Of course, this is like taking the tangent of any curve and making a projection. A day later, EquityBot is at -32%.

Screen Shot 2015-08-25 at 8.57.06 AM

Still not good, but if if you were to invest yesterday, you could be much richer today.

I’m not that much of a gambler, so I’m glad that EquityBot is just a simulated (for now) bank account.

EquityBot Goes to ISEA

EquityBot will be presented at this year’s International Symposium on Electronic Art at Vancouver. The theme is Disruption. You can always follow EquityBot here: @equitybot.

EquityBot is an automated stock-trading algorithm that uses emotions on Twitter as the basis for investments in a simulated bank account.

This art project poses the question: can an artist create a stock-trading algorithm that will outperform professional managed accounts?

The original EquityBot, what I will call version 1, launched on October 28th via the Impakt organization, which was supported the project last fall during at artist residency.

I intended for it to run for 6 months and then to assess its performance results. I ended up letting it run a little bit longer (more on this later).

Since then, I’ve revamped EquityBot about 1 month ago. The new version is doing *great* with an annual rate of return of 10.86%. Most of this is due to some early investments in Google, whose stock prices have been doing fantastic.

equitybot-isea-8emotions-1086percent

How does EquityBot work? During stock market hours, EquityBot scrapes Twitter to determine the frequency of eight basic human emotions: anger, fear, joy, disgust, anticipation, trust, surprise and sadness.

equitybot-8emotions

The software code captures fluctuations in the number of tweets containing these emotions. It then correlates them to changes in stock prices.  When an emotion is trending upwards EquityBot will select a stock that follows a similar trajectory. It deems this to be a “correlated investment” and will buy this stock.

equitybot_correlation_graph

The ISEA version of EquityBot will run for another 6 months or so. The major change from version 1 was that with this version, I tracked 24 different emotions, all based on the Plutchik wheel.

Plutchik-wheel.svg_1

 

The problem that I found was this was too many emotions to track, both in terms. Statistically-speaking, there were too few tweets for many of the emotions for the correlation code to properly function.

The only change with the ISEA version (what I will call v1.1) is that it now tracks eight emotions instead of 24.

popular-emotions

How did v1 of EquityBot perform? It came out of the gates super-strong, hitting a high point of 20.21%. Wowza. These are also some earlier data-visualizations, which have since improved, slightly so.
equitybot-nov26-2021percent

But 1 month later, by December 15th, EquityBot dipped down to -4.58% percent. Yikes. These are the vicissitudes of the market and a short time-span

equitybot-dec15-minus-458percent

 

By January 21st 2015, EquityBot was almost back to even at -0.96%.

 

equitybot-jan21-minus096percent

Then by February 4th, 2015, EquityBot was back at a respectable 5.85%.

equitybot-feb4-585percent

And on March 1st, doing quite well at 7.36%

equitybot-march1-736percent

I let the experiment run until June 11th. The date was arbitrary, but -9.15% was the end result. This was pretty terrible.

equitybot-jun11-minus915percent

And which emotions performed the “best” — the labels aren’t on this graph, but the ones that were doing well were Trust and Terror. And the worst…was Rage (extreme Anger).

equitybot-investing-results-jun11

 

How do other managed accounts perform? According to the various websites, these are the numbers I’ve found.

Janus (Growth & Income): 7.35%
Fidelity (VIP Growth & Income): 4.70%
Franklin (Large Cap Equity): 0.46%
American Funds (The Income Fund of America): -1.23%
Vanguard (Growth and Income): 4.03%

This would put EquityBot v1.0 as dead last. Good thing this was a simulated bank account.

I’m hoping that v1.1 will do better. Eight emotions. Let’s see how it goes.

 

Machine Data Dreams: Barbie Video Girl Cam

One of the cameras they have here at the Signal Culture Residency is the Barbie Video Girl cam. This was a camera embedded inside a Barbie doll, produced in 2010.

The device was discontinued most notably after the FBI accidentally leaked a warning about possible predatory misuses of the camera, is  patently ridiculous.

The interface is awkward. The camera can’t be remotely activated. It’s troublesome to get the files off the device. The resolution is poor, but the quality is mesmerizing.

 

barbie_disassembly_1

The real perversion is the way you have to change the batteries for the camera, by pulling down Barbie’s pants and then opening up her leg with a screwdriver.

b-diss

I can only imagine kids wondering if the idealized female form is some sort of robot.

barbie_disassembly_3

The footage it takes is great. I brought it first to the local antique store, where I shot some of the many dolls for sale.

 

 

And, of course, I had to hit up the machines at Signal Culture to do a live analog remix using the Wobbulator and Jones Colorizer.

In the evening, as dusk approached, I took Barbie to the Evergreen Cemetery in Owego, which has gravestones dating from the 1850s and is still an active burial ground.

Here, Barbie contemplated her own mortality.

barbie_cemetery barbie_close_cross barbie_good barbie_gravestone_1 barbie_headstore barbie_warren

barbie_cemetery_mother

It was disconcerting for a grown man to be holding a Barbie doll with an outstretched arm to capture this footage, but I was pretty happy with the results.

I made this short edit.

And remixed with the Wobbulator. I decided to make a melodic harmony (life), with digital noise (death) in a move to mirror the cemetery — a site of transition between the living and the dead.

How does this look in my Machine Data Dreams software?

You can see the waveform here — the 2nd channel is run through the Critter & Guitari Video Scope.

Screen Shot 2015-08-04 at 10.43.09 AM

And the 3D model looks promising, though once again, I will work on these post-residency.

Screen Shot 2015-08-04 at 10.45.04 AM

Machine Data Dreams: Critter & Guitari Video Scope

Not to be confused with Deleuze and Guattari, this is a company that makes various hardware music synths.

For my new project, Machine Data Dreams, I’m looking at how machines might “think”, starting with the amazing analog video machines at Signal Culture.

signal_culture-fullsetup

This morning, I successfully stabilized my Arduino data logger. This captures the raw video signal from any device with RCA outputs and stores values at a sampling rate of ~3600 Hz.

It obviously misses a lot of the samples, but that’s the point, a machine-to-machine listener, bypassing any sort of standard digitizing software.

data_logger

For my first data-logging experiment, I decided to focus on this device, the Critter & Guitari Video Scope, which takes audio and coverts it to a video waveform.

critterguitari_3 critterguitari_2 Crittcritterguitari_1

Using the synths, I patched and modulated various waveforms. I’ve never worked with this kind of system until a few days ago, so I’m new to the concept of control voltages.audio_sythn

This is the 15-minute composition that I made for the data-logger.

Critter & Guitari Videoscope Composition (below)

And the captured output, in my custom OpenFrameworks software.

 

Screen Shot 2015-08-02 at 10.56.15 PMThe 3D model is very preliminary at this point, but I am getting some solid waveform output into a 3D shape. I’ll be developing this in the next few months. But since I only have a week at Signal Culture, I’ll tackle the 3D-shape generation later.

Screen Shot 2015-08-02 at 11.02.25 PM

My data logger can handle 2 channels of video, so I’m experimenting with outputting the video signal as sound and then running it back through the C&G Videoscope.

This is the Amiga Harmonizer — output, which looks great by itself. The audio, however, as a video signal, as expected comes out sounding like noise.

But the waveforms are compelling. there is a solid band at the bottom, which is the horizontal sync pulse. This is the signature for any composite (NTSC) devices.

2000px-Composite_Video.svg

 

So, every devices I log should have this signal at the bottom, which you can see below.

Screen Shot 2015-08-02 at 10.58.12 PM

Once again, the 3D forms I’ve generated in OpenFrameworks and then opened up in Meshlab are just to show that I’m capturing some sort of raw waveform data.

Screen Shot 2015-08-02 at 11.00.14 PM

Atari Adventure Synth

Hands down my favorite Atari game when I was a kid was Adventure (2). The dragons looked like giant ducks. Your avatar was just a square and a bat wreaks chaos by stealing your objects.

In the ongoing research for my new Machine Data Dreams project, beginning here at Signal Culture, I’ve been playing with the analog video and audio synths.

Yesterday afternoon, I explored the town of Owego. I ran across a used DVD, CD & electronics store and bought an Atari Flashback Console for $25. I didn’t even know these existed.

blog_pic

I can plug it directly into their video synth system. After futzing around with the various patch cables, I came up with this 5-minute composition, which shows me playing the game. The audio sounds like marching with dirty noise levels.

Also, here is the latest 3D model from my code, which now has a true 3D axis for data-plotting.

Time is one axis, video signal is another, audio signal is the third.

Screen Shot 2015-07-31 at 9.26.05 PMAnd a crude frequency plot.

Screen Shot 2015-08-01 at 3.03.24 PM

 

Van Gogh Wobbulator

In the first full day of the residency at Signal Culture, I played around with the video and audio synthesizers. It’s a new world for me.

While my focus is on the Machine Data Dreams project, I also want to play with what they have and get familiar with the amazing analog equipment.

I started with this 2 minute video, which I shot earlier this summer at Musee d’Orsay. I had to document the odd spectacle: visitor after visitor would take photos of this famous Van Gogh self-portrait…despite the fact you can get a higher-quality version online.

I ran this through a few patches and into the Wobbulator, which affects the electronic signal on the CRT itself.

20150730_200415

 

 

20150730_152742

Ewa Justka, who is the toolmaker-in-residence here, and who is building her own audio synthesizer spruced up the accompanying audio. I captured a 20-minute sample.

ewa-blog-trash

What I love about the result is that the repetitive 2-minute video takes on its own life, as the two of us tweaked knobs, made live patches and laughed a lot.