Category: Algorithm

Machine Data Dreams @ Black & White Projects

This week, I opened a solo show called Machine Data Dreams, at Black & White Projects. This was the culmination of several months of work where I created three new series of works reflecting themes of data-mapping, machines and mortality.

The opening reception is Saturday, November 5th from 7-9pm. Full info on the event is here.

Two of the artworks are from my artist-in-residency with SETI and the third is a San Francisco Arts Commission Grant.

All of the artwork uses custom algorithms to translate datasets into physical form, which is an ongoing exploration that I’ve been focusing on in the last few years.

Each set of artwork deserves more detail but I’ll stick with a short summary of each.

Fresh from the waterjet, Strewn Fields visualizes meteorite impact data at four different locations on Earth.

water-jet-1Strewn Fields: Almahata Sitta

As an artist-in-residence with SETI, I worked with planetary scientist, Peter Jenniskens to produce these four sculptural etchings into stone.

When an asteroid enters the earths atmosphere, it does so at high velocity — approximately 30,000 km/hour. Before impact, it breaks into thousands of small fragments — meteorites which spread over areas as large as 30km. Usually the spatial debris fall into the ocean or hits at remote locations where scientists can’t collect the fragments.

And, only recently have scientists been able to use GPS technology to geolocate hundreds of meteorites, which they also weigh as they gather them. The spread patterns of data are called “Strewn Fields”.

Dr. Jenniskens is not only one of the world’s experts on meteorites but led the famous  2008 TC3 fragment recovery in Sudan of the Almahata Sitta impact.

With four datasets that he both provided and helped me decipher, I used the high-pressure waterjet machine at Autodesk’s Pier 9 Creative Workshops, where I work as an affiliate artist and also on their shop staff, to create four different sculptures.

water-jet-2Strewn Fields: Sutter’s Mill

The violence of the waterjet machine gouges the surface of each stone, mirroring the raw kinetic energy of a planetoid colliding with the surface of the Earth. My static etchings capture the act of impact, and survive as an antithetical gesture to the event itself. The actual remnants and debris — the meteorites themselves — have been collected, sold and scattered and what remains is just a dataset, which I have translated into a physical form.

A related work, Machine Data Dreams are data-etchings memorials to the camcorder, a consumer device which birthed video art by making video production accessible to artists.

pixel_visionMACHINE DATA DREAMS: PIXELVISION

This project was supported by an San Francisco Individual Arts Commission grant. I did the data-collection itself during an intense week-long residency at Signal Culture, which has many iconic and working camcorders from 1969 to the present.

sonyvideorecorderSONY VIDEORECORDER (1969)
pixelvisionPIXELVISION CAMERA (1987)

During the residency, I built a custom Arduino data-logger which captured the raw electronic video signals, bypassing any computer or digital-signal processing software.data_loggerWith custom software that I wrote, I transformed these into signals that I could then etch onto 2D surfaces.Screen Shot 2015-08-02 at 10.56.15 PM I paired each etching with its source video in the show itself.

sony_video_recorderMACHINE DATA DREAMS: PIXELVISION

Celebrity Asteroid Journeys is the last of the three artworks and is also a project of from the SETI Artist in Residency program, though is definitively more light-hearted than the Strewn Fields.

Celebrity Asteroid Journeys charts imaginary travels from one asteroid to another. There are about 700,000 known asteroids, with charted orbits. A small number of these have been named after celebrities.

Working with asteroid orbital data from JPL and estimated spaceship velocities, I charted 5 journeys between different sets of asteroids.

My software code ran calculations over 2 centuries (2100 – 2300) to figure out the the best path between four celebrities. I then transposed the 3D data into 2D space to make silkscreens with the dates of each stop.

20161025_165421_webCELEBRITY ASTEROID JOURNEY: MAKE BELIEVE LAND MASHUP

This was my first silkscreened artwork, which was a messy antidote to the precise cutting of the machine tools at Autodesk.

All of these artworks depict the ephemeral nature of the physical body in one form or another. Machine Data Dreams is a clear memorial itself, a physical artifact of the cameras that once were cutting-edge technology.

With Celebrity Asteroid Journeys, the timescale is unreachable. None of us will ever visit these asteroids. And the named asteroids are memorials themselves to celebrities (stars) that are now dead or soon, in the relative sense of the word, will be no longer with us.

Finally, Strewn Fields captures a the potential for an apocalyptic event from above. Although these asteroids are merely minor impacts, it is nevertheless the reality that an extinction-level event could wipe out human species with a large rock from space. This ominous threat of death reminds us that our own species is just a blip in Earth’s history of life.

 

Waterjet Etching Tests

For the last several weeks, I have been conducting experiments with etching on the waterjet — a digital fabrication machine that emits a 55,000 psi stream of water, usually used for precision cutting. The site for this activity is Autodesk Pier 9 Creative Workshops. I continue to have access to their amazing fabrication machines, where I work part-time as one of their Shop Staff.

My recent artwork focuses on writing software code that transforms datasets into sculptures and installations, essentially physical data-visualizations. One of my new projects is called Strewn Fields, which is part of my work as an artist-in-residence with the SETI Institute. I am collaborating with the SETI research scientist, Peter Jenniskens, who is a leading expert on meteor showers and meteorite impacts. My artwork will be a series of data-visualizations of meteorite impacts at four different sites around the globe.

While the waterjet is normally used for cutting stiff materials like thick steel, it can etch using lower water pressure rather than pierce the material. OMAX — the company that makes the waterjet that we use at Pier 9 —  does provide a simple etching software package called Intelli-ETCH. The problem is that it will etch the entire surface of the material. This is appropriate for some artwork, such as my Bad Data series, where I wanted to simulate raster lines.

Meth Labs in Albuquerque(Data source: http://www.metromapper.org)

The technique and skills that I apply to my artistic practice is to write custom software that generates specific files for digital fabrication machines: laser-cutters, 3D printers, the waterjet and CNC machines. The look-and-feel is unique, unlike using conventional tools that artists often work with.

For meteorite impacts, I first map data like the pattern below (this is from a 2008 asteroid impact). For these impacts, it doesn’t make sense to etch the entire surface of my material, but rather, just pockets, simulating how a meteorite might hit the earth.

strewn_field_15scaled_no_notation

I could go the route of working with a CAM package and generating paths that work with the OMAX Waterjet. Fusion 360 even offers a pathway to this. However, I am dealing with four different datasets, each with 400-600 data points. It just doesn’t make sense to go from a 2D mapping, into a 3D package, generate 3D tool paths and then back to (essentially) a 2D profiling machine.

So, I worked on generating my own tool paths using Open Frameworks, which outputs simple vector shapes based on the size of data. For the tool paths, I settled on using spirals rather than left-to-right traverses, which spends too much time on the outside of the material, and blows it out. The spirals produce very pleasing results.

My first tests were on some stainless steel scrap and you can see the results here, with the jagged areas where the water eats away at the material, which is the desired effect. I also found that you have to start the etching from the outside of the spiral and then wind towards the inside. If you start from the inside and go out, you get a nipple, like on the middle right of this test, where the water-jet has to essentially “warm-up”. I’m still getting the center divots, but am working to solve this problem.

This was a promising test, as the non-pocketed surface doesn’t get etched at all and the etching is relatively quick.

IMG_0286

I showed this test to other people and received many raised eyebrows of curiosity. I became more diligent in my test samples and produces this etch sample with 8 spirals, with an interior path ranging from 2mm to 9mm to test on a variety of materials.

sprial_paths.png

I was excited about this material, an acrylic composite that I had leftover from a landscape project. It is 1/2″ thick with green on one side and a semi-translucent white on the other. However, as you can see, the water-jet is too powerful and ends up shattering the edges, which is less than desirable.

IMG_0303

And then I began to survey various stone samples. I began with scavenging some material from Building Resources, which had an assortment of unnamed, cheap tiles and other samples.

Forgive me…I wish I hadn’t sat in the back row of “Rocks for Jocks” in college. Who knew that a couple decades later, I would actually need some knowledge of geology to make artwork?

I began with some harder stone — standard countertop stuff like marble and granite. I liked seeing how the spiral breaks down along the way. But, there is clearly not enough contrast. It just doesn’t look that good.

IMG_0280

IMG_0294

I’m not sure what stone this is, but like the marble, it’s a harder stone and doesn’t have much of an aesthetic appeal. The honed look makes it still feel like a countertop.

IMG_0295

I quickly learned that thinner tile samples would be hard to dial in. Working with 1/4″ material like this, often results in blowing out the center.

IMG_0282

But, I was getting somewhere. These patterns started resembling an impact of sorts and certainly express the immense kinetic energy of the waterjet machine, akin to the kinetic energy of a meteorite impact.

white_tile_detail

This engineered brick was one of my favorite results from this initial test. You can see the detail on the aggregate inside.

IMG_0290brick_all

And I got some weird results. This material, whatever it is, is simple too delicate, kind of like a pumice.

IMG_0289

This is a cement compound of some flavor and for a day, I even thought about pouring my own forms, but that’s too much work, even for me.

 

IMG_0291

I think these two are travertine tile samples and I wish I had more information on them, but alas, that’s what you get when you are looking through the lot. These are in the not-too-hard and not-too-soft zone, just where I want them to be.

 

IMG_0274

IMG_0292

I followed up these tests by hitting up several stoneyards and tiling places along the Peninsula (south of San Francisco). This basalt-like material is one of my favorite results, but is probably too porous for accuracy. Still, the fissures that it opens up in the pockets is amazing. Perhaps if I could tame the waterjet further, this would work.

IMG_0275basalt-detail

basalt-more-detailThis rockface/sandstone didn’t fare so well. The various layers shattered, producing unusable results.

IMG_0299discolored_slate

Likewise, this flagstone was a total fail.

IMG_0302flagstone-shatter

The non-honed quartzite gets very close to what I want, starting to look more like a data-etching. I just need to find one that isn’t so thick. This one will be too heavy to work with.

IMG_0284  quartzite_close_IMG_0340

Although this color doesn’t do much for me, I do like the results of this limestone.

IMG_0298

Here is a paver, that I got but can’t remember which kind it is. Better notes next time! Anyhow, it clearly is too weak for the water-jet.

IMG_0297

This is a slate. Nice results!

IMG_0296

And a few more, with mixed results.

IMG_0300 IMG_0301

And if you are a geologist and have some corrections or additions, feel free to contact me.

Art in Space: the First Art Exhibition in Space

Art in Space is the first art exhibition in space, which was created in conjunction with Autodesk’s Pier 9 Creative Workshops and Planet Labs, a company which dispatches many fast-orbiting imaging satellites that document rapid changes on the Earth’s surface.

For this exhibition, they selected several Pier 9 artists to create artwork, which were then etched onto the satellites panels. Though certainly not the first artwork in space*, this is the first exhibition of art in space. And, if you consider that several satellites are constantly orbiting Earth on opposite sides of the planet, this would be the largest art exhibition ever.

My contribution is an artwork called: Hello, World! It is the first algorithmically-generated artwork sent to space and also the first art data visualization in space. The artwork was deployed on August 19th, 2015 on the satellite: Dove 0C47. The artwork will circle the Earth for 18 months until its satellite orbit decays and it burns up in our atmosphere.

 

The left side of the satellite panel depicts the population of each city, represented by squares proportional to the population size. The graphics on the right side represent the carbon footprint of each city with circles proportional to carbon emissions. By comparing the two, one can make correlations between national policies and effects on the atmosphere. For example, even though Tokyo is the most populated city on earth, its carbon emissions per capita is very low, making its carbon footprint much smaller in size, than Houston, Shanghai or Riyadh, which have disproportionately large footprints.

The etched panel resembles a constellation of interconnected activity and inverts the viewpoint of the sky with that of the earth. It is from this “satellite eye,” that we can see ourselves and the effect of humans on the planet. The poetic gesture of the artwork burning up as the satellite re-enters the Earth’s atmosphere, serves as reminder about the fragile nature of Earth.

Also consider this: the Art in Space exhibition is something you can neither see nor is it lasting. After only 18 months, the satellite, as well as the artwork vaporizes. I thought of this as an opportunity to work with ephemerality and sculpture. And, this is the first time I have had the chance for a natural destruction of my work. Everything dies and we need to approach life with care.

A few people have asked me where did my title come from? Anyone who has written any software code is familiar with the phrase: “Hello, World!” This is the first test program that any instructional has you write. It shows the basic syntax for constructing a working program, which is helpful since all computer programs embody different language constructions. By making this test code work, you also have verified that your development environment is working properly.

“Hello, World!” C implementation.

/* Hello World program */
#include<stdio.h>
main() {
    printf("Hello World");
}

And here is the full a video that explains more about the Art in Space exhibition.

 

* There has been plenty of other art in space, and more recent projects such as my collaboration with Nathaniel Stern for Tweets in Space (2012) and Trevor Paglen’s The Last Pictures.

EquityBot World Tour

Art projects are like birthing little kids. You have grand aspirations but never know how they’re going to turn out. And no matter, what, you love them.

20151125 125225

It’s been a busy year for EquityBot. I didn’t expect at all last year that my stock-trading algorithm Twitterbot would resonate with curators, thinkers and  general audience so well. I’ve been very pleased with how well this “child” of mine has been doing.

This year, from August-December, it has been exhibited in 5 different venues, in 4 countries. They include MemFest 2015 (Bilbao), ISEA 2015, (Vancouver), MoneyLab 2, Economies of Dissent (Amsterdam) and Bay Area Digitalists (San Francisco).

Of course, it helps the narrative that EquityBot is doing incredibly well, with a return rate (as of December 4th) of 19.5%. I don’t have the exact figures, but the S&P for this time period, according to my calculations, is the neighborhood of -1.3%.

Screen Shot 2015-12-05 at 9.13.20 AM

 

The challenge with this networked art piece is how to display it. I settled on making a short video, with the assistance of a close friend, Mark Woloschuk. This does a great job of explaining how the project works.

And, accompanying it is a visual display of vinyl stickers, printed on the vinyl sticker machine at the Creative Workshops at Autodesk Pier 9, where I once had a residency and now work (part-time).

EquityBot_installation_screen_c

 

from-columbus-show

EquityBot Goes to ISEA

EquityBot will be presented at this year’s International Symposium on Electronic Art at Vancouver. The theme is Disruption. You can always follow EquityBot here: @equitybot.

EquityBot is an automated stock-trading algorithm that uses emotions on Twitter as the basis for investments in a simulated bank account.

This art project poses the question: can an artist create a stock-trading algorithm that will outperform professional managed accounts?

The original EquityBot, what I will call version 1, launched on October 28th via the Impakt organization, which was supported the project last fall during at artist residency.

I intended for it to run for 6 months and then to assess its performance results. I ended up letting it run a little bit longer (more on this later).

Since then, I’ve revamped EquityBot about 1 month ago. The new version is doing *great* with an annual rate of return of 10.86%. Most of this is due to some early investments in Google, whose stock prices have been doing fantastic.

equitybot-isea-8emotions-1086percent

How does EquityBot work? During stock market hours, EquityBot scrapes Twitter to determine the frequency of eight basic human emotions: anger, fear, joy, disgust, anticipation, trust, surprise and sadness.

equitybot-8emotions

The software code captures fluctuations in the number of tweets containing these emotions. It then correlates them to changes in stock prices.  When an emotion is trending upwards EquityBot will select a stock that follows a similar trajectory. It deems this to be a “correlated investment” and will buy this stock.

equitybot_correlation_graph

The ISEA version of EquityBot will run for another 6 months or so. The major change from version 1 was that with this version, I tracked 24 different emotions, all based on the Plutchik wheel.

Plutchik-wheel.svg_1

 

The problem that I found was this was too many emotions to track, both in terms. Statistically-speaking, there were too few tweets for many of the emotions for the correlation code to properly function.

The only change with the ISEA version (what I will call v1.1) is that it now tracks eight emotions instead of 24.

popular-emotions

How did v1 of EquityBot perform? It came out of the gates super-strong, hitting a high point of 20.21%. Wowza. These are also some earlier data-visualizations, which have since improved, slightly so.
equitybot-nov26-2021percent

But 1 month later, by December 15th, EquityBot dipped down to -4.58% percent. Yikes. These are the vicissitudes of the market and a short time-span

equitybot-dec15-minus-458percent

 

By January 21st 2015, EquityBot was almost back to even at -0.96%.

 

equitybot-jan21-minus096percent

Then by February 4th, 2015, EquityBot was back at a respectable 5.85%.

equitybot-feb4-585percent

And on March 1st, doing quite well at 7.36%

equitybot-march1-736percent

I let the experiment run until June 11th. The date was arbitrary, but -9.15% was the end result. This was pretty terrible.

equitybot-jun11-minus915percent

And which emotions performed the “best” — the labels aren’t on this graph, but the ones that were doing well were Trust and Terror. And the worst…was Rage (extreme Anger).

equitybot-investing-results-jun11

 

How do other managed accounts perform? According to the various websites, these are the numbers I’ve found.

Janus (Growth & Income): 7.35%
Fidelity (VIP Growth & Income): 4.70%
Franklin (Large Cap Equity): 0.46%
American Funds (The Income Fund of America): -1.23%
Vanguard (Growth and Income): 4.03%

This would put EquityBot v1.0 as dead last. Good thing this was a simulated bank account.

I’m hoping that v1.1 will do better. Eight emotions. Let’s see how it goes.

 

Machine Data Dreams: Barbie Video Girl Cam

One of the cameras they have here at the Signal Culture Residency is the Barbie Video Girl cam. This was a camera embedded inside a Barbie doll, produced in 2010.

The device was discontinued most notably after the FBI accidentally leaked a warning about possible predatory misuses of the camera, is  patently ridiculous.

The interface is awkward. The camera can’t be remotely activated. It’s troublesome to get the files off the device. The resolution is poor, but the quality is mesmerizing.

 

barbie_disassembly_1

The real perversion is the way you have to change the batteries for the camera, by pulling down Barbie’s pants and then opening up her leg with a screwdriver.

b-diss

I can only imagine kids wondering if the idealized female form is some sort of robot.

barbie_disassembly_3

The footage it takes is great. I brought it first to the local antique store, where I shot some of the many dolls for sale.

 

 

And, of course, I had to hit up the machines at Signal Culture to do a live analog remix using the Wobbulator and Jones Colorizer.

In the evening, as dusk approached, I took Barbie to the Evergreen Cemetery in Owego, which has gravestones dating from the 1850s and is still an active burial ground.

Here, Barbie contemplated her own mortality.

barbie_cemetery barbie_close_cross barbie_good barbie_gravestone_1 barbie_headstore barbie_warren

barbie_cemetery_mother

It was disconcerting for a grown man to be holding a Barbie doll with an outstretched arm to capture this footage, but I was pretty happy with the results.

I made this short edit.

And remixed with the Wobbulator. I decided to make a melodic harmony (life), with digital noise (death) in a move to mirror the cemetery — a site of transition between the living and the dead.

How does this look in my Machine Data Dreams software?

You can see the waveform here — the 2nd channel is run through the Critter & Guitari Video Scope.

Screen Shot 2015-08-04 at 10.43.09 AM

And the 3D model looks promising, though once again, I will work on these post-residency.

Screen Shot 2015-08-04 at 10.45.04 AM

Machine Data Dreams: Critter & Guitari Video Scope

Not to be confused with Deleuze and Guattari, this is a company that makes various hardware music synths.

For my new project, Machine Data Dreams, I’m looking at how machines might “think”, starting with the amazing analog video machines at Signal Culture.

signal_culture-fullsetup

This morning, I successfully stabilized my Arduino data logger. This captures the raw video signal from any device with RCA outputs and stores values at a sampling rate of ~3600 Hz.

It obviously misses a lot of the samples, but that’s the point, a machine-to-machine listener, bypassing any sort of standard digitizing software.

data_logger

For my first data-logging experiment, I decided to focus on this device, the Critter & Guitari Video Scope, which takes audio and coverts it to a video waveform.

critterguitari_3 critterguitari_2 Crittcritterguitari_1

Using the synths, I patched and modulated various waveforms. I’ve never worked with this kind of system until a few days ago, so I’m new to the concept of control voltages.audio_sythn

This is the 15-minute composition that I made for the data-logger.

Critter & Guitari Videoscope Composition (below)

And the captured output, in my custom OpenFrameworks software.

 

Screen Shot 2015-08-02 at 10.56.15 PMThe 3D model is very preliminary at this point, but I am getting some solid waveform output into a 3D shape. I’ll be developing this in the next few months. But since I only have a week at Signal Culture, I’ll tackle the 3D-shape generation later.

Screen Shot 2015-08-02 at 11.02.25 PM

My data logger can handle 2 channels of video, so I’m experimenting with outputting the video signal as sound and then running it back through the C&G Videoscope.

This is the Amiga Harmonizer — output, which looks great by itself. The audio, however, as a video signal, as expected comes out sounding like noise.

But the waveforms are compelling. there is a solid band at the bottom, which is the horizontal sync pulse. This is the signature for any composite (NTSC) devices.

2000px-Composite_Video.svg

 

So, every devices I log should have this signal at the bottom, which you can see below.

Screen Shot 2015-08-02 at 10.58.12 PM

Once again, the 3D forms I’ve generated in OpenFrameworks and then opened up in Meshlab are just to show that I’m capturing some sort of raw waveform data.

Screen Shot 2015-08-02 at 11.00.14 PM

Atari Adventure Synth

Hands down my favorite Atari game when I was a kid was Adventure (2). The dragons looked like giant ducks. Your avatar was just a square and a bat wreaks chaos by stealing your objects.

In the ongoing research for my new Machine Data Dreams project, beginning here at Signal Culture, I’ve been playing with the analog video and audio synths.

Yesterday afternoon, I explored the town of Owego. I ran across a used DVD, CD & electronics store and bought an Atari Flashback Console for $25. I didn’t even know these existed.

blog_pic

I can plug it directly into their video synth system. After futzing around with the various patch cables, I came up with this 5-minute composition, which shows me playing the game. The audio sounds like marching with dirty noise levels.

Also, here is the latest 3D model from my code, which now has a true 3D axis for data-plotting.

Time is one axis, video signal is another, audio signal is the third.

Screen Shot 2015-07-31 at 9.26.05 PMAnd a crude frequency plot.

Screen Shot 2015-08-01 at 3.03.24 PM

 

Bad Data: SF Evictions and Airbnb

The inevitable conversation about evictions at San Francisco every party…art organizations closing, friends getting evicted…the city is changing. It has become a boring topic, yet it is absolutely, completely 100% real.

For the Bad Data series — 12 data-visualizations depicting socially-polarized, scientifically dubious and morally ambiguous dataset, each etched onto an aluminum honeycomb panel — I am featuring two works: 18 Years of Evictions in San Francisco and 2015 AirBnb Listings for exactly this reason. These two etchings are the centerpieces of the show.

evictions_airbnb

This is the reality of San Francisco, it is changing and the data is ‘bad’ — not in the sense of inaccurate, but rather in the deeper sense of cultural malaise.

By the way, the reception for the “Bad Data” show is this Friday (July 24, 2015) at A Simple Collective, and the show runs through August 1st.

The Anti-Eviction Mapping Project has done a great job of aggregating data on this discouraging topic, hand-cleaning it and producing interactive maps that animate over time. They’re even using the Stamen map tiles, which are the same ones that I used for my Water Works project.

Screen Shot 2015-07-23 at 4.52.36 PM

When I embarked on the Bad Data series, I reached out to the organization and they assisted me with their data sets. My art colleagues may not know this, but I’m an old-time activist in San Francisco. This helped me with getting the datasets, for I know that the story of evictions is not new and certainly not on this scale.

In 2001, I worked in a now-defunct video activist group called Sleeping Giant, which worked on short videos in the era when Final Cut Pro made video-editing affordable and when anyone with a DV camera could make their own videos. We edited our work, sold DVDs and had local screenings, stirring up the activist community and telling stories from the point-of-view of people on the ground. Sure, now we have Twitter and social media, but at the time, this was a huge deal in breaking apart the top-down structures of media dissemination.

Here is No Nos Vamos a hastily-edited video about evictions in San Francisco. Yes, this was 14 years ago.

I’ve since moved away from video documentary work and towards making artwork: sculpture, performance, video and more. The video-activist work and documentary video in general felt overly confining as a creative tool.

My current artistic focus is to transform datasets using custom software code into physical objects. I’ve been working with the amazing fabrication machines at Autodesk’s Pier 9 facility to make work that was not previously possible.

Ths dataset (also provided through the SF Rent Board) includes all the no-fault evictions in San Francisco, I got my computer geek on…well, I do try to use my programming powers for non-profit work and artwork.

I mapped the data into vector shapes using the C++ open source toolkit, called OpenFrameworks and wrote code which transformed the ~9300 data points into plotable shapes, which I could open in Illustrator. I did some work tweaking the strokes and styles.

sf_evictions_20x20

This is what the etching looks like from above, once I ran int through the water jet. There were a lot of settings and tests to get to this point, but the final results were beautiful.

waterjet-overhead

The material is a 3/4″ honeycomb aluminum. I tuned the high-pressure from the water-jet to pierce through the top layer, but not the bottom layer. However, the water has to go somewhere. The collisions against the honeycomb produce unpredictable results.

…just like the evictions themselves. We don’t know the full effect of displacement, but can only guess as the city is rapidly becoming less diverse. The result is below, a 20″ x 20″ etching.

Bad Data: 18 Years of San Francisco Evictions

baddata_sfevictions

The Airbnb debate is a little less clear-cut. Yes, I do use Airbnb. It is incredibly convenient. I save money while traveling and also see neighborhoods I’d otherwise miss. However, the organization and its effect on city economies is a contentious one.

For example, there is the hotel tax in San Francisco, which after 3 years, they finally consented to paying — 14% to the city of San Francisco. Note: this is after they had a successful business.

There also seems to be a long-term effect on rent. Folks, and I’ve met several who do this, are renting out places as tenants on Airbnb. Some don’t actually live in their apartments any longer. The effect is to take a unit off the rental market and mark it as a vacation rental. Some argue that this also skirts the law rent-control in the first place, which was designed as a compromise solution between landlords and tenants.

There are potential zoning issues, as well…a myriad of issues around Airbnb.

BAD DATA: 2015 AIRBNB LISTINGS, etching file

airbnb_sf

In any case, the location of the Airbnb rentals (self-reported, not a complete list) certainly fit the premise of the Bad Data series. It’s an amazing dataset. Thanks to darkanddifficult.com for this data source.

BAD DATA: 2015 Airbnb Listings

baddata_airbnb

EquityBot goes live!

During my time at Impakt as an artist-in-residence, I have been working on a new project called EquityBot, which is an online commission from Impakt. It fits well into the Soft Machines theme of the festival: where machines integrate with the soft, emotional world.

EquityBot exists entirely as a networked art or “net art” project, meaning that it lives in the “cloud” and has no physical form. For those of you who are Twitter users, you can follow on Twitter: @equitybot

01_large

What is EquityBot? Many people have asked me that question.

EquityBot is a stock-trading algorithm that “invests” in emotions such as anger, joy, disgust and amazement. It relies on a classification system of twenty-four emotions, developed by psychologist and scholar, Robert Plutchik.

Plutchik-wheel.svg

how it works
During stock market hours, EquityBot continually tracks worldwide emotions on Twitter to gauge how people are feeling. In the simple data-visualization below, which is generated automatically by EquityBot, the larger circles indicate the more prominent emotions that people are Tweeting about.

At this point in time, just 1 hour after the stock market opened on October 28th, people were expressing emotions of disgust, interest and fear more prominently than others. During the course of the day, the emotions contained in Tweets continually shift in response to world events and many other unknown factors.

twitter_emotionsEquityBot then uses various statistical correlation equations to find pattern matches in the changes in emotions on Twitter to fluctuations in stocks prices. The details are thorny, I’ll skip the boring stuff. My time did involve a lot of work with scatterplots, which looked something like this.

correlationOnce EquityBot sees a viable pattern, for example that “Google” is consistently correlated to “anger” and that anger is a trending emotion on Twitter, EquityBot will issue a BUY order on the stock.

Conversely, if Google is correlated to anger, and the Tweets about anger are rapidly going down, EquityBot will issue a SELL order on the stock.

EquityBot runs a simulated investment account, seeded with $100,000 of imaginary money.

In my first few days of testing, EquityBot “lost” nearly $2000. This is why I’m not using real money!

Disclaimer: EquityBot is not a licensed financial advisor, so please don’t follow it’s stock investment patterns.

accountThe project treats human feelings as tradable commodities. It will track how “profitable” different emotions will be over the course of months. As a social commentary, I propose a future scenario that just about anything can be traded, including that which is ultimately human: the very emotions that separate us from a machine.

If a computer cannot be emotional, at the very least it can broker trades of emotions on a stock exchange.

affect_performanceAs a networked artwork, EquityBot generates these simple data visualizations autonomously (they will get better, I promise).

It’s Twitter account (@equitybot) serves as a performance vehicle, where the artwork “lives”. Also, all of these visualizations are interactive and on the EquityBot website: equitybot.org.

I don’t know if there is a correlation between emotions in Tweets and stock prices. No one does. I am working with the hypothesis that there is some sort of pattern involved. We will see over time. The project goes “live” on October 29th, 2014, which is the day of the opening of the Impakt Festival and I will let the first experiment run for 3 months to see what happens.

Feedback is always appreciated, you can find me, Scott Kildall, here at: @kildall.