From Data Visualization

Data Crystals at EVA

I just finished attending the EVA London conference this week and did a demonstration of my Data Crystals project. This is the formal abstract for the demonstration and writing it helped clear up some of my ideas about the Data Crystals project and digital fabrication of physical sculptures and installations.


Embodied Data and Digital Fabrication: Demonstration with Code and Materials
by Scott Kildall


Data has tangible consequences in the real world. Accordingly, physical data-visualizations have the potential to engage with the actual effects of the data itself. A data-generated sculpture or art installation is something that people can move around, though or inside of. They experience the dimensionality of data with their own natural perceptual mechanisms. However, creating physical data visualizations presents unique material challenges since these objects exist in stasis, rather than in a virtual space with a guided UX design. In this demonstration, I will present my recent research into producing sculptures from data using my custom software code that creates files for digital fabrication machines.


The overarching question that guides my work is: what does data look like? Referencing architecture, my artwork such as Data Crystals (figure 2) executes codes that maps, stacks and assembles data “bricks” to form unique digital artifacts. The form of these objects are impossible to predict from the original data-mapping, and the clustering code will produce different variations each time it runs.

Other sculptures remove material through intense kinetic energy. Bad Data (figure 3) and Strewn Fields (figure 1) both use the waterjet machine to gouge data into physical material using a high- pressure stream of water. The material in this case — aluminum honeycomb panels and stone slabs — reacts in adverse ways as it splinters and deforms due to the violence of the machine.

2.1 Material Expression

Physical data-visualizations act on materials instead of pixels and so there is a dialogue between the data and its material expression. Data Crystals depict municipal data of San Francisco and have a otherworldly ghostly quality of stacked and intersecting cubes. The data gets served from a web portal and is situated in the urban architecture and so the 3D-printed bricks are an appropriate form of expression.

Bad Data captures data that is “bad” in the shallow sense of the word, rendering datasets such as Internet Data Breaches, Worldwide UFO Sightings or Mass Shootings in the United States. The water from the machine gouges and ruptures aluminum honeycomb material in unpredictable ways, similar to the way data tears apart our social fabric. This material is emblematic of the modern era, as aluminum began to be mass-refined at the end of the 19th century. These datasets exemplify conflicts of our times such as science/heresy and digital security/infiltration.

2.2 Frozen in Time

Once created, these sculptures cannot be endlessly altered like screen-based data visualizations. This challenges the artwork to work with fixed data or to consider the effect of capturing a specific moment.

For example, Strewn Fields is a data-visualization of meteorite impact data. When a large asteroid enters the earths atmosphere, it does so at high velocity of approximately 30,000km/hour. Before impact, it breaks up into thousands of small fragments, which are meteorites. Usually they hit our planet in the ocean or at remote locations. The intense energy of the waterjet machine gouges the surface of each stone, mirroring the raw kinetic energy of a planetoid colliding with the surface of the Earth. The static etching captures the act of impact, and survives as an antithetical gesture to the event itself. The actual remnants and debris (the meteorites) have been collected, sold and scattered and what remains is just a dataset, which I have translated into a physical form.

2.3 Formal Challenges to Sculpture

This sort of “data art” challenges the formal aspects of sculpture. Firstly, machine-generated artwork removes the artist’s hand from the work, building upon the legacy of algorithmic artwork by Sol Lewitt and others. Execution of this work is conducted by the stepper motor rather than by gestures of the artist.

Secondly, the input source of data are unknowable forms until they are actually rendered. The patterns are neither mathematic nor random, giving a certain quality of perceptual coherence to the work. Data Crystals: Crime Incidents has 30,000 data points. Using code-based clustering algorithms, it creates forms only recently possible with the combination of digital fabrication and large amounts of data.


My sculpture-generation tools are custom- developed in C++ using Open Frameworks, an open source toolkit. My code repositories are on GitHub: My own software bypasses any conventional modeling package. It can handle very complex geometry, and more importantly doesn’t have the “look” that a program such as Rhino/Grasshopper generates.

3.1 Direct-to-Machine

My process of data-translation is optimized for specific machines. Data Crystals generate STL files which most 3D printers can read. My code generates PostScript (.ps) files for the waterjet machine. The conversation with the machine itself is direct. During the production and iteration process, once I define the workflow, the refinements proceed quickly. It is optimized, like the machine that creates the artwork.

3.2 London Layering

In my demonstration, I will use various open data from London. I focus not on data that I want to to acquire, but rather, data that I can acquire. I will demonstrate a custom build of Data Crystals which shows multiple layers of municipal data, and I will run clustering algorithms to create several Data Crystals for the City of London.


Figure 1: Strewn Fields (2016)
by Scott Kildall
Waterjet-etched stone

Figure 2:
Data Crystals: Crime Incidents (2014)
by Scott Kildall
3D-print mounted on wood

Figure 3:
Bad Data: U.S. Mass Shootings (2015)
by Scott Kildall
Waterjet-etched aluminum honeycomb panel

GPS Tracks

I am building water quality sensors which will capture geolocated data. This was my first test with this technology. This is part of my ongoing research at the Santa Fe Water Rights residency (March-April) and for the American Arts Incubator program in Thailand (May-June).

This GPS data-logging shield from Adafruit arrived yesterday and after a couple of hours of code-wrestling, I was able to capture the latitude and longitude to a CSV data file.

This is me walking from my studio at SFAI to the bedroom. The GPS signal at this range (100m) fluctuates greatly, but I like the odd compositional results. I did the plotting in OpenFrameworks, my tool-of-choice for displaying data that will be later transformed into sculptural results.

The second one is me driving in the car for a distance of about 2km. The tracks are much smoother. If you look closely, you can see where I stopped at the various traffic lights.

Now, GPS tracking alone isn’t super-compelling, and there are many mapping apps that will do this for you. But as soon as I can attach water sensor data to latitude/longitude, then it can transform into something much more interesting as the data will become multi-dimensional.

Machine Data Dreams @ Black & White Projects

This week, I opened a solo show called Machine Data Dreams, at Black & White Projects. This was the culmination of several months of work where I created three new series of works reflecting themes of data-mapping, machines and mortality.

The opening reception is Saturday, November 5th from 7-9pm. Full info on the event is here.

Two of the artworks are from my artist-in-residency with SETI and the third is a San Francisco Arts Commission Grant.

All of the artwork uses custom algorithms to translate datasets into physical form, which is an ongoing exploration that I’ve been focusing on in the last few years.

Each set of artwork deserves more detail but I’ll stick with a short summary of each.

Fresh from the waterjet, Strewn Fields visualizes meteorite impact data at four different locations on Earth.

water-jet-1Strewn Fields: Almahata Sitta

As an artist-in-residence with SETI, I worked with planetary scientist, Peter Jenniskens to produce these four sculptural etchings into stone.

When an asteroid enters the earths atmosphere, it does so at high velocity — approximately 30,000 km/hour. Before impact, it breaks into thousands of small fragments — meteorites which spread over areas as large as 30km. Usually the spatial debris fall into the ocean or hits at remote locations where scientists can’t collect the fragments.

And, only recently have scientists been able to use GPS technology to geolocate hundreds of meteorites, which they also weigh as they gather them. The spread patterns of data are called “Strewn Fields”.

Dr. Jenniskens is not only one of the world’s experts on meteorites but led the famous  2008 TC3 fragment recovery in Sudan of the Almahata Sitta impact.

With four datasets that he both provided and helped me decipher, I used the high-pressure waterjet machine at Autodesk’s Pier 9 Creative Workshops, where I work as an affiliate artist and also on their shop staff, to create four different sculptures.

water-jet-2Strewn Fields: Sutter’s Mill

The violence of the waterjet machine gouges the surface of each stone, mirroring the raw kinetic energy of a planetoid colliding with the surface of the Earth. My static etchings capture the act of impact, and survive as an antithetical gesture to the event itself. The actual remnants and debris — the meteorites themselves — have been collected, sold and scattered and what remains is just a dataset, which I have translated into a physical form.

A related work, Machine Data Dreams are data-etchings memorials to the camcorder, a consumer device which birthed video art by making video production accessible to artists.


This project was supported by an San Francisco Individual Arts Commission grant. I did the data-collection itself during an intense week-long residency at Signal Culture, which has many iconic and working camcorders from 1969 to the present.

sonyvideorecorderSONY VIDEORECORDER (1969)
pixelvisionPIXELVISION CAMERA (1987)

During the residency, I built a custom Arduino data-logger which captured the raw electronic video signals, bypassing any computer or digital-signal processing software.data_loggerWith custom software that I wrote, I transformed these into signals that I could then etch onto 2D surfaces.Screen Shot 2015-08-02 at 10.56.15 PM I paired each etching with its source video in the show itself.


Celebrity Asteroid Journeys is the last of the three artworks and is also a project of from the SETI Artist in Residency program, though is definitively more light-hearted than the Strewn Fields.

Celebrity Asteroid Journeys charts imaginary travels from one asteroid to another. There are about 700,000 known asteroids, with charted orbits. A small number of these have been named after celebrities.

Working with asteroid orbital data from JPL and estimated spaceship velocities, I charted 5 journeys between different sets of asteroids.

My software code ran calculations over 2 centuries (2100 – 2300) to figure out the the best path between four celebrities. I then transposed the 3D data into 2D space to make silkscreens with the dates of each stop.


This was my first silkscreened artwork, which was a messy antidote to the precise cutting of the machine tools at Autodesk.

All of these artworks depict the ephemeral nature of the physical body in one form or another. Machine Data Dreams is a clear memorial itself, a physical artifact of the cameras that once were cutting-edge technology.

With Celebrity Asteroid Journeys, the timescale is unreachable. None of us will ever visit these asteroids. And the named asteroids are memorials themselves to celebrities (stars) that are now dead or soon, in the relative sense of the word, will be no longer with us.

Finally, Strewn Fields captures a the potential for an apocalyptic event from above. Although these asteroids are merely minor impacts, it is nevertheless the reality that an extinction-level event could wipe out human species with a large rock from space. This ominous threat of death reminds us that our own species is just a blip in Earth’s history of life.


Waterjet Etching Tests

For the last several weeks, I have been conducting experiments with etching on the waterjet — a digital fabrication machine that emits a 55,000 psi stream of water, usually used for precision cutting. The site for this activity is Autodesk Pier 9 Creative Workshops. I continue to have access to their amazing fabrication machines, where I work part-time as one of their Shop Staff.

My recent artwork focuses on writing software code that transforms datasets into sculptures and installations, essentially physical data-visualizations. One of my new projects is called Strewn Fields, which is part of my work as an artist-in-residence with the SETI Institute. I am collaborating with the SETI research scientist, Peter Jenniskens, who is a leading expert on meteor showers and meteorite impacts. My artwork will be a series of data-visualizations of meteorite impacts at four different sites around the globe.

While the waterjet is normally used for cutting stiff materials like thick steel, it can etch using lower water pressure rather than pierce the material. OMAX — the company that makes the waterjet that we use at Pier 9 —  does provide a simple etching software package called Intelli-ETCH. The problem is that it will etch the entire surface of the material. This is appropriate for some artwork, such as my Bad Data series, where I wanted to simulate raster lines.

Meth Labs in Albuquerque(Data source:

The technique and skills that I apply to my artistic practice is to write custom software that generates specific files for digital fabrication machines: laser-cutters, 3D printers, the waterjet and CNC machines. The look-and-feel is unique, unlike using conventional tools that artists often work with.

For meteorite impacts, I first map data like the pattern below (this is from a 2008 asteroid impact). For these impacts, it doesn’t make sense to etch the entire surface of my material, but rather, just pockets, simulating how a meteorite might hit the earth.


I could go the route of working with a CAM package and generating paths that work with the OMAX Waterjet. Fusion 360 even offers a pathway to this. However, I am dealing with four different datasets, each with 400-600 data points. It just doesn’t make sense to go from a 2D mapping, into a 3D package, generate 3D tool paths and then back to (essentially) a 2D profiling machine.

So, I worked on generating my own tool paths using Open Frameworks, which outputs simple vector shapes based on the size of data. For the tool paths, I settled on using spirals rather than left-to-right traverses, which spends too much time on the outside of the material, and blows it out. The spirals produce very pleasing results.

My first tests were on some stainless steel scrap and you can see the results here, with the jagged areas where the water eats away at the material, which is the desired effect. I also found that you have to start the etching from the outside of the spiral and then wind towards the inside. If you start from the inside and go out, you get a nipple, like on the middle right of this test, where the water-jet has to essentially “warm-up”. I’m still getting the center divots, but am working to solve this problem.

This was a promising test, as the non-pocketed surface doesn’t get etched at all and the etching is relatively quick.


I showed this test to other people and received many raised eyebrows of curiosity. I became more diligent in my test samples and produces this etch sample with 8 spirals, with an interior path ranging from 2mm to 9mm to test on a variety of materials.


I was excited about this material, an acrylic composite that I had leftover from a landscape project. It is 1/2″ thick with green on one side and a semi-translucent white on the other. However, as you can see, the water-jet is too powerful and ends up shattering the edges, which is less than desirable.


And then I began to survey various stone samples. I began with scavenging some material from Building Resources, which had an assortment of unnamed, cheap tiles and other samples.

Forgive me…I wish I hadn’t sat in the back row of “Rocks for Jocks” in college. Who knew that a couple decades later, I would actually need some knowledge of geology to make artwork?

I began with some harder stone — standard countertop stuff like marble and granite. I liked seeing how the spiral breaks down along the way. But, there is clearly not enough contrast. It just doesn’t look that good.



I’m not sure what stone this is, but like the marble, it’s a harder stone and doesn’t have much of an aesthetic appeal. The honed look makes it still feel like a countertop.


I quickly learned that thinner tile samples would be hard to dial in. Working with 1/4″ material like this, often results in blowing out the center.


But, I was getting somewhere. These patterns started resembling an impact of sorts and certainly express the immense kinetic energy of the waterjet machine, akin to the kinetic energy of a meteorite impact.


This engineered brick was one of my favorite results from this initial test. You can see the detail on the aggregate inside.


And I got some weird results. This material, whatever it is, is simple too delicate, kind of like a pumice.


This is a cement compound of some flavor and for a day, I even thought about pouring my own forms, but that’s too much work, even for me.



I think these two are travertine tile samples and I wish I had more information on them, but alas, that’s what you get when you are looking through the lot. These are in the not-too-hard and not-too-soft zone, just where I want them to be.




I followed up these tests by hitting up several stoneyards and tiling places along the Peninsula (south of San Francisco). This basalt-like material is one of my favorite results, but is probably too porous for accuracy. Still, the fissures that it opens up in the pockets is amazing. Perhaps if I could tame the waterjet further, this would work.


basalt-more-detailThis rockface/sandstone didn’t fare so well. The various layers shattered, producing unusable results.


Likewise, this flagstone was a total fail.


The non-honed quartzite gets very close to what I want, starting to look more like a data-etching. I just need to find one that isn’t so thick. This one will be too heavy to work with.

IMG_0284  quartzite_close_IMG_0340

Although this color doesn’t do much for me, I do like the results of this limestone.


Here is a paver, that I got but can’t remember which kind it is. Better notes next time! Anyhow, it clearly is too weak for the water-jet.


This is a slate. Nice results!


And a few more, with mixed results.

IMG_0300 IMG_0301

And if you are a geologist and have some corrections or additions, feel free to contact me.

Cistern Mapping Project Reportback

On October 11th, 2015, 18 volunteer bike and mapping aficionados gathered at my place to work on the Cistern Mapping Project — an endeavor to physically document the 170 (or so) Cisterns in San Francisco. There exists no comprehensive map of these unique underground vessels. The resulting map is here.

I personally became fascinated by them, when working on my Water Works project*, which mapped the water infrastructure of San Francisco.

The history of the cisterns is unique, and notably incomplete.


The cisterns are part of the AWSS (Auxiliary Water Supply System) of San Francisco, a water system that exists entirely for emergency use and is separate from the potable drinking water supply and the sewer system.

In the 1850s, after a series of Great Fires in San Francisco tore through the city, about 23 cisterns were built. These smaller cisterns were all in the city proper, at that time between Telegraph Hill and Rincon Hill. They weren’t connected to any other pipes and the fire department intended to use them in case the water mains were broken, as a backup water supply.

They languished for decades. Many people thought they should be removed, especially after incidents like the 1868 Cistern Gas Explosion.

However, after the 1906 Earthquake, fires once again decimated the city. Many water mains broke and the neglected cisterns helped save portions of the city. Afterward, the city passed a $5,200,000 bond and begin building the AWSS in 1908. This included the construction of many new cisterns and the rehabilitation of other, neglected ones. Most of the new cisterns could hold 75,000 gallons of water. The largest one is underneath the Civic Center and has a capacity of 243,000 gallons.

The original ones, presumably rebuilt, hold much less, anywhere from 15,000 to 50,000 gallons.

Cistern109 22nd Dolores

Armed with a series of intersections of potential Cistern Locations, the plan was to bike to each intersection and get the exact latitude and longitude and a photograph of each of the cistern markers — either the circular bricks or the manholes themselves.

We had 18 volunteers, which is a huge turnout for a beautiful Sunday morning. I provided coffee and bagels and soon folks from my different communities of the bike teamExploratorium and other friends were chatting with one another.

Cistern Prep Meeting 3

One way to thank my lovely volunteers was to provide gifts. What I made for everyone were a series of moleskine notebooks with vinyl stickers of the cisterns and bikes. I was originally planning to laser-etch them, but found out that they were on the “forbidden materials” list at the Creative Workshops at Autodesk Pier 9, where I made them. Luckily, I always have a Plan B and so I made vinyl stickers instead.

Cistern Prep Meeting 4Cistern Prep Meeting 1

Here I am, in desperate need of a haircut, greeting everyone and explaining the process. I grouped the cisterns into blocks of about 10-20 into 10 different sets. This covered most of them and then we paired off riders in groups of 2 to try to map out the best way to figure out their ride.Cistern Prep Meeting 2

Some of the riders were friends beforehand and others became friends during the course of riding together. Here, you can see two riders figuring out the ideal route for their morning. Some folks were smart and brought paper maps, too!

Cistern Prep Meeting 7

Here are the bike-mappers just before embarking on their day-of-mapping. Great smiles all around!


Cistern Group 1

I would have preferred to ride, but instead was busy arranging the spreadsheet and verifying locations. Ah, admin work.

How did we do this? Simple: each team used a GPS app and emailed me the coordinates of the cistern marker, along with a photo of the cistern: the bricks, manhole or fire hydrant. I would coordinate via email and confirm that I got the right info and slowly fill out the spreadsheet. It was a busy few hours.

Screen Shot 2015-12-10 at 4.09.31 PM

The hills were steep, but fortunately we had a secret weapon: some riders from the Superpro Racing team! Here is Chris Ryan crushing the hills in Pacific Heights.

Cistern chris uphill

One reason that we traveled in pairs is that documentation can be dangerous. Sometimes we had to to put folks on the edge or actually in the street so they could get some great documentation.

Cisterns chris

So, how many cisterns did we map? The end result was 127 cisterns, which is about 75% of them, all in one day. We missed a few and then there were a series of outliers such as ones in Glen Park, Outer Sunset and Bayview that we didn’t quite make.

And the resulting map is here. There are still some glitches, but what I like about it is that you can now see the different intersections for each cistern. These have not been documented before, so it’s exciting to see most of them on the map.

Cistern web map

What did we discover?

Most of the cisterns are not actually marked with brick circles and just have a manhole that says “Cisterns” or even just “AWSS” on it.


What I really enjoyed, especially being in the “backroom” was how the cyclists captured the beautiful parts of the city in the background of the photos, such as the cable car tracks.

IMG 0609

Also, the green-capped fire hydrants usually are nearby. These are the ones that get used to fill up the cisterns by the SF Fire Department.

Green hydrant

A few were almost like their own art installations, with beautiful brickwork.

Cistern118 24th Noe

The ones in the Sunset and Richmond district are newer and are actually marked by brick rectangles

Cistern135 46th Geary

Thanks to the AMAZING volunteers on this day.

* Water Works is supported by a Creative Code Fellowship through Stamen Design, Autodesk and Gray Area.

EquityBot got clobbered

Just after the Dow Jones dropped 1000 points on Aug 24th (yesterday), I checked out how EquityBot was doing. Annual rate of return of > -50%

Screen Shot 2015-08-24 at 11.20.25 PM

Crazy! Of course, this is like taking the tangent of any curve and making a projection. A day later, EquityBot is at -32%.

Screen Shot 2015-08-25 at 8.57.06 AM

Still not good, but if if you were to invest yesterday, you could be much richer today.

I’m not that much of a gambler, so I’m glad that EquityBot is just a simulated (for now) bank account.

Bad Data: SF Evictions and Airbnb

The inevitable conversation about evictions at San Francisco every party…art organizations closing, friends getting evicted…the city is changing. It has become a boring topic, yet it is absolutely, completely 100% real.

For the Bad Data series — 12 data-visualizations depicting socially-polarized, scientifically dubious and morally ambiguous dataset, each etched onto an aluminum honeycomb panel — I am featuring two works: 18 Years of Evictions in San Francisco and 2015 AirBnb Listings for exactly this reason. These two etchings are the centerpieces of the show.


This is the reality of San Francisco, it is changing and the data is ‘bad’ — not in the sense of inaccurate, but rather in the deeper sense of cultural malaise.

By the way, the reception for the “Bad Data” show is this Friday (July 24, 2015) at A Simple Collective, and the show runs through August 1st.

The Anti-Eviction Mapping Project has done a great job of aggregating data on this discouraging topic, hand-cleaning it and producing interactive maps that animate over time. They’re even using the Stamen map tiles, which are the same ones that I used for my Water Works project.

Screen Shot 2015-07-23 at 4.52.36 PM

When I embarked on the Bad Data series, I reached out to the organization and they assisted me with their data sets. My art colleagues may not know this, but I’m an old-time activist in San Francisco. This helped me with getting the datasets, for I know that the story of evictions is not new and certainly not on this scale.

In 2001, I worked in a now-defunct video activist group called Sleeping Giant, which worked on short videos in the era when Final Cut Pro made video-editing affordable and when anyone with a DV camera could make their own videos. We edited our work, sold DVDs and had local screenings, stirring up the activist community and telling stories from the point-of-view of people on the ground. Sure, now we have Twitter and social media, but at the time, this was a huge deal in breaking apart the top-down structures of media dissemination.

Here is No Nos Vamos a hastily-edited video about evictions in San Francisco. Yes, this was 14 years ago.

I’ve since moved away from video documentary work and towards making artwork: sculpture, performance, video and more. The video-activist work and documentary video in general felt overly confining as a creative tool.

My current artistic focus is to transform datasets using custom software code into physical objects. I’ve been working with the amazing fabrication machines at Autodesk’s Pier 9 facility to make work that was not previously possible.

Ths dataset (also provided through the SF Rent Board) includes all the no-fault evictions in San Francisco, I got my computer geek on…well, I do try to use my programming powers for non-profit work and artwork.

I mapped the data into vector shapes using the C++ open source toolkit, called OpenFrameworks and wrote code which transformed the ~9300 data points into plotable shapes, which I could open in Illustrator. I did some work tweaking the strokes and styles.


This is what the etching looks like from above, once I ran int through the water jet. There were a lot of settings and tests to get to this point, but the final results were beautiful.


The material is a 3/4″ honeycomb aluminum. I tuned the high-pressure from the water-jet to pierce through the top layer, but not the bottom layer. However, the water has to go somewhere. The collisions against the honeycomb produce unpredictable results.

…just like the evictions themselves. We don’t know the full effect of displacement, but can only guess as the city is rapidly becoming less diverse. The result is below, a 20″ x 20″ etching.

Bad Data: 18 Years of San Francisco Evictions


The Airbnb debate is a little less clear-cut. Yes, I do use Airbnb. It is incredibly convenient. I save money while traveling and also see neighborhoods I’d otherwise miss. However, the organization and its effect on city economies is a contentious one.

For example, there is the hotel tax in San Francisco, which after 3 years, they finally consented to paying — 14% to the city of San Francisco. Note: this is after they had a successful business.

There also seems to be a long-term effect on rent. Folks, and I’ve met several who do this, are renting out places as tenants on Airbnb. Some don’t actually live in their apartments any longer. The effect is to take a unit off the rental market and mark it as a vacation rental. Some argue that this also skirts the law rent-control in the first place, which was designed as a compromise solution between landlords and tenants.

There are potential zoning issues, as well…a myriad of issues around Airbnb.

BAD DATA: 2015 AIRBNB LISTINGS, etching file


In any case, the location of the Airbnb rentals (self-reported, not a complete list) certainly fit the premise of the Bad Data series. It’s an amazing dataset. Thanks to for this data source.

BAD DATA: 2015 Airbnb Listings


Selling Bad Data

The reception for my solo show “Bad Data”, featuring the Bad Data series is this Friday (July 24, 2015) at A Simple Collective.

Date: July 24th, 2015
Time: 7-9pm
Where: ASC Projects, 2830 20th Street (btw Bryant and York), Suite 105, San Francisco

The question I had, when pricing these works was how do you sell Bad Data? The material costs were relatively low. The labor time was high. And the data sets were (mostly) public.

We came up with this price list, subject to change.

///  Water-jet etched aluminum honeycomb:

18 Years of San Francisco Evictions, 2015 | 20 x 20 inches | $1,200
Data source: The Anti-Eviction Mapping Project and the SF Rent Board

2015 AirBnB Listings in San Francisco, 2015 | 20 x 20 inches | $1,200
Data source:

Worldwide Haunted Locations, 2015 | 24 x 12 inches | $650
Data source: Wikipedia


Worldwide UFO Sightings, 2015 | 24 x 12 inches | $650
Data source: National UFO Reporting Center (NUFORC)


Missouri Abortion Alternatives, 2015 | 12 x 12 inches
Data source: (U.S. Government) | $150


Southern California Starbucks, 2015 | 12 x 8 inches | $80
Data source:


U.S. Prisons, 2015 | 18 x 10 inches | $475
Data source: Prison Policy Initiative (via Josh Begley’s GitHub page)

///  Water-jet etched aluminum honeycomb with anodization:


Albuquerque Meth Labs, 2015 | 18 x 12 inches | $475
Data source:


U.S. Mass Shootings (1982-2012), 2015 | 18 x 10 inches | $475
Data source: Mother Jones


Blacklisted IPs, 2015 | 20 x 8 ½  inches | $360
Data source: Suricata SSL Blacklist


Internet Data Breaches, 2015 | 20 x 8 ½ inches | $360
Data source:

Bad Data, Internet Breaches, Blacklisted IPs

In 1989, I read Neuromancer for the first time. The thing that fascinated me the most was not the concept of “cyberspace” that Gibson introduced. Rather it was the physical description of virtual data. The oft-quoted line is:

“The matrix has its roots in primitive arcade games. … Cyberspace. A consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts. … A graphic representation of data abstracted from banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding.”

What was this graphic representation of data that struck me at first and has stuck with be ever since. I could only imagine what this could be. This concept of physicalizing virtual data later led to my Data Crystals project. Thank you, Mr. Gibson.


In Neuromancer, the protagonist Case is a freelance “hacker”. The book was published well-before Anonymous, back in the days when KILOBAUD was the equivalent of Spectre for the BBS world.

At the time, I thought that there would be no way that corporations would put their data in a central place that anyone with a computer and a dial-up connection (and, later T1, DSL, etc) could access. This would be incredibly stupid.

And then, the Internet happened, albeit more slowly than people remember. Now hacking and data breaches are commonplace.

My “Bad Data” series — waterjet etchings of ‘bad’ datasets onto aluminum honeycomb panels — capture two aspects of internet hacking: Internet data breaches and Blacklisted IPs.

In these examples, ‘bad’ has a two-layered meaning. The abrogations of accepted treatises of Internet behavior is widely considered a legal, though always not a moral crime. The data is also ‘bad’ in the sense that it is incomplete. Data breaches are usually not advertised by the entities that get breached. That would be poor publicity.

For the Bad Data series, I worked with no necessarily the data wanted, but rather the data that I could get. From Information Is Beautiful, I found this dataset of Internet data breaches.

Screen Shot 2015-07-12 at 8.22.04 PM

What did I discover? …that Washington DC is the leader of breached information. I suspect it’s mostly because the U.S. government is the biggest target rather than lax government security. The runner-up is New York City, the center of American finance. Other notable cities are San Francisco, Tehran and Seoul. San Francisco makes sense — the city is home to many internet companies. And Tehran, which is the target of Western internet attacks, government or otherwise. But Seoul? They claim to be targeted by North Korea. However, as we have found out, with the Sony Pictures Entertainment Hack, North Korea is an easy scapegoat.



Conversely, there are many lists of banned IPs. The one I worked with is the Suricata SSL Blacklist. This may not be the best source, as there are thousands of IP Blacklists, but it is one that is publicly available and reasonably complete. As I’ve learned, you have to work with the data you can get, not necessarily the data you want.

I ran these two etched panels both through an anodization process, which further created a filmy residue on the surface. I’m especially pleased with how the Banned IPs panel came out.

Bad Data: BLACKLISTED IPs (below)


Genetic Portraits and Microscope Experiments

I recently finished a new artwork — called Genetic Portraits — which is a series of microscope photographs of laser-etched glass that data-visualize a person’s genetic traits.

I specifically developed this work as an experimental piece, for the Bearing Witness: Surveillance in the Drone Age show. I wanted to look at an extreme example of how we have freely surrendered our own personal data for corporate use. In this case, 23andMe provides a (paid) extensive genetic sequencing package. Many people, including myself have sent in saliva samples to the company, which they then process. From their website, you can get a variety of information, including their projected likelihood that you might be prone to specific diseases based on your genetic traits.

Following my line of inquiry with other projects such as Data Crystals and Water Works, where I wrote algorithms that transformed datasets into physical objects, this project processes individual’s genetic sequence to generate vector files, which I later use to laser-etch onto microscope slides. The full project details are here.


Concept + Material
I began my experiment months earlier, before the project was solidified, by examining the effect of laser-etching on glass underneath a microscope. This stemmed from conversations with some colleagues about the effect of laser-cutting materials. When I looked at this underneath a microscope, I saw amazing results: an erratic universe accentuated by curved lines. Even with the same file, each etching is unique. The glass cracks in different ways. Digital fabrication techniques still results in distinct analog effects. 

blog-IMG_4106When the curators of the show, Hanna Regev and Matt McKinley, invited me to submit work on the topic of surveillance, I considered how to leverage various experiments of mine, and came back to this one, which would be a solid combination of material and concept: genetic data etched onto microscope slides and then shown at a macro scale: 20” x 15” digital prints.

Surrendering our Data
I had so many questions about my genetic data. Is the research being shared? Do we have ownership of this data? Does 23andMe even ask for user consent? As many articles point out, the answers are exactly what we fear. Their user agreement states that “authorized personnel of 23andMe” can use the data for research. This sounds officially-sounding text simply means that 23andMe decides who gets access to the genetic data I submitted. 23andMe is not unique: other gene-sequencing companies have similar provisions, as the article suggests.

Some proponents suggest that 23andMe is helping the research front, while still making money. It’s capitalism at work. This article in Scientific American sums up the privacy concerns. Your data becomes a marketing tool and people like me handed a valuable dataset to a corporation, which can then sell us products based on the very data we have provided. I completed the circle and I even paid for it.   

However, what concerns me even more than 23andMe selling or using the data — after all, I did provide my genetic data, fully aware of its potential use — is the statistical accuracy of genetic data. Some studies have reported a Eurocentric bias to the data and The FDA has also has battled with 23andMe regarding the health data they provide. The majority of the data (with the exception of Bloom’s Syndrome) simply wasn’t predictive enough. Too many people had false positives with the DNA testing, which not only causes worry and stress but could lead to customers taking pre-emptive measures such as getting a mastectomy if they mistakenly believe they have are genetically predisposed to breast cancer.

A deeper look at the 23andMe site shows a variety of charts that makes it appear like you might be susceptible (or immune) to certain traits. For example, I have lower-than-odds of having “Restless Leg Syndrome“, which is probably the only neurological disorder that makes most people laugh when hearing about it. My genetic odds of having it are simply listed as a percentage.

Our brains aren’t very good with probabilistic models, so we tend to inflate and deflate statistics. Hence, one of many problems of false positives.

And, as I later discovered, from an empirical standpoint, my own genetic data strayed far from my actual personality. Our DNA simply does not correspond closely enough to reality.

Screen Shot 2015-06-16 at 11.06.44 AM

Data Acquisition and Mapping
From the 23andMe site, you can download your raw genetic data. The resulting many-megabyte file is full of rsid data and the actual allele sequences.

Screen Shot 2015-06-15 at 10.37.08 AM

Isolating useful information from this was tricky. I cross-referenced some of the rsids used for common traits from 23andMe with the SNP database. At first I wanted to map ALL of the genetic data. But, the dataset was complex — too much so for this short experiment and straightforward artwork.

Instead, I worked with some specific indicators that correlate to physiological traits such as lactose tolerance, sprinter-based athleticism, norovirus resistances, pain sensitivity, the “math” gene, cilantro aversion — 15 in total. I avoided genes that might correlate to various general medical conditions like Alzheimer’s and metabolism.

For each trait I cross-referenced the SNP database with 23andMe data to make sure the allele values aligned properly. This was arduous at best.

There was also a limit on physical space for etching the slide, so having more than 24 marks or etchings one plate would be chaotic. Through days of experimentation, I found that 12-18 curved lines would make for compelling microscope photography.

To map the data onto the slide, I modified Golan Levin’s decades-old Yellowtail Processing sketch, which I had been using as a program to generate curved lines onto my test slides. I found that he had developed an elegant data-storage mechanism that captured gestures. From the isolated rsids, I then wrote code which gave weighted numbers to allele values (i.e. AA = 1, AG = 2, GG = 3, depending on the rsid).


Based on the rsid numbers themselves, my code generated (x, y) anchor points and curves with the allele values changing the shape of each curve. I spent some time tweaking the algorithm and moving the anchor points. Eventually, my algorithm produced this kind of result, based on the rsids.


The question I always get asked about my data-translation projects is about legibility. How can you infer results from the artwork? Its a silly question, like asking an Kindle engineer to to analyze a Shakespeare play. A designer of data-visualization will try to tell a story using data and visual imagery.

My research and work focuses deep experimentation with the formal properties of sculpture — or physical forms — based on data. I want to push boundaries of what art can look like, continuing the lineage of algorithmically-generated work by artists such as Sol Lewitt, Sonia Rappaport and Casey Raes.

Is it legible? Slightly so. Does it produce interesting results? I hope so.


But, with this project, I’ve learned so much about genetic data — and even more about the inaccuracies involved. It’s still amazing to talk about the science that I’ve learned in the process of art-making.

Each of my 5 samples looks a little bit different. This is the mapping of actual genetic traits of my own sample and that of one other volunteer named “Nancy”.


Genetic Traits for Scott (ABOVE)

genome_scott_notatedWe both share a number of genetic traits such as the “empathy” gene and curly hair. The latter seems correct — both of our hair is remarkably straight. I’m not sure about the empathy part. Neither one of us is lactose intolerant (also true in reality).

But the test-accuracy breaks down on several specific points. Nancy and I do have several differences including athletic predisposition. I have the “sprinter” gene, which means that I should be great at fast-running. I also do not have the math gene. Neither one of these is at all true.

I’m much more suited to endurance sports such as long-distance cycling and my math skills are easily in the 99th percentile. From my own anecdotal standpoint, except for well-trodden genetics like eye color, cilantro aversion and curly hair, the 23andMe results often fail.

The genetic data simply doesn’t seem to be support the physical results. DNA is complex. We know this, it is non-predictive. Our genotype results in different phenotypes and the environmental factors are too complex for us to understand with current technology.

Back to the point about legibility. My artwork is deliberately non-legible based on the fact that the genetic data isn’t predictive. Other mapping projects such as Water Works are much more readable.

I’m not sure where this experiment will go. I’ve been happy with the results of the portraits, but I’d like to pursue this further, perhaps in collaboration with scientists who would be interested in collaboration around the genetic data.




gp_michele_may11 gp_nancy_may11 gp_scott_may11