Polycon in Berlin

This week I traveled to Berlin for Polycon. No…it’s not a convention on polyamory, but a porject developed by my longtime friend, Michael Ang (aka Mang). Polygon Construction Kit (aka Polycon) is a software toolkit for converting 3D polygon models into physical objects.

IMG_0246I wanted an excuse to visit Berlin, to hang out with Mang and to open up some possibilities for physical data-visualization behind EquityBot, which I’m working at for the artist-in-residency at Impakt Works and their upcoming festival.

I brought my recently-purchased Printrbot Simple Metal, which I had disassembled into this travel box.IMG_0281

After less than 30 minutes, I had it reassembled and working. Victory! Here it is, printing one of the polygon connectors.

IMG_0248How does Polycon work? Mang shared with me the details. You start with a simply 3D model from some sort of program. He uses SketchUp for creating physical models of his large-scale sculptures. I prefer OpenFrameworks, which is powerful and will let me easily manipulate shapes from data streams.

Here’s the simple screenshot in OpenFrameworks of two polyhedrons. I just wrote this the other day, so there’s no UI for it yet.

Screen Shot 2014-09-25 at 6.10.14 PM

And here is how it looks in MeshLab. It’s water-tight, meaning that it can be 3D-printed.Screen Shot 2014-09-25 at 6.10.59 PM

My goal is to do larger-scale data visualizations than some of my previous works such as Data Crystals and Water Works. I imagine room-sized installations. I’ve had this idea for many months of using the 3D printer to create joinery from datasets and to skin the faces using various techniques, TBD.

How it works: Polycon loads a 3D model and using Python scripts in FreeCAD will generate 3D joints that along with wooden dowels can be assembled into polygonal structures. Screen Shot 2014-09-25 at 6.09.00 PM

The Printrbot makes adequate joinery, but it’s nowhere near as pretty as the Vero prints on the Object 500 at Autodesk. It doesn’t matter that much because my digital joinery will be hidden in the final structures.IMG_0272Mang guided be through the construction of my first Polycon structure. There’s a lot of cleanup work involved such as drilling out the holes in each of the joints. IMG_0274It took awhile to assemble the basic form. There are vertex-numbering improvements that we’ll both make to the software. Together, Mang and I brainstormed ideas as to how to make the assembly go more quickly.IMG_0259After about 15 minutes, I got my first polygon assembled.

IMG_0265 It looks a lot like…the 3D model. I plan to be working on these forms in the next several months and so felt great after a successful first day.IMG_0268And here is a really nice image of one of Mang’s pieces — these are sculptures of mountains that he created. The backstory is that he made these from memories while flying high in a glider and they represent mountains. I like where he’s going with his artwork: making models based on nature, with ideas of recording these spaces and playing them back in various urban spaces. You can check out Michael Ang’s work here on his website.
IMG_0278




 

 

 

 

 

A Starting Point: Distributed Capital

I’m doing more research on EquityBot —the project for my Impakt Works residency, which I just started a couple of days ago.

EquityBot is a stock-trading algorithm that explores the connections between collective emotions on social media and financial speculation. It will be presented at the Impakt Festival at the end of October.

It will also consist of a sculptural component (presented post-festival), which is the more experimental form.

Many of you are familiar with Paul Baran’s work on designing a distributed network, but many others may not be. He worked for the U.S. Air Force and determined that a central communications network would be vulnerable to attack, and suggested that the United States use a distributed network.
baranInterestingly, there is a widespread myth that the Internet, derived from APANET, was designed to withstand a nuclear attack using this model. This isn’t the case, just that the architects of the internet transmission protocol heard of Rand’s work and adapted it for packet use. Yet, the myth persists.

On a side note, perhaps military technology could be useful for the public good. If only we could declassify the technology, like Baran did.

The distributed network reminds me of a 3D polygon mesh I think this could be a good source of 3D data-visualization: Distributed Capital. I’ll research this more in the future.

But EquityBot isn’t about networks in the formal sense, it is a project about constructing a predictive model of stock changes based on the idea that Twitter sentiments correlate with fluctuations in stock prices. Screen Shot 2014-09-17 at 6.08.23 AM

Do I know there is a correlation? Not yet, but I think there is a good possibility. One of my reading sources, The Computational Beauty of Nature, sums up the value of simulated models in its introduction. The predictive model might fail in its results but it will likely reveal a greater truth in the economic system that it is trying to predict. Thus, knowing the uncertainty ahead of time will provide a sense of certainty. EquityBot may not “work” but then again, it may.

compbeautyofnatureMy source of dissent is the excellent book, The Signal and The Noise: Why So Many Predictions Fail — but Some Don’t by Nate Silver. After reading this, last summer, I was convinced that any predictive analysis would be simply be noise. I was disheartened and halted the EquityBot project (previously called Grantbot) for many months.

la-ca-nate-silver

However, now I’m not so sure. It seems likely that people’s moods would affect financial decisions, which in turn would affect stock prices. With studies such as this one by Vagelis Hristidis, which found some correlation to Twitter chatter and stock, I think there is something to this, which is why I’ve revisited the EquityBot project.

I’ll follow the Buddhist maxim with this project and embrace its uncertainty.

 

Life of Poo

I’ve been blogging about my Water Works project all summer and after the Creative Code Gray Area presentation on September 10th, the project is done. Phew. Except for some of the residual documentation.

In the hours just before I finished my presentation, I also managed to get Life of Poo working. What is it? Well, an interactive map of where your poo goes based on the sewer data that I used for this project.

Huh? Try it.

Screen Shot 2014-09-16 at 6.42.06 AM

This is the final piece of my web-mapping portion of Water Works and uses Leaflet with animated markers, all in Javascript, which is a new coding tool in my arsenal (I know, late to the party). I learned the basics in the Gray Area Creative Code Immersive class, which was provided as part of the fellowship.

The folks at Stamen Design also helped out and their designer-technicians turned me onto Leaflet as I bumbled my way through Javascript.

How does it work?

On the Life of Poo section of the Water Works website, you enter an address (in San Francisco) such as “Twin Peaks, SF” or “47th & Judah, SF” and the Life of Poo and then press Flush Toilet.

This will begin an animated poo journey down the sewer map and to the wastewater treatment plant.

Screen Shot 2014-09-16 at 6.50.17 AMNot all of the flushes works as you’d expect. There’s still glitches and bugs in the code. If you type in “16th & Mission”, the poo just sits there. Hmmm.

Why do I have the bugs? I have some ideas (see below) but I really like the chaotic results so will keep it for now.

Screen Shot 2014-09-16 at 6.54.32 AM

 

I think the erratic behavior is happening because of a utility I wrote, which does some complex node-trimming and doesn’t take into account gravity in its flow diagrams. The sewer data has about 30,000 valid data points and Leaflet can only handle about 1500 or so without it taking forever to load and refresh.

The utility I wrote parses the node data tree and recursively prunes it to a more reasonable number, combining upstream and downstream nodes. In an overflow situation, technically speaking, there are nodes where waste might be directed away from the waste-water treatment plant.

However, my code isn’t smart enough to determine which are overflow pipes and which are pipes to the treatment plants, so the node-flow doesn’t work properly.

In case you’re still reading, here’s an illustration of a typical combined system, that shows how the pipes might look. The sewer outfall doesn’t happen very often, but when your model ignores gravity, it sure will.

CombineWasteWaterOverflow

The 3D print of the sewer, the one that uses the exact same data set as Life of Poo looks like this.

sewerworks_front sewerworks_top