Introducing Machine Data Dreams
Earlier this year, I received an Individual Artist Commission grant from the San Francisco Arts Commission for a new project called Machine Data Dreams.
I was notified months ago, but the project was on the back-burner until now — where I’m beginning some initial research and experiments at a residency called Signal Culture. I expect full immersion in the fall.
The project description
Machine Data Dreams will be a large-scale sculptural installation that maps the emerging sentience of machines (laptops, phones, appliances) into physical form. Using the language of machines — software program code — as linguistic data points, Scott Kildall will write custom algorithms that translate how computers perceive the world into physical representations that humans can experience.
The project’s narrative proposition is that machines are currently prosthetic extensions of ourselves, and in the future, they will transcend into something sentient. Computer chips not only run our laptops and phones, but increasingly our automobiles, our houses, our appliances and more. They are ubiquitous and yet, often silent. The key to understanding their perspective of the world is to envision how machines view the world, in an act of synthetic synesthesia.
Scott will write software code that will perform linguistic analysis on machine syntax from embedded systems — human-programmable machines that range from complex, general purpose devices (laptops and phones) to specific-use machines (refrigerators, elevators, etc) . Scott’s code will generate virtual 3D geometric monumental sculptures. More complex structures will reflect the higher-level machines and simpler structures will be generated from lower-level devices. We are intrigued by the experimental nature of what the form will take — this is something that he will not be able to plan.
Machine Data Dreams will utilize 3D printing and laser-cutting techniques, which are digital fabrication techniques that are changing how sculpture can be created — entirely from software algorithms. Simple and hidden electronics will control LED lights to imbue a sense of consciousness to the artwork. Plastic joints will be connected via aluminum dowels to form an armature of irregular polygons. The exterior panels will be clad by a semi-translucent acrylic, which will be adhered magnetically to the large-sized structures. Various installations can easily be disassembled and reassembled.
The project will build on my experiments with the Polycon Construction Kit by Michael Ang, where I’m doing some source-code collaboration. This will heat up the fall.
At Signal Culture, I have 1 week of residency time. It’s short and sweet. I get to play with devices such as the Wobbulator, originally built by Nam June Paik and video engineer Shuya Abe.
The folks at Signal Culture built their own from the original designs.
What am I doing here, with analog synths and other devices? Well, I’m working with a home-built Arduino data logger that captures raw analog video signals (I will later modify it for audio).
I’ve optimized the code to capture about 3600 signals/second. The idea is to get a raw data feed of what a machine might be “saying”, or the electronic signature of a machine.
Does it work? Well, I hooked it up to a Commodore Amiga (yes, they have one).
I captured about 30 seconds of video and I ran it through a crude version of my custom 3D data-generation software, which makes models and here is what I got. Whoa…
It is definitely capturing something.
Its early research. The forms are flat 3D cube-plots. But also very promising.
Leave a ReplyWant to join the discussion?
Feel free to contribute!