Since I'm a big fan of physical art forms, mainly drawings and paintings, I thought it a good idea to convert data into a painted picture somehow. What better way than with the help of a robotic arm.
I spent the few evenings leading up to the event to construct the mechanised appendage. It's Friday, my hack bag packed, and the Tate greets me with this view:
Let the hacking commence!
For my hack, the data was simply going to be bitmapped image data. The public can tweet a photo to my robot, which gets converted to black and white. The robot then draws it for them on canvas.
Needless to say, I very quickly ran into problems hacking the arm. It had no fine-grained control: you had the option of rotating in various directions for a certain amount of time, and if you turned left for a second, then right for a second, it never ended up where it started. Stephen (a frequent co-hackathoner) laughed at my and said I should have bought a more grown-up robotic arm. Oh, well.
My mate Mat then suggested the temperamental arm can perhaps create Jackson Pollock-esque paintings and base position and intensity of splatter on Twitter sentiment data. (Ah, Twitter - the hackathon favourite.)
By this point, I have wasted so much time using my Raspberry PI's GPIO pins to detect arm over-run that there wasn't an awful lot of time left.
So, I resorted to what do best and made a web app...
The data portion was now going to come from data input on the app, which was along the lines of
without thinking much, what three words come to you when looking at this piece of art?
and then displaying the most-used words in a data visualisation.
I was immensely inspired by the artist hackers in that room that night. If I could do it again, I would just give the arm a Sharpie, put cardboard up all around it, and unleash something akin to a SETI data feed at it. It will be messy, but at least it will get done. And I will still get to use the robot arm!