Notes on Readings & Exercises

The largest takeaway I have from this week is how hard it is to conceive of “interesting” forms of interaction when limited to nothing but a servo and its tiny range of motion. I found myself huffing and puffing and making mental leap after mental leap in order to arrive at something practicable. Like the Implicit Interactions framework, there’s a bit of anthropomorphizing that creeps in at the edges of this design work - I started, unprompted, to think of the servo as an arm, or a finger, or perhaps a crank, and in doing so, I could also feel my horizon of possibilities shrinking more and more.

In part I think this has to do with the fact that there is nothing for the servo to “do”. There’s no real problem, unless it’s invented - and where’s the fun in that? There’s no stakes in the game, nothing to really push the mind past the obvious into the creative. I feel the same way about screens, and maybe software in general. It’s too wide a field, and it became even wider when we collapsed display and interaction into one pane of glass - there’s no real stakes in the actual “design” of the thing, because it’s trivial to make a system that solves all the specified requirements perfectly on paper. The system might be a disaster in human hands, but who cares, the boxes were ticked and we have quantifiable data to back up our claims.

And in a way this is also my issue with the whole Implicit Interactions thing. Speaking of design as somehow separate and above the very real constraints and assumptions imposed on us by hardware means that we’re really just engaging in a sort of disjointed game. The logistical and economic weight of things means that by the time we actually get to think about interaction, we’re only really doing our best at patching the limitations we’ve been delivered. And failing to acknowledge this as we work does a disservice to how we think about interaction, because we validate the status-quo of interactive art and/or design.

For a brief moment a couple of years ago, Graham’s OOO seemed to provide a way out of this trap, a true framework for questioning not only the nature of objects but the relations between each other. Since, this has devolved into a lot of academic noise, but there was a point there that shouldn’t be lost. Our role in thinking about interaction can’t just be limited to the last step of the sequence, but actually should be about the first - and often, the most mature answer will mean we shun electronics altogether in favor of objects that actually make sense in the first place.

Of course, this is all nice and good, but I still only managed to come up with a shitty finger puppet.

Experiments in Tone

I spent the first day of this week looking at how one might play something resembling music over the 1Amp speaker we were provided with in our kit. It soon became abundantly clear to me that doing so was a significantly harder endeavor than I thought it would be. Building the circuit was easy, but the nuances of working realtime with audio format conversions was a bit of a nightmare.

The closest I got to making something happen was I managed to store mp3 data in a Processing buffer and then streamed this data to the arduino where it was passed into a tone function. This of course being the most naive way of doing it - the output was horrendous but interesting; in a way, the processor was playing the “live” version of what an mp3 actually contains.

I was bad about documenting this.

Servos and Puppets

The servo exercises were more interesting to me - I wanted to look at how easy/difficult it is to make servos behave somewhat more organically, and what it takes to synchronize them. I carved out a couple of test wooden pieces, in a cheap imitation of what a wooden finger might look like. I then built a little scaffolding for the control armature, and connected the servos.

It was surprisingly difficult and at the same time easy to get this to work. Whenever a servo moved independently of the other - and because they were both connected to the same structure - the puppet threatened to break apart.

My solution was to map each servo to its own rotary encoder. I could then use this setup to move the finger into its “keyframes” and make a note of the corresponding servo values. I could then create an interpolated scale between these values, and because they were set together, a full scale of motion could be attained.

The next step was to map the continuing scale to a potentiometer, which would allow me to control the full range of motion with one input.

The idea here was to eventually replace the potentiometer with a flex sensor placed on my own finger, but adafruit was a bit slow in delivering.

Finally, I incorporated an interpolation library and used a Quartic Interpolation mode to have the finger move autonomously in a more natural fashion - it was interesting that a friend pointed out that it did indeed look more natural and fluid when I incorporated the interpolation mode.

Adjusting the finger parameters: