The P5 Glove is a consumer wired glove (tactile but not haptic). I bought one boxed as-new on eBay a while ago for not very much, and I’m glad I did as they now seem to be increasingly hard (and expensive) to get hold of.
It contains five analog bend sensors, 3 buttons plus in theory x, y and z coordinates and yaw, pitch and roll (it emits IR which is picked up by a big USB IR tower so it knows where your hand is in space).
Here’s the P5 Glove intro movie…
I say in theory because while the p5osc Mac drivers handle the bend sensors very well the x/y/z output is jittery and yaw/pitch/roll sadly non-existent.
I’ve been experimenting with bridging the outputs for the buttons, fingers and thumb into MIDI custom controls so that I mess around with them in ControllerMate. Here’s a demo of a simple setup which detects whether each digit is straight or bent, and uses that to determine whether your hand is describing a rock, paper or scissors shape. For now, it just displays ‘Rock’, ‘Paper’ or ‘Scissors’ in large type on the screen but it would be pretty straightforward to turn this into a simple game.
Here’s the ControllerMate patch I made to do it (click through for the annotated version on Flickr).
Lots more fun to be had here with virtual pianos and guitar strings too; arpeggiating the MIDI guitar, for example.
Tangible interfaces strike again. Not content with playing music with a yo-yo, I’ve knocked up a first pass at an augmented reality game of pong on a whiteboard.
Here, I play pong with two whiteboard erasers. On a whiteboard. A camera watches the scene and a quick hacky Processing app (via reacTIVision and some OSC messages) bounces a ball around. For the full effect, a projector will render the virtual ball right there on your desk. Or you could play it on the surface of a flat-panel monitor I guess.
The nice thing about working with whiteboards: it’s fun to change the version number when v0.2 is done.
Also worth mentioning: I mentioned on Flickr that “now I just need a projector” and within minutes my brainy friend Dave came round to my office with a spare one for me to borrow. The joy of declarative living!
I’ve been thinking about augmented reality and tangible stuff in relation to music recently. A while ago I hacked together a RFID reader and rotary encoder (using cheap-ish off the shelf USB kit from Phidgets) into a virtual knob. Ian captured me giving a quick (“it’s me knob demo!“) demonstration in the office a couple of weeks ago. The idea there was that one rotary encoder could act as more than one controller if it knew it had moved between different positions. In this case, using RFID tags.
More recently, I have been playing with reacTIVision (the software behind the reactable, and incidentally what the SLorpedo team used in their Hack Day entry). It’s incredibly fun and ridiculously easy. To avoid my hands getting in the way of the tags, I threw together a picture frame, a cardboard box, a 25 watt table lamp and a Logitech quickcam (actually, the LEGO Vision Command camera, which is a nasty manual-focus quickcam with nifty LEGO extrusions).
For tangible objects, I grabbed three things relatively close at hand. Here’s a video of me having fun in C major with a red wooden yo-yo, a tin of Altoids and a hidden surprise.
Having installed reacTIVision, I run it like this:
reacTIVision.exe -m midi\demo.xml –midi-device 10
An argument of -l midi will list all the available MIDI devices. Something like MIDI Yoke is handy here (which is device 10 for me). The MIDI output is optional, and the default OSC output is more flexible, but for today I wanted to play directly with MIDI and this made it really easy.
To use the three different inputs shown in the video, I first described the controls I wanted in the xml configuration file. I just edited demo.xml to include
<map fiducial="0" type="hfader" channel="1" control="1"/>
<map fiducial="0" type="note" note="72"/>
<map fiducial="1" type="knob" channel="1" control="2"/>
<map fiducial="1" type="note" note="76"/>
<map fiducial="2" type="vfader" channel="1" control="3"/>
<map fiducial="2" type="note" note="79"/>
Which, as you’d expect, means…
- tag 0 (the tea bag) plays a C when visible, as well as treating its vertical position as MIDI controller 1 on channel 1,
- tag 1 (the yo-yo) plays an E, as well as treating its angle as controller 2, and
- tag 2 (the Altoids) plays a G, as well as treating its horizontal position as controller 3.
Next, I needed something to handle the MIDI notes and commands. I love Reaper for this kind of thing. (Annotated screenshot which explains a little more of what is going on here). For an instrument, I used the lovely Tapeworm from Tweakbench.
Finally, I trained Reaper (though you might prefer Ableton or whatever) with the controls I planned to manually twiddle. I set controller 2 to affect the fine-tune knob in Tapeworm and controller 1 to change the volume for the track.
The possibilities here seem endless. Throw in a Monome 40h, a couple of Wiimotes, an P5 Glove and the RFID reader / rotary encoder knob I was playing with before, and I have more physical controllers than I could ever need. All of which talk MIDI and/or OSC. Expect more demos (hopefully with some actual music rather than just proof of concepts) as I continue to experiment. I already like reacTIVision a lot, and it makes me want to buy a better camera.
Thanks to Ian for recording this: