BlinkyTape bike indicators

I made a bike light indicator system out of a BlinkyTape, a PowerMonkey rechargeable battery and some Loom Bands.

BlinkyTape bike indicators

It starts off with the central front LEDs lit up bright white, and pressing the button on the BlinkyTape switches between steads/left/right indicator modes.

BlinkyTape bike indicators

The PowerMonkey is a simple little 5v rechargeable battery, with a variety of adaptors for charging various phones etc. It makes an ideal portable power source for the BlinkyTape.

BlinkyTape bike indicators

Here’s a video of it in action.

And I’ve put the source code online too.

I’ve also been experimenting with using the BlinkyTape PatternPaint app to do some light painting.

BlinkyTape Light painting with the BlinkyLight

Lots more fun to have here.

Custom KSP controller and display

Here’s my custom controller and display for Kerbal Space Program.

Fitted

Last year, after seeing this custom controller, I was suitably inspired. I wanted to build a simple physical device to control launch/stage, throttle, landing gear, lights, and maybe some custom stages. I knocked up a quick hack just to get a feel for how well it worked, using cardboard, a handful of switches I already had lying around, and a Teensy development board which can act as a USB keyboard.

Untitled  Untitled

Using a simple controller with physical switches and buttons as alternatives to keyboard keys was fun to use, but I was soon annoyed every time my hands had to go across to the keyboard – and especially the mouse – when checking things like radar altimeter, periapsis, time to apoapsis etc.

I soon wanted not just switches but screens and dials I could glance at. I especially knew I needed a physical radar altimeter. (Landing safely is hard!) What I needed was a way to get the data out of KSP.

Ideally, I thought, someone would have written a KSP plugin to give me easy access to live data about velocity, altitude, fuel levels, periapsis apoapsis, time to periapsis and apoapsis, height from terrain, velocity, surface speed, vertical speed, sensor data etc. Ideally something simple, lightweight, readable by a hacky little program that could pass the data on through USB serial to the controller.

I was really looking for a CSV or JSON plugin for KSP. It took a bit of digging to find it, and I feared I might have to write it, but I was delighted to find the Telemachus plugin which adds a nice simple JSON API to KSP and has a fully featured web interface built on that API. I don’t use the web interface but the JSON API is great. Getting live data out of KSP and into Ruby was a nice moment.

Look at all the lovely data

Now I had an approach that I knew would work, I started putting together a wishlist of parts and putting together a simple paper prototype; a rough sketch of what components I wanted where.

KSP controller paper prototype

Having seen various voltmeter clock projects I knew I wanted to use an analog output on an Arduino to have it display live data about altitude, fuel, velocity etc.

So I started playing with LCD screens and voltmeters to work out how to display different things simultaneously.


Next I went shopping for a good range of switches

Switches

A higher fidelity prototype came next, with holes punched in the cardboard where I thought the switches screens and meters needed to be. At this stage, I learned a lot about what felt comfortable, and moved a few things around.

KSP controller prototype

Starting to put it all together.

Displays

The displays all go in to the base

Untitled

Feels satisfying already

Untitled

Testing the displays

Preparing to drill the holes

Preparing to drill

Drilled and Dremelled

Drilled and Dremelled

Everything in place

Fitted

Source code

  • Teensy code for creating key presses from switches
  • Simple Arduino code for controlling LCD screen and voltmeters
  • Beginnings of a Ruby script for passing values from the Telemachus plugin to the Arduino

Components


I’ve subsequently seen this astonishing mission control desk which I now very badly want to make for my son / self.

Things meter

I’ve been using the Things app for a while for tracking projects and next actions with the goal of Getting Things Done. I wanted something to help me pay attention to the things I need to get done, and decided that a physical representation of daily progress would be an interesting thing to try.

The hardware build was really easy. More of a bodging together of components than anything. I dremelled out the back of the voltmeter to create a bit more room, fitted it to a small enclosure box, and squeezed the dev board into the remaining space, with the ground pin and an analog output connected to the voltmeter.

 Untitled   Untitled

The code is pretty straightforward. The Teensy runs a small program that listens for lines of text via the USB serial port and simply sets the output of the voltmeter to whatever percentage value arrives. At this stage I’ve got a simple multi-purpose percentage meter controlled trivially over USB.

Next is a Ruby script that listens for changes to the Things app, works out how many of the tasks in the ‘Today’ screen have been marked as completed today, and sends that percentage to the USB serial port. It’s like a physical progress bar for things I want to get done today. A done dial for life.

I’m going to try it for a while and see how it works. There are probably lots of other things that a progress meter would help with too.

Untitled

Components:

Jargone

I made a thing.

Jargone is a bookmarklet for highlighting jargon words on any web page.

The list comes from the Government Digital Service GOV.UK style guide, specifically the plain English and words to avoid sections of the guide, which has this to say about avoiding jargon

We lose trust from our users if we write government ‘buzzwords’ and jargon. Often, these words are too general and vague and can lead to misinterpretation or empty, meaningless text. We need to be specific, use plain English and be very clear about what we are doing.

While the guide is very helpful, and includes alternative suggestions for many of the words to avoid, I wanted to be able to spot jargon more easily on the web.

The bookmarklet is very simple. It just adds a bit of CSS styling and javascript to the page and then checks all the words on the page against a list of known jargon words and phrases. Once you run it on a page, offending words are highlighted and, borrowing heavily from the design of Gmail’s handy spellcheck feature, any entries which also have notes associated (suggestions for alternatives, for example) also let you click on them to see the suggestion. It doesn’t (yet) let you replace with a suggestion, mainly because it doesn’t even pretend to be clever enough to get it right. In fact, the implementation is so simple that it’s quite likely to think there’s jargon on a page even if there’s not. For ‘impact’ it gives the advice “Don’t use it as a verb” even when you’ve used it as a verb. It could probably be made a bit cleverer, but as a quick automatic highlighting of things to watch out for, it’s hopefully already quite useful.

Although I based the list on the GDS style guide, I’ve already accepted several contributions from elsewhere. Thanks to everyone who has already contributed.

You can find out more about Jargone or just install it here if you want to try it in your own browser. Enjoy.

Inky-Linky

I made a thing.

Inky-Linky makes web pages 100% more useful and irritating when printed. It’s a bookmarklet that adds a QR code to the margins for each external link in the page.

It came about because I wanted to make it easy to visit a link from a printed page, and also wanted to see if I could find an actual useful use for the much (rightly) maligned QR code.

Although it just about works there are, or course, quite a few things wrong with it.

  • It really doesn’t work very well on very busy pages with lots of links.
  • The layout algorithm could be a bit smarter when deciding which margin to use (e.g. links on the right of the page should ideally prefer to be shown in the right margin, rather than blindly alternating).

If that doesn’t put you off, and you want to try it for yourself, here’s the Inky Linky repository and installation page. Enjoy.

(Oh, hello Boingboing!)

P5 Glove – Rock Paper Scissors and other fun

The P5 Glove is a consumer wired glove (tactile but not haptic). I bought one boxed as-new on eBay a while ago for not very much, and I’m glad I did as they now seem to be increasingly hard (and expensive) to get hold of.

P5 Glove

P5 Glove    P5 Glove (Rock!)

It contains five analog bend sensors, 3 buttons plus in theory x, y and z coordinates and yaw, pitch and roll (it emits IR which is picked up by a big USB IR tower so it knows where your hand is in space).

Here’s the P5 Glove intro movie…

I say in theory because while the p5osc Mac drivers handle the bend sensors very well the x/y/z output is jittery and yaw/pitch/roll sadly non-existent.

I’ve been experimenting with bridging the outputs for the buttons, fingers and thumb into MIDI custom controls so that I mess around with them in ControllerMate. Here’s a demo of a simple setup which detects whether each digit is straight or bent, and uses that to determine whether your hand is describing a rock, paper or scissors shape. For now, it just displays ‘Rock’, ‘Paper’ or ‘Scissors’ in large type on the screen but it would be pretty straightforward to turn this into a simple game.

P5 Glove – MIDI Rock Paper Scissors from rooreynolds on Vimeo.

Here’s the ControllerMate patch I made to do it (click through for the annotated version on Flickr).

ControllerMate VR Glove MIDI Rock-Paper-Scissors

Lots more fun to be had here with virtual pianos and guitar strings too; arpeggiating the MIDI guitar, for example.

Yo-yo tuning; tangible audio

I’ve been thinking about augmented reality and tangible stuff in relation to music recently. A while ago I hacked together a RFID reader and rotary encoder (using cheap-ish off the shelf USB kit from Phidgets) into a virtual knob. Ian captured me giving a quick (“it’s me knob demo!“) demonstration in the office a couple of weeks ago. The idea there was that one rotary encoder could act as more than one controller if it knew it had moved between different positions. In this case, using RFID tags.

More recently, I have been playing with reacTIVision (the software behind the reactable, and incidentally what the SLorpedo team used in their Hack Day entry). It’s incredibly fun and ridiculously easy. To avoid my hands getting in the way of the tags, I threw together a picture frame, a cardboard box, a 25 watt table lamp and a Logitech quickcam (actually, the LEGO Vision Command camera, which is a nasty manual-focus quickcam with nifty LEGO extrusions).

Yo-yo, camera, Altoids tin Side-on Fiducials viewed from below Yo-yo tuning (video thumbnail) The (trivial) setup in Reaper

For tangible objects, I grabbed three things relatively close at hand. Here’s a video of me having fun in C major with a red wooden yo-yo, a tin of Altoids and a hidden surprise.

Having installed reacTIVision, I run it like this:

reacTIVision.exe -m midi\demo.xml –midi-device 10

An argument of -l midi will list all the available MIDI devices. Something like MIDI Yoke is handy here (which is device 10 for me). The MIDI output is optional, and the default OSC output is more flexible, but for today I wanted to play directly with MIDI and this made it really easy.
To use the three different inputs shown in the video, I first described the controls I wanted in the xml configuration file. I just edited demo.xml to include

<map fiducial="0" type="hfader" channel="1" control="1"/>
<map fiducial="0" type="note" note="72"/>
<map fiducial="1" type="knob" channel="1" control="2"/>
<map fiducial="1" type="note" note="76"/>
<map fiducial="2" type="vfader" channel="1" control="3"/>
<map fiducial="2" type="note" note="79"/>

Which, as you’d expect, means…

  • tag 0 (the tea bag) plays a C when visible, as well as treating its vertical position as MIDI controller 1 on channel 1,
  • tag 1 (the yo-yo) plays an E, as well as treating its angle as controller 2, and
  • tag 2 (the Altoids) plays a G, as well as treating its horizontal position as controller 3.

Next, I needed something to handle the MIDI notes and commands. I love Reaper for this kind of thing. (Annotated screenshot which explains a little more of what is going on here). For an instrument, I used the lovely Tapeworm from Tweakbench.

Finally, I trained Reaper (though you might prefer Ableton or whatever) with the controls I planned to manually twiddle. I set controller 2 to affect the fine-tune knob in Tapeworm and controller 1 to change the volume for the track.

The possibilities here seem endless. Throw in a Monome 40h, a couple of Wiimotes, an P5 Glove and the RFID reader / rotary encoder knob I was playing with before, and I have more physical controllers than I could ever need. All of which talk MIDI and/or OSC. Expect more demos (hopefully with some actual music rather than just proof of concepts) as I continue to experiment. I already like reacTIVision a lot, and it makes me want to buy a better camera.

It’s me knob demo! – RFID reader + rotary encoder = virtual physical control

Thanks to Ian for recording this:

I used a Phidgets RFID kit, a rotary encoder and Reaper.

Powered by WordPress with GimpStyle Theme design by Horacio Bella.
The postings on this site are my own and don't necessarily represent my employer's positions, strategies or opinions.