It’s seemed like for many years the relative genius of The Mouse as an input device was truly innovative. It enabled us to interact with iconography on our desktops and thus transform the personal computer from something a bit geeky to something that, through the use of some strange metaphor-inspired icons, would be usable to millions of people across the world. Importantly, the metaphors of the User Interface – the desktop, trash can, folders and files – went hand-in-hand with its success at supporting users understand and move from the analogue world to a digital one.
The mouse has quite an interesting history, since its early prototypes by Douglas Englebart (who named it) through to the Xerox Alto and ultimately, the Macintosh Lisa (a decent-enough history is available on wikipedia). It has ruled as the main input device since, and has seen many slight variations in its design. But, it’s still a mouse.
Of course, Apple really blew our minds when they introduced the iPhone and its touch screen, which has ultimately led to almost all smartphones being touch screen; the introduction of a viable tablet device and even touch screen laptops and desktop PCs. To a large degree, controlling our devices with the touch of a finger, a pinch and zoom, or a swipe has become second nature to many of us. In fact for some young thundercats, it’s all they’ve ever really known.
It wasn’t until around 2010 when we seen Leap Motion – a gesture based controller, enabling us to wave our hands in front of the screen to control what we would typically use a mouse for. It was all a bit exciting and Minority Report-like. I still find it fascinating how that movie is seen as almost the goal for technological development.
The Leap is now quite affordable (about £60 on Amazon) but for one reason or another, I think it’s still seen as a gimmicky type of device that hasn’t really impacted on computing, or even educational technology, quite as much as I thought it might. Which is a shame.
The latest gesture-based interaction is something that I’ve just come across (and actually which inspired me to write this post) – an app called ControlAir. Well actually, it’s the webcam which is the device, and the clever software does the donkey work. It enables the user (using a Mac) to control certain apps (iTunes, Spotify, etc) by the use of a gesture. For example, to mute the volume, bring your index finger up to your mouth (as though you’re telling it to shush!). Genius. To raise/lower volume and move back and forward, raise your index finger and click your finger in mid-air. Check out the marketing clip below.
I downloaded this over the weekend and love it already. Yes, it’s hugely gimmicky but offers an insight into how we might be controlling our devices in the future. It’s easy to see how we might point and click to select items on a screen, pinch and zoom, fast forward through movies or swipe away. Apply this to scenarios enabling students to interact with virtual skeletons, muscles and organs and it’s actually quite exciting.