It seems we are in search of ever better input devices. After being satisfied with a mouse for many years we now want gesture input and the easiest way may be to tap into a user's muscles.
Gesture input is already possible using a Kinect depth camera and in the very near future perhaps with a Leap 3D input device. However it might just be much cheaper and simpler to read the electrical signals in the user's arm. The MYO input device, being built by Canadian firm Thalmic Labs, measures the myoelectric activity in the arm. You might think that this would limit the input to crude movements such as clenching a fist but the demo video suggests that it is refined enough to distinguish between the activities of smaller muscle groups. For example different muscle activities are associated with rotating the arm and flexing the fingers.
What this means is that the device could in principle sense gestures that you make without having to track your hand or arm and without using an external device such as a camera. Of course the gestures and arm's positions would all be relative to your current body position but this isn't a big problem as long as absolute positioning isn't required.
The device has a set of muscle activity sensors and a 6-axis accelerometer; putting the two sets of measurements together might just be enough for all sorts of input. It also claims to be capable of providing input just ahead of the users real movement because the muscles are activated slightly ahead of the actually movement. This might make its use in game playing very interesting. As the company's website say:
"Unleash your inner jedi"
Take a look at the concept video which show the arm band in action:
The device isn't ready to be delivered just yet but it is available for pre order. It claims to be ready to use as a mouse replacement as delivered. It communicates via Bluetooth with almost any PC, Mac, iOS or Android device. Don't take the applications in the video too seriously - the snowboard for example is just a mockup.
The company is relying on us programmers to get excited about the idea of the MYO enough to make their dreams reality and to come up with dreams of our own. There is an open API but very little detail at the moment. As the publicity says:
"Developers, we get it. You can think of ways to use MYO that we haven’t even dreamt of."
The key factor in its success is going to be how good the gesture recognition is. There is also the question of how often false positives will trigger unwanted actions. For example, in the video a soldier is shown guiding a remote vehicle - imagine what might happen if the arm movements connected with some other activity were interpreted as inputs to guide the vehicle. It might be a matter of how well a human can learn to use the input device as well as how good the software is.
You can pre-order the MYO for $149 and it should be with you late 2013.
An intermediate level robotics MOOC in which students discover the underlying principles that allow autonomous robots to navigate through the world started this week on edX. It is self-paced with [ ... ]
A handwritten letter from Alan Turing explaining the algorithm for playing solitaire is among the lots at an auction of manuscripts at Bonhams London saleroom later this month and is expected to [ ... ]