Myoelectric Input - Better Than Kinect?
Written by Harry Fairhead   
Tuesday, 05 March 2013

It seems we are in search of ever better input devices. After being satisfied with a mouse for many years we now want gesture input and the easiest way may be to tap into a user's muscles.

Gesture input is already possible using a Kinect depth camera and in the very near future perhaps with a Leap 3D input device. However it might just be much cheaper and simpler to read the electrical signals in the user's arm. The MYO input device, being built by Canadian firm Thalmic Labs,  measures the myoelectric activity in the arm. You might think that this would limit the input to crude movements such as clenching a fist but the demo video suggests that it is refined enough to distinguish between the activities of smaller muscle groups. For example different muscle activities are associated with rotating the arm and flexing the fingers.

myobands

What this means is that the device could in principle sense gestures that you make without having to track your hand or arm and without using an external device such as a camera. Of course the gestures and arm's positions would all be relative to your current body position but this isn't a big problem as long as absolute positioning isn't required.

The device has a set of muscle activity sensors and a 6-axis accelerometer; putting the two sets of measurements together might just be enough for all sorts of input. It also claims to be capable of providing input just ahead of the users real movement because the muscles are activated slightly ahead of the actually movement. This might make its use in game playing very interesting. As the company's website say:

"Unleash your inner jedi"

Take a look at the concept video which show the arm band in action:

 

The device isn't ready to be delivered just yet but it is available for pre order. It claims to be ready to use as a mouse replacement as delivered.  It communicates via Bluetooth with almost any PC, Mac, iOS or Android device. Don't take the applications in the video too seriously - the snowboard for example is just a mockup.

The company is relying on us programmers to get excited about the idea of the MYO enough to make their dreams reality and to come up with dreams of our own. There is an open API but very little detail at the moment. As the publicity says:

"Developers, we get it. You can think of ways to use MYO that we haven’t even dreamt of."

The key factor in its success is going to be how good the gesture recognition is. There is also the question of how often false positives will trigger unwanted actions. For example, in the video a soldier is shown guiding a remote vehicle - imagine what might happen if the arm movements connected with some other activity were interpreted as inputs to guide the vehicle.  It might be a matter of how well a human can learn to use the input device as well as how good the software is.

You can pre-order the MYO for $149 and it should be with you late 2013.

More Information

MYO armband

www.thalmic.com

Related Articles

PrimeSense Give Details Of Tiny Depth Sensor

Leap Motion In Asus Deal

PrimeSense Imagines A 3D Sensor World

Leap 3D Sensor - First Demos

 

To be informed about new articles on I Programmer, install the I Programmer Toolbar, subscribe to the RSS feed, follow us on, Twitter, Facebook, Google+ or Linkedin,  or sign up for our weekly newsletter.

 

kotlin book

 

Comments




or email your comment to: comments@i-programmer.info

 

Banner


Pulumi Envisions The Intelligent Cloud
18/09/2024

Today at PulumiUp, its 4th annual conference, Pulumi is announcing "Intelligent Cloud", its vision for AI-powered cloud infrastructure, and two new products, ESC GA and Insights 2.0.  



Run C From JavaScript
25/09/2024

The serverside JavaScript engine Bun has a new trick. You supply it a C file and it will let you call C functions from JavaScript. Genius or insane?


More News

 

Last Updated ( Sunday, 02 June 2024 )