What this means is that you can write Kinect using gadgets that work from any web page. At the moment the software is pre-alpha and provides a low level interface to the Kinect and a high level gesture recognition API.
The high-level API provides robust hand detection but needs work on more general gesture recognition.The API can recognize the following:
Presence of hand (registration)
Removal of hand (unregistration)
Large swipe up/down/left/right
If you want to see the sort of thing it might be used for take a look at the video below.:
In my opinion it looks good but I foresee lots of arm ache and perhaps even some ailment to beat carpal tunnel syndrome as the number one computer using hazard.
The code is open source and you can get it from GitHub.
An open source simulator that you can use to crash-test drones and robots has been released by Microsoft. The simulator can be used to test the devices virtually rather than wrecking them in the real [ ... ]
The latest regular update to SQLite, the widely used embeddable SQL database library that is found in many memory constrained gadgets such as cellphones, PDAs, and MP3 players, has been released [ ... ]