We are still waiting for a next generation Kinect with improved resolution and other features, but meanwhile Microsoft Research is already hard at work implementing new software that could be more important.
Microsoft Research is currently having a Techfest at Redmond where it is showing off a lot of new work. Given how small it is compared to the rest of Microsoft, it is remarkably effective as a generator of new ideas and products - like the Kinect. This probably means that Microsoft will kill it off or downsize it any minute, but for the moment it seems to be unstoppable.
The latest work on the Kinect uses the same sort of machine-learning approach to distinguish between an open hand and a clenched fist. Although there are no details, its general method was to use a large number of images of people's hands and supervised training to distinguish between open and closed hands. The learning algorithm is based on a forests of decision trees, which is the same general method used to implement the skeleton tracking.
Being able to detect an open or closed hand might not seem to be much of an advance, and certainly not as good as a multi-gesture touch screen interface, but it is enough to allow the user interface to distinguish a "pick up" or "grip" gesture. So you can move the hands within an image, close both hands to grip the image points and move apart to zoom. This is much easier to understand by watching the following video:
It seems that we might not need higher depth field resolutions to make more sophisticated use of the Kinect interface. As well as the basic paint and other interactions shown in the video, it works with other off-the-shelf software via a mouse click emulation.
You can't get the software at the moment, but it has been promised for the next version of the Kinect SDK for Windows. There is no news about any plans to make use of it within the Xbox version of the hardware.
Amazon has just added a small Wi-Fi device to its Internet of Things collection. Designed to be easy to configure to get started with AWS IoT, it is a programmable button based on the Amazon Dash Butt [ ... ]