PrimeSense has just released a video view of how they think its 3D sensor could change things. It is a bit over the top in places, but it does give you a clear idea how important 3D cameras are when coupled with the right software. For example, who needs a touch screen when the machine can see what you are touching....
The video has been made as part of PrimeSense' presence at this year's CES where they also hope to launch their miniaturized depth camera. This might be small enough and cheap enough to put 3D sensors into just about any device you care to think of.
Take a look at what this might mean and if you think that some of it is slightly dystopian then you might well be right:
What is surprising is that many of the ideas shown in the video have real world implementations. For example the "turn any screen into a touch screen idea" is already in production as part CoVii’s Almost Touch Digital Signage. ShopPerception uses PrimeSense sensors on stores' shelves to analyze shoppers’ behavior in real time. Personify changes the way people communicate with video for on-line meetings, video calling, or desktop remote presence. Using the 3D sensor Personify Live extracts a live image of you from your surroundings and puts you right in front of your content. Matterport’s scanning unit uses two PrimeSense 3D sensors to accurately capture a scene and within minutes provide a full 3D model of the room for virtual walk-through. Ayotle uses the PrimeSense 3D sensor to turn any projector into an interactive one. Presenters can use gestures control to flip through slides, zoom in and out, and perform other presentation tasks.
The bottom line is that we don't walk past shop windows that allow us to buy things directly and the doctor doesn't yet manipulate your x-rays with a wave of the hand but... it's just a matter of software and ubiquitous 3D cameras.