Intel unveiled a virtual reality headset that allows nearby objects from the real world to be integrated into its computer-generated views at this week's Intel Developer Forum. Project Alloy is at an early stage and combines depth sensing camera technology with immersive graphics in what Intel terms "merged reality".
Microsoft calls it Mixed Reality, for others it is Augmented Reality - but whichever it is the next really big thing and something that developers are already embracing.
In the case of Project Alloy this concept is virtual reality without hardware constraints and with added freedom to interact with the real world:
Through merged reality, see your hands, see your friends … see the wall you are about to run into. Using Intel RealSense technology, not only can you see these elements from the real world, but you can use your hands to interact with elements of your virtual world, merging realities.
The new head-mounted display consists of a set of goggles that resembles, say, Facebook’s Oculus Rift, HTC’s Vive VR headsets Most other VR headsets, and Sony’s upcoming PSVR. The main difference is that this is an "untethered experience" - you don't need to be connected by a cables to a computer. Instead, the computing power needed to run the headset in the headset. Nowhere in the Intel presentation is mention made of the new Joule device with its quad-core 1.7GHz processor - but that's probably just because the Project Alloy announcement came first.
Our report of the Joule states that it is:
part of Intel's new Augmented Reality push. The Joule API includes support for Intel's RealSense depth camera and these two technologies are key to the merged reality headset, project Alloy
Most other VR headsets require a series of cameras, sensors and accessories to be placed about a room or held in the hand. The use of RealSense with its depth-sensing capabilities allow objects including the user’s hands to be tracked and entered into the virtual world displayed within the goggles Most other VR headsets, including Sony’s upcoming PSVR without the need for additional sensors.
In his keynote at IDF Intel’s CEO Brian Krzanich, said:
“Merged reality delivers virtual-world experiences more dynamically and naturally than ever before – and makes experiences impossible in the real world now possible.
“Pick up your real-world tennis racket in your living room and step virtually on to the court at Wimbledon. Be the ultimate concert master – fully unplugged. Plan your virtual visit to the Sistine Chapel while never leaving the office. Experience a sporting event, a concert or a film scene from any point of view – and from any position.”
To show the audience what the experience would feel like Craig Raymond joined Krzanich on stage to demo the view from inside the headset:
According to the only information yet available, merged reality hinges on the following five technological advances said to be: "soon to be available to developers, makers and inventors".
- 6 Degrees of Mobility Freedom in movement in 3-D virtual spaces with real-world awareness. Said another way: It’s about sensing technologies to help make sure that while you’re experiencing your virtual world, you’re not colliding into real-world stuff.
- Integrated Tracking Attaching sensing technologies, such as Intel RealSense cameras, to your headset and other smart and connected devices reduces the need for elaborate and costly sets of external sensors that translate real-world environments into digital representations.
- More Natural Manipulation Immerses your your real-life hands into your simulated experiences thanks to readily available new sensing technologies.
- Go Untethered Experience virtual worlds across larger spaces without pesky cords. No more being jolted out of your VR experience because you have reached the end of your cord.
- Digitized Real-World Content. Rather than a single point of view, advancements like Intel’s Replay 360-degree technologies use encoded video and advanced composition algorithms captured from an array of cameras to digitize whole playing fields and venues — from any position, from any point of view, and with an enhanced ability to interact. This is a game changer for the entire category of virtual and augmented reality. You choose the experience, and you get to navigate real-world content in new ways.
Devs already have access to Microsoft HoloLens, which shares some of these features, plus the added one of including remote people into the mixed reality environment by holographic projection. However rather than treating the Microsoft HoloLens as competition, Intel is collaborating with Microsoft to bring immersive experience to Project Alloy.
The other point worth noting in the Intel Chip Shot, aka press release is the Project Alloy will be offered as an open platform in 2017.
WWDC17, Apple's annual developer conference, is underway in San Jose and the keynote included announcements about iOS 11 and WatchOS 4 that open up new possibilities. The App Store also has a brand new look with benefits for developers.
No this is not some vague philosophical point about code being so beautiful it is a logical poem - although it undoubtedly is - this is real poetry. It has to rhyme and it has to compile.
- Self Driving Car Challenge
- Google Open Sources iOS Testing Framework
- Mozilla Funds Open Source Code Audits
- Windows 10 S App Store Only!
- Free Version Of PVS-Studio Released
- Rust 1.6 Released
- Google's DeepMind Demis Hassabis Gives The Strachey Lecture
- dbForge Adds Data Generator
- Google Doodle for Claude Shannon's 100th Anniversary.
- Using Super Seeing To See Photosynthesis
- ImageNet Training Record - 24 Minutes
- Win 10 To Get AI Dev Platform
- App Locates People Even When There Is No Service