|Throw A Paper Plane Around The World And Catch One!|
|Written by David Conrad|
|Saturday, 24 September 2016|
Is this VR, AR or something else? The Google blog explaining the idea doesn't seem clear on the matter either. It all starts with a simple thought, "What if you could throw a paper plane from one screen to another?"
From this idea we have Paper Planes, an Android experiment which lets you "throw" a paper plane using your phone to launch it. The plane zooms off the phone screen and appears on the screen of a desktop viewing a website. What you see is your plane join all the planes thrown by other users. You can catch a plane and see where it came from by viewing the passport included when it was built. And you can add your own to show where it has been.
Check out the video to see it in action:
The app was demoed at this year's Google IO but now you can download it from Play and try it yourself.
It is fun and it is a new sort of social interaction mediated by computer technology. Watch the world throw paper airplanes and discover geography, at least the geography of the most technologically advanced parts of the world.
Perhaps the most important idea here is the use of all the screens as one big display. Why not a paper airplane that flies across all the screens in a house or school or whatever?
The technologies used are fairly predictable. The 3D rendering is via WebGL, what else, the communications are via web sockets and the plane cloud is computed using web workers. The basic architecture is a WebView augmented with some Java code - an approach that isn't used often enough as a way of integrating web behaviour with an app UI and native behavior. As the blog says:
This approach worked extremely well for us, enabling an experience that was smooth and captivating across platforms and form factors, connecting people from all over the world. Extending the web with native capabilities has proven to be a valuable avenue to deliver high quality experiences going forward.
After all it is the basic approach used by Cordoba and other web as native frameworks.
Slightly less obvious technologies used include Firebase Cloud Messaging (FCM) to get the notifications between instances of the app and a network of servers on Google's Cloud Platform to handle the web socket traffic. The WebGL rendering was done in three.js
Not only can you get the app from Play, you can also get the code from Github.
How about a real VR version running on Google Cardboard?
or email your comment to: email@example.com
|Last Updated ( Saturday, 24 September 2016 )|