Posted by Sam Weiss on 4/13/2014
“Our goal is to give mobile devices a human scale understanding of space and motion”
Google is aiming to create a unique mobile device that senses space and movement in real time. The Tango Phone’s current prototype is a 5” Android phone with a highly customized hardware and software to track the full 3D motion of the device as you hold while creating a map of the environment, also pointing out that the device has a gaming potential.
The project is intended to re-imagine the use of smartphones. The device will be using a 4 MP camera, two “Computer Vision Processors,” integrated depth sensing, and a motion tracking camera. Johnny Chung Lee, Project Lead at Google’s Advance Technology and Projects (ATAP) group said the phone’s sensor make over a “quarter million 3D measurements every second,” updating its position and orientation in real-time, which is then used to create a 3D model of the environment it’s in.
They are already collaborating with universities, research labs, and industrial partners spanning 9 countries around the world to concentrate the past 10 years of research on robotics and computer vision into a unique mobile phone. “Imagine that you scan a small section of your living room and then are able to generate a little game world in it,” said one of the developers who collaborated on the projects. “I don’t know of any other controller or gaming device that can do that.”
What exactly can you do with it? If you walk into a store you could see exactly where the thing you need to buy is, or play hide-and-seek in your home with a favorite game character, or help visually-impaired navigate their way to a a place by their own, and they imagine more you could do with it. You can even probably enter a house and navigate your way to their jewelries; kidding aside the possibilities are endless.
The early prototypes aren’t actually ready for everyone, it’s still being polished. But they are already in the hands of developers excited to explore new technologies, test its limitations, and join in its evolution. Collaborators involved in the project are Bosch, BSQUARE, CompalComm, ETHzurich, Flyby, The George Washington University, DOF hiDOF, MMSolution, Movidius, Universitiy of Minnesota, JPL, Ologic, OmniVision, Open Source Robotics Foundation, and Paracosm, Sunny Optical Technology. So we’ll soon hear updates with its finishing touches being done, let’s keep an ear out.