It’s rare for Google itself to make a sound around the time of MWC. Whilst a host of OEMs are pushing its Android system with new wares at the show in Barcelona, Google usually sits at home working on something for its own I/O show which takes place a few months later. This year was different, though, as Google unveiled something exciting which it calls ‘Project Tango’.
So what exactly is Project Tango? Well, it’s the culmination of over a decade of research in 3D mapping, virtual worlds and more between Google, universities and a company called Movidius, who are responsible for a lot of the tech that goes into Project Tango. On the surface, the Project Tango phone itself is a chunky and square-ish white device with Android soft keys at the bottom of the screen. There’s a camera on the back and another smaller lens on the front, giving the allure that this is an ordinary, unbranded smartphone.
However, this phone utilises almost the same technology that was used in the NASA’s Mars Exploration Rover a little over a decade ago, thanks to a multitude of sensors which help it map the world in unique 3D visuals. A 4 megapixel camera, low resolution image-tracking camera and RGB and IR sensors allow the phone to understand its surroundings in 3D as the human eye does, including a sense of real depth perception. Movidius developed its own Myriad processor which sits inside the phone, utilising low power whilst analysing the data fed to it by the phone’s sensors.
This data can then be output to any app or API, which is where Project Tango’s real purpose comes in: developers. Google believes that the extra data recorded by the unique array of sensors means that devs can create a whole new type of app which understands the world in the same way as we do. In fact, 200 of these prototype phones will be given away to developers in the near future so that apps can be created and the phone’s purpose made real use of.
As with all new tech, it can be tricky to see the purpose behind it at first, so Google came up with some demo apps which show off what Project Tango can do for apps. One such example was an app which analysed the world around the phone and drew objects in 3D on the screen within seconds – point the phone at a desk and you’ll see that desk rendered in 3D on the phone, and so on. Other apps showed the ability to render a 3D space and then add heat map colours to objects within a scene, including red for objects closest and blue for objects further away – very clever stuff.
We’re likely to see this technology implemented in 3D mapping and design apps in the future if it takes off, but Movidius also believe it could be used to help the visually impaired. For example, a phone could feed its surroundings from the camera to the screen, providing an accurate 3D rendering of objects in the immediate vicinity of the user. This could then be fed via earphones to provide warnings when the holder of the phone moves too close to an obstacle, preventing any bumps or collisions. Further still, this tech could be put into a purpose-built device which is worn around the neck, providing a beep or alarm system which is triggered by being in close vicinity to an obstacle.
You won’t be able to buy a Project Tango phone and we suspect it will be a while before any manufacturer decides to put such a complex array of sensors and features into a smartphone for the consumer market, but for now just check out the video below and let us know what you think.