September 16, 2020 – Facebook Reality Labs (FRL) has today unveiled Project Aria — a new research project that the company states will help it build its first generation of wearable augmented reality devices.
The team at Facebook Reality Labs envisions a time when people can enjoy all the benefits of connectivity, without the need to keep heads and eyes down to look at a device. Instead, the device itself blends seamlessly with everyday life. FRL added though that a lot of this is still the domain of science fiction, noting: “To actually build glasses flexible enough to work for most face shapes and sizes, and create the software to support them, we still need several generations of breakthroughs, like systems to enhance audio and visual input, contextualized AI, and a lightweight frame to house it all. This kind of AR requires a foundational shift in computing technology that mirrors the leap from libraries and landlines to personal computers and smartphones.”
However, Project Aria marks a step in that direction. FRL has built a research device that will help the company understand how to build the software and hardware necessary for AR glasses by:
- Using sensors to capture video and audio from the wearer’s point of view;
- Capture eye movement and location data to help FRL engineers and programmers figure out how AR can work in practice;
- Encrypting, compressing, and storing data until it’s uploaded to a separate, designated back-end storage space.
However, the Project Aria glasses are not a consumer product and will not be released for sale to the general public, nor are they a prototype. Therefore:
- The glasses won’t display any information on the inside of the lens;
- Research participants will not be able to view or listen to the raw data captured by the device.
Instead, Project Aria has been designed as a way to help Facebook develop the safeguards, policies, and even social norms necessary to govern the use of AR glasses and future wearable devices.
Starting in September, the glasses will be made available to a limited group of about 100 Facebook employees and contractors in the United States (primarily located in the San Francisco Bay Area and Seattle), who will be “trained in both where and when to use the device, and where and when not to”, according to the company. By wearing the devices as they go about their day, at home, on Facebook campuses, and in public, the data gathered will support the development of head-tracking, eye-tracking, and audio algorithms that will one day make AR glasses a reality.
To better understand how this technology can benefit people with varying physical abilities, FRL is also starting a pilot program with Carnegie Mellon University’s Cognitive Assistance Laboratory to build 3D maps of museums and airports that will have multiple applications, including helping people with visual impairments better navigate their surroundings.
Obviously, with Facebook comes Privacy questions, and the company has published a list of FAQs on the topic. It’s too much to dig into for this article, but for full information from the company, click here. To summarize though, Facebook stated that: “Any data we collect today is done in service of finding a way to do the least intrusive data capture with our hardware tomorrow, and to make sure whatever data we do collect provides as much user value as possible.”
FRL stated that despite technical challenges, the Project Aria glasses include the full sensor suite used in VR headsets for spatial awareness, and are able to compute location from GPS, take high-res pictures, and capture multichannel audio and eye images.
As a research device, the current sensor configuration is not set in stone, but FRL will be focusing on whether it can build products that use a collection of sensors for multiple tasks in a power-efficient manner. FRL stated that ideally, such a product would be able to use all of its sensors in tandem, to better understand a user’s intent and offer information only when it’s useful. One good example offered being, if a user is at a grocery store, the front-facing cameras might scan the room and identify the store’s contents, while the eye-tracking cameras recognize the user’s gaze has fallen on a fruit, and the display function pops up with an identification, its price, its nutritional value, potential recipes, and so on.
The company is also exploring how the device’s head-tracking sensors can be used to add virtual objects to physical environments, as well as the ability to share virtual objects with other people. This brings with it its own set of technical challenges, and for now, FRL is going to rely on existing map data – specifically, its LiveMaps solution.
By using computer vision to construct a virtual, 3D representation of the parts of the world that are relevant to a user, LiveMaps will let future glasses effectively mix the virtual world and real world. AR glasses will download the most recent data from the 3D map, and then only have to detect changes — like new street names or the appearance of a new parking garage, and update the 3D map with those changes. The Project Aria device is testing how this can work in practice too.
It will be interesting to see the results of the FRL’s research and pilot programs. For the full breakdown on Project Aria, visit the project’s website.
Video / image credit: Facebook Reality Labs / Facebook
About the author
Sam is the Founder and Managing Editor of Auganix. With a background in research and report writing, he has been covering XR industry news for the past five years.