In General XR News
May 16, 2025 – Meta has this week announced several accessibility-focused updates to its products and research initiatives, including new features for Ray-Ban Meta glasses and expanded deployment of assistive AI tools.
Starting this month, users in the U.S. and Canada will gain access to a new Meta AI feature on Ray-Ban Meta smart glasses that allows for more descriptive responses about a user’s surroundings. The feature is designed to assist with real-time contextual information and is particularly useful for users who are blind or have low vision. Users can activate it in the Meta AI app under Accessibility settings, with wider availability planned for additional markets.
In partnership with Be My Eyes, Meta also stated that its Call a Volunteer feature will roll out in all 18 countries where Meta AI is currently supported. The service connects blind or low vision individuals with a network of sighted volunteers to assist with everyday tasks via live video support.
Advancing Accessibility Through sEMG Research
Beyond consumer-facing features, Meta also highlighted its ongoing research efforts to improve human-computer interaction for users with physical disabilities. The company is continuing development on its surface electromyography (sEMG) wristband technology, which interprets muscle signals at the wrist to allow for device control even in cases of tremor, paralysis, or limited mobility. The sEMG wristband is being tested as part of Meta’s Orion AR glasses prototype.
Meta confirmed it has completed recent data collection with a Clinical Research Organization to evaluate sEMG use by individuals with Parkinson’s disease and Essential Tremor. In addition, the company has an active research collaboration with Carnegie Mellon University focused on enabling sEMG-based control for users with spinal cord injuries. According to Meta, participants in the program were able to begin using the system for interaction as early as the first day.
Improving Communication Tools Across XR and Social Platforms
In the company’s recent blog post recognizing Global Accessibility Awareness Day, Meta also noted several ongoing efforts to improve communication accessibility across its extended reality ecosystem. Live captions are currently available at the Quest system level, as well as in Horizon calls and Horizon Worlds. A live speech feature that converts text into synthetic audio has also seen high adoption, with recent updates enabling message personalization and quick access to frequently used phrases.
Finally, Meta pointed to use cases for its open-source Llama AI models in supporting accessibility. One example included Sign-Speak’s integration of Llama to build a WhatsApp chatbot that enables communication between Deaf and hearing individuals by translating American Sign Language into text and vice versa using an avatar.
Meta stated that it remains committed to investing in features and products that make connection easier for users.
Image /video credit: Meta
About the author
Sam is the Founder and Managing Editor of Auganix. With a background in research and report writing, he has been covering XR industry news for the past seven years.