Category: Platform

  • XRHealth launches Virtual Reality telehealth clinic to provide VR therapy to patients

    XRHealth launches Virtual Reality telehealth clinic to provide VR therapy to patients

    In Virtual Reality News

    February 14, 2020 – XRHealth (formerly VRHealth), a provider of extended reality and therapeutic applications, has announced this week the launch of its first virtual reality telehealth clinic that will provide VR therapy to patients.

    VR telehealth clinicians providing care are currently certified in Massachusetts, Connecticut, Florida, Michigan, Washington D.C., Delaware, California, New York, and North Carolina and will be expanding their presence in additional states in the coming months, according to the company. The XRHealth telehealth services are covered by Medicare and most major insurance providers.

    XRHealth utilizes the power of virtual reality to rehabilitate patients via an immersive and engaging experience in the comfort of their own home. The company combines therapeutic software with VR technology solutions that can treat a variety of health conditions. The VR therapy helps transport patients to an environment where they can view and experience treatment as a fun activity, increasing patient participation in prescribed therapeutic treatments.

    “XRHealth is modernizing and revolutionizing the way healthcare is operating today,” said Eran Orr, CEO of XRHealth. “We are utilizing the most advanced forms of technology like virtual reality to provide patients with optimal care in the comfort of their own homes while providing top-notch clinicians with ongoing status of their progress. Patients can now ‘go’ to a virtual clinic without the need to leave their homes at all.”

    The XRHealth VR telehealth clinicians will provide an initial patient assessment, ship a VR headset to patients who do not currently have access to one, train them on how to use the technology, provide ongoing telehealth care and remote monitoring, using video call and VR technology, and manage the insurance billing for patients.

    As the patient is using the XRHealth VR technology for therapeutic treatment, the clinical staff can control the unit remotely and see exactly what the patient is viewing and adjust the settings and treatment in real-time. After the initial training session, the patient can then use the headset independently while data from the therapy is stored and analyzed so that clinicians can monitor patient status regularly while in compliance with the HIPAA privacy rules. Once a week, a report will be generated to the payer/provider that referred the patient.

    Patients that want virtual reality therapeutic care from the XRHealth telehealth clinic can seek treatment for the following conditions:

    • Traumatic brain injury and stroke rehabilitation;
    • Stress, anxiety, memory decline;
    • Chronic pain, acute pain, pain distraction, pain syndromes;
    • Hot flashes and night sweats;
    • Neck, shoulder, spinal cord injuries and neurological disorders.

    The XRHealth VR telehealth clinics will open on March 1st and patients can join by submitting a request to enroll for the XRHealth services on the company website.

    Image credit: XRHealth

  • Arvizio launches XR Connect platform, enabling multi-location edge and cloud collaboration for enterprise XR

    Arvizio launches XR Connect platform, enabling multi-location edge and cloud collaboration for enterprise XR

    In Augmented Reality, Virtual Reality and Mixed Reality News

    February 13, 2020 – Arvizio has today announced the launch of XR Connect, a suite of services for secure multi-location edge and cloud collaboration for enterprise XR (Extended Reality). The Arvizio XR Platform offers a complete solution for optimization, visualization and collaboration using 3D CAD, BIM, LiDAR, 3D scans and photogrammetry models.

    Arvizio states that its XR Connect services will allow multiple XR users to interact over wide area networks including both fixed broadband and mobile 4G/5G connections. Arvizio’s specialized servers offer WAN-optimized data transfers, spatial anchor sharing and synchronization of shared content together with real-time audio and video communications. Edge and cloud hybrid architecture means that sensitive 3D model data is stored securely on location and relayed to XR devices on demand over encrypted wide area services.

    The Arvizio XR platform has 3 key elements:

    1. The Arvizio XR Immerse app that runs on Augmented Reality (AR), Mixed Reality (MR) or standalone Virtual Reality (VR) devices;
    2. Arvizio XR Director which is used to prepare, optimize and serve 3D models and associated content from a PC or edge server; and
    3. Arvizio XR Connect which provides connectivity and sharing services across locations. 

    Arvizio XR Director allows fully automated or user-guided workflows to reduce the size of large 3D models and point clouds for display on AR devices and headsets. In addition, hybrid rendering capabilities allow models to be streamed to the headsets from GPU equipped PCs and edge servers using Nvidia, AMD or Intel GPUs. Along with seamless integration with Autodesk Revit, BIM 360 and Fusion 360, users can combine CAD/BIM models with point clouds in the same scene to visualize placement of a model in a real-world LiDAR scan or photogrammetry model. Real time IoT data can be displayed and augment a 3D scene to provide an XR digital twin.

    “New augmented reality services with AR content streaming will require scalable edge and cloud architectures in order to take advantage of the power of 5G networks,” said Jonathan Reeves, CEO at Arvizio. “Our XR Connect services, along with advanced tools, hybrid rendering, and support of multiple devices leverage new 5G deployments and enable our customers to visualize and implement project solutions faster than they’ve been able to in the past. This is particularly important in the AEC, mining, energy and manufacturing industries where accuracy and time to completion directly impact the bottom line.”

    Arvizio has also released three new tiers of the XR Platform to allow customers to choose a solution scaled to best fit their organization. All three packages include XR Director, the Immerse XR app and Immerse Mobile AR.

    The XR Platform Innovator Edition provides the ability to walk through designs at life size, align models with the real world for AR and MR digital twin scenarios and synchronize the view for participants at a single location. The Pro and Pro Plus Editions extend this set of capabilities to conduct virtual design reviews and collaborative meetings with audio bridging and two-way video for multiple participants across locations, according to the company.

    Arvizio will be offering live demonstrations of its XR Platform at the Industrial VR/AR Forum from February 25 – 26, in Houston. The company also stated that it was due to showcase the platform in conjunction with Qualcomm at MWC in Barcelona this year, however, as of yesterday, GSMA, the company that organizes MWC Barcelona, announced that this year’s conference has been cancelled, due to fears of the spread of Coronavirus, with an average.

    Image credit: Arvizio

  • Vuzix and TensorMark collaborate to integrate facial and object recognition into Vuzix Blade Smart Glasses

    Vuzix and TensorMark collaborate to integrate facial and object recognition into Vuzix Blade Smart Glasses

    In Augmented Reality News

    February 13, 2020 – Vuzix Corporation has announced a partnership with US based TensorMark, a cloud-based AI and computer vision technology provider, to integrate their two solutions on the Vuzix Blade Smart Glasses.

    TensorMark has been developing an interface specifically for the Vuzix Blade Smart Glasses that takes advantage of the physical attributes of the device, including the color display and on-board camera, to integrate TensorMark’s cloud-based AI and computer vision technology and more specifically facial and object recognition. The company’s AI and computer vision technology has applications developed for verticals including loyalty and retail; corporate and personal security and access control; and banking services and Fintech.

    With the TensorMark technology and the Vuzix Blade Smart Glasses, customers will be able to ID facial and object images hosted on a cloud database that can be customized and catered specifically for every client. Security camera output, as well as drone aerial footage, can be analyzed by the TensorMark backend system to provide important information and send alerts directly to the display on the Vuzix Blade, providing real-time digital intelligence to customers. TensorMark is also working on adding the ability to predict behavior patterns using emotion detection to provide clients with even more vital information with its AI algorithm.

    “We are thrilled about our partnership with Vuzix to bring our AI and computer vision technology suite to the Vuzix Blade Smart Glasses,” said J.P. Weston, CEO and co-founder of TensorMark. “The combination of the Vuzix Blade with our facial and object recognition backend will open up very significant business opportunities for both companies across numerous market verticals that include border patrol, first responders, hospitality, retail, and banking.”

    Paul Travers, CEO and president of Vuzix, commented: “The TensorMark solution, working in unison with the Vuzix Blade Smart Glasses and leveraging the ongoing developments in 5G and edge computing, will be a recipe that will disrupt the personal and professional security marketplace.” Travers added, “We are excited to partner with TensorMark to address the growing interest in the Vuzix Blade Smart Glasses across the security market.

    The companies are collaborating on numerous facial and object recognition proof of concept demos centered around the Vuzix Blade Smart Glasses, including one with a Fortune 50 company that is evaluating the solution for deployment across their customer base, leveraging AI and edge computing to provide enhanced intelligence to security personnel.

    Image credit: TensorMask

  • Magic Leap announced Developer Access Programs

    Magic Leap announced Developer Access Programs

    In Mixed Reality News

    February 12, 2020 – Magic Leap has today announced its ‘Access Hardware’ program, the first of a series of formalized programs aimed at providing more access to hardware, funding, enterprise customers, and support for developers across the globe. 

    Access Hardware is an initiative to put Magic Leap devices in the hands of developers who are serious about publishing applications with Magic Leap. Developers will get support from Magic Leap’s developer relations team, and once an app is ready it will be prioritized for publishing, according to the company. 

    During the application process, Magic Leap states that it will look for things like technical experience, project feasibility, and the overall quality of submissions. The company is particularly interested in ideas that solve problems and create opportunities for enterprise markets and customers. Interested developers can apply here for the Access Hardware program.

    Magic Leap has also announced the following today:

    LEAP Developer Days
    On May 19-20 and 21-22, the company will host two groups of developers at its HQ in Florida to provide behind-the-scenes access to help accelerate the development and growth of their businesses. Developers will meet face-to-face with members of the company’s developer relations, studios, and go-to-market teams, and will have the chance to interact with the engineers and designers building the core Magic Leap hardware and platform. Developers will have direct access to Magic Leap technical, creative, and business leads and hands-on workshops. Applications for LEAP Developer Days will open later this month.

    Access Enterprise Customers
    Later this year, Magic Leap will be rolling out programs for agencies who specialize in building custom applications for enterprise customers or brands, and for those developers creating their own software solutions to sell directly to business customers. The focus of these Access Enterprise programs will be to help developers sell their solutions to businesses or leverage their creative talents in building the next brand experience or custom application.

    Access Funding and Support
    Magic Leap has stated that later this year it will announce more details about how it will be providing access to funding with the next evolution of its grant program. The company has also stated that it will unveil programs to support enterprise-focused developers and agencies in connecting with customers and clients.

    Based on the success of Magic Leap’s Independent Creator Program, which saw the deployment of USD $10 million in funding to 33 developers worldwide, the program will be evolving this year to help financially support more developers in its ecosystem, according to the company.

    Image credit: Magic Leap

  • Oxford VR announces GBP £10 million in Series A funding for its Virtual Reality therapy solution

    Oxford VR announces GBP £10 million in Series A funding for its Virtual Reality therapy solution

    In Virtual Reality News

    February 12, 2020 – Oxford VR has today announced the closure of a landmark GBP £10 million Series A funding round led by Optum Ventures and supported by Luminous Ventures. Existing investors including Oxford Sciences Innovation, Oxford University Innovation and GT Healthcare Capital Partners also participated in the investment round. The capital infusion will enable Oxford VR to accelerate the US expansion of its automated VR therapy solution and to continue to expand its treatment pipeline.

    CEO Barnaby Perks said: “We are tremendously excited to close this investment round and to be working with Optum Ventures to drive our next level of growth. We would not be at this exciting tipping point without the collective efforts of the team at OVR, in particular Katie Bedborough, our CFO & COO. Together with Optum Ventures and Luminous Ventures, and with the continued support from our existing investors, we can expand our clinical leadership footprint and accelerate our pipeline of automated VR therapy treatments.”

    Founded in 2017, Oxford VR is a spinout from Oxford University’s Department of Psychiatry and builds on two decades of research by Professor Daniel Freeman investigating VR’s potential to create powerful, automated psychological treatments that revolutionise the way people experience therapy. Oxford VR’s first clinical trial for fear of heights, which was published in The Lancet Psychiatry, showed how automated VR therapy can produce large clinical benefits. The trial also demonstrated automated VR therapy’s capacity to transform mental healthcare by helping overloaded providers expand access and standardise clinical excellence, ensuring adherence to treatment protocols.

    According to the 2019 Lancet Commission report on Global Mental Health, mental disorders are on the rise in every country around the world and will cost the global economy USD $16 trillion by 2030. From access challenges, poor outcomes, the high cost of care, low patient engagement rates and a shortage of skilled clinicians, there are huge unmet needs in mental healthcare. Oxford VR states that the investment recognises the “significant potential” of its automated VR therapy to address many of these challenges, and attests to the company’s existing work delivering automated VR therapy in a real-world setting in the UK’s NHS.

    Along with today’s investment announcement, Oxford VR has also announced that Ash Patel, Principal at Optum Ventures will be joining the company’s Board of Directors. “Oxford VR has taken a technology-led approach to create evidence-based solutions that will make treatment more accessible to patients who need it,” said Patel. “We believe Oxford VR’s solutions will benefit those who need access to high quality, effective cognitive behavioural therapy.”

    In addition to providing VR therapy to NHS patients, Oxford VR is participating in several trials. In the UK, the NHS-funded gameChange project is the first large-scale multi-site trial to use VR therapy to treat patients with serious and complex mental health conditions.

    In Asia, Oxford VR has partnered with AXA Hong Kong and The Chinese University of Hong Kong (CUHK) in a pilot to test VR’s potential to support better mental health outcomes. In the US, OVR has established a strategic partnership with the National Mental Health Innovation Centre (NMHIC) where it is running multiple pilots using VR therapy treatment programmes to advance mental health outcomes in the US.

    Izzy Fox, Principal of Luminous Ventures, commented: “Oxford VR has taken world-class science from Oxford University and applied cutting-edge immersive technology to create a transformational mental health solution which can deliver significant value for overloaded healthcare programs globally. Immersive therapy is accessible, engaging and effective, and has demonstrated exceptional clinical outcomes and we are thrilled to be partnering with the Oxford VR team.”

    Video credit: Oxford VR/Vimeo

  • ThirdEye partners with 3D Media to improve training for U.S. Air Force flight crews

    ThirdEye partners with 3D Media to improve training for U.S. Air Force flight crews

    In Augmented Reality and Mixed Reality News

    February 11, 2020 – ThirdEye, a provider of augmented reality (AR) and mixed reality (MR) enterprise solutions and devices, has today announced its partnership with 3D Media, a technology development firm that specializes in virtual and augmented reality solutions for enterprise.

    The partnership will see the two companies work together to improve training and human performance for flight line maintainers and flight crews working with the B-1 Lancer aircraft in the U.S. Air Force’s 7th Bomb Wing. ThirdEye states that its X2 MR Glasses, which weigh just 300 grams, will provide personnel with improved safety, efficiency and proficiency when working on the mechanical structures of the aircraft, known as ‘airframes’.

    In November 2019, 3D Media was awarded a USD $1 million Small Business Innovation Research (SBIR) grant from AFWERX to build AR tools to improve training for airmen. According to ThirdEye, its track record of execution and success in military environments was a key reason that its X2 MR Glasses were chosen for research and development purposes, along with a larger deployment into the Air Force and U.S. Department of Defense.

    The X2 MR Glasses’ industrial capabilities, as well as their platform built for working applications, and ability to connect with subject matter experts in harsh environments, are all features that ThirdEye states align with the military’s needs. By wearing the X2 MR Glasses, flight crews and flight line maintainers can connect with experts in any environment via the glasses’ built-in proprietary 3D SLAM (simultaneous localization and mapping) system and CAD modeling and overlay, where step-by-step technical instructions and drawings can be projected onto the X2 MR Glasses display. The glasses also take screenshots and enlarge images for better visibility. Flight crews and maintainers are then able to open and view documents via voice command, while working hands-free, without worrying about Wi-Fi connection, thanks to the X2’s 5G capabilities.

    “As more of the world’s military adopt AR and MR solutions to better support the training and work of personnel, we’re honored to be working with 3D Media to contribute to the U.S. Air Force’s efforts and increase the safety of our airmen,” said Nick Cherukuri, Founder and CEO of ThirdEye. “From our previous work in the military, ThirdEye’s X2 MR Glasses have undergone drop tests, so we know it’s ready to support airmen however needed, no matter where they are.”

    The X2 MR Glasses feature a wide field of view and sensors that provide advanced MR features that are not available on a monocular device. Additionally, the glasses are entirely hands free, which the company states is important for being out in the field where wires can be a potential hazard. The X2 MR Glasses also run on the latest Android operating system, which allows software to be easily ported onto the glasses.

    Commenting on the announcement, Daryl Roy, CEO and Founder of 3D Media, said: “When we go into a project, we never assume what the outcome is going to be – the answer comes from collaborating with other experts. By partnering with ThirdEye, we’re arming airmen with X2 MR Glasses to significantly improve their day-to-day”. He added, “I believe that augmented reality’s largest opportunity is in the area of human performance, where there’s no margin for error. ThirdEye has proven to be successful in military environments, and with life and death literally on the line, it’s important to arm our military personnel with the best.”

    Earlier this year, ThirdEye announced that its X2 Mixed Reality Glasses were mass shipping worldwide, along with the availability of hand tracking and gesture controls for the X2 glasses – features that will no doubt come in useful for flight line maintainers using the devices as part of today’s partnership.

    Image credit: ThirdEye

  • U.S. Army using Virtual Reality to help Soldiers shape hypersonic weapon prototype

    U.S. Army using Virtual Reality to help Soldiers shape hypersonic weapon prototype

    In Augmented Reality, Virtual Reality and Mixed Reality News

    February 10, 2020 – Using virtual reality, U.S. Army Soldiers from Fort Sill, Oklahoma are getting a rare look at components of the Army’s new prototype Long Range Hypersonic Weapon (LRHW) and influencing how the system is designed.

    The U.S. Army stated that through a mix of virtual reality, augmented reality and mixed reality technologies, Soldiers last month were able to walk around and “touch” the Army’s new prototype LRHW system as an interactive, true-to-scale, three-dimensional model. 

    Inside the mixed reality lab, known as the Collaborative Human Immersive Laboratory, or CHIL, the Soldiers could view the equipment from any angle, at any distance and manipulate it as needed in order to better understand its operation and recommend improvements.

    “We were able to stand as a group around an area called ‘the cave,’ which allowed all of us to see, in 3D, through special eyewear, the Transporter Erector Launcher and missile as one,” said LTC Aaron Bright, the chief of the Operational Training Division of the Directorate of Training and Doctrine at Fort Sill. “I was able to grab pieces of the LRHW with my hands and move them weightlessly to the side to get a better look at another part, and to better understand how the system as a whole works. The kinds of things that would take hours with a crane, and several more hours with tools, we were doing on our own in seconds.”

    While hypersonics is often considered a futuristic, complex technology, the input received focused on seemingly low-tech items that are critical to Soldiers’ operational experience, such as generator placement and access, excess equipment that could be removed to save weight, generator exhaust routing, and specific locations for skid plates.

    “You can apply virtual reality and augmented reality to almost any concept the Army or other component has and gain vital feedback”

    As the prototype is built, this early Soldier feedback will help the U.S. Army identify any quick-fix flaws, as well as offer ways to improve the operational capacity. The system consists of a 40-foot Transporter Erector Launcher (TEL) with missiles and a Battery Operations Center (BOC). The truck and trailer combination, and the BOC, are all taken from existing Army stock, and are in the process of being modified to create new equipment that’s never been used in this way before.

    “You can apply virtual reality and augmented reality to almost any concept the Army or other component has and gain vital feedback,” said 1st Sergeant Michael Weaver, with the 1-31st Field Artillery Battalion, 434th Field Artillery Brigade at the Fort Sill Fires Center of Excellence. “Identifying potential issues early on in the development process is crucial because it is easier and cheaper to adjust design during the concept phase as opposed to production.”

    The Army Rapid Capabilities and Critical Technologies Office (RCCTO) is charged with delivering the prototype LRHW to a battery no later than fiscal year 2023. This aggressive prototyping schedule, which pushes the Army’s initial hypersonic capability delivery ahead by two years, can’t wait until the hardware is modified and integrated for Soldier feedback. Virtual reality fills the void by enabling Soldier touch points on an early and regular basis.

    “We have a very tight timeline with the LRHW,” said COL Ian Humphrey, integration project manager for the RCCTO’s Army Hypersonic Project Office. “We have to make it safe and we must meet very hard requirements. Although the LRHW is a prototype, the Soldier feedback we get here provides operational input early in the process. This is not only to help inform the LRHW, but also aid in the development of the Army’s hypersonics program of record.”

    The mixed reality CHIL enables real-time collaboration through equipment including virtual reality headsets, 3D glasses, holograms, and handheld controllers. The facility is owned by Lockheed Martin, which is under contract to deliver the All Up Round plus Canister (AUR+C), which includes the missile stack, the Common Hypersonic Glide Body, and canister. The company also serves as the LRHW prototype system integrator.

    Soldiers will be involved throughout the process and as more integrated and modified hardware becomes available, they’ll get a chance to walk around the real system. Plans are also in the works to create a CHILNET, which would allow remote sites to utilize the simulations and interact in real-time from multiple locations.

    Image credit: U.S. Army/Lockheed Martin

  • LlamaZOO announces new appointments to its board of directors

    LlamaZOO announces new appointments to its board of directors

    In Virtual Reality News

    February 10, 2020 – Spatial business intelligence and 3D data visualization company, LlamaZOO Interactive Inc., has today announced that it is preparing for the next phase of its growth with new appointments to its board of directors.

    The new additions to the board of the Victoria-based company, which uses 3D visualization to help industries such as mining and forestry make better decisions from complex spatial data, are Hannes Blum, Venture Partner at Acton Capital Partners, Jeff Booth, former President and CEO of BuildDirect, and Edoardo De Martin, General Manager of Microsoft Vancouver.

    LlamaZOO CEO, Charles Lavigne, said: “The expertise and insight that Hannes, Jeff and Edoardo bring to LlamaZOO will be invaluable for guiding the company as it continues its rapid growth and expansion into new markets.” Lavigne added, “We are very fortunate and appreciative that they have agreed to join the board to lend their experience to our operations.”

    Edoardo De Martin has worked extensively across gaming and visualization technology, with roles at BlackBox Games, Electronic Arts and Next Level Games. As Studio Manager for Microsoft Victoria, his team worked on the then-secret Hololens project, and in 2015 he was appointed General Manager of Microsoft Vancouver, the firm’s only Canadian development centre. De Martin is heavily involved in British Columbia’s tech ecosystem, and is on the boards of Synthiam, Canada’s Digital Technology Supercluster, and the BC Tech Association. Commenting on his appointment, Edoardo De Martin said: “I am very excited to serve on the board alongside a world class team with a real vision to deliver a 3D spatial analytics platform”.

    Hannes Blum has spent his career immersed in the business of new technologies, beginning with Boston Consulting Group in 1996. He went on to found JustBooks, which later merged with Abebooks, where as CEO and President he structured and navigated an acquisition by Amazon.com. In addition to his role as Venture Partner at Acton Capital Partners, Hannes has been involved at the board level with a number of companies including Mobify and Chefs Plate, which was acquired by HelloFresh.

    Jeff Booth is the former President and CEO of BuildDirect, a Vancouver company that went from connecting home-reno buyers and sellers with 6,000 products to over 150,000 products after launching Home Marketplace in 2016. Today Jeff is on several boards of growing companies including Terramera, which recently closed a USD $45m funding round, and is a founding partner at Oitolabs in Bengaluru, India – a global in-house product development centre which builds world-class teams for clients.

    Image credit: LlamaZOO

  • NexTech launches 3D AR Ad Network

    NexTech launches 3D AR Ad Network

    In Augmented Reality News

    February 7, 2020 – NexTech AR Solutions has announced this week that it has launched it’s 3D/AR Ad Network, offering an end-to-end solution for both advertisers and brands.

    NexTech’s end-to-end solution includes the creation of 3D assets, online 3D/AR display ads, WebAR 3D product views, and education & training, all in a frictionless and seamless 3D virtual environment, which the company anticipates will lead to an uptick in ROI.

    Evan Gappelberg, CEO of NexTech AR, commented: “We believe NexTech’s first mover advantage in 3D ads, can help us capture market share in the $240 billion online advertising market – where global brands constantly seek an edge in this highly competitive space.” He added, “Our team is super excited to continue to leverage our technology into new and exciting multi-billion dollar industries like advertising. Our existing customers have been enthusiastic about working with us on 3D/AR ads and so we expect to hit the ground running with campaigns”.

    NexTech’s 3D/AR advertising platform created for brands, publishers, and developers helps its customers power immersive advertising across all browsers and devices, on the web and mobile.

    The ad network is part of the company’s AR omni-channel platform approach which includes: AR for eCommerce, AR in Chat, its ARitize App and AR University for education and training. With this new ad network NexTech will continue to leverage all its current 3D asset creation technology and relationships into 3D/AR ads, which the company hopes will open up a new revenue channel in 2020.

    According to NexTech, the biggest challenge in creating 3D ads is the creation of the 3D asset itself. NexTech’s solution will therefore offer both the creation of 3D assets as well as serving up the ads. The company also added that interactive 3D ads deliver consistently better click through rates than flat 2D ads, and work especially well on mobile phones.

    Image credit: NexTech

  • Google announces Glass Enterprise Edition 2 now available for developers

    Google announces Glass Enterprise Edition 2 now available for developers

    In Augmented Reality News

    February 5, 2020 – Google has announced that its Glass Enterprise Edition 2 is now available for developers. Glass Enterprise Edition 2 has helped people working in logistics, manufacturing, field services and a variety of other industries do their jobs more efficiently through hands-free access to the information and tools they need to do their job. According to Google, Enterprises who have deployed Glass with experiences built by its network of solution providers, have seen faster production times, improved quality, and reduced costs.

    Since Glass Enterprise Edition 2 launched last May, Google states that it has seen strong demand from developers and businesses who are interested in building new, helpful enterprise solutions for Glass. In order to make it easier for them to start working with Glass, they can now purchase devices directly from Google hardware resellers, such as CDW, Mobile Advance or SHI.

    Glass Enterprise Edition 2 is built on Android, which allows developers to work with a familiar platform, and businesses to integrate the services and APIs (application programming interfaces) they already use. Google has also shared new open source applications and code samples, including sample layouts and UI components that may be helpful examples for those just getting started developing for Glass.

    Image credit: Google

  • French startup Lynx announces the Lynx R-1 XR headset powered by Qualcomm XR2 chipset

    French startup Lynx announces the Lynx R-1 XR headset powered by Qualcomm XR2 chipset

    In Augmented Reality, Virtual Reality and Mixed Reality News

    February 4, 2020 – Yesterday at Photonics West 2020, French startup, Lynx, announced the launch of its new standalone mixed reality headset – the Lynx R-1.

    The R-1, which is powered by the Qualcomm Snapdragon XR2 chipset, is primarily aimed at enterprise usage and is capable of both virtual and augmented reality immersion. Lynx has been working with Qualcomm for the last six months to integrate the XR2 chipset into its hardware, and the company is preparing for the R-1 to be the first XR device that ships with the XR2 on board, according to Stan Larroque, CEO at Lynx. Compared with other devices that utilize Qualcomm’s Snapdragon 835 chipset, the XR2 offers substantial performance gains:

    In terms of the design and specifications, the R-1 is slightly different to other VR headset offerings currently on the market, in that it doesn’t use a fresnel lens or a bi-convex lens. Instead, Lynx is opting to use a ‘4-fold catadioptric freeform prism’. In the context of virtual reality usage, this is not only a lens, but in fact a full optoelectronic system, which includes eye-tracking hardware within the prism itself.

    There are four elements to the display optics, with a ‘microlens’ being at the center. This is where the eye-tracking hardware is hidden – right in plain sight so to speak. The prism therefore actually displays four images, however the user only sees one single image. Some pixels at the center of these four images are also duplicated on the screen, allowing for super-sampling at the center of the image, as well as along the four borders between images.

    As a result, the R-1 offers a 90° field of view (FoV) with 18 pixels per degree (PPD) on two 1,600 x 1,600 LCD panels running at a refresh rate of 90Hz. The R-1 also features on board-cameras, and it is these that enable the device to offer augmented reality experiences by way of AR passthrough, wherein a live image feed of a user’s surroundings are projected onto the headset’s display, and augmented elements are overlaid on top of this feed. Other design features include the location of the battery pack at the rear of the headset, to provide improved comfort due to a more even weight distribution of the device.

    Talking in a product announcement speech at Photonics West, Lynx CEO, Stan Larroque, commented: “The kind of device we are bringing to the market is fully ready for the B2B market”, adding, “Every industrial company I talked to, they already have a use case for our device”. Larroque also noted that the best use cases he has seen to date for the R-1 include applications for surgery, and also gaming. However, he anticipates that the biggest market share will be with industry, and stated that the R-1 will be an alternative to the Microsoft HoloLens, as it offers some use cases that the HoloLens does not.

    You can in fact see some similarities between the R-1 and the HoloLens, in that the R-1 also has a flip-up design, allowing the wearer to quickly switch between immersive experiences and real life – although, presumably a passthrough feature would also accomplish the same goal. Other features of the R-1 include:

    Sensors
    Positional tracking – 2 B&W cameras
    Eye tracking – 2 IR cameras
    RGB Cameras – 2 visible light cameras
    Inertial Measurement Unit (IMU) – Accelerometer, gyroscope, magnetometer

    Audio
    Speakers – 2 stero speakers
    Microphone array – 2 channels

    Tracking
    6DoF tracking – SLAM with world anchors
    Hand tracking – Two-handed gesture recognition
    Eye tracking – Low latency tracking

    Processing System
    SoC – Qualcomm Snapdragon XR2
    Memory – 6GB LPDDR5
    Storage – 128GB

    Connectivity
    Wi-Fi 6 (802.11ax)
    Bluetooth 5.0
    USB type C

    The Lynx R-1 will be available in summer 2020, and will cost USD $1,500. Preorders are available today, and require a USD $150 deposit. Larroque has expressed interest in partnering with games studios, with one big name partnership apparently already confirmed by Lynx. More info on this will be announced in the summer, according to the CEO.

    Video & image credit: Lynx/YouTube/Qualcomm

  • Virtualitics releases its ‘Virtualitics Immersive Platform’ 2020

    Virtualitics releases its ‘Virtualitics Immersive Platform’ 2020

    In Virtual Reality News

    January 31, 2020 – Virtualitics, Inc., a data analytics company that specializes in machine learning, artificial intelligence and 3D visualization – has this week announced the next major release of its flagship product, Virtualitics Immersive Platform (VIP). VIP 2020 enables users to obtain actionable insights from complex data.

    The cornerstone feature of VIP 2020 is its ‘Network Graph’ package. This capability allows users to automatically build interactive reports on trends, anomalies, and relationships living in unstructured data. Virtualitics is able to render 3D visualizations of network graphs and compute insights thanks to the platform’s algorithms.

    Ciro Donalek, CTO and co-founder of Virtualitics, said: “Working with unstructured data has been a pain point for business intelligence software users – until now. Virtualitics is the first software that offers a suite of analytics and visualization tools that are complete, easy-to-use and can render results in 3D. This translates into actionable insights within seconds.”

    VIP 2020 offers additional upgrades designed to enhance and streamline the user experience for both desktop and Virtual Reality users. The software now includes a column calculator, full control over colors and playback for compelling story-telling, tools to clean up data before running analyses and connectors to all major data sources. Virtualitics states that its patented technology also helps deliver faster plotting, particularly for larger datasets. Further optimization and a revised user interface help to simplify running machine learning routines for all users, regardless of their analytic background. Furthermore, the collaboration across desktop and VR allows multiple users to analyze data together in real time.

    “With VIP 2020, Virtualitics is revolutionizing the data analytics market,” said Michael Amori, CEO and co-founder of Virtualitics. “Our Fortune 500 and government clients will continue to benefit from VIP’s blend of features that emphasize innovation, usability and speed.”

    Image credit: Virtualitics

  • Amey launches Virtual Reality health & safety training program developed by Edg VR

    Amey launches Virtual Reality health & safety training program developed by Edg VR

    In Virtual Reality News

    January 31, 2020 – Public services provider Amey has announced the launch of a new safety training program for its utilities employees based on a fully-interactive virtual reality platform.

    The programme has been developed in partnership with Edg VR and focuses on a series of virtual scenarios to enhance Amey’s induction process, including site safety, driver awareness and hazard perception.

    The platform uses 360-degree photos, videos, CGI features and interactive platforms to deliver bespoke health and safety training modules. It also provides an insight into employee performance using live scoring, assessment and analytics.

    Several training sessions have already taken place across the UK on the company’s contract with UK water company, Severn Trent, and feedback among Amey’s employees has so far been consistently positive.

    Andy Halsall, Managing Director of Amey’s Utilities Division, said: “I’m delighted to announce this new training package for the business. Edg VR’s technology can help us re-invent traditional methods of delivering health and safety training at work and help protect our people as they carry out their vital work across the UK.”

    Matt Smith, Managing Director of Edg VR, commented: “Edg VR’s range of health and safety training harnesses cutting-edge technology to transform safety in the workplace. The solution allows multiple users to learn, be assessed in real-time and receive industry recognised accreditation. We look forward to working with Amey to deliver this exciting new program.”

    Amey states that over the coming year, more than 1,600 of its employees will be enrolled in the course and it is hoped that the training will increase retention, reduce costs regarding site defects, fines and vehicle damage, and most importantly protect employees on the front line.

    Image credit: Amey/Edg VR

  • SA Photonics releases SA-62/E electronic see-through Augmented Reality HMD

    SA Photonics releases SA-62/E electronic see-through Augmented Reality HMD

     

    In Augmented Reality News

    January 30, 2020 – SA Photonics, a provider of photonics solutions for military and commercial applications, has announced the release of its SA-62/E augmented reality (AR) head mounted display (HMD) with electronic see-through, and almost no peripheral obscuration. The AR display provides high contrast, bright imagery even when viewed outdoors, as well as full occlusion of objects that are behind other objects, according to the company.

    The video processing system, developed in partnership with SL Process (Paris, France), has low latency and provides simultaneous localization and mapping (SLAM), as well as hand tracking. The video can be stored for review or shared with teammates.

    SA Photonics states that the eyepieces of the SA-62/E were specifically designed to have less than 10% obscuration of the entire human visual field, allowing users to interact with people and other objects, while still affording safe operation given that users can still see their feet and the ground.

    Dr. Michael Browne, SA Photonics’ General Manager, commented: “The SA-62/E provides a seamless transition between the real and augmented worlds. Our system will provide great benefits to users who require occlusion and operation in high ambient brightness environments.” Stan Larroque, CEO of SL Process, added: “We are excited to integrate our low latency video processing, SLAM and hand tracking architectures with the SA-62/E HMD. I believe this will be a revolutionary AR display.”

    The SA-62/E HMD will be demonstrated at the SPIE AR/VR/MR Conference in San Francisco on February 3-4 in Booth 9.

    Image credit: SA Photonics

  • Blippar appoints AR veteran Keith Curtin as Chief Commercial Officer, US

    Blippar appoints AR veteran Keith Curtin as Chief Commercial Officer, US

    In Augmented Reality News

    January 30, 2020 – Augmented Reality technology company, Blippar, has today named Keith Curtin as its Chief Commercial Officer for the US.

    Curtin started his AR career at Blippar in 2013 and then led the US commercial business for Zappar in North America before founding his own consultancy firm, See Digital. According to Blippar, Curtin has consulted with hundreds of Fortune 1000 global brands to help them develop and execute their first augmented reality and mixed reality campaigns, including P&G, Coca-Cola, Pepsico, Nestle, Disney, LVMH, Diageo and Kellogg’s.

    He will now be responsible for driving commercial growth in the US for Blippar’s AR creation and publishing platform, Blippbuilder, which enables brands, educators and individuals to create and publish their own AR experiences with no coding knowledge or skills required.

    Commenting on his appointment, Curtin said: “Blippar has become a global leader and pioneer in AR that spans across advertising, retail, FMCG, healthcare, education and content creation tools for the next generation of digital engagement. As we gear up for our next phase of growth, I’m thrilled to be returning to the company and look forward to moving our business and the AR ecosystem to the next level.”

    Blippar’s CEO, Faisal Galaria, commented: “I’m delighted to welcome Keith back to Blippar. His experience in the AR space is unprecedented. He helped Blippar scale through several early phases of its growth lifecycle and was instrumental in educating global brands about how to use pioneering AR in their campaigns. Blippar remains the pre-eminent brand in the industry and with Keith onboard we are building a truly world-class team to deliver our ambitious growth plans as the market for AR becomes increasingly mature.”

    Blippar’s mission is to make AR easy to create and accessible for everyone, and since launching in the UK in 2011, the company’s technology has been used by brands such as PepsiCo, Porsche, Pearson, Burberry, Cadbury, L’Oréal, GSK, Rugby World Cup and Procter & Gamble.

    Image credit: Blippar