Category: Location

  • Sense of Space launches ‘Sense XR’ platform for WebAR Hologram publishing

    Sense of Space launches ‘Sense XR’ platform for WebAR Hologram publishing

    In Augmented Reality News 

    June 2, 2021 – Sense of Space, a Helsinki-based company focused on the development of software for post-production and streaming of volumetric video-based holograms, has announced the launch of its ‘Sense XR’ platform, which allows creators with all levels of technical knowledge to edit hologram experiences and publish them to WebAR.

    The new release also brings to the market a WebAR player with native adaptive streaming for volumetric videos, which Sense of Space states guarantees that experiences run smoothly across different types of network connections, allowing end-users to see holograms with the best visual quality for their context.

    Sense XR is a WebAR hologram experience editing and publishing tool that helps designers, advertisers, artists, and content creators to develop immersive experiences that blend the real and the virtual world without the need to hire expensive and hard-to-find technical teams.

    “The most fascinating aspect about holograms is that it’s the next step for merging the real and the virtual worlds. With this technology, you can get things from the real world, like humans and animals, and put them in the virtual world exactly as they are, while also getting digital elements from the virtual world into the real one,” said Tuomo Paavilainen, Chief Technology Officer at Sense of Space.

    “It’s a completely new way of bringing someone into the virtual world. It’s different from avatars, as avatars are not really us, but holograms are. When we use holograms, we don’t simulate people, we are people,” added Victor Pardinho, Sense of Space founder and CEO.

    Sense XR includes features such as easy editing of holograms, audio, and sequential 3D assets; WebAR publishing with a single click; compression capabilities to make experiences smooth in most devices without compromising visual quality; and a WebAR player with native adaptive streaming for volumetric videos. The user interface helps to make the editing of holograms as straightforward as editing a 2D video, and the content created in Sense XR can be distributed to third-party tools, including real-time game engines like Unity, Unreal, and more.

    Sense of Space is aiming to tackle the challenges behind both hologram post-production and streaming. Volumetric video-based experiences are limited to the capabilities of existing wireless networks, and as most hologram experiences are notoriously data heavy, they are better consumed with 5G. As a result, Sense of Space has developed a compression and optimization system that has been paired with its adaptive streaming feature in order to allow creators to share their content with those using WiFi and 4G as well, enabling widespread accessibility to hologram experiences.

    Sense XR users also have access to a web editor that allows them to create a full WebAR experience with their published holograms. Users can add images, logos, 3D models, and other elements needed in a more complex experience, besides being able to create interactions between the assets, extending the possibilities of what can be created with Sense XR.

    “Sense of Space’s mission is to build tools for 3D content creation that are accessible both in terms of workflow and pricing, enabling creators from all walks of life to adopt holograms in their digital experiences. We are very close to living in a world where recording ourselves and others volumetrically will be as easy as making a video with our smartphone, and we want to support the endless creative possibilities that come with it. We want everyone to be able to create immersive and interactive experiences with their own realities,” said Sense of Space founder, Raquel Cardoso.

    For more information on Sense of Space and its Sense XR platform for WebAR hologram experiences, visit the company’s website.

    Image / video credit: Sense of Space

  • LEGOLAND Windsor Resort incorporates Augmented Reality into its new LEGO MYTHICA attraction

    LEGOLAND Windsor Resort incorporates Augmented Reality into its new LEGO MYTHICA attraction

    In Augmented Reality News 

    June 2, 2021 – LEGOLAND Windsor Resort has incorporated augmented reality (AR) as part of its launch of ‘LEGO MYTHICA: World of Mythical Creatures’, a brand-new IP and area of the park. Created using technology from Zappar, the AR functionality to the LEGOLAND Windsor Resort app will help visitors immerse themselves in the new world, as they roam around the area and learn more about the backstory of LEGO MYTHICA.

    Zappar enables brands to create and deploy immersive AR experiences to native apps, as well as to the mobile web where users do not need to download an app. According to the company, Zappar’s AR technology underpins the new LEGOLAND experience, and the incorporation of such immersive technologies in the park showcases how AR can further enhance the resort experience, as visitors use their smartphone cameras to take park visitors on a virtual journey alongside their real-world trip.

    LEGOLAND Windsor Resort’s aim is to drive awareness of the new IP, increasing the audience’s understanding of the back story and cast of characters, with AR activities helping to complement the physical installations within the park in general. 

    The AR experiences are activated exclusively via the LEGOLAND Windsor Resort app, which users are encouraged to download before they enter the site. When roaming around the LEGO MYTHICA area, visitors will come across four LEGO statues of mythical creatures; a Chimera, Alicorn, Hydra, and Sky Lion. Scanning special plaques, with branded Mythica codes that accompany these LEGO statues, reveal their real-world characters in AR which appear through a vortex bringing the Mythica realm to life before visitors’ eyes. Users can take selfies with the creatures, collect digital cards about the characters and learn more about their backstory via the app.  

    In addition, 3D portals can be placed either at home or in the park using the AR function, where users can visit four different locations, where they can discover more about the LEGO MYTHICA world.  

    Ash Tailor, VP Global Brand and Marketing Director for LEGOLAND explained: ‘Mythica is all about championing creativity and firing the imagination of kids who are our real heroes. Through the incredible flying theatre experience, a first for the UK, and the magic of AR we’re able to bring this brand new world to life for kids like never before at LEGOLAND Windsor and really show the journey from LEGO characters to mythical creatures.” 

    Caspar Thykier, Co-founder and CEO at Zappar, commented: “This really is a dream project to work on with such an iconic brand, venue and new IP that lends itself so beautifully to augmented reality with its interplay between the real world and LEGO MYTHICA realm accessed via portals and vortexes. The challenge was to deliver spatial storytelling that could enhance and complement the incredible physical experience that is the cornerstone of a great day out at LEGOLAND Windsor Resort and really immerse family and kids in this wonderful new IP in a new way. We hope it surprises and delights everyone in equal measure.” 

    For more information on Zappar and its augmented reality experience creation platform, please visit the company’s website.

    Image credit: The LEGO Group

  • Vodafone Spain customers now able to purchase the Nreal Light Augmented Reality glasses

    Vodafone Spain customers now able to purchase the Nreal Light Augmented Reality glasses

    In Augmented Reality and Mixed Reality News

    May 31, 2021 – Vodafone Spain has announced that, starting tomorrow, June 1, its customers will be able to purchase the new Nreal Light augmented reality (AR) glasses. The glasses, which are designed for everyday use, connect to a smartphone via a USB-C cable in order to take advantage of the power of a next-generation Android device for processing tasks and providing an immersive AR experience for the wearer. 

    Vodafone added that it has developed, in collaboration with Virtual Voyagers and Optiva Media, ‘Vodafone 5G Reality AR’, an augmented reality application that allows users to extend their own reality. The solution has a list of application widgets, allowing users to create their own virtual dashboard, combining 2D and 3D screens. If an app is not available, users can simply open a browser window and view it from the glasses display instead.

    The Nreal Light smart glasses offer consumers an accessible and high-quality augmented reality and mixed reality experience, with a sophisticated knowledge of the wearer’s surrounding environment thanks to the glasses’ six degrees of freedom (6DoF) tracking, plane detection and image tracking.

    The glasses utilize two cameras, two microphones and two integrated speakers, allowing users to watch videos, surf the Internet and arrange dozens of screens within their field of view, with information being displayed on what appears to the wearer as the equivalent of a large transparent screen. Plus, users are able to carry on moving around whilst wearing the glasses.

    The Vodafone 5G Reality AR app will be available completely free of charge from the Google Play store and users will access and interact with it through Nebula, Nreal’s native 3D system for the Light glasses.

    Today’s announcement means that Vodafone becomes the first mobile operator in Spain to market a consumer AR device with real use cases thanks to an exclusivity agreement with Nreal. The mobile operator also stated that: “The advantages provided by the Vodafone 5G network are essential for the operation of these glasses, since speeds of up to 1Gbps and latencies of less than 10ms make the experience smooth and comfortable for the user.”

    Vodafone added that it will continue to develop other use cases to enhance the advantages of its 5G network that will be implemented in the coming months. 

    Nreal Light will be available from tomorrow, June 1 at vodafone.es/nreal. Customers can purchase the glasses at Vodafone physical and digital points of sale. The Nreal Light glasses can either be associated with any Vodafone rate for one of the compatible smartphones on offer from Vodafone, or purchased on their own and linked to the customer’s existing smartphone.

    Offers include:

    • Free Nreal Light glasses on a mobile plan costing EUR €16 a month for 36 months; or
    • Oppo Find X3 Pro 5G device and Nreal Light glasses for EUR €42.50 a month for 36 months.

    For more information on Nreal and its Light augmented reality glasses, please visit the company’s website.

    Image / video credit: Vodafone Spain / YouTube

  • DPVR launches its ‘Starlink’ Virtual Reality headset group training system

    DPVR launches its ‘Starlink’ Virtual Reality headset group training system

    In Virtual Reality News

    May 24, 2021 – DPVR, a Shanghai, China-based company that specialises in virtual reality (VR) device design and manufacturing, has recently announced that it has launched its newest virtual reality hardware offering with the introduction of the ‘Starlink’ group system solution. 

    According to DPVR, Starlink is designed to overcome some of the problems surrounding a setup that involves multiple tethered PC VR headsets for group training sessions, which often requires lots of computer hardware in the room, connecting hubs, room cabling, and which can also lessen the sense of immersion for users.

    Starlink removes the need to use individual computers for individual users, and instead employs the use of multiple wireless VR headsets—specifically DPVR’s all in one ‘P Series’ headset—helping to cut costs and save setup time when fitting out classrooms or group training facilities.

    With the Starlink system, only a single computer and tethered PC VR headset (either DPVR’s E34K or E3C headsets) are required. The setup is then able to synchronise with up to 30 of DPVR’s P-Series wireless VR headsets and can smoothly share training or educational content from the presenter, according to the company.

    DPVR’s P1 VR headset.

    The setup also means that all the processing work is done by the wireless headsets themselves, which helps to reduce issues arising due to poor computer processing power or operating system crashes.

    Depending on the room size and wireless router hardware, up to 30 people can use Starlink within 10 meters and up to 40 people within 8 meters. The company added that even more users are possible, depending on the hardware being used.

    Another bonus of the Starlink setup, according to DPVR, is that as well as saving on PC/tethered headset hardware costs, it also helps to reduce the cost of software licensing. Instead of purchasing individual licenses for each user to access the same software on multiple PC’s the content is streamed to multiple users with a single license and no additional hardware requirements.

    DPVR states that its VR headsets are currently used by consumers and commercial businesses around the world in over 100 countries, with its products used in a wide range of applications including education, training, simulation and entertainment. The company also noted that it has already started rolling out production trials of Starlink through educational and training content suppliers such as Veative Labs, with more than 20 other companies ready to test Starlink. 

    DPVR’s VR headset offerings include its P-Series range of wireless devices (available in 3DoF or 6DoF versions), as well as its E3 range of tethered headsets. The Starlink system is compatible with the company’s P-Series headsets (P1, P1 Pro, P1 Pro EDU, and P1 Pro 4K) for the wireless units. For the tethered VR aspect of the system, customers can use DPVR’s E-Series headsets (E34K and E3C), as well as the HTC Vive, HTC Vive Pro, HTC Cosmos and Oculus Quest 2 devices, according to the company.

    For more information on DPVR and its virtual reality hardware and software solutions, please visit the company’s website.

    Image credit: DPVR

  • King Crow Studios awarded $6.5 million contract by US Department of Defense to support B-52 Virtual Reality training

    King Crow Studios awarded $6.5 million contract by US Department of Defense to support B-52 Virtual Reality training

     

    In Virtual Reality and Mixed Reality News

    May 19, 2021 – King Crow Studios, a virtual reality (VR) training and game development company, has today announced that it has been awarded a USD $6.5M SBIR (Small Business Innovation Research) Phase III contract with the United States Government to support B-52 Pilot Training over the next 4 years.

    The award comes after the company spent 2020 collaborating with the US Air Force (USAF) to develop a B-52 Virtual Reality Procedures Trainer (VRPT). The B-52 VRPT is a custom virtual training solution that creates digital twins of aircraft and equipment. This allows B-52 pilots and mechanics to train on virtual procedures prior to their interaction with physical aircraft and helps to reduce the cost of on the job training, fuel costs, and equipment downtime.

    “We are excited to continue collaborating with the DoD and Air Force to use mixed reality training solutions to increase safety, productivity, and efficiency for Air Force pilots, maintainers, and support staff,” said Cody Louviere, King Crow Studios Founder and President. 

    According to the company, the contract began on May 17, 2021 and extends through to 2025. King Crow Studios was supported and advised throughout the bid process by several Louisiana-based partners including Nexus Louisiana, Precision Procurement Solutions (PPS) and the USAF. 

    “King Crow Studios continues to Bridge The Gap for the American Warfighter. Their success and determination will continue to create jobs, have a positive return on investment for the United States Air Force and grow the economy in the state of Louisiana,” said Grant Rogers, Precision Procurement Solutions CEO.

    As well as working with the US Department of Defense, King Crow Studios also produces VR games, with its latest title ‘Hive Slayer’ currently available on Steam for free – although the company is currently accepting donations to help raise money for hurricane relief in Lake Charles, Louisiana. The game will also be available on the Oculus Store in Summer 2021. 

    For more information on King Crow Studios and its virtual reality training solutions and games, please visit the company’s website.

    Image credit: King Crow Studios

  • Immersive LBE Virtual Reality experience ‘War Remains’ comes to National WWI Museum and Memorial this month

    Immersive LBE Virtual Reality experience ‘War Remains’ comes to National WWI Museum and Memorial this month

    In Virtual Reality News

    May 11, 2021 – War Remains, an immersive experience from MWM Interactive (MWMi) will this month debut at its new home, the National WWI Museum and Memorial located in Kansas City, US. Presented by ‘Hardcore History’ podcaster Dan Carlin, War Remains is an immersive virtual reality (VR) experience that transports viewers to the Western Front of the First World War where they can witness history unfold from a soldier’s point-of-view.

    War Remains is a location-based experience (LBE) that places attendees in a 25ft x 25ft space and equips them with a VR headset. The space effectively acts as a physical set, thereby allowing viewers to interact with what they are witnessing virtually in the physical world, adding to the immersion of the experience. Although a relatively small space, according to Brandon Padveen, Associate Producer at MWM Interactive, the experience actually feels enormous due to the techniques that were used to create the experience and trick users into thinking they are in the vast trench networks on the frontlines of WWI.

    The VR experience was produced by MWMi, directed by Brandon Oldenburg, and developed by Flight School Studio, with audio designed by Skywalker Sound. Throughout the experience, Dan Carlin leads audiences into the trenches as an active battle scene rages on around them. Through a combination of visual effects, sound engineering, and the guidance of Carlin’s voice, audiences get the opportunity to experience a moment in history. 

    “Virtual Reality creates other dimensions. The medium allows the storyteller to engage the audience in a way that previous storytelling genres haven’t been able to tap into. The engagement level is so much higher because the audience is 100% involved. It’s an active not passive experience,” said Carlin. 

    As well as the digital animation and physical interaction aspects of the experience, another hugely important part is the sound. Ethan Stearns, Executive Vice President of Content at MWM Interactive, said that originally, the team wanted the experience to be “so loud and uncomfortable that people wouldn’t really want to be in a headset.” Obviously, this wouldn’t be something that viewers would reasonably want to experience, and understandably, MWMi couldn’t really push things to a level that the real soldiers of WWI went through. Instead, the team had to look at how it could emulate how loud the trenches would have been in a different way. This emulation was achieved through clever sound design methods including the integration of speakers into the walls of the physical set, thanks to the work that Skywalker Sound carried out.

    MWMi has gifted the War Remains LBE to the National WWI Museum and Memorial. Stearns added: “We want War Remains to continue to be experienced, and there is no better permanent home than the National WWI Museum and Memorial.” 

    “We are extremely grateful to MWMi for the gift of War Remains. This experience is unlike anything that Kansas City has hosted before,” said Matthew Naylor, President and CEO of the National WWI Museum and Memorial. “War Remains will allow the viewer to immerse themselves in the trenches of WWI and experience it with all of their senses, reaching them on both an educational and emotional level.” 

    War Remains premiered to international acclaim at the Tribeca Film Festival in 2019 and later opened for a limited run in Austin, Texas. It went on to win the “Out-of-home VR Entertainment of the Year” award at the VR Awards. The experience will be hosted in the National WWI Museum and Memorial’s Memory Hall and will be open to the public on May 27, 2021. Due to the graphic nature of the content, viewers must be at least 14 years of age. 

    For more information on the War Remains experience, click here.

    Image / video credit: www.worldwar1centennial.org / MWM Interactive / YouTube

  • Pico Interactive announces its newest enterprise 6DoF Virtual Reality headsets with the Neo 3 Pro and Neo 3 Pro Eye

    Pico Interactive announces its newest enterprise 6DoF Virtual Reality headsets with the Neo 3 Pro and Neo 3 Pro Eye

    In Virtual Reality News

    May 10, 2021 – Pico Interactive, a global tech company that develops virtual reality (VR) and enterprise solutions, has today announced the launch of its latest 6DoF headsets in its Neo line – the ‘Neo 3 Pro’ and ‘Neo 3 Pro Eye’, which are built for businesses. The headsets follow last month’s launch of the Neo 3 – the company’s consumer-only headset exclusively available in China. Pico stated that the Neo 3 Pro and Neo 3 Pro Eye devices will also be available in the West, including North America and Europe.

    Henry Zhou, CEO of Pico Interactive, commented: “For the Neo 3 line, we implemented the latest technologies to meet the ever-changing demands of the enterprise. For instance, as hybrid and remote work continues, more companies are seeking remote collaboration solutions, like Pico Interactive’s VR headsets with tools and applications available through our software partners, to allow employees to increase productivity and sales to boost revenue. From training firefighters to decreasing the impact of social isolation on seniors to measuring brain health, Pico’s headsets are now being used in a wide range of industries.”

    The new Pico Neo 3 Pro device.

    Both 6DoF models were built for the enterprise and are powered by the Snapdragon XR2 Platform. According to Pico, the headsets have a single 5.5” display with 3664 x 1920 resolution, a PPI of 773 and up to 90Hz refresh rate. The Neo 3 Pro and Neo 3 Pro Eye keep the same counterbalanced design as their predecessor though – having a reasonably weighted front HMD with the battery pack located in the rear, to help provide a more comfortable experience. The headsets also have a replaceable polyurethane sterilizable face cushion that’s hygienic and washable. 

    The controllers have been updated from the Neo 2 models as well, transitioning to two 6DoF optical controllers for improved tracking. With the Neo 3 headsets having four cameras instead of two, the guardian system has also been improved from the last generation. Pico states that the guardian system now offers a more robust, flexible, and open user tracking system, which allows for a larger range of commercial use-cases without requiring additional devices or setup costs.

    The Pico Neo 3 Pro Eye comes with built-in eye tracking technology from Tobii.

    Pico is once again partnering with Tobii, a provider of eye tracking solutions. The Neo 3 Pro Eye, with built-in Tobii Eye Tracking, will help enterprises to gain a deeper understanding of customers, enhance training efficiency, and improve productivity by being able to track a user’s gaze whilst wearing the device.

    The Pico Neo 3 Pro and Neo 3 Pro Eye will also support NVIDIA’s Direct Mode as they are DisplayPort (DP) supported and equipped with DP connectors and cables, which provide native 4K@90Hz high bandwidth wired connection for Pico VR Streaming. When a Pico device is connected to a PC via a DP cable, the Pico VR streaming assistant can work in Direct Mode, which is a high-performance, low-latency way to render PC VR content to Pico headsets. 

    Finally, the entire Neo 3 family of headsets support NVIDIA CloudXR, delivering VR wirelessly across 5G and WiFi networks, and enabling enterprises to integrate VR into workflows to help drive design reviews, virtual production, location-based entertainment and more. 

    With Pico’s focus on the enterprise, the headsets will be sold directly to companies via a dedicated sales team spread across the globe. The Neo 3 Pro is available for pre-order at USD $699 USD and the Neo 3 Pro Eye at USD $899. The headsets will be available in Q3 2021. For more information on Pico Interactive and its VR headset offering, visit the company’s website.

    Image credit: Pico Interactive

  • GridRaster selected to US Air Force’s SBIR program to provide immersive XR simulations and training

    GridRaster selected to US Air Force’s SBIR program to provide immersive XR simulations and training

    In Augmented Reality, Virtual Reality and Mixed Reality News

    May 6, 2021 – GridRaster Inc., a provider of cloud-based XR platforms that power scalable augmented, virtual and mixed reality (AR/VR/MR) experiences on mobile devices for enterprises, has announced that it has been selected to the Small Business Innovation Research (SBIR) program of the United States Air Force (USAF) to provide large scale hyper-realistic immersive simulations and training of pilots and support crew.

    According to GridRaster, the traditional pilot training environment is expensive, not flexible, offline, not easily scalable and comes with inherent risks. Aircrew train themselves on a multi-million-dollar aircraft, expend fuel and other consumables, risk wear and tear and maintenance. The rate at which the pilots can be trained is limited by availability of physical aircraft for training purposes and severely constrains the amount of practise trainee pilots can get.

    AR and VR can provide a flexible platform to conduct training that does not require the use of expensive aircraft, costly consumables, or additional maintenance costs. Furthermore, AR/VR applications can be configured to provide a near real-world combat operation environment, and the hardware can be used for many combat training missions and environments in which pilots and staff can have more mission rehearsals. As a result, GridRaster anticipates that the USAF will be able to substantially reduce its training budget and still get effective results.

    “AR and VR technologies save money, create a much safer environment for trainees and help them gain better operational and situational awareness, access to more rehearsals, decreased time required to model varied and customizable terrains and provide a highly engaging mission planning to students,” said Rishi Ranjan, CEO of GridRaster. “We look forward to working with the stakeholders at USAF to adapt our commercial GridRaster platform to address national security needs.”

    GridRaster states that its technology provides a platform for the USAF to unlock the full potential of AR/VR/MR that can rapidly adapt to the advancements and changing landscape of modern warfare by facilitating large scale hyper-realistic immersive simulations and training of pilots and support crew.

    The GridRaster XR platform utilizes distributed cloud computing/on-premises remote servers with Graphics Processing Units (GPUs) to achieve high-fidelity rendering of complex 3D content, and real-time alignment and tracking of virtual models/scenes over the real-world objects. Furthermore, the platform supports a unified and shared synthetic training environment that is flexible enough to allow for mission rehearsals of most types and intuitive enough to make training effective, according to the company.

    The SBIR program encourages US-based small businesses to engage in Federal research and development efforts with the potential for commercialization. Through a competitive awards-based program, SBIR enables small businesses to explore their technological potential and provide the incentive to profit from its commercialization. For more information on the program, click here.

    For more information on GridRaster and its platform for scalable XR experiences, please visit the company’s website.

    Image credit: GridRaster / Business Wire

  • STIM deploys RealWear’s HMT-1 Assisted Reality headset to aid its fish farming operations

    STIM deploys RealWear’s HMT-1 Assisted Reality headset to aid its fish farming operations

    In Augmented Reality News 

    May 5, 2021 – RealWear, Inc., a provider of hands-free head-mounted computers for industry, has today announced that STIM, a provider of fish health services, has deployed RealWear’s HMT-1 assisted reality wearable solution into the aquaculture industry.

    By utilizing the voice-activated HMT-1 wearable tablet device, it has enabled STIM Norway’s fish health teams to increase operational efficiency whilst reducing travel, according to RealWear. 

    In Norway, fish farming sites and facilities span the coastline and are located in remote locations. The sites are required by law to host fish-health inspections several times a year. For STIM, honouring this requirement means that its fish health teams are constantly on the road, traveling to and from production sites, all of which comes with both an economic and environmental cost. 

    To address these challenges, STIM researched remote inspection systems including handhelds, laptops, wearables and AR smart glasses. Following an in-depth consultation with RealWear partner and IT infrastructure specialists ATEA Norge, STIM opted to standardize on RealWear’s HMT-1 wearable device due to its features such as voice activation and hands free functionality, as well as its ruggedized design that allows for operation in wet, windy and rough offshore conditions. 

    RealWear stated that since the first deployment of the HMT-1 in 2019, STIM has unlocked additional capabilities of the devices through the STIM PRO app, the company’s system for remote communication with wellboats, service boats and aquaculture facilities. STIM PRO allows STIM’s customers to log in and book an appointment for a remote inspection. In the same portal, customers can upload documents that STIM’s fish health staff may need to review prior to the inspection. The system enables immediate access to STIM’s fish health personnel who, regardless of geographical distances, can overlook and assess sampling and test results, provide advice, and contribute with important assessments related to specific fish health-related problem solving. 

    With real-time video and audio transmission, teams are able to provide visibility to any situation and ask for real-time advice on any identified issues. The HMT-1 has also helped to reduce response time in cases where important decisions have to be made quickly, according to RealWear.

    To date, STIM has deployed 75 HMT-1 headsets. The company initially rolled out RealWear devices for use in hygiene inspections that it conducts on wellboats used to transport live fish or to provide onboard treatments. These simple hygiene tests, which must be done to ensure that the boats are sufficiently cleaned between missions, have previously required on-site presence of an independent veterinarian or fish health biologist. However, with RealWear’s hands-free wearable computers running STIM PRO, the process has been transformed, with crew members now able to perform tests themselves whilst STIM’s fish health professionals remotely oversee the procedure and either approve of test results or decide that further cleaning is necessary. 

    “Since deploying RealWear’s solution, we have been able to provide the same quality to our customers with this method and even offer a discounted inspection. Whilst it’s a very different process to physically sending someone to a site, it has resulted in improved safety, efficiency and cost savings,” said Henrik Hareide, COO Knowledge Services, STIM Norway. “The hardware is excellent, and RealWear’s solution is perfectly tailored to our unique needs.” 

    RealWear added that STIM has plans to add a Learning System to its STIM PRO platform, to ensure quality and consistency across teams.

    Jon Arnold, VP of Sales, EMEA at RealWear, commented: “Yet again, we are witnessing further scope for RealWear’s solution to be used in multiple applications within another industry sector,” adding, “Wearable technology is now widely accepted as a tool for efficient communication, offering efficiency benefits both for operations and environmentally.”

    For more information on RealWear and its assistive reality devices, please visit the company’s website.

    Image credit: STIM / Werner Juvik / RealWear

  • Signature Production Group upgrades its Chicago area LED studio into a full Extended Reality studio

    Signature Production Group upgrades its Chicago area LED studio into a full Extended Reality studio

    In Augmented Reality, Virtual Reality and Mixed Reality News

    May 4, 2021 – Signature Production Group (SPG), a company that specializes in providing production technology solutions for meetings and live events, has announced that it has finalized the transformation of its LED studio into a full Extended Reality (XR) studio. The transformation includes a variety of technology investments including a Disguise One VX4 media server and RX rendering solution, Mo-Sys camera tracking, and Notch motion graphics and interactive visual effects engine.

    As a result of the upgrades, Signature Production Group is now also the latest company to be added to the Disguise list of worldwide XR studio partners. SPG’s media server, camera tracking and visual effects engine work together to deliver any XR combination be it augmented, virtual, or mixed reality (AR/VR/MR – collectively XR).

    The company’s Chicago-area XR studio is powered by a Disguise media server, Mo-Sys camera tracking and Notch visual effects. Through the use of this technology, SPG provides audio, video, and lighting for meetings and events, with XR primarily used to place live presenters into virtual environments. 

    Additionally, the company employs the use of LED walls for virtual production and as an alternative to location filming. Virtual production has been made popular by shows such as Disney’s The Mandalorian, and the production world continues to find new applications for this emerging technology. SPG’s LED wall uses high-resolution THOR EDGE tiles controlled by Brompton Tessera SX40 LED processors.

    “We’re excited about the upgrade to full XR capabilities,” said Tristen Crow, SPG’s lead video engineer. “Originally our LED wall could do all the same things as a wall or projection surface that we would set up in a ballroom or conference facility.  But by adding interactive 3D environments, set extensions and a combination of XR technologies, the possibilities are endless.”

    SPG’s owner, Dave Schwarz, also commented: “When you put on VR goggles, you’re immersed in a virtual world… As a meeting and event technology company, we wanted to give presenters the ability to stand on a physical stage and present from any virtual environment. As they’re presenting live from our studio, we can add speaker support graphics, remote presenters, interviewees or panelists. They can also interact with apps, websites, sales aids, or virtual 3D models of their products.”

    When live events were cancelled in March 2020, the SPG team built broadcast control rooms to stream live meetings to online attendees – something that SPG stated will also help to meet the evolving needs of the event industry. The company noted that recent business success, even amidst the pandemic, has allowed it to expand its staff and invest in new capabilities as SPG prepares for the next phase of live events.

    For more information on Signature Production Group and its virtual and XR production capabilities, please visit the company’s website.

    Image credit: Signature Production Group

  • BUNDLAR awarded United States Air Force AFWERX SBIR Phase I contract

    BUNDLAR awarded United States Air Force AFWERX SBIR Phase I contract

    In Augmented Reality News 

    May 4, 2021 – BUNDLAR, a provider of a web-based platform that provides a no-code, drag-and-drop interface for creating, editing, and publishing augmented reality (AR) experience bundles to any supported mobile device, has recently announced that it has won an AFWERX Small Business Innovation Research (SBIR) Phase I contract.

    BUNDLAR submitted proposed solutions to utilize its augmented reality platform to enhance existing training for service men and women to improve time to proficiency for equipment maintenance procedures and pilot continuing education programs.

    Established in 2017, AFWERX is the Air Force’s team of innovators who encourage and facilitate connections across industry, academia, and the military to create transformative opportunities and foster a culture of innovation. AFWERX’ mission is to solve problems and enhance the effectiveness of the service by enabling thoughtful, deliberate, ground-up innovation.

    “The benefits of augmented reality are real and measurable, and we are honored to be a part of the AFWERX mission to bring innovative technology to the armed services,” said BUNDLAR Co-founder & CTO, Matthew Wren. “The goal here is simple, to improve warfighter safety and effectiveness. Our platform can modernize training materials and allow them to be delivered in a cost effective way, at the scale and speed required by the DoD.”

    The United States Armed Forces have been among the earliest adopters of AR technology, and BUNDLAR’s mission-ready tool can support critical initiatives and improve results, according to the company.

    The company did not disclose a dollar amount for the SBIR Phase I contract. For more information about BUNDLAR’s no-code AR platform, visit the company’s website.

    Image credit: BUNDLAR

  • SimX receives new US Air Force contracts totaling over $1.5 million to advance Virtual Reality training programs

    SimX receives new US Air Force contracts totaling over $1.5 million to advance Virtual Reality training programs

    In Virtual Reality News

    May 3, 2021 – SimX, a provider of a virtual reality (VR) medical simulation platform, has announced that it has landed four new contracts totaling over USD $1.5 million as it continues to expand its partnership with the US Air Force (USAF) and US Space Force (USSF) to develop training solutions for special operations forces.

    According to SimX, the contracts, which are part of the USD $2.5 million ‘Virtual Advancement of Learning and Operational Readiness’ (VALOR) research and development program, seek to further develop fielded capabilities for training the USAF’s medical personnel with the SimX Virtual Reality Medical Simulation System.

    SimX stated that the new funding is targeted primarily towards enhancing the capabilities of the system to train operational medical handoffs between roles of care, train missions involving multiple simultaneous caregiving teams, train in dynamic and realistic environments (such as night and weather operations), and provide more customizable and adaptable training capabilities. Additional funding has also been allocated to adapting VR medical simulation training for in-flight medicine during aerial and space operations with the Air Force and Space Force.

    As a result of the partnerships, special operations medical personnel of the 24th Special Operations Wing, will be able to train through simulated medical scenarios based on real-world experience and reinforce learning on the relevant medical techniques, tactics and protocols. The overall objective is to enable the wing’s Special Tactics operators including pararescuemen and combat controllers, as well as their unit medics and Special Operations Surgical Teams, to “train how they fight.”

    A pararescueman participates in a virtual reality medical scenario.

    SimX noted that the capabilities of its platform include a broad array of situations, including Tactical Combat Casualty Care-based scenarios as well as routine medical care. All newly developed capabilities will be fielded for operational testing and evaluation at the existing SimX deployments at installations across the US, as well as USAF installations in Europe and Asia.

    “The USAF and USSF’s continued investment in the VALOR program will enable us to continue to push the envelope of VR medical training by ensuring that we can train for these critical interactions,” said Karthik V. Sarma, VALOR Principal Investigator and Chief Technology Officer at SimX.

    Col. John R. Dorsch, who heads the effort for the US Air Force, also commented: “The VALOR program is helping to increase overall medical capability and has the potential to improve survival rates in combat casualties. Expanding and innovating capabilities is critical for ensuring the highest level of combat trauma and austere medical care is provided by our special operators and medical personnel.”

    In addition to the distribution to the 24th SOW’s units stationed around the country, the new capabilities will also be fielded at the new Special Operations Center for Medical Integration and Development (SOCMID), a collaboration between the USAF and the University of Alabama-Birmingham designed to provide the next generation of standardized training to Special Operations Surgical Team members, pararescuemen and independent duty medical technicians.

    The projects are made possible through the Small Business Innovation Research (SBIR) program, in collaboration with AFWERX, a team of innovation specialists within the USAF, and the Air Force Research Laboratory (AFRL). AFRL and AFWERX have partnered to streamline the Small Business Innovation Research process in an attempt to speed up the experience, broaden the pool of potential applicants and decrease bureaucratic overhead. Beginning in SBIR 18.2, the Air Force has begun offering ‘Special’ SBIR topics that are faster, leaner and open to a broader range of innovations.

    For more information on SimX and its virtual reality medical simulation platform, visit the company’s website.

    Image credit: SimX / US Air Force photo by Tech. Sgt. Sandra Welch

  • Publications Office of the European Union tenders €4.9 million contract that includes the provision of Augmented and Virtual Reality applications

    Publications Office of the European Union tenders €4.9 million contract that includes the provision of Augmented and Virtual Reality applications

    In Augmented Reality and Virtual Reality News

    April 29, 2021 – The Publications Office of the European Union has recently gone to tender on a contract for EUR €4,900,000 that includes the provision of applications based on augmented reality (AR) and virtual reality (VR) technologies.

    The contract is split into four lots, with the VR and AR component having a total value of EUR €700,000, and involves the analysis, design, production, maintenance and assistance in the publishing of AR and VR-based applications.

    The lot is technology orientated, based on hardware, applications, and techniques involving some degree of spatial tracking. According to the tender documents, the following products may be requested under the AR/VR lot:

    • Virtual reality products such as simulated experiences/activities/training, 3D visualisations of environments and places, virtual visits or tours in 3D, photo-realistic 3D worlds, VR simulations of processes, fully immersive training/educational experiences, games in a VR environment, etc. to be disseminated via one or several of the following channels:
      • As mobile applications published on mobile app stores (e.g. Apple App Store, Google Play Store, Oculus store or other VR stores/gaming platforms);
      • As browser-based experiences for desktop and mobile operating systems (e.g. WebVR, WebXR, etc.);
    • Augmented Reality products or experiences to be published via the following channels:
      • As mobile apps published on mobile app stores;
      • Through a browser via WebAR;
    • Back-end solutions for VR/AR products if ordered together with said VR/AR products products (but not as stand-alone back-end solutions);
    • Websites included in complex projects based on VR or AR technologies.

    As part of the tender process, bidders must demonstrate their ability to deliver a successful solution by submitting two case studies within the fields of virtual and augmented reality. For the VR aspect, bidders must submit a proposal for the development of a VR application that demonstrates how VR can be used to present the European Green Deal strategy to 18-30 year olds, and that is designed in a way that allows for multilingualism, with the look and feel of the application taking into account the visual identity of the European Commission.

    For the AR aspect, bidders must propose an AR application that allows users to place European architectural landmarks in their own environment (e.g. the Eiffel Tower, the Colosseum in Rome, the Leaning Tower of Pisa, etc.). The application should target a young audience aged 12-15. Bidders are free to select four to five landmarks that they want to make available in AR and suggest features focusing on educational goals. Furthermore, the landmarks need to be displayed in comparison to each other and in the correct scale. The proposed AR mobile app should work in all 24 EU languages and be operational on mobile phones and tablets, covering both iOS and android devices. As with the VR case study, the look and feel of the application should take into account the visual identity of the European Commission.

    Note that these above proposals are simply the requirements of the case studies needed in order for companies to demonstrate a technical competency within the fields of VR and AR. The actual requirements of what products will be needed once the contract has been awarded have not yet been disclosed, however it is likely that the above case studies give some clues as to what may generally be required of the successful bidder.

    The duration of the contract is noted as being 48 months (4 years), with no extension options. Tenderers are able to submit a bid for one, several, or all four lots, depending on the choice of the tenderer. The other three lots cover:

    Production of audiovisual publications (total value of EUR €3,000,000);

    Production of mobile applications (total value of EUR €600,000);

    Production of electronic publications based on HTML (total value of EUR €600,000).

    The contract for lot 4 is expected to be signed in the first quarter of 2022. The deadline for receipt of bids is June 22, 2021 at 1:00pm Central European Summertime. For more details on how to bid and for tender documentation, please get in touch via contact@auganix.org.

    To stay up to date on the latest contract notice and award information, make sure you sign up to the Auganix weekly mailing list.

    Image credit: Publications Office of the European Union

  • Varjo and VRM Switzerland’s Virtual Reality aviation training device first to receive approval from European Union Aviation Safety Agency

    Varjo and VRM Switzerland’s Virtual Reality aviation training device first to receive approval from European Union Aviation Safety Agency

    In Virtual Reality

    April 28, 2021 – Varjo, a provider of professional-grade virtual and mixed reality (VR/MR – collectively XR) hardware devices, has recently announced that the European Union Aviation Safety Agency (EASA) has for the first time ever officially qualified a virtual reality–based training solution for aviation training.

    Developed and built by VRM Switzerland, the helicopter pilot simulator features Varjo’s latest human-eye resolution virtual reality headset, the Varjo VR-3, as well as unique technology developed by the Swiss flight training solution provider.

    Varjo stated that with the first EASA approved VR simulator, pilots can now have the virtual training time credited to their flight training as well as benefit from the high training value and flexibility of the immersive solution. Since the qualification is done directly by the aviation authority, any customers of VRM Switzerland can start using the VR training solution without further national certification according to EASA regulations.

    This will enable flight schools and helicopter companies to use the latest technology to help increase flight safety, offer more cost-effective training solutions, and to train in a more environmentally friendly way, at any time.

    “Varjo is honored to be a part of the world’s first EASA authorized pilot simulator with our highest-fidelity virtual reality device, the Varjo VR-3. This is a truly pivotal moment for the entire VR/XR industry, proving that immersive simulations can bring very tangible benefits for pilot training. Together with VRM Switzerland, we look forward to providing and scaling cost-effective, photorealistic virtual simulation training to pilots worldwide,” said Seppo Aaltonen, Chief Commercial Officer at Varjo.

    The VR simulator provides pilots with a full-body immersive experience, and gives the feeling of sitting in a real helicopter through Varjo’s human-eye resolution VR headsets, a dynamic motion platform, numerical simulation of the flight behavior, haptic perception, and a 3D cockpit model that is an exact replica of the real helicopter, according to Varjo.

    Fabi Riesen, CEO at VRM Switzerland, said: “Pilots should receive realistic training on simulators. This allows helicopter operators and flight schools to fly more efficiently and safely. Thanks to our cooperation with the authority, we can offer training that is directly creditable. Our VR concept, which includes the Varjo VR-3 Head Mounted Display (HMD), Pose Tracking combined with a VR Haptic Cockpit on a highly dynamic 6DoF [six degrees of freedom] motion platform, provides a training device with the highest possible visual fidelity, allowing pilots to be fully immersed.”

    Varjo VR-3 features an extremely high resolution of over 70 pixels per degree in the center of the field of view, as well as a wide field of view  of 115 degrees, allowing pilots in training to see and read the smallest of details in VR with perfect clarity.

    According to Varjo, the suitability of the VR solution was verified through a training evaluation program together with EASA involving pilots of various nationalities from industry and aviation authorities, including helicopter flight instructors and test pilots.

    Commenting on the announcement, David Solar, Head of General Aviation (GA) and Vertical Takeoff and Landing (VTOL) Department at EASA, said: “Virtual Reality Simulation has been identified as a real enabler and potential game changer for helicopter training. I’d like to congratulate VRM Switzerland Team for the outstanding work as well as EASA teams for their commitment to support this qualification, which so far as I’m aware is a first in the world. Well done to all!”

    The fact that something as highly specialized and scrutinized as pilot training is now allowing for the use of VR technology to factor in to officially credited training hours marks yet another positive step forward for the tech, and helps to demonstrate the readiness of VR to meet real-world industry needs.

    This is also not the first time that Varjo’s technology has been utilized for immersive helicopter flight training. Last year, Bohemia Interactive Simulations announced that it was integrating Varjo’s XR-3 headset into a mixed reality Apache helicopter simulator demo.

    For more information on Varjo and its virtual reality HMD solutions, please visit the company’s website.

    Image credit: Varjo

  • Husson University announces new degree program for students wanting to specialize in Extended Reality

    Husson University announces new degree program for students wanting to specialize in Extended Reality

    In Augmented Reality, Virtual Reality and Mixed Reality News

    April 22, 2021 – Husson University has announced that its Board of Trustees has approved the creation of a new degree program for students interested in careers that specialize in the design, programming, and technology associated with augmented and virtual reality (AR/VR) applications. Known as the ‘Bachelor of Science in extended reality’ (XR), this new degree program will enroll its first class starting in the fall of 2021.

    “We’ve seen an enormous amount of interest in a degree like this from high school students enrolled in our Early College Access Program,” said Robert A. Clark, PhD, CFA, President of Husson University. “A pilot course called ‘XR 177 Augmented and Virtual Reality’ had strong enrollments. This new program puts Husson University on the leading edge of technology education.”

    Students enrolled in the new Bachelor of Science in extended reality will be working extensively in the new iEX Center, where they will collaborate with other students, faculty, and outside partners. The iEX Center will serve as an innovation hub for the creation of interactive and immersive XR research and design projects. 

    Students on the course will use emerging XR technology to help them develop practical solutions to real-world challenges, and will study disciplines such as digital media design, computer programming, audio engineering, and visual design; within the context of extended reality. According to the degree course’s webpage, projects that students could end up working on may include: 

    • Building three-dimensional models for a virtual reality training platform;
    • Programming an augmented reality application;
    • Experimenting with interactive projected media hardware;
    • Using a motion capture system to analyze body movements or create real-time avatars;
    • Programming a game engine to create interactive spatial audio-triggered effects;
    • Experimenting with augmented reality glasses.

    The University added that given the current level of industry demand, the kinds of presentations developed in the Center will have practical applications for professionals working in education, entertainment, business, healthcare and a variety of other fields. 

    “Students studying in the XR program will learn to solve real-world problems using advanced technologies,” said Marie Hansen, JD, PhD, SHRM-SCP, Dean of the College of Business and New England School of Communications. “This new degree will include augmented reality, virtual reality, three-dimensional design, spatial mapping, and spatial sound design. After graduation, they’ll be fully prepared to be hired as extended reality professionals.”

    As part of this rollout, Husson University will be launching a new school to oversee the degree program. The School of Technology and Innovation in the College of Business will oversee the XR degree’s curriculum development and the capital purchases of equipment and software associated with this program. This likely means that there will be various XR hardware and software related contract awards on the horizon to be handed out by the University too.

    Launching at the end of the Spring 2021 semester, the new School of Technology and Innovation will join the four other schools that already comprise Husson University’s College of Business.

    “There is strong evidence that industry, educational institutions, research organizations, and others will increasingly require the consultation of an extended reality professional,” said Brave Williams, the Director of the iEX Center and associate professor. “We believe our students will have many exciting career opportunities that integrate three dimensional immersion and interaction into family and economic life. Our goal is to prepare professionals who will create seamless extended reality opportunities for everyone.”

    For more information on Husson University’s new Bachelor of Science in extended reality degree program, please visit the University’s website.

    Image credit: Husson University