BeBop Sensors and Talon Simulations receive Epic Games MegaGrant to develop immersive full motion VR training solution

In Virtual Reality News

September 9, 2020 – BeBop Sensors, a provider of haptic glove technology, has partnered with industrial gaming simulation provider Talon Simulations to announce that they have received an Epic Games MegaGrant to develop an economical “Hands-on Immersive Full-Motion VR Trainer” for the Unreal Engine. According to the companies, the joint development will provide military and commercial aviation training entities with a full-motion simulator that offers a heightened sense of realism to benefit training, at a “significantly lower price point.”

The training platform will showcase the integration of BeBop Sensors’ Forte Data Gloves with Talon Simulations’ A3 Full-Motion Simulator offering 2-degrees-of-motion with integrated glove haptics such as force feedback and virtual cockpit control interaction to improve immersion and training realism.

The grant will allow both companies to co-develop SDKs/APIs/haptic editors for the Unreal Engine, making it easier for developers to integrate haptic gloves and motion into VR-based training solutions. The units will be lightweight, portable, and reconfigurable with various flight and driver controllers at a price point far below traditional full-motion simulators offered today, according to BeBop Sensors.

“By virtualizing the cockpit in VR and using haptic gloves to interact, we allow instructors and courseware designers to economically build more aircraft/vehicle variants without additional hardware costs and switch seamlessly between variants to improve student throughput. At the same time, we’re improving muscle memory and reducing muscle scaring by allowing them to use their hands naturally as they would in the real world – this changes the way simulator training will be delivered going forward,” noted Jerry Kurtze, VP of Sales & Marketing for BeBop Sensors.

Brandon Naids, CEO at Talon Simulations, added: “Static flight simulators are fatiguing for students over time as they become disconnected from what their eyes are seeing in VR but not feeling in their inner ear or through the haptic responses in their hands or torso as they work controls flying the aircraft. Working together with Unreal software developers, we can offer a robust, portable, and reliable simulator that can reduce training costs for training applications.”

The partnership’s solution will feature:

  • Unreal Engine SDKs and APIs, which will simplify the integration of haptic gloves, haptic responses, flight controls, and flight dynamics into Unreal developers’ training applications;
  • Haptic gloves to provide full hand translation, precision location, and tactile interaction, offering precise control and feedback of virtualized cockpit/dashboard controls;
  • A simulator that enables instructors to easily reconfigure for variants using VR to reproduce any virtualized cockpits with attached flight/driving controls;
  • Controller swapping in under 10 minutes with a variety of aircraft and driving controllers – sticks, yokes, wheels, thrusters, and pedals;
  • A 2’x 4′ compact footprint that allows multiple systems to network in smaller classrooms;
  • Portable and lightweight chassis that has the ability to be crated, or moved to be reconfigured to match a training space.

BeBop Sensors only recently announced that it had been selected to present its haptic Forte Data Glove as part of the US Air Force’s AFWERX ‘Base of the Future’ challenge. The company uses smart fabrics to create sensor solutions that comprehend force, location, size, weight, bend, twist, and presence across any size, resolution, and geometry. BeBop Sensors states that its sensors are available for a wide variety of applications, including military, industrial, medical, human factors, virtual reality, gaming, design, automotive, sports, and more. To find out more about the company’s haptic solutions, click here.

Image credit: BeBop Sensors

About the author

Sam Sprigg

Sam is the Founder and Managing Editor of Auganix. With a background in research and report writing, he covers news articles on both the AR and VR industries. He also has an interest in human augmentation technology as a whole, and does not just limit his learning specifically to the visual experience side of things.