Category: Defence & Security

  • Kognitiv Spark partners with Mack Defense and Valcom to provide Mixed Reality remote support from Canada to Europe

    Kognitiv Spark partners with Mack Defense and Valcom to provide Mixed Reality remote support from Canada to Europe

    In Augmented Reality and Mixed Reality News

    May 20, 2020 – Kognitiv Spark, a provider of augmented and mixed reality applications for industrial remote worker support, has today announced that it has partnered with Mack Defense Canada and Valcom Consulting Group to provide mixed reality remote support to field service representatives operating in Europe.

    Mack Defense and Valcom are collaboratively using Kognitiv Spark’s ‘RemoteSpark’ mixed reality remote support tool, built for use on Microsoft HoloLens, to deliver task guidance from system experts based in Canada, to field service technicians based in Europe. Furthermore, the solution helps to support travel reduction and physical distancing measures that are being implemented in several work environments, while allowing subject matter experts (SMEs) to continue to guide and support field service staff.

    The RemoteSpark software platform enables remote workers to establish a secure, low-bandwidth mixed reality connection with SMEs, located anywhere in the world. The platform creates a shared audio and video connection while allowing SMEs to support the task with relevant documents, images and 3D assets which appear as holograms in the end-user’s real-life environment. 

    Built for use in data-sensitive industries, Kognitiv Spark states that the RemoteSpark system is secure, with end-to-end encryption, firewalls and tamper detection, the ability to operate on-premise, air-gapped, among additional layers of security. The system is in use by the Royal Canadian Navy, Royal Canadian Air Force, Canadian Army and other defence and industrial organizations, according to Kognitiv Spark. 

    A Valcom worker repairs a Mack Defense vehicle using a Microsoft HoloLens Mixed Reality HMD

    The trial is focused on secure inspection and repair of Mack Defense systems by Valcom field service technicians. After an extensive, multi-party security audit, the first remote support call was initiated by a Valcom technician in early 2020. Since then, Valcom and Mack Defense have reported positive results from the trans-Atlantic mixed reality calls between field service representatives in Europe and experts in Canada.

    “The RemoteSpark system has allowed us to troubleshoot problems from overseas quickly and accurately without traveling, which means Mack Defense systems can continue to operate efficiently,” said Senior Business Development and Procurement Manager for Valcom, Charles Richer. “Compare this to having to fly a system expert overseas, into remote locations, from Canada and you’re looking at weeks of delays and equipment that can’t be used. The capability shows a lot of promise.”

    The companies state that In the span of a month, numerous remote support calls have been made as a part of the project, each one representing a saved overseas flight for a system SME and a Mack Defense system returned to service faster than traditional repair processes would allow. Richer added that one day of training was all that was required for technicians to be able to operate the HoloLens hardware and RemoteSpark software. 

    “The trial of the mixed reality system is proving successful in ensuring that we maintain a high level of operational efficiency. Working with our service partners at Valcom, we will be continuing to monitor the outcomes of the trial and assess opportunities for expansion,” said Alain Gauthier, VP of Mack Defense Canada.

    Image credit: Kognitiv Spark

  • US Department of Defense looking into using Virtual Reality to help “enhance small unity lethality”

    US Department of Defense looking into using Virtual Reality to help “enhance small unity lethality”

    In Virtual Reality News

    May 20, 2020 – The US Department of Defense (DoD) is looking into using virtual reality (VR) to support the enhancement of “small unit lethality”, and is seeking to solicit technologies from private industry, government research & development organizations, and academia for inclusion in an upcoming capability demonstration and experimentation event focused on the use of VR tools.

    Titled ‘Thunderstorm 21-1’, the demonstration and experimentation event will focus on VR tools that leverage autonomy and artificial intelligence/machine learning (AI/ML) technologies capable of providing operators with ways to improve planning, decision making and efficiency; as well as reduce cost, workload, and risk; improve the safety of operators and operating equipment; and enhance situational awareness in both the rapid response planning and execution of small unit operations. 

    The Thunderstorm 21-1 event will primarily focus on developmental and mature technologies and will be structured to:

    • Execute a variety of rapid response-related scenarios to examine the capabilities and limitations of VR tools to support planning and rehearsal of platoon level and other small unit operations. The scenarios will be designed in collaboration with the USMA and other interested federal government stakeholders;
    • Invite industry, academia and government labs to demonstrate relevant capabilities by either providing end-to-end solutions or to propose a solution to one of the three components of the challenge:
      • Collection
      • Processing and VR Products Development
      • VR Applications/Experience;
    • Document the results with a level of granularity such that the reporting can inform future DoD investment decisions.

    The DoD is inviting respondents to submit technology demonstration and experimentation applications related to an end-to-end solution, or to one or more of the following three capabilities:

    1 – Collection capabilities that may include autonomous operations of air and ground systems: Unmanned Aerial System, Unmanned Ground Vehicle operating alone, as part of a group/swarm, or as a manned/unmanned team and capable of tactical photogrammetry and Measurement and Signature Intelligence collection to develop three-dimensional products (interior/exterior) in support of a platoon sized objective. Capabilities for this step may also include several other characteristics, such as:

    • Low cost and disposable;
    • On-board real-time analytics;
    • Live streaming of data;
    • Remote launch and recovery; and many more.

    2 – Time Critical Processing under field conditions of data in step 1 to provide VR products in support of a platoon level mission, including Denied, Disrupted, Intermittent, and Limited (DDIL) bandwidth considerations. Desirable characteristics include:

    • Capabilities that use low space, weight and power solutions;
    • Local or private cloud;
    • On premises processing;
    • Ability to import and export to community defined standard formats.

    3 – VR applications/experiences that may enhance a platoon or other small unit’s ability to rapidly gain situational awareness, complete their planning process and conduct mission rehearsals. The following characteristics are desirable:

    • Intuitive, multi-user operational environments that virtually recreates a real-world location of interest that is representative of a current operational situation;
    • Application that can ingest standard data formats;
    • Distributed collaboration capabilities using shared virtual user experiences;
    • Artificial Intelligence (AI) enabled representations of situations that the unit has never experienced;
    • Capabilities to collect metrics and measures of effectiveness while a team is using a VR experience to demonstrate improvements in decision making;
    • Geolocation-based data integration capability that enables current, proximal information to be represented virtually in situ.

    The DoD states that after a review of applications, the government may invite candidates to demonstrate and experiment with their capability during the Thunderstorm 21-1 event, with the aim of the DoD’s investigation being to accelerate the delivery of innovative capabilities to US warfighters by demonstrating and experimenting with any capabilities that a VR tool might offer in an operationally relevant environment. In addition, candidates may be selected for follow-on events at facility specific locations in support of future acquisition activities. However, this doesn’t imply that the government will endorse, procure or purchase equipment at a later date.

    The results will then be used to inform key decision makers of emerging and available technology solutions that could potentially enhance or improve operational capabilities.

    Interested parties are able to submit applications to participate in the Thunderstorm 21-1 demonstration and experimentation event, which is planned for the autumn of 2020 at the United State Military Academy. 

    The application deadline is 5:00 PM EDT on June 19, 2020, and the event is planned to be conducted in the autumn of 2020 at the United States Military Academy (USMA) in West Point, New York.

    For more information on how to submit an application please contact Auganix.

    Image credit: U.S. Army

  • NNTC and Vuzix announce upgrade to iFalcon Face Control Mobile AI facial recognition platform on Vuzix Blade Smart Glasses

    NNTC and Vuzix announce upgrade to iFalcon Face Control Mobile AI facial recognition platform on Vuzix Blade Smart Glasses

    In Augmented Reality News 

    May 12, 2020 – Vuzix Corporation, a supplier of smart glasses and augmented reality (AR) technology and products, together with NNTC, a Dubai-based software developer and solution provider, has today announced that iFalcon Face Control Mobile, a fully autonomous AI-powered face recognition system that is integrated with the Vuzix Blade Smart Glasses, has been upgraded to a more powerful and robust solution ready for 4G/5G city-wide deployments.

    The iFalcon Face Control Mobile Release 2.0, designed for law enforcement officers and security guards on patrol to screen crowds and match faces against a database of missing or suspected people without requiring a network of CCTV cameras, can now manage hundreds of wearable devices and thousands of stationary cameras in a single interface, according to the companies.

    Detections and alerts are now propagated to nearby security personnel and operational centers. For field units in 4G/5G-covered communities, the direct transmission of video streams via Vuzix Blade AR glasses or bodycams to a central server can be performed for video processing and face matching. The upgraded software applies the use of GPS location data, and makes it available to users so that they can receive patrol and system notifications, as well as access online information about locations, status, alerts and situations from other team members.

    As a result of its autonomous facial recognition system, earlier this year NNTC became a member of the Intel Internet of Things Solutions Alliance, an ecosystem of more than 900 companies working within the IoT industry. The 2.0 release of iFalcon Face Control Mobile uses the 8th Generation Intel Core processor (i7-8650U 1.90 GHz to 4.20 GHz Turbo, Quad Core, 8MB Cache, 15W TDP) as the heart of the wearable mobile server.

    Additionally, the new iFalcon Face Control setup is now 500 grams lighter and features only one battery instead of two, making the system more mobile, according to NNTC. The company is also considering adding OpenVino, Intel’s AI and computer vision platform into future releases of the product.

    “The world is changing and political, economic and even medical situations bring new challenges to police and security forces around the globe. Authorities are turning to emerging technologies to address those issues. Video analytics and facial biometrics are some of the best available response options,” said Dmitry Doshaniy, General Manager at NNTC. “Following the production rollout of our unique wearable face recognition solution, we have received invaluable feedback from the field. The NNTC team is glad to present the pinnacle of our efforts to bring facial recognition and video analytics where it is needed.”

    Paul Travers, President and Chief Executive Officer at Vuzix, commented: “The overall design and see-through waveguide optics of the Vuzix Blade are critical must-have features for deployments into security operations. We’re excited about the iFalcon Face Control Mobile 2.0 release, which offers new functionality and features that will help unify security forces to monitor their communities using Vuzix smart glasses technology.”

    The companies first announced their partnership and integration of the iFalcon Face Control Mobile with Vuzix’ Blade smart glasses in June 2019.

    Video credit: NNTC / YouTube

  • U.S. Army ERDC utilizing Augmented Reality solutions in the fight against COVID-19

    U.S. Army ERDC utilizing Augmented Reality solutions in the fight against COVID-19

    In Augmented Reality and Mixed Reality News

    May 8, 2020 – Scientists and engineers at the U.S. Army Engineer Research and Development Center (ERDC) are using augmented reality technology to assist peers throughout the U.S. Army Corps of Engineers (USACE) in virtually conducting site assessments of alternate care facilities (ACFs) across the country.

    Augmented reality technology, developed by researchers at the ERDC Information Technology Laboratory (ITL), offers a way for the USACE to assess potential ACF locations while assisting with social distancing and safety considerations.

    “The ERDC team is forward thinking in terms of how immersive computing can be applied to solve real-world engineering science and defense-related challenges,” said Jonathan Boone, an ITL research civil engineer.

    Using live-streaming and mixed-reality overlays, smaller groups of engineers located on-site have the capability to share information with subject-matter experts working remotely. In addition to the safety benefits of leveraging the technology, real-time collaboration of assessment results has expedited the delivery of information to FEMA, according to the U.S. Army. Images shared by the U.S. Army in relation to the ERDC assessments show a user appearing to wear a Microsoft HoloLens device.

    “Facility assessments are critical to the success of the ACF mission,” Boone said. “Having reachback, live-stream capabilities allows engineers and architects who are leading efforts from a ‘boots on the ground’ team perspective to get virtual support from other USACE subject-matter experts.”

    Using augmented reality for ACF assessments is just one way that the ITL team has and continues to identify ways to use innovative technology to address real-world challenges. The U.S. Army states that “Innovate, immerse and inspire” is the mission of the ERDC team members delivering solutions that utilize augmented reality technologies.

    “Augmented and virtual reality is the connective tissue for all things the ERDC is doing with artificial intelligence and robotics to leverage more informed decision-making for the nation and the warfighter,” Boone said. “In a virtual world, our stakeholders can practice, fail, learn and improve through repetition in a safe environment. That way, they’ll be better suited to perform their duties.”

    Currently five USACE districts are prototyping the technology, with two more districts planning to use the technology soon.

    Image credit: U.S. Army Engineer Research and Development Center

  • Kopin receives USD $2.7 million follow-on order for its Augmented Reality display technology for F-35 fighter aircraft

    Kopin receives USD $2.7 million follow-on order for its Augmented Reality display technology for F-35 fighter aircraft

    In Augmented Reality News 

    May 2, 2020 – Kopin Corporation, a provider of wearable computing technologies and solutions, has this week announced that it has received an approximately USD $2.7 million follow-on order of its high-brightness liquid crystal display for the F-35 Joint Strike Fighter program. With the F-35 scheduled for production through 2030, Kopin anticipates additional orders over the life of the program.

    The F-35 jet fighter is a single-seat, single-engine, all-weather, day and night stealth, multi-role combat aircraft. The jet is designed to perform both air superiority and strike missions while also providing electronic warfare and intelligence, surveillance, and reconnaissance. Much of the functionality is enabled through an augmented reality (AR) helmet, which provides pilots with vast quantities of flight, tactical, and sensor information for advanced situational awareness, precision and safety. The extensive functionality and extreme conditions require unique display technology and Kopin is the sole supplier to this production program.

    The largest procurement program in the Department of Defense (DOD), the F-35 strike fighter aircraft is being procured in different configurations for multiple arms of the DOD, including the United States Air Force, Marine Corps, and Navy. The 500th production F-35 was delivered in March 2020 and Kopin reiterates that the current DOD plan is to acquire a minimum of 2,400 jets over the life of the program. In addition, US Allies are expected to purchase hundreds of F-35s with eight nations cost-sharing the program with the United States. With replacements, the total number of displays for the F-35 program is very substantial, according to Kopin.

    “This follow-on order extends our backlog of scheduled deliveries into the second quarter of 2021, providing good visibility for our manufacturing plan,” stated Bill Maffucci, Kopin’s Vice President of Government Programs. “As the sole provider of displays to the F-35 production program, we benefit from our strong relationship with the DOD in obtaining valuable feedback, which we leverage to continuously improve our display technology. These enhancements have implications for the F-35 program as well as many future opportunities with the military as well as enterprise customers.”

    Image credit: Kopin Corporation

  • Vuzix enters into agreement with “Major US Defense Contractor” to develop a customized Waveguide-based optics engine

    Vuzix enters into agreement with “Major US Defense Contractor” to develop a customized Waveguide-based optics engine

     

    In Augmented Reality News

    March 13, 2020 – Vuzix Corporation, a supplier of Smart Glasses, Augmented Reality (AR) technology and products for the consumer and enterprise markets, has today announced that the company has signed an agreement with a new major US defense contractor to build a customized waveguide-based optics engine. Under the terms of the first phase of the agreement, Vuzix and its new customer have agreed upon an upfront payment and phase-gated development milestones and payments.

    Phase 1 is expected to generate initial non-recurring engineering (NRE) revenue over the next 6 months for Vuzix with potentially greater NREs in subsequent phases before an accepted final product design is expected to lead to a volume production order.

    Commenting on the announcement, Paul Travers, President and Chief Executive Officer at Vuzix, said: “We are excited to enter into this partnership and believe it represents a strong vote of confidence in our capabilities and recognition of our leading position within the waveguide optics technology space”. He added, “Additionally, the agreement demonstrates how Vuzix is able to leverage our industry leading optics technology and partner with top US Defense contractors across a variety of vertical markets.”

    Image credit: Vuzix Corporation

  • Varjo introduces real-time green screen and marker tracking feature for its XR-1 Developer Edition headset

    Varjo introduces real-time green screen and marker tracking feature for its XR-1 Developer Edition headset

    In Virtual Reality and Mixed Reality News

    March 12, 2020 – Varjo, a provider of industrial-grade VR/XR headsets, has introduced real-time chroma keying and marker tracking as early access features for its XR-1 Developer Edition headset. An industry-standard technique known as ‘green-screening’ and used in broadcasting and film, Varjo states that it is the first company to deliver chroma keying in real-time for mixed reality devices. With marker tracking, professional users can anchor any virtual objects to the real world using printable visual markers.

    Together, these two features allow enterprise customers to integrate virtual and real worlds, interact with photorealistic virtual content as they would in real life, and achieve high levels accuracy and occlusion inside mixed reality. The video above showcases the power of chroma keying and object tracking.

    “Since its commercial launch in December 2019, Varjo’s XR-1 Developer Edition has quickly become the most demanded mixed reality product for professional users, transforming the way companies train, design and conduct research in immersive environments,” said Urho Konttori, Chief Product Officer and co-founder of Varjo. “When our customers asked us to create a seamless solution for blending the real and virtual worlds, we immediately jumped to the challenge. We’re excited to introduce real-time chroma keying and object tracking to our customers just three months after the first deliveries of the XR-1, enabling absolute immersion inside mixed reality.”

    Varjo states that chroma keying is particularly beneficial for professional workflows where aligning virtual content accurately with the physical world is crucial. Users can now define parts of reality, identify them with color and replace them with virtual models or scenery without heavy development costs. With chroma key, virtual content also occludes with real-world objects or hands, allowing for intuitive interactions. Using Varjo’s object tracking with visual markers, professionals can make virtual objects appear exactly where they want them in their surroundings. Example use cases include:

    Training and simulation: A pilot can sit in a replica of a plane or helicopter cockpit and be able to look outside and see oneself flying in an ultra-immersive visual scenery, while operating physical cockpit controllers for realistic training. Chroma keying also enables multi-user training scenarios.

    Design: An automotive designer can sit in a car and replace parts of the interior with designs that are not yet built in reality. Designers can also collaborate in an immersive mixed reality space, interacting with virtual models and making changes to them in real-time, or virtually ‘dress’ 3D prints to look like material-finished products.

    Run academic, clinical, and commercial research: Researchers can conduct studies inside life-like mixed reality, simultaneously combining virtual and real world elements into the research environment. Subjects can hold virtual products or instruments in their hands and interact with them.

    “With chroma key, Varjo took an industry-standard technique and turned it into a useful new feature for dynamic mixed reality simulations,” said Bob Vaughn, Product Manager at FlightSafety International, a provider of aviation training. “We look forward to further exploring the feature applied to a variety of simulation opportunities. We highly value our collaborative relationship with Varjo, and are excited to continue to push the boundaries of mixed reality.”

    Both chroma keying and marker tracking are available in early access to all users of the XR-1 Developer Edition headset, which is capable of streaming high-resolution video and virtual reality to the user without any observable latency.

    Varjo recently demonstrated the new features on the XR-1 headset at the DSET (Defence Simulation Education and Training) trade show this week in Bristol, UK.

    Video credit: Varjo/Vimeo

  • Alion to demonstrate its Mixed Reality ‘Virtual Sandbox’ tactical wargaming system to U.S. Air Force’s AFWERX

    Alion to demonstrate its Mixed Reality ‘Virtual Sandbox’ tactical wargaming system to U.S. Air Force’s AFWERX

    In Augmented Reality, Virtual Reality and Mixed Reality News

    February 19, 2020 – Alion Science and Technology has today announced that it has been selected to demonstrate its ‘Virtual Sandbox’ technology at the USAF Future of Wargaming Showcase in Las Vegas, Nevada from February 25 – 26, 2020. Alion works with Defense and Intelligence communities to design and deliver advanced engineering solutions to meet current and future demands, and is one of 24 companies selected to participate from 78 submissions. 

    Alion will demonstrate its Virtual Sandbox, a mixed-reality (MR) tactical wargaming system. This product, along with the Navy Continuous Training Environment (NCTE), managed and operated by Alion, will support development of the cloud-based, DevSecOps driven, wargaming Novel Distributed Integrated Concept Environment (NoDICE).

    The Alion solution creates live virtual simulations for wargaming by integrating with numerous data sets, including: Radio Frequency; Intelligence, Surveillance, and Reconnaissance; Full Motion Video; and Live, Virtual, and Constructive. The team has integrated augmented reality (AR), virtual reality (VR), and artificial intelligence (AI) to enable real world experiences from the users/operators as well as real time model development. Alion states that the solution’s architecture helps enable rapid integration of commercially developed capabilities.

    “Alion’s solution for wargaming aligns to the DoD’s DevSecOps Reference Design while pulling in lessons learned from the DoD’s Big Data Platform and the Navy Continuous Training Environment,” said Alion Senior Vice President of Cyber Network Solutions Katie Selbe. “Alion pairs off-the-shelf technology from the government and commercial sectors with Open Source Software to take advantage of innovations and best practices for wargaming. This has allowed the team to rapidly create 2D, 3D, AR, VR, and other solutions for wargaming and training.”

    Demonstrations will be presented to AFWERX, U.S. Air Force, and the U.S. Joint Forces. The AFWERX program’s goal is to foster Air Force engagement across industry, academia and non-traditional contributors to create transformative opportunities and help bring about an Air Force culture of innovation, with an ultimate aim of solving problems and enhancing the effectiveness of the Air Force.

    Image credit: Alion Science and Technology

  • Vuzix and TensorMark collaborate to integrate facial and object recognition into Vuzix Blade Smart Glasses

    Vuzix and TensorMark collaborate to integrate facial and object recognition into Vuzix Blade Smart Glasses

    In Augmented Reality News

    February 13, 2020 – Vuzix Corporation has announced a partnership with US based TensorMark, a cloud-based AI and computer vision technology provider, to integrate their two solutions on the Vuzix Blade Smart Glasses.

    TensorMark has been developing an interface specifically for the Vuzix Blade Smart Glasses that takes advantage of the physical attributes of the device, including the color display and on-board camera, to integrate TensorMark’s cloud-based AI and computer vision technology and more specifically facial and object recognition. The company’s AI and computer vision technology has applications developed for verticals including loyalty and retail; corporate and personal security and access control; and banking services and Fintech.

    With the TensorMark technology and the Vuzix Blade Smart Glasses, customers will be able to ID facial and object images hosted on a cloud database that can be customized and catered specifically for every client. Security camera output, as well as drone aerial footage, can be analyzed by the TensorMark backend system to provide important information and send alerts directly to the display on the Vuzix Blade, providing real-time digital intelligence to customers. TensorMark is also working on adding the ability to predict behavior patterns using emotion detection to provide clients with even more vital information with its AI algorithm.

    “We are thrilled about our partnership with Vuzix to bring our AI and computer vision technology suite to the Vuzix Blade Smart Glasses,” said J.P. Weston, CEO and co-founder of TensorMark. “The combination of the Vuzix Blade with our facial and object recognition backend will open up very significant business opportunities for both companies across numerous market verticals that include border patrol, first responders, hospitality, retail, and banking.”

    Paul Travers, CEO and president of Vuzix, commented: “The TensorMark solution, working in unison with the Vuzix Blade Smart Glasses and leveraging the ongoing developments in 5G and edge computing, will be a recipe that will disrupt the personal and professional security marketplace.” Travers added, “We are excited to partner with TensorMark to address the growing interest in the Vuzix Blade Smart Glasses across the security market.

    The companies are collaborating on numerous facial and object recognition proof of concept demos centered around the Vuzix Blade Smart Glasses, including one with a Fortune 50 company that is evaluating the solution for deployment across their customer base, leveraging AI and edge computing to provide enhanced intelligence to security personnel.

    Image credit: TensorMask

  • ThirdEye partners with 3D Media to improve training for U.S. Air Force flight crews

    ThirdEye partners with 3D Media to improve training for U.S. Air Force flight crews

    In Augmented Reality and Mixed Reality News

    February 11, 2020 – ThirdEye, a provider of augmented reality (AR) and mixed reality (MR) enterprise solutions and devices, has today announced its partnership with 3D Media, a technology development firm that specializes in virtual and augmented reality solutions for enterprise.

    The partnership will see the two companies work together to improve training and human performance for flight line maintainers and flight crews working with the B-1 Lancer aircraft in the U.S. Air Force’s 7th Bomb Wing. ThirdEye states that its X2 MR Glasses, which weigh just 300 grams, will provide personnel with improved safety, efficiency and proficiency when working on the mechanical structures of the aircraft, known as ‘airframes’.

    In November 2019, 3D Media was awarded a USD $1 million Small Business Innovation Research (SBIR) grant from AFWERX to build AR tools to improve training for airmen. According to ThirdEye, its track record of execution and success in military environments was a key reason that its X2 MR Glasses were chosen for research and development purposes, along with a larger deployment into the Air Force and U.S. Department of Defense.

    The X2 MR Glasses’ industrial capabilities, as well as their platform built for working applications, and ability to connect with subject matter experts in harsh environments, are all features that ThirdEye states align with the military’s needs. By wearing the X2 MR Glasses, flight crews and flight line maintainers can connect with experts in any environment via the glasses’ built-in proprietary 3D SLAM (simultaneous localization and mapping) system and CAD modeling and overlay, where step-by-step technical instructions and drawings can be projected onto the X2 MR Glasses display. The glasses also take screenshots and enlarge images for better visibility. Flight crews and maintainers are then able to open and view documents via voice command, while working hands-free, without worrying about Wi-Fi connection, thanks to the X2’s 5G capabilities.

    “As more of the world’s military adopt AR and MR solutions to better support the training and work of personnel, we’re honored to be working with 3D Media to contribute to the U.S. Air Force’s efforts and increase the safety of our airmen,” said Nick Cherukuri, Founder and CEO of ThirdEye. “From our previous work in the military, ThirdEye’s X2 MR Glasses have undergone drop tests, so we know it’s ready to support airmen however needed, no matter where they are.”

    The X2 MR Glasses feature a wide field of view and sensors that provide advanced MR features that are not available on a monocular device. Additionally, the glasses are entirely hands free, which the company states is important for being out in the field where wires can be a potential hazard. The X2 MR Glasses also run on the latest Android operating system, which allows software to be easily ported onto the glasses.

    Commenting on the announcement, Daryl Roy, CEO and Founder of 3D Media, said: “When we go into a project, we never assume what the outcome is going to be – the answer comes from collaborating with other experts. By partnering with ThirdEye, we’re arming airmen with X2 MR Glasses to significantly improve their day-to-day”. He added, “I believe that augmented reality’s largest opportunity is in the area of human performance, where there’s no margin for error. ThirdEye has proven to be successful in military environments, and with life and death literally on the line, it’s important to arm our military personnel with the best.”

    Earlier this year, ThirdEye announced that its X2 Mixed Reality Glasses were mass shipping worldwide, along with the availability of hand tracking and gesture controls for the X2 glasses – features that will no doubt come in useful for flight line maintainers using the devices as part of today’s partnership.

    Image credit: ThirdEye

  • U.S. Army using Virtual Reality to help Soldiers shape hypersonic weapon prototype

    U.S. Army using Virtual Reality to help Soldiers shape hypersonic weapon prototype

    In Augmented Reality, Virtual Reality and Mixed Reality News

    February 10, 2020 – Using virtual reality, U.S. Army Soldiers from Fort Sill, Oklahoma are getting a rare look at components of the Army’s new prototype Long Range Hypersonic Weapon (LRHW) and influencing how the system is designed.

    The U.S. Army stated that through a mix of virtual reality, augmented reality and mixed reality technologies, Soldiers last month were able to walk around and “touch” the Army’s new prototype LRHW system as an interactive, true-to-scale, three-dimensional model. 

    Inside the mixed reality lab, known as the Collaborative Human Immersive Laboratory, or CHIL, the Soldiers could view the equipment from any angle, at any distance and manipulate it as needed in order to better understand its operation and recommend improvements.

    “We were able to stand as a group around an area called ‘the cave,’ which allowed all of us to see, in 3D, through special eyewear, the Transporter Erector Launcher and missile as one,” said LTC Aaron Bright, the chief of the Operational Training Division of the Directorate of Training and Doctrine at Fort Sill. “I was able to grab pieces of the LRHW with my hands and move them weightlessly to the side to get a better look at another part, and to better understand how the system as a whole works. The kinds of things that would take hours with a crane, and several more hours with tools, we were doing on our own in seconds.”

    While hypersonics is often considered a futuristic, complex technology, the input received focused on seemingly low-tech items that are critical to Soldiers’ operational experience, such as generator placement and access, excess equipment that could be removed to save weight, generator exhaust routing, and specific locations for skid plates.

    “You can apply virtual reality and augmented reality to almost any concept the Army or other component has and gain vital feedback”

    As the prototype is built, this early Soldier feedback will help the U.S. Army identify any quick-fix flaws, as well as offer ways to improve the operational capacity. The system consists of a 40-foot Transporter Erector Launcher (TEL) with missiles and a Battery Operations Center (BOC). The truck and trailer combination, and the BOC, are all taken from existing Army stock, and are in the process of being modified to create new equipment that’s never been used in this way before.

    “You can apply virtual reality and augmented reality to almost any concept the Army or other component has and gain vital feedback,” said 1st Sergeant Michael Weaver, with the 1-31st Field Artillery Battalion, 434th Field Artillery Brigade at the Fort Sill Fires Center of Excellence. “Identifying potential issues early on in the development process is crucial because it is easier and cheaper to adjust design during the concept phase as opposed to production.”

    The Army Rapid Capabilities and Critical Technologies Office (RCCTO) is charged with delivering the prototype LRHW to a battery no later than fiscal year 2023. This aggressive prototyping schedule, which pushes the Army’s initial hypersonic capability delivery ahead by two years, can’t wait until the hardware is modified and integrated for Soldier feedback. Virtual reality fills the void by enabling Soldier touch points on an early and regular basis.

    “We have a very tight timeline with the LRHW,” said COL Ian Humphrey, integration project manager for the RCCTO’s Army Hypersonic Project Office. “We have to make it safe and we must meet very hard requirements. Although the LRHW is a prototype, the Soldier feedback we get here provides operational input early in the process. This is not only to help inform the LRHW, but also aid in the development of the Army’s hypersonics program of record.”

    The mixed reality CHIL enables real-time collaboration through equipment including virtual reality headsets, 3D glasses, holograms, and handheld controllers. The facility is owned by Lockheed Martin, which is under contract to deliver the All Up Round plus Canister (AUR+C), which includes the missile stack, the Common Hypersonic Glide Body, and canister. The company also serves as the LRHW prototype system integrator.

    Soldiers will be involved throughout the process and as more integrated and modified hardware becomes available, they’ll get a chance to walk around the real system. Plans are also in the works to create a CHILNET, which would allow remote sites to utilize the simulations and interact in real-time from multiple locations.

    Image credit: U.S. Army/Lockheed Martin

  • SA Photonics releases SA-62/E electronic see-through Augmented Reality HMD

    SA Photonics releases SA-62/E electronic see-through Augmented Reality HMD

     

    In Augmented Reality News

    January 30, 2020 – SA Photonics, a provider of photonics solutions for military and commercial applications, has announced the release of its SA-62/E augmented reality (AR) head mounted display (HMD) with electronic see-through, and almost no peripheral obscuration. The AR display provides high contrast, bright imagery even when viewed outdoors, as well as full occlusion of objects that are behind other objects, according to the company.

    The video processing system, developed in partnership with SL Process (Paris, France), has low latency and provides simultaneous localization and mapping (SLAM), as well as hand tracking. The video can be stored for review or shared with teammates.

    SA Photonics states that the eyepieces of the SA-62/E were specifically designed to have less than 10% obscuration of the entire human visual field, allowing users to interact with people and other objects, while still affording safe operation given that users can still see their feet and the ground.

    Dr. Michael Browne, SA Photonics’ General Manager, commented: “The SA-62/E provides a seamless transition between the real and augmented worlds. Our system will provide great benefits to users who require occlusion and operation in high ambient brightness environments.” Stan Larroque, CEO of SL Process, added: “We are excited to integrate our low latency video processing, SLAM and hand tracking architectures with the SA-62/E HMD. I believe this will be a revolutionary AR display.”

    The SA-62/E HMD will be demonstrated at the SPIE AR/VR/MR Conference in San Francisco on February 3-4 in Booth 9.

    Image credit: SA Photonics

  • Vuzix announces ONVIF security camera support for its full line of Augmented Reality Smart Glasses

    Vuzix announces ONVIF security camera support for its full line of Augmented Reality Smart Glasses

    In Augmented Reality News

    January 15, 2020 – Vuzix Corporation, a supplier of smart glasses and augmented reality technology and products, has today announced that it has made available an ONVIF compatible security application on the Vuzix App Store along with drivers and source code access for third party integration for the Vuzix Blade Smart Glasses with planned follow-on support for its full M-series line of Smart Glasses.

    ONVIF (Open Network Video Interface Forum) is a global and open corporate forum that was founded to standardize the IP-based surveillance camera industry. The IP security camera market is projected to grow from its current market value of more than $8 billion to over $20 billion by 2025, according to a study by Global Market Insights.

    The Vuzix Blade Smart Glasses resemble ordinary glasses, and make for an ideal augmented reality accessory for security personnel that are utilizing ONVIF security cameras across a wireless network. With the Vuzix ONVIF application, security personnel can access multiple cameras on the wireless network and toggle between IP surveillance camera feeds and, with the available drivers and source code, third parties can easily integrate Vuzix Smart Glasses into existing solutions.

    “The introduction of the ONVIF security application for the Vuzix Blade provides a new use case for the Blade in enterprise and opens up new segments of the market related to security especially within big box retail stores, public safety, smart cities and others,” said Paul Travers, Vuzix President and Chief Executive Officer.

    Image credit: Vuzix

  • Intevac, Inc. receives USD $8.1 million contract award for development of night vision goggles with Augmented Reality capabilities

    Intevac, Inc. receives USD $8.1 million contract award for development of night vision goggles with Augmented Reality capabilities

    In Augmented Reality News

    January 14, 2020 – Intevac, Inc. has today announced that it has received a USD $8.1 million contract award from the U.S. Army for the 24 month development of the Delta I fused digital night vision goggle incorporating advanced augmented reality capabilities. The program is in support of a Coalition Warfare Program for the special operations forces of the United States, Australia, Canada and the United Kingdom.

    “This will be the first goggle system to incorporate Intevac’s newly-developed ISIE 19 EBAPS sensor,” commented Timothy Justyn, Executive Vice President and General Manager of Intevac Photonics. “This award continues to demonstrate Intevac’s commitment to delivering the latest digital night vision technology to our Warfighters.”

    “We are very proud to have received this system development order for our digital night vision technology,” added Wendell Blonigan, President and Chief Executive Officer of Intevac.

    Intevac’s digital night-vision sensors, based on its patented Electron Bombarded Active Pixel Sensor (EBAPS) technology, provide state-of-the-art capability to the most advanced avionic fighting platforms in the U.S. Department of Defense inventory, according to the company.

    Intevac was founded in 1991 and has two businesses: Thin-film Equipment and Photonics. The Thin-film Equipment business, focuses on the design and development of high-productivity, thin-film processing systems for substrates with precise thin film properties, such as hard drive media, display cover panels, and solar photovoltaic products. The Photonics business, develops high-sensitivity digital sensors, cameras and systems that primarily serve the defense industry. Intevac states that it is the provider of integrated digital imaging systems for most U.S. military night vision programs.

    Image credit: Intevac, Inc.

  • VRgineers unveils the next generation of its XTAL professional 8K Virtual Reality headset

    VRgineers unveils the next generation of its XTAL professional 8K Virtual Reality headset

    In Augmented Reality, Virtual Reality and Mixed Reality News

    January 7, 2020 – VRgineers has today introduced the latest generation of the XTAL professional VR headset. Already in use with the U.S. Department of Defense, major national aircraft simulators and leading global automobile manufacturers, this latest version of the XTAL headset incorporates 8K resolution with a set of new features, including readability optimized for powerful GPUs such as the NVIDIA Quadro RTX 8000.

    “Our customers are using the latest cutting edge technologies, so they expect the best from their VR solutions,” said Marek Polčák, VRgineers CEO & co-founder. “That means solutions optimized for the latest NVIDIA RTX cards with VirtualLink embedded. The latest generation of XTAL VR headset is the only solution today that meets these needs with 8K resolution.”

    VRgineers introduced the original XTAL headset two years ago, which was designed specifically for the needs of professional users such as designers, engineers, pilots, doctors and trainers, and allowed them to virtually design, create, prototype, teach or train with high image quality and a wide field-of-view. The company states that its XTAL headset offers the visual quality needed for even the most demanding tasks.

    VRgineers’ new XTAL headset includes the following technologies:

    • High-density LCD displays with 8K resolution;
    • Foveated rendering capabilities;
    • AR mixed reality module add-on;
    • Improved lenses with 180º FOV;
    • Eye tracking capable of running up to 210 fps;
    • Highly accurate Leap Motion sensors;
    • Unique VirtualLink cable implemented;
    • Capable of embedding to helmets for simulations.

    This latest generation of XTAL features an embedded VirtualLink cable manufactured by BizLink. VirtualLink is an open industry standard developed to meet the connectivity requirements of current and next-generation virtual reality headsets. VirtualLink is a standards-based USB Type-C solution designed to deliver the power, display, and data required to power VR headsets through a single USB Type-C connector, providing a durable and simple connection. Instead of multiple cables, VirtualLink-enabled systems like XTAL only require a single cable to connect their systems.

    The XTAL headset is currently being used by a variety of US Airforce airbases for pilot training, according to VRgineers. The first recipient of the new XTAL model will be the Vance Airforce Base in Oklahoma, which VRgineers states has ordered XTAL headsets as part of a complete upgrade to its training center. VRgineers are also participating in US Navy and US NAVAIR R&D initiatives to create next generation simulator solutions.

    “The feeling that I got while flying the F18A in full VR mode in XTAL is really astonishing. It was so close to reality that I felt I was inside the F18A. As a pilot, that is exactly what I need to feel for training purposes,” said Capt. Taimeir, a former F18 pilot from the Swiss Airforce, CEO of Mirage Technologies.

    VRgineers will be showcasing its new generation of the XTAL professional VR headset at this year’s CES 2020 in Las Vegas, January 7-10. Demos can be arranged upon request with the company.

    Image credit: VRgineers