New university research applies biometrics to flight training
A joint project between Southern Methodist University (EMS) and simulator manufacturer CAE explored the use of biometric measurements during flight training, with a view to investigating the benefits of a personalized training regimen, AINonline reports.
The new approach combines biometrics with machine learning techniques to measure situational awareness and cognitive load of pilots in various scenarios.
Biometric measurements include visual gaze patterns, pupil size, and heart rate to determine a pilot’s level of engagement, workload, situational awareness, stress, or fatigue.
For example, a “bad” gaze pattern – depending on the phase of flight – could indicate a high workload, while a “correct” gaze pattern would show a higher level of attention and performance.
Similarly, fewer blinks or blinks of shorter duration could be correlated with tasks requiring greater attention, and heart rate variability could be used to track exertion.
Biometric data was collected using a virtual reality headset with an integrated eye-tracker and a wrist-worn device, then correlated via computer analysis.
The project is now entering its fourth year and, according to the researchers, some of the early results of the automated biometric tests largely reflect the assessments of highly experienced human evaluators.
“Our theory is that biometrics during the simulation will result in much more objective and accurate measurements than asking users a few questions after the simulation to gauge their experience,” says Suku Nair, director of the Center for Virtualization at SMU.
This in turn would make flight training potentially more personalized, effective and efficient.
However, accelerating learning with biometric sensing is a difficult and unproven hypothesis, according to the study’s lead researcher, Eric Larson.
“This research aims to understand how sensing can be used to understand a learner’s level of proficiency in a difficult task, such as flying an airplane. We hope to advance the field of research by being the first group to show whether personalized and automated learning can be effective in a real-world learning scenario.
For context, the research was originally designed to support a 2019 Department of Defense project focused on accelerating complex skills training and multi-domain warfare support.
In the same year, SMU and CAE (then L3Harris Technologies) proved for the first time that machine learning based on biometric data could deliver accurate performance results in real time. L3Harris’ flight training technology business became CAE after Leidos acquired its biometrics and security business in 2020.
Fast forward to 2022, and much of the data collected by the project would be based on a repeated measurement experiment using 40 test subjects with varying backgrounds and experience levels, flying a flight simulator in reality. mixed (MR) in a controlled environment.
SMU and CAE demonstrated the feasibility and utility of the physiological sensor system through 33 actual flight maneuvers performed by pilots at Edwards Air Force Base.
Eye-tracking technologies have also recently been used by Microsoft as part of its Flight Simulator game.
biometric data | biometric sensors | biometrics | eye tracking | heartbeat | machine learning | monitoring | personalization | virtual reality | portable