As commercial and military aircraft become more technologically complex, the University of Iowa Operator Performance Laboratory (OPL) is conducting research to better understand how much a pilot can handle in the cockpit before becoming overwhelmed and stressed—a situation that can turn deadly in seconds.
In a separate study, but one that also aims to keep pilots and passengers safe, the lab is testing new 3-D technology to help helicopter pilots land amidst blowing dirt, sand, and snow.
OPL director Tom Schnell recently traveled to Europe to fly a Swiss Army helicopter equipped with the technology, which uses a pulsed laser to measure distances and locate obstacles. Also participating in the test were members of the NATO Industrial Advisory Group, who work together to improve military technology. In Europe, the testing environment was a snowy mountaintop, but last September Schnell and his team traveled to Arizona to evaluate the equipment in a dusty patch of desert. The Arizona trip was sponsored by Hensoldt (a spin-off of Airbus’ defense electronics activities), the maker of the 3-D technology.
The projects are good examples of the work OPL does to improve the efficiency and safety of flight-management systems and other airborne sensor systems. Housed in a cluster of hangars and office buildings at the Iowa City Municipal Airport, OPL is a member of the UI’s Center for Computer Aided Design (CCAD). Both OPL and CCAD are part of the UI College of Engineering.
“Pilots have to perform a lot of headwork when they’re in the cockpit, whether it’s the cockpit of a medivac helicopter or the cockpit of a commercial jetliner,” says Schnell. “At OPL, we provide an academic research environment that allows for rigorous testing of new technology, as well as the study of pilot behavior. The goal of many of the studies we undertake is to keep pilots and passengers safe.”
Avoiding cockpit confusion
To better understand pilot function in the cockpit, OPL researchers recently partnered with Rockwell Collins, an avionics and information technology systems company, to equip one of the lab’s two jet fighter training aircraft with an additional computer and touch screen to virtually augment pilot workload in flight. NeuroTracker software by CogniSens Inc., a Canadian firm that uses neurological technologies to enhance cognitive capabilities, produces images of spheres, each with an ID number, that move across the screen.
As the study participants perform flight maneuvers of varying difficulty—from an easy 30-degree right turn to a timed, 1,000-foot descent with 360-degree turn—they also must keep track of the moving spheres. A camera attached to the pilot’s helmet records eye movement and a heart monitor keeps track of their heart rate. This information is recorded, along with other physiological readings, using software created by OPL called the Cognitive Avionics Tool Set, or CATS.
Schnell recruited pilots with a low number of flight hours because he wanted to measure how quickly their flight performance deteriorated as a result of increased workload. (Experienced pilots are more adept at juggling activities.) Some study participants came from a flight school in Dubuque, Iowa, and others volunteered as a tradeoff for flying time in the jet.
“I’ve never flown a plane like this, so it’s pretty exciting,” says study participant Bryce Richards, of Polk City, Iowa.
Dressed in a flight suit, Richards puts on the camera-equipped helmet and climbs into the dual-cockpit jet with Schnell. Upon reaching an altitude of roughly 10,000 feet, Richards takes control of the aircraft, running various maneuvers over the flat, Midwestern terrain.
Inside the hangar, researchers Chris Reuter and Maxime Montariol, a French Air Force Academy student who sought work in the lab after watching a YouTube video spotlighting OPL’s work, monitor a set of computer screens that depict Richards’ situation in real time. Communicating via radio with the jet and interacting with the onboard computers through a digital data link, Reuter and Montariol carefully record the ID numbers of the moving spheres as Richards reads them.
Although test participants practice on the NeuroTracker software for weeks before takeoff, it’s much more difficult to track the moving spheres while flying. These high-pressure multitasking moments are exactly what Schnell and his team want to observe.
“Basically, we’re trying to see how much we can put on them and still have them stay within a safe flight margin,” says Reuter, a UI graduate student in industrial engineering from Coralville, Iowa.
Later, after the jet has returned to the hangar, Schnell talks about the future of air warfare and the strains that “complex, large-force engagements” put on military pilots. He explains that pilots today have to do a lot of button pushing in the cockpit and that the use of infrared search systems—used to spot enemy planes—means that pilots must take their eyes away from the horizon to monitor a screen.
“All of these technologies require more headwork, more coordination, and more multitasking,” says Schnell. Commercial pilots are under similar pressure, he says.
Aerospace companies such as Rockwell Collins are looking to develop better flight training and simulation technology, including the use of biometrics and cognitive elements, Schnell says. He is hopeful that the research he’s doing on pilot workload in the cockpit will help the company bring enhanced simulation and training tools to the commercial market.
“The degradation from a situation where all is well to where aircraft control is lost can be a few seconds,” Schnell says. “Tracking cockpit automation takes an analytic mind and is a task that must be learned in order to be mastered.”
Augmented reality in the cockpit
Even the most masterful pilot can run into trouble when landing an aircraft in degraded visual environments, situations that might involve flying dust, sand, or snow. That’s why Hensoldt developed technology that fuses digital and sensor information with natural vision to create “augmented reality.”
Although virtual reality has become relatively common in the gaming industry, augmented reality—the fusion of virtual with real-world content—is still relatively new in aerospace, says Schnell. The goal of the research partnership with Hensoldt is to take this emerging technology and make it more accessible to aerospace firms. Early tests confirm that the technology holds great promise, especially in terms of improving air safety.
During the testing at the U.S. Army’s Yuma Proving Ground in Arizona, Schnell and his team attached a SferiSense 500 lidar device, a light-detection and radar camera, to the outside of an MI-2 helicopter. The device emits laser bursts that ricochet off obstacles, sending back information about the terrain, including the location of boulders, poles, electrical wires, and buildings, that could be hidden by a “brownout” (flying dust) or a “whiteout” (flying snow).
Watch a video of the Arizona “brownout” testing, and see images from inside the augmented-reality helmet.
This information is relayed electronically to the pilot, who wears a special helmet. The helmet uses binocular technology to project 2-D and 3-D symbols (similar to Tony Stark’s Iron Man mask) onto the back of the helmet visor. Cameras inside the cockpit track the pilot’s head movement and synchronize the computer-generated images to match the pilot’s view, no matter which direction they turn.
“When a brownout starts, you feel like you’ve stopped, even if you know you’re still moving,” Schnell says. “A pilot can totally lose track of his or her position, and things can literally turn upside down very quickly.”
Just a few days after the desert test, Schnell and his crew hosted 10 U.S. Air Force test pilot students at OPL headquarters to run similar flight vignettes. Such a quick turnaround is a testimony to the expertise of OPL’s researchers, who are capable of MacGyver-esque feats of engineering. Basically, they can take any avionics prototype and make it work in just about any aircraft in the OPL hangar.
“OPL is unique in that we have flying testbeds that can do things that no one else can do,” Schnell says. “The ability of these testbeds to work together to test new equipment and the resulting human response is something pretty remarkable.”