Maintenance of modern aircraft is a complex process. The use of AR smart glasses with 3D information visualization should make repair and maintenance work easier for aircraft mechanics. Virtual elements such as instructions, displays and technical tools can be operated by way of gesture, voice and gaze control. With the help of augmented reality systems, researchers at the Fraunhofer Institute for Communication, Information Processing and Ergonomics FKIE have developed concepts and solutions for maintaining the Airbus A400M.
Modern aircraft are fitted with countless mechanical and electronic devices and components. In aircraft such as the Airbus A400M, the engine itself is made up of more than 10,000 individual parts. Servicing and maintenance of the aircraft are therefore demanding tasks. To date, printed operating manuals have guided aircraft mechanics in their challenging work. However, these systematically structured manuals are cumbersome and can therefore be difficult to use—especially when several manuals need to be referred to at the same time.
In cramped areas like the cockpit, juggling documents results in space issues and a lack of clarity. In addition, the two-dimensional depictions of complex assembly tasks are not always self-explanatory and can be misleading. In the worst case, this can lead to maintenance errors. Virtual 3D guides, which are overlaid in the wearer's field of vision when using AR smart glasses, could solve these problems and replace two-dimensional maintenance instructions in the long term.
Researchers working on the "Ariel" project in the "Human-Machine Systems" department at Fraunhofer FKIE have evaluated how augmented reality can assist aircraft mechanics with maintenance work, using two use cases—"Installation of a display unit in the cockpit" and "Maintenance of a battery in the workshop"—by way of example.
The prototype concepts for the Airbus A400M were tested with two types of AR glasses—Microsoft HoloLens 2 and Epson Moverio BT-300. The focus was on the design of suitable 3D information visualizations and interaction techniques such as gesture, gaze and voice control. Five aircraft mechanics were involved in the tests, which took into account the issues of usability, user experience and comfort.
Dedicated interaction concept for each pair of AR smart glasses
"The concept development included many aspects, such as taking into account the knowledge of domain experts, analyzing the approaches taken by the test persons in the workplace, checking their personal gear and the compatibility thereof with the AR smart glasses, and developing a unique interaction concept for each pair of glasses, based on the various requirements in the categories of use, organization and design," explains Martin Mundt, a scientist in the Human-Machine Systems' department, summarizing the course of action.
The concepts were also adapted to the greatly contrasting areas of application: The first of these was the cockpit, in which extensive gestural interaction is restricted and where there is also an increased noise level, which influences possible speech recognition. In addition, a concept for a workshop environment was created. With controlled lighting, it was well-suited to the automatic detection of objects.
"We expanded the existing regulations, implemented them in 3D and supplemented them with animations visualizing certain steps directly on the virtually displayed component. That could be, for example, measuring the resistance of the battery or special notations on the model of the cockpit," says the researcher.
Highly versatile interaction elements
The team relied on multimodal input techniques, which allowed the user to interact independently of constraints imposed by the task in question or certain environmental variables. Various interaction elements, which could be operated by way of gaze and gesture control, were designed for this purpose. Various approaches were considered for the avoidance of unwanted input when using gaze control.
Ultimately, the decision was made to realize an input by briefly gazing at the interaction element. An indicator (similar to a progress bar) fills up, showing how long the user still has to look at the element in order to trigger it. Alternatively, the user can activate the interaction element by pressing it with their index finger.
Intuitive interaction with smart glasses
Overall, the interaction was rated as "intuitive" by the subjects. Gaze control was praised, and the majority of the testers preferred it to gesture or voice control. The direct overlaying of relevant information during battery maintenance helped to identify specific components. Animations used for additional illustration of assembly tasks were rated positively.
On the other hand, there were uncertainties when using gesture control. As a result, in addition to the demo applications, the research team developed a learning environment that used animations to teach users basic gestures and interaction techniques.
Remote maintenance as a further option
The next step is to be able to record and digitize documentation and notes on the status of work. It should also be possible to call in experts. "We want to address remote maintenance tasks that primarily focus on interaction between the participants," says Mundt. As well as the use cases analyzed, the researchers want to investigate further areas of application, given that other sectors can also benefit from the use of augmented reality for maintenance and repair work.
Because of this, the shortage of skilled workers in the energy sector could be compensated to some extent. For example, AR could assist less-experienced technicians with the installation of heat pumps, since the three-dimensional step-by-step instructions also guide beginners safely through the maintenance tasks.
Provided by Fraunhofer-Gesellschaft