After HOTAS, HMDs, touchscreens and gesture control - is ‘thought control’ the final evolution of the human-machine interface for pilots? TIM ROBINSON talks to Honeywell Aerospace about its cutting edge research into neural technology. 

The fictional Firefox needed Clint Eastwood's character to 'think in Russian' to fire the missiles. (YouTube) 

Aviation film buffs may well remember the 1982 film Firefox - in which Clint Eastwood steals a top-secret MiG-31 fighter to escape to the West. The fictional ‘Firefox’ was ahead of its time in featuring stealth - but interestingly another technology that some 30 years ago was science fiction, ‘thought control’ is now being made into science fact. Only last year Honeywell Aerospace conducted neural interface tests using a 737 simulator and actual flight tests with a King Air – with a pilot controlling the aircraft’s manouevres using thought and is currently working on the next evolution of this cutting-edge research. Bob Witwer, VP Advanced Technology, Honeywell Aerospace, is at pains to explain that ‘thought' or 'mind control’, (which conjures up images of computers able to literally read minds) is really the wrong description - the correct term being ‘neurotech’ (or transcranial neural sensing) .

In Honeywell's advanced research lab, the company already explores future flight interfaces like touch, gesture, voice and eye tracking.  

Those worried that an airliner can now be able to ‘read minds’ or explore a pilot's secret fantasies may be relieved: “You can't read peoples deepest, darkest thoughts – it just doesn't work that way” says Witwer. Instead neural sensing is about recognising brain signals - particularly when patterns are involved. It is sensing these tiny microvolts of brain activity, explains Witwer: “We sense these signals outside of the brain, do some spectacular signal processing with those signals, then we can indicate what part of the brain is active. We even look for certain patterns that you can then tie to certain activities.” 

Honeywell has been working on neural sensing technology now for around 12 years. Originally this was developed for a US DARPA AugCog project for footsoldiers which would use neural and biomedical sensors to transmit soldiers status and attention/saturation levels to squad leaders. In those days the bulky cranial sensors required a very short haircut as well as a gel to provide the contact between sensor and skull.

The company then worked on another DARPA project for intelligence analysts which found that ‘subconsciously’ humans could detect patterns (or targets of interest in satellite imagery) far more quickly than conscious study. Wired up to a machine to measure the ‘aha’ signal moment when the brain recognises a visual pattern allowed intelligence analysts to process and match images at superhuman speeds - some seven to ten times faster while still maintaining the same (or even greater) accuracy. Says Witwer: “Humans are fabulous graphics processers.” By then, the neural sensors had become more sensitive and now used dry contacts.

Brain control in the cockpit


The company used one of its flight test fleet, a King Air for the 'thought control' tests. (Honeywell) 

This research has led to Honeywell's latest project - putting neural sensors in the cockpit and builds on its other work on future flightdeck HMIs with voice, touch, gesture control and eye tracking. Witwer stresses that neurotech is no mere fad for him and his team: “We’re not looking to do these as gimmicks.” Instead he says: “If you deeply understand the mission and how the human is going to interface with the system and then you match the modality to best suit the mission, then you are going to have a system that is intuitive, unambiguous and easy for the pilot to interact with.” 

Honeywell began investigating neural interfaces and the flightdeck using a 737 simulator in Redmond, Washington. The test subject pilots were wired up with transcranial neural sensors to detect 'event related potential' or the image-matching 'aha' signal. Witwer stresses that this visual input was for the demonstration test only – and not necessarily how it would be used in a real aeronautical application - “it wasn't a very cockpit-friendly way of doing it” he admits. For this experimental setup, four flashing lights were installed in the 737 flightdeck – each flashing at a different frequency (from 2-8 times a second) and arranged around the pilots field of vision up, down left and right.

Says Witwer: “This meant that as the pilot looked at the left hand light, flashing at 8hertz, the signals we picked up from the primary visual cortex, would be 8 times a second signals.” He adds: “We would sense that, and once we were sure that is what he was thinking of, then the aircraft would bank left”.

From these simulator tests, Witwer and his team went one step further in 2015 with actual flight tests using one of Honeywell's company demonstrators, a King Air, which was flown in the Seattle area. For this testing, the team fitted a refined, non-flashing single visual display of a '9 block grid'. Instead of looking at a flashing light, the pilot would think about the single 'block' (which moved quickly around the grid display) being in the corresponding position. Once the block was in the correct position, a recognition pattern would be detected using the neural sensors, and that was transmitted to the aircraft's controls. The sensors are now advanced enough to work through hair and are less obtrusive that a swimming cap.

All told, Honeywell flew ten test flights using this experimental set-up – says Witwer: “It actually worked very well”. The next phase of neural research currently underway, reveals Witwer, will be using the supplementary motor cortex (where the brain plans body movement) signals, so that a pilot merely by thinking about using their arm to pull back on a yoke or stick in their head, could make the aircraft pitch up. Like the visual input neural tests, Honeywell will use a ground-based simulator first.

Possible applications

Using neurotech controls to operate or view secondary systems in high workload situations could be one application. (MoD)

So what could this advanced technology be used for? Witwer is sceptical that the first use of the technology will be to control the aircraft itself. “The way our industry works (and very rightly so), on safety first, above all, probably means it is going to be a long time before we use neurotechnology to actual control an aircraft itself.”

However he foresees that it very well could have applications, for instance in operating secondary systems or controls – especially in abnormal or emergency conditions.

Says Witwer: “One of the applications we thought of was for a helicopter pilot. There are certain times when a helicopter pilot is flying and both of their hands are busy on the controls, their feet are busy, the environment is noisy (particularly for military types) – so touch, gesture or voice may be very difficult.” In this maxed-out situation, “all that pilot wants to do is get a quick look at a non-flight critical, but important synoptic page on a display that shows the electrical system. This is a case where that modality [neurotech] could really be the right modality to help him with his mission.” 

The fictional Firefox ‘You must think in Russian’ to fire missiles may be fiction, but it is not hard to think of other examples (deploy chaff/flares?) where another potential human-machine modality (or input method) could be useful when a pilot was in a highly task-saturated situation. “It's not about gimmicky” says Witwer, “we think there is a lot of potential here.” One barrier for the commercial world, he says could be wearing the sensor skull cap itself but observes that it is now common to see people wear Bluetooth headsets all the time – even when not on a phone call. It may be that as the technology shrinks and consumer applications increase, a neural interface could easily be incorporated into a pilots helmet or even standard headphones one day.


Monitoring fatigue?

An EU project 'Brainflight' has also looked at neural technology to control UAVs and aircraft (Technical University of Munich).

As well as neural technology applications to help out aircrew when they are overloaded, monitoring the pilots awareness levels when they are underloaded, bored or tired could also be valuable in long-haul airline operations - where fatigue is a concern. Having a neural monitor which alerts pilots that their concentration levels are dipping could be a useful way of reducing the ‘startle factor’ when abnormal situations develop. Says Witwer “Attentiveness is a relatively characterisable neural signature.” In fact, he reveals, around five years ago the TSA was interested in using this emerging technology for airport baggage screeners – where keeping attention levels high is critical. “Attentiveness is a very real place for us to focus this.”

There may also be training and research applications for this type of neural sensing. Has the student really understood the lesson and taken it all in? Are they concentrating on learning or is their mind wandering elsewhere? Flightdeck and display design, as well as passenger cabin optimisation, too, could be areas where neural sensing could help bring new insights into the HMI. There may even be spin-offs into other fields – Honeywell, for example, has already worked on using its expertise in this technology to help people with traumatic brain injury to strengthen and rebuild lost neural connections, without overstressing the patient.

Finally this technology could also one day allow those disabled or with limited mobility to enjoy the wonder of flight, through neural interfaces. Already a DARPA project has seen a paralysed woman fly a F-35 simulator just using her mind, while other research projects have seen drones controlled using neural technology. Meanwhile, an EU-funded project, Brainflight, meanwhile has used a Diamond DA42 simulator at the Institute for Flight System Dynamics of the Technische Universität München (TUM).

Summary

Again these are very early days and this technology is still very much in its infancy. Says Witwer: “What we may find is that we get it more mature [on the ground] before we take it into the cockpit.” He adds: “Seeing where it works and where it doesn't, rather than saying 'would you like to fly an aeroplane with brain control' is key. It's matching the modality to the mission”. However, as the cranial sensors become lighter, cheaper and more sensitive (and enter the consumer world), while the processing becomes more refined and accurate, it could bring pilot and machine even closer than we can ever imagine.


12 April 2016