The synthetic vision system developed by Airbus exploits modern screen technology. Credit: Airbus
As shipping makes its journey into digitisation and automation it is wise to look for inspiration and pitfalls from other sectors that are farther ahead on the path. Discussing automation in aviation at the recent Transas Global Conference in Vancouver, on 6-8 March,
Harry Nelson, director at CL Max Consulting and a former Airbus operational adviser on product safety, said, “We focus on what pilots do wrong, and it has worked. We are down to around two accidents a year.”
This is the kind of impressive statistic that prompted David Christie, senior vice-president for corporate maritime quality assurance at Carnival Corporation, to look to aviation for new approaches to safety. Christie explained that he had spent time and taken “a lot of advice” from airlines as to how Carnival should operate its fleet operation centres (FOCs) and learnt from how they share data between airlines and land-based flight control stations.
“At an aviation industry event I asked what I thought was a difficult question: what makes pilots in the air honest and follow regulations? One pilot replied simply: ‘flight operations quality assurance’.”
More commonly known as FOQA, it is the process by which data from a plane’s digital systems, either recorded when the pilot moves the controls or collected when systems are automatically operated by the aircraft itself, is sent back to land, where it can be used for analysis of deviation and trends. Airlines can then use the data to observe trends in particular kinds of unsafe manoeuvre or misuse of equipment and develop policies on safe flight operations and guidelines before these trends lead to accidents.
This creates a virtuous circle, where pilots can be made aware of old, unsafe practices, and can modify their behaviour, ensuring that safety is at the forefront of their decision-making. It has the added benefit of improving maintenance and overall operational efficiency.
Under regulations introduced by the International Civil Aviation Organization in 2005, flight data monitoring and FOQA became mandatory for most large aircraft operators.
“So we went to look and build our own version, which we called BOQA [bridge operations quality assurance] but is now affectionately known as Neptune,” said Christie. “Now with 104 cruise ships in the fleet we can call up exactly what is happening and what parameters we’ve set.”
Automated data collection via Carnival Maritime’s Neptune platform, built on cloud-based technology from Microsoft, has been in use at the Carnival Maritime Fleet Operations Centre in Germany since October 2015. First piloted with its European cruise line brands in Hamburg and Southampton, it has since been rolled out to monitor its ships in the United States and Caribbean through operations centres in Miami and Seattle. The goal in developing Neptune was to achieve aviation-level safety standards. Fromthe Neptune ‘dashboard’, Christie said, fleet operations centres can get live data from any ship, downloading about 2 million records a day from each vessel. And while the safety element was the main driver of the project, it has also provided “gold dust” in the form of engine analytics that can drive energy efficiencies.
“There are other parameters built in, such as rate of turn and rudder usage, so that if aggressively used during a watch we would be notified in our operation centre and understand what happened,” explained Christie. “If the captain is not on the bridge he will also be alerted and the officer on duty will have to give reasons for deviating off track.”
This kind of live monitoring technology being fed back to shore does raise concerns over who will ultimately make decisions and take final responsibility for the safety of the ship. However, Christie stressed that the master still had “responsibility at all times”; the operation centres, which always have a senior master or officer with access to the live data, are simply a “third eye” (along with the two officers of the watch on the bridge) to assist. “If there was a problem it used to mean a lot of phone calls to ship staff. Now there is just one call to make to the fleet operations centre saying ‘we are going to be late, please make appropriate arrangements’,” said Christie. “It is another support system to call fleet operation centres to ask what they think about the data they are seeing and what they think, but the final decision remains with the master at all times.”
When discussing hacking vulnerabilities, Christie said that as all information was just going from ship to shore at present this was not much of an issue but “one can foresee there will be two-way communication and then it will have to be further protected”.
For now, though, it appears that the development is reaping rewards for Carnival, which is collecting and studying the data to find ways to improve operations. “We analysed commonly travelled routes between Marseille and Ibiza of four different operating companies, including Costa, P&O, and Carnival, looking into speeds, fuel burns, and routes. From that we can make well-judged safety related decisions as to what is best speed and route to use.”
Hidden automation risks
While automation has led to safer operations for airlines, a discussion on the relationship between the human and the machine at the Transas conference prompted Nelson to warn that “99.99% of the time pilots perform very professionally but if we don’t understand the small percentage that don’t then going full autonomous is very risky in my view.”
As shipping knows with ECDIS, the introduction oftechnology can be a blessing and a curse: the term ‘ECDIS-assisted groundings’ has been developed to describe accidents caused by seafarers’ overreliance or misuse of ECDIS. The introduction of ‘smarter’ technology, designed to improve operations and safety may inadvertently end up creating unforeseen safety issues. While we should not shy away from technological progress, there is a real need to understand how humans work best in order to avoid technology-assisted incidents.
Rama Myers, vice-president of aviation at Seeing Machines, a company that develops computer vision technologies to enable machines to see, understand, and assist people, explained how pilots’ roles have changed as flying has become increasingly automated. “Today the average pilot spends around three minutes on each journey actually manually flying an aircraft, whether short- or long-haul. Pilot’s roles are less hands-on and is more of a monitoring role, scanning the screens in front of them. It is is more passive. We [humans] are not excellent at that. It has been found that poor monitoring is a contributing factor in up to 50% of aviation accidents. The industry lacks knowledge of how to train for monitoring.”
For aircraft pilots, as well as for seafarers, if you spend hours monitoring screens rather than engaging with machines and your surroundings, there is a risk of boredom and fatigue. A survey carried out by pilots union Balpa in 2013 found that 56% of pilots admitted to involuntarily falling asleep in the cockpit during flight; Nelson and Rama said a lack of engagement with flying and constantly scanning screens will contribute to this.
The solution to such issues can also come from technology. Seeing Machines has developed computer vision technology that can look at eye behaviour, pupil diameter, and eyelid behaviour and detect signs of fatigue. When detected, the system will deliver a haptic alert, for example vibrating the pilot seat, to make sure the subject is alert. “Some other solutions being introduced in the aviation sector involve sensors monitoring heart or brain activity,” said Myers.
When asked what maritime should learn from aviation’s experience with automation, Nelson was quick to reply that it needs to be “applied appropriately and understood”.
“Airline pilots will say they understand how to use a system but it must be emphasised through the training process,” he said, adding that in the long term automation can lead to de-skilling and cognitive arousal issues. “If you are running on low operation and then something dangerous happens that needs attention the pilot needs to switch quickly to maximum cognitive load, which we do not always do well.”
Ultimately, Nelson said, there are unintended consequences of automation that should be thought about and planned for in training. He warned that an automated system “is only as good as the person who designed the system”. He also stressed that humans are much better at risk management decisions that machines and without knowing how many accidents are avoided by these decisions, going fully autonomous is a risk. “If I come into work and someone says the weather is bad I might decide to take an extra tonne of fuel that could save me later if I need to divert course. Humans carry out these risk management decisions all the time.”
Importantly, when it comes to discovering new technology and learning vital safety lessons about its use, Nelson stressed that collaboration was key. “I’ve been impressed and seen some amazing tech in maritime. We have things to learn from each other.”
Contact Tanya Blake or follow on twitter @Tanya_Blake