Guest Blog: The Case For Keeping Two Pilots

Jeremiah FELIX is a pilot for a Canadian airline with concerns about the apparent move toward single-pilot airline operations.


This is a companion discussion topic for the original entry at https://www.avweb.com/insider/the-case-for-keeping-two-pilots

Case against, two pilots keep pilot wages down. Single pilots will be able to claim at least 25% more for added responsibilities…
And that Air France plane that crashed actually had three, highly qualified, pilots on the flight deck. Arguably one, knowing what they were doing would be better than three, with two not realising the third was pulling back on the stick with all his might…

So if single pilot is ok, when will single engine be ok?

I have noticed no mention of SMS in any of the airline single pilot operations discussions I have read. What would a FRAT score be in this situation? Not sure on the “qualified pilots” in the Air France crash mentioned. I do agree with the authors’ reasoning for keeping 2 pilots in the cockpit for airline ops. Single engine airline ops would require Congress in the US to change that rule, since they mandated at least 2 engine planes for airline ops in the 1920’s. Considering how “fast” they work, not worried about that happening in my lifetime.

1 Like

It’s tiring to hear the same drumbeat of “automation reduces workload.” This is only partially true.

The main reason for the development of FMS systems, etc., was not to reduce pilot workload, but to increase fuel and operational efficiency. It does reduce workload at times, and in a perfect world would allow for drastic changes like single pilot operation. In a perfect world one could program an entire flight into the FMS, takeoff, engage the autopilot in LNAV and VNAV, and sail on to your destination hands-off. But we don’t live in a perfect world. Here’s where it is appropriate to use the word “never.” I have NEVER flown a flight where the planned route, planned altitude, planned approach, and planned runway remained unchanged.

There are times when automation actually increases workload, like during arrival and approach. One has to decide what level of automation is appropriate to the situation, use the other pilot to make changes (especially at low altitude), watch for traffic, and make the correct changes at the correct time. “Downloading” automation to an appropriate level has become an important part of operations in high traffic environments. This all becomes second nature to experienced and proficient professional crews. Take one pilot out of the picture and “the automation” will become overwhelming.

Yours is an excellent comment, made I am guessing by someone who has personal experience of automation failures.
One should first ask “ What is the advantage of a single pilot operation. The knee jerk answer would be to save a pilot’s salary. That is a nonsensical argument when one considers the cost of an accident caused by an automation failure. The Qantas Airbus 380 incident should be a wake up for all those thinking that automation can do a pilot’s job. Those five guys struggling to get that aircraft back on the ground.

What should always be understood, is that automation, computers, AI can only react to situations that a computer engineer can imagine happening. Sollenberger’s decision to ditch in the Hudson was as a result an unimaginable collision with multiple geese. If the situation can’t be imagined, then it can’t be programmed into the system. AI is not a magical solution, because as far as anyone can forecast at the moment, a computer system can only follow a logical progression to arrive at a solution for a situation. Far from decreasing the workload, when things start going wrong, the Flight Management System can INCREASE the workload tremendously.

The Air France accident had only two pilots in the cockpit and quite definitely they demonstrated that they were NOT highly skilled. They took what would have been an annoyance to a catastrophic result.

The recent loss of F35 with its highly automated cockpit should be a Heads Up for those who believe that automation is the answer. That accident will probably be written off as pilot error, but from what has been published, it was pilot error as a result of the failure of cockpit instrumentation.

Flight time with a highly experienced command pilot and a less experienced copilot… grows that copilot in the knowledge and operation and ‘intangibles’ that comes from long in-cockpit experience. Especially from the wide variety of cockpit crew ‘mixes/mixing’ that occurs routinely in commercial aviation.

Sooo… how would an AI system provide this human-related-training to any single pilot entering the cockpit.

Would AI flight system potentially try to ‘fight’ the human pilot(s) when the unexpected happens?

Also each aircraft and each flight has subtle peculiarities that require unique ‘situational awareness’ [intuition?] of the unpredictable/unexpected elements of flight. Sounds, vibrations, operations, etc… that are second nature to experience pilots.

Also… every try to have a verbal conversation with an AI chat-bot… when you simply can’t make yourself understood? Imagine AI trying to understand ATC, air-to-air and other communications that may be odd, indistinct or scrambled.

SAE has a long list of ‘problems to be solved with AI driving vehicles of all types’. It is scary and profound… and helps you realize what a commonly skilled human is vastly capable of under extraordinary circumstances.

Then there is the utterly human element of aircraft command over crew and passengers and cargo and whatever else unexpectedly happens… in flight… and we have all been there.

I suspect that corporate and short-haul single pilot flights… which is starting to be routine… might benefit the most.

I wonder how insurance companies… responsible for the large payouts in the aftermath of aviation catastrophes… are responding to these technologies and this eventual technologies?

I’m afraid your understanding of AI is somewhat antiquated, Navyjock. The (somewhat scary) difference between “automation” and “artificial intelligence” is the latter’s ability to synthesize novel responses unanticipated by its “computer engineers”.

I fear that the airlines are only a few years away from having AI in the cockpit, and that extended timeline being due only to bureaucracy, not technology. At that point, every left-seat will contain an experienced Captain, monitoring the decisions of the onboard AI in collaboration with its counterparts in other airliners and the tower. In the right seat will be the proverbial dog.

When was the last time you saw a caboose?

I’ll bet the passengers who lived through potential disasters were grateful for the second pilot in the cockpit, especially when they had to take over control.

This is why there are thousands of self-driving cars on the road.

Automation will ultimately replace the second pilot. It’s only a matter of when. Automation is now capable of landing United 232 type failures and in the event of pilot incapacitation it simply lands the plane, something it’s been capable of for over 50 years. Even “Captain Kirk” took a fully autonomous ride into space in a vehicle that had NO pilot.

Aviatrexx, your reply leads me to believe that you are comfortable with AI actually controlling and/or making decisions. You are quite correct in that my appreciation or understanding of AI may well be antiquated. I think the negation of your view of the future can be situations that no one could have forecast. The Qantas situation is an excellent example in that communication between various bits and pieces such as the engines, flaps and so on to the ‘Brain’ was interrupted. I don’t think that any amount of AI would be able to figure out a solution to the many problems that the crew coped with if communication from the broken bit due to physical damage was not there.

I concede that my view of AI may well be antiquated, but I have been flying for 67 years military/airline/general aviation and can think of occasions, particularly in the military, where I had to recover from a situation where I can’t imagine how AI would have coped. One unfortunate example could be keeping an engine running that was on fire, in order to attempt to land it. Presumably AI would follow the procedure, shut the engine down, shoot the extinguisher to it and leave me sitting there. At the time I was unable to eject due to other factors.

I think this is a discussion where there are those better educated than I in computer development, who are willing to allow a computer (AI) to make rather important decisions for them which might be irreversible. To me and other aviation greybeards, a computer, (flight management system) Is a very useful tool, a screwdriver or hammer, but not to be trusted completely.