‘Automation Addiction’ And The Asiana Crash: What Happens When We Trust Computers Too Much?

Had the pilots of the doomed Asiana jetliner that crashed in San Francisco last weekend looked out the window, they might well have surmised much earlier that they were flying too low. Instead, they appear to have behaved as people are increasingly prone these days: They entrusted their fate to computers — trust that trumped basic judgment, yielding disaster.

The initial investigation suggests this may provide one possible explanation for how a mechanically sound Boeing 777 — with functioning equipment, flying as it should on a clear summer day — missed the runway and hit a seawall, ending the lives of two passengers and injuring dozens of others.

Beyond the particularities of aviation, the crash underscores the growing pains plaguing our transition from human intelligence to artificial intelligence. Though our machines are not quite advanced enough to completely substitute for human guidance in many complex tasks, people are getting worse at operating without help from software. When crisis strikes and the computers can’t handle it, we are all too often operating with skills and judgment that have been eroded by our increasing reliance on technology.

“Automation is designed to make our worlds easier and simplify our mental demands, but we’ve become reliant on it, so that when technology fails, it makes it particularly challenging for the human,” said David Strayer, a professor at the University of Utah’s Department of Psychology specializing in cognition and distracted driving. “You’ve let peoples’ skills erode and then given them a problem that’s a hard problem to solve to begin with. That combination is really the biggest concern for automation.”

People are apt to assume the computer knows best. Drivers, for example, will believe navigation systems over their own eyes, turning the wrong way down one-way streets, driving miles into remote locales or trusting an app over their sense of direction. In Australia, several motorists blindly following directions from Apple Maps — which had misplaced their destination — had to be rescued from the blistering outback by police, who issued a warning about the app’s “potentially life-threatening” information. And as automakers introduce more automation, including collision-avoidance systems and technology that keeps cars from straying outside their lane, it is “inevitable that some people will misunderstand the limitations of those systems and rely on them too much,” predicted Martin Ford, author of The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future.

This dependence on technology seems particularly acute in the airliner cockpit, where improved aviation technology now allows pilots to hand off their planes to a computer for all but three minutes of a flight during takeoff and landing. While this advancement has been an enormous boon to airplane safety, experts said, it has also given rise to a new issue now being explored as a possible cause of the Asiana crash — so-called “automation addiction,” the phenomenon of pilots so dependent on their computers that they have essentially forgotten how to fly.

Airplanes’ increasingly sophisticated computer systems have allowed manual flying skills to fall by the wayside, experts and safety officials warned, producing a new breed of pilots who are out of practice when it comes to hand-flying. They may be incapable of properly handling malfunctions during a flight and are likely to be studying their screens instead of, say, taking a look at the physical world and recognizing that their plane is about to plunge into the San Francisco Bay.

“Automation has reduced certain types of human errors. But in a way, it’s introduced new ones,” said Bill Waldock, a professor of safety science at Embry-Riddle Aeronautical University in New Jersey. “You’re trusting automation to fly the airplane, and in a lot of respects, that makes you not pay attention to” the plane.

“I’ve heard stories of people falling asleep flying airplanes,” Waldock added. “Not one person, but the whole crew. That scares me.”

Though investigators and aviation safety experts are still piecing together what caused the Asiana 777 to crash in San Francisco following its 11-hour flight from Seoul, experts have focused on the moments just before impact. Capt. Lee Kang-guk took over from the auto-pilot and waited until seconds before the crash to attempt to increase the plane’s speed, which was about 40 mph slower than it should have been. Lee had spent just 43 hours flying a 777 (but thousands in Boeing’s 747) and it was his first time landing the plane in San Francisco.

Logs of the pilots’ conversation should help reveal why no one acted. Were they tired? Distracted? Reluctant to second-guess the captain in command? But some see similarities between the Asiana Airlines flight and other accidents in which pilots had trouble taking over for automated systems. In the 2009 Air France disaster over the Atlantic Ocean, for example, the pilots’ actions stalled the plane, which plummeted more than 35,000 feet in just 3 1/2 minutes.

In a safety warning to airlines issued in January of this year, the Federal Aviation Administration cautioned that a review of flight data had revealed an “increase in manual handling errors” the agency blamed on pilots’ regular reliance on auto-flight systems. The FAA asked airlines to update their training policies to ensure pilots had opportunities to “exercise manual flying skills.” A 2011 study by the FAA — which examined data from more than 9,000 flights, accidents and incidents — concluded that more than 60 percent of accidents had involved pilots who had difficulty hand-flying their aircraft or using their autopilots.

“There have been accidents we didn’t see before because of poor piloting skills and poor airmanship,” said Hans Weber, president of aviation consulting firm, TECOP International, who has worked closely with the FAA for more than two decades. “An airplane can be flown in a highly automated way, and therein lies one of the problems with automated airplanes — they don’t provide much opportunity for pilots to maintain their piloting skills.”

While no one would argue we should scrap auto-pilots altogether in favor of human captains — or Google Maps for paper maps, or calculators for slide rules — the concern over pilots’ complacency in the skies highlights problems that may arise as our cars start to drive themselves, our physicians turn to algorithms when diagnosing patients and smartwatches tell us how to sleep better.

History has shown that certain skills — like repairing horse-drawn carriages — will inevitably become obsolete as technology improves. Yet the trend of “automation addiction” suggests a more worrisome potential outcome in which basic common sense is outsourced to a machine.

When things go awry on airplanes, for example, pilots increasingly have a tendency to study their computers for answers, instead of trusting their instincts, said air safety experts.

“One of the consequences of highly automated airplanes and younger pilots, who grow up very computer literate, is that they tend to focus exclusively on the computer, punching buttons and trying to get airplane to do the right thing, rather than focusing on the fundamental requirement of the pilot, which is, fly fast enough and maintain altitude,” explained Weber.

It may be only a matter of time before humans are eliminated from the cockpits entirely and replaced by computers that never lose their cool — which may give us even more reason to trust their instincts over our own.

To Andrew McAfee, a research scientist at the MIT Sloan School of Management and co-author of Race Against the Machine, the machines deserve our confidence.

“As people rely on technology more and more, there’s always hand-wringing from some quarters about our over-reliance on technology” said McAfee. “Then technology proves it’s more than up to the job and that worry fades into rearview mirror.”