The University of Texas at Austin- What Starts Here Changes the World
Services Navigation


Safety First: From medicine to air travel, leading researchers find ways to manage human error


To err, they say, is human. This truism is comforting when vases are smashed or foul shots come bounding off the backboard. But it isn’t so comforting when a surgeon leans over a body in the operating room or a patient is rushed into the emergency room. To err, in these cases, can be deadly.

In 1999 the Institute of Medicine reported that medical error was responsible for up to 98,000 hospital deaths each year in the United States. The medical industry took note, and improving patient safety became a fast priority. Many in the industry looked to Dr. Robert Helmreich and the researchers in his Human Factors Research Project at The University of Texas at Austin for help.

Robert Helmreich sitting in the cockpit of an airplane
Social psychologist Dr. Robert Helmreich is a leading expert in aviation safety. His success in developing ways to manage error has led other industries, including medicine, to ask for his assistance.

Helmreich is one of the leading experts in aviation safety in the country, and his team has been studying human error and teamwork in high-risk environments for more than 20 years. They’ve conducted research on everyone from Antarctic expeditioners to airline pilots, astronauts to heart surgeons. And not one of the group would say that the ultimate goal of their work is to prevent people from ever making mistakes.

“Error is part of the human condition,” says Helmreich. “We’ve got limited capacity, we don’t function super well under stress, and we get into situations where we make mistakes.”

Learning to contain the consequences of those mistakes and training teams to work more effectively to detect and recover from their errors is key to Helmreich and his group’s work. They have a proven record in aviation and have expanded their work to include medicine as well.

“The medical world has sort of awakened in the past three or four years to the whole idea of system level error and human error in general,” says Dr. Dave Musson, a researcher at the lab whose background includes a decade practicing medicine in Canada.

“It’s a field that for years was driven by two things: high individual standards of perfection and tremendous fear of litigation. Both of these things have led to an intolerance toward error and suppression or denial of it when it happens.”

Medical error is responsible for up to 98,000 hospital deaths per year in the United States.

Many considered the aviation industry a close parallel to medicine in thinking about safety issues. Both industries involve high-risk environments, highly trained individuals working in teams, high regulation and a hierarchical structure. Thus, looking at how aviation has made strides in safety over the past two decades may offer clues for improving the medical industry.

Aviation has shifted from a reactive to a proactive approach. Instead of waiting for an accident to occur, the industry tries to identify vulnerabilities within operations and address them before an accident can happen.

The Line of Operations Safety Audit (LOSA) has helped airlines understand those vulnerabilities. With LOSA, trained observers actually sit in the cockpit with pilots and crew and observe them in normal operations during regularly scheduled flights. Early on, the goal was to examine team performance issues: How did the crew communicate with each other? How did information flow between the captain and first officer?

Since 1996 LOSA has shifted to look closely at the threat and error side of crew performance, including how things such as bad weather and equipment malfunctions are handled and how mistakes are managed. Thousands of observations have been conducted, and recently the International Civil Aviation Organization named LOSA a recommended practice for airlines around the world.

Bruce Tesmer, a captain at Continental Airlines and manager of two of their safety programs, says that the practice has enabled the industry to improve by identifying precursors to accidents.

Faces of four doctors in operating room
Medical personnel work in teams, but they are rarely trained in teamwork and communication.

“What they’ve done is offer practical research that has an answer at the end of it,” Tesmer says. “The whole industry is a winner in this endeavor.”

LOSA findings have helped aviation understand its own culture and how it contributes to or hinders safety. For example, where once a clearly defined hierarchy often prevented a first officer from even pointing out an error to the captain, now teamwork is much more the norm. And training for teamwork and communication is not just common; it is mandatory.

Helmreich and his team helped develop Crew Resource Management (CRM) training to teach pilots, crews and other airline personnel to work as a team to reduce errors. Simulations became focused on teamwork as well as motor skills. The dangers inherent in hierarchy were fleshed out. CRM training is now required of airlines across the world.

The industry has really caught on, and the hope is that the medical industry will follow.

“In the early days, CRM programs had the reputation of being like charm school,” says Bill Taggart, who has been developing programs and training aviation personnel for 20 years. “Today in commercial aviation it is totally accepted. Delta is one of the strongest proponents of it. We’ve been working with Southwest for over 15 years.”

Through LOSA, CRM and other approaches, including confidential error reporting for pilots and research into the differences in aviation protocol across cultures, the aviation industry has created a culture of safety that other industries hope to emulate.

In 1994, physicians at the University of Basel in Switzerland invited Helmreich to adapt his approaches in the cockpit to the operating room, and the Human Factors Research Project began its work in the medical field. Helmreich spent a year in Basel as a visiting professor where he began the task of determining the perceptions of medical personnel about issues surrounding safety. He quickly discovered that the medical industry is even more complex than aviation, though the stakes are every bit as high, if not higher.

“In the U.S. a lot more people die as a result of medical error than die in aviation accidents,” he says, “but medical death is much less visible than a hole in the ground with smoking wreckage.”

What they've done is offer practical research that has an answer at the end of it. The whole industry is a winner in this endeavor. Captain Bruce Tesmer, Continental Airlines

The high incidence of medical death is what galvanized the industry—including patient safety groups and insurance companies—behind making changes. Ultimately, this may mean overhauling the system.

“Problems in the medical industry span the spectrum from how you name drugs—a lot of drugs have similar names that are easily confused—to how you train physicians in medical school and acculturate them into the profession,” explains Musson. “Fixing problems requires changing the way the medical system works, which has always been beyond the ability of one individual.”

Take, for example, the situation at the average hospital. A quality assurance or safety department may exist, but such areas tend to deal with low-level personnel. Physicians rarely receive any training in their hospitals, in part because they don’t work for the hospital itself but rather act as independent contractors. The culture of medical school has guided the physician to believe that patient safety ultimately lies entirely on his or her shoulders, encouraging a strict hierarchy in patient care. Meanwhile, different teams may form for each patient.

Or consider the not atypical story of an anesthesiologist who went to work on a patient and confused the drugs he was using. Because the pharmacy had purchased drugs from a different vendor than the doctor was familiar with, the medicine to wake up the patient was packaged almost identical to the paralytic, so the anesthesiologist reparalyzed the patient. Fortunately, he caused no long-term harm.

Doctors performing an operation in a hospital room
Improving patient safety is a critical issue in the medical industry. An overhaul of the system may prove necessary.

“There’s no consistency in the system,” says Musson. “It’s like every town invents its own stop signs, its own color lights. When you’re passing through Waco the red light means go, and in the next town the yield sign means stop. Nothing is the same anywhere.”

Yet improving patient care is critical. One means of doing this is helping the industry change from one of individuality to one of communication and teamwork. The Human Factors Research Project is trying to adapt the CRM training developed for the aviation industry to medicine.

Taggart has trained medical personnel on communication, safety and teamwork at several academic medical centers—including Johns Hopkins—and integrated care programs. At Johns Hopkins, a four-hour training program has become the backbone of Hopkins Hospital’s safety training. At Kaiser Permanente, clinical teams at Kaiser facilities across the country have received human factors training. One key has been determining how to approach the training with tools appropriate to the industry.

“One valuable piece of this work is that it has taught me how to engage professional culture in a way that is productive,” says Dr. Michael Leonard, an anesthesiologist who is one of the patient safety leaders at Kaiser.

Leonard says that the process is still in its early stages, but his hope would be to see the medical industry at a place where teamwork is the focus and effective communication is in place.

“Medicine needs to move from a culture of individuals to one of collaborative, team-based care,” he says.

Creating measures for assessing the effectiveness of such training will be a crucial next step, says Taggart.

Medicine needs to move from a culture of individuals to one of collaborative, team-based care. Dr. Michael Leonard, Kaiser Permanente

“Our aviation expertise lets us measure whether or not a particular training intervention brings about change in terms of attitudes toward safety and teamwork,” he says.

At the same time, studies are underway to try to untangle the medical industry at a more systemic level. Musson is undertaking a project that will look at emergency room operations in a number of hospitals across the country to determine where mistakes are typically made and what types of interventions might help minimize those mistakes.

Human error research in the medical industry is admittedly in its early stages. And it faces tremendous challenges.

“Medicine as an industry is difficult to study because people are busy, our kind of research can be intrusive, and it’s a world that has tremendous fears of litigation and bad press,” says Musson.

However, the motivation is there. Just as every pilot wants to deliver passengers safely to their destination, every doctor wants to be part of healing—not harming—a patient. Taking a hard look at the industry and ways to improve it becomes the necessary corollary to that desire.

“People in medicine in general are pretty altruistically motivated,” says Musson. “They go into it because they want to make things better. People actually want to deliver good health care. We’re one of the few places that has a formal history and research agenda in managing human error. This is why people want to work with our lab.”

Vivé Griffith

Photo of Dr. Helmreich: Marsha Miller

Office of Public Affairs
P.O. Box Z
Austin, Texas
78713-7509

512-471-3151
Fax 512-471-5812


  Updated 2014 October 13
  Comments to utopa@www.utexas.edu