Thanks for visiting!

Sign up to receive our free weekly enewsletter, and gain access all our FREE articles, tools, and resources.

banner
HCPro

Consider human factors engineering when designing your patient safety projects


CLICK to Email
CLICK for Print Version

Consider human factors engineering when designing your patient safety projects

When Barbara Wilson, PhD, RNC, begins any new patient safety project, she first examines the principles of human factors engineering (HFE). 

Wilson, assistant professor at Arizona State University’s College of Nursing and Health Innovation, Center for Improving Health Outcomes in Children, Teens, & Families, says that to ensure her staff members’ success, it’s imperative to examine how current processes may fail.

“Every time someone makes a mistake … there are processes that failed before that for it to ever get to that place,” says Wilson, who worked as a hospital administrator and manager at Intermountain Healthcare in Salt Lake City. “It’s rarely just that one person wasn’t vigilant. It’s almost always a systems problem in the process.”

HFE is defined by the Human Factors and Ergonomics Society as the “scientific discipline concerned with the understanding of interactions among humans and other elements of a system, and the profession that applies theory, principles, data, and other methods to design in order to optimize human well-being and overall system performance.” Although other high-reliability industries such as aviation and nuclear power have utilized HFE principles for decades, healthcare only recently began looking to HFE when designing processes and systems. 

Creating a “culture of safety” is a concept that many healthcare facilities have become familiar with in the past five years. A culture of safety is one that deemphasizes individual blame and looks at errors from a systems perspective. James Reason, one of the most well-known thought leaders on the topic of human error, brought his idea of the “Swiss cheese model” to healthcare to explain how errors can occur in high-reliability organizations.  

“He talks about the ‘blunt end and the sharp end’—the blunt end are the organizational factors: staffing, turnover, poor policies, poor leadership, poor management,” says Wilson. “All of those lead to the sharp end problems, which are the nurse and the patient, or doctor and the patient. It’s the person who interacts directly with the patient that is the recipient of all of those blunt end problems.”

 

Designing with human factors in mind

Utilizing certain basic human factors principles when starting a new project is essential to its success, says Wilson. Referencing a 2001 Journal of Healthcare Risk Management article, “Safety by design: Ten lessons from human factors research,” she says using lessons such as reducing reliance on memory, managing fatigue, and reducing the need for calculations will help these initiatives succeed.

Jacob Seagull, PhD, assistant professor and director of education research in the Division of General Surgery at the University of Maryland in Baltimore, says there needs to be a preoccupation with safety at the organization. 

“Everyone is involved in safety,” Seagull says. “There is an unwillingness to simplify things and dumb things down. With a high-reliability organization, culture of safety has to be a pervasive issue.”

Aim for system designs that are uncomplicated and require the least amount of cognitive processing, especially in emergency situations, says Seagull. He gives the example of a master mechanic’s garage in which all of the tools are neatly laid out and organized. By comparison, the tubes and tools in a crash cart are often disorganized and can be a hindrance to workers trying to use them.

“A number of people have taken their crash carts and redesigned them so they are as usable as a box of wrenches,” says Seagull. This means that when the caregiver opens the crash cart drawer, he or she only has the necessary tools laid out, not an overabundance. The medications are arranged so that their labels are easy to read, and those medications that are used more often are easier to reach. This approach has shown positive results.

“Something like organizing the physical environment to support the work at hand” is easily accomplished, says Seagull, who applies human factors to medical care. 

Wilson has studied the implementation HFE in relation to the mock code process and will begin a study on the use of HFE principles when designing clinical response teams. Prior to each project, she creates a table that outlines basic HFE principles to consider in relation to that specific process and what type of action is being taken to incorporate HFE thinking..

“We have a column that says, ‘Is this amenable to incorporating human factors engineering?’ and then we talk about what in the current process creates confusion,” says Wilson. 

For example, reducing reliance on memory is one important factor that can be applied to nearly any initiative. Wilson and her team will go through every step of a process and find where reliance on memory can be lessened through checklists, protocols, or automated reminders.

“I think back to when I worked as a nurse,” Wilson says. “At three in the morning, when you’re tired and haven’t slept much, the last thing I want is to have a staff who has to rely on their memory when I know they’re not functioning at 100%.”

Considering each of these factors and understanding the vulnerabilities inherent with employees, Wilson suggests that facilities examine what processes can be put in place to minimize the risk of error. Doing so will require input from the facility’s top clinicians. It’s important to involve those people who are using the process every day and know where HFE principles could be best applied.

 

HFE and patient safety technology

HFE principles are more often being applied to the design of healthcare information technology (HIT). Ross Koppel, PhD, thinks that most clinicians are amenable to using HIT, but the current technology systems make doing so effectively difficult. Koppel is professor of sociology at the University of Pennsylvania in Philadelphia and principal investigator on the study of the hospital as a workplace and medication errors at the Center for Clinical Epidemiology and Biostatistics at the UPenn School of Medicine.

“People in IT often denigrate the clinical staff as being incompetent people,” says Koppel. “There’s a share of blame, but my experience has been that clinicians actually want to do very good work and they find the IT to be a barrier too often.”

Koppel was the lead author on a 2008 study published in the Journal of the American Informatics Association titled “Workarounds to Barcode Medication Administration Systems: Their Occurrences, Causes, and Threats to Patient Safety.” He was intrigued by vendors’ claims that barcodes reduced medication errors to virtually 0%.

He and his colleagues examined the reasons that some clinicians used work-arounds in conjunction with barcoded medication administration systems (BCMA). To do this, they characterized each work-around as a specific type and compiled the data. BCMAs are supposed to help ensure timely administration of the correct type and amount of medication to the correct patient.

Koppel and his colleagues identified 15 reasons why clinicians needed to use a work-around, accounting for 4.2% of patients. Examples included technology-related (not having enough scanners necessary to read barcode, computer connected to scanner not able to fit into patients’ rooms) and task-related (clinician threw away packaging accidentally, prohibiting scan), among others.

“All of the work-arounds—and all of them are justified given the problems with the software—all of those are work-arounds that severely deteriorate the patient safety protections,” says Koppel. By examining where these work-arounds occurred, four of the hospitals included in the study were able to dramatically reduce the amount of overrides by analyzing why clinicians were using the work-arounds and deciding on system repairs.

“For instance, if a doctor ordered 20 mg of a pill, and the pharmacy sends up two 10 mg tablets,” says Koppel. “The nurse scans the first 10 mg tablet and the bar code says, ‘That’s no good; I’m looking for a 20,’ and the nurse is screaming, ‘But 10 and 10 equals 20!’ ” The facility could write a software patch that tells the scanner that the two pills equal the total amount prescribed, he said.

 

Recommendations for utilizing HFE

Although Seagull recommends enlisting the support of a trained HFE professional, there are some actions that can be taken at a unit or hospital level with existing staff. They include “understanding the personal responsibility of not just making sure your patients are doing well, but that all patients in a similar situation will do well, and working on improving the system because problems usually don’t happen because of individuals, but because of systems and the way they’re designed,” says Seagull.

In addition, ask why clinicians may stray away from using certain technologies, says Emily Patterson, PhD, assistant professor at Ohio State University’s College of Medicine, School of Allied Medical Professionals, Health Information Management and Systems Division in Columbus. Also, when introducing new systems, observe whether staff continue to use the old system for certain features, determine what those are, and investigate why they may be doing this. 

“We looked at the ED and asked why, if electronic whiteboards are starting to take off, why is there still a manual whiteboard in most EDs? What about that functionality is not in the new system yet?” asked Patterson.