Traditionally, an unexpected adverse event was equated with an error. In turn, an error was equated with incompetence or even negligence. Consequently, punishing the guilty was considered to be the only method to improve safety of patients. We felt that we could solve human error problems by telling people to be more careful, by reprimanding the miscreants, or by issuing a new rule or procedure. This is ‘The Bad Apple Theory’, where you believe your system is basically safe if it were not for those few unreliable people in it. This old view of human error is increasingly outdated and will lead you nowhere.
In fact, this “name, blame, and shame” approach has a toxic effect. Not only does it not improve safety, it also continues to push the issue of medical errors into secrecy. The new perspective The new discipline of Patient Safety acknowledges that risk is inherent in medicine and error is inherent in the human condition. As Dr Lewis Thomas said eloquently, “We are built to make mistakes, coded for error.” We now understand that a human error problem is actually an organizational problem. Finding a ‘human error’ by any other name, or by any other human, is only the beginning of your journey, not a convenient conclusion. The new view recognizes that systems are inherent trade-offs between safety and other pressures , such as time. For example, in an understaffed hospital, a rushed doctor may be forced to take shortcuts which jeopardize the patient’s safety, not because he is careless, but is forced to do so because he has lots of other patients to see. The major contribution of the patient safety movement has been to propagate the insight that medical error is the result of “bad systems,” not “bad apples”.
Errors can be reduced by redesigning systems and improving processes so that doctors and nurses ( caregivers ) can produce better results. While the discipline of Patient Safety can learn a lot from other high hazard industries, such as aviation and nuclear power, the uniqueness of health care must not be lost. Health care is much more unpredictable, complex, and nonlinear than the most complex nuclear power plants. Machines respond in a predictable way to a set of commands and processes; patients don’t—their response to medications and clinical interventions is far more variable and unpredictable. Machines don’t have families, feelings, language barriers, or psychosocial issues; patients do. While it is vitally important for us to learn techniques and lessons from other industries, the healthcare industry must produce leaders and champions from within the clinical community to face up to this challenge and devise solutions unique to the clinical environment.
Humans as heroes While humans can cause problems, they are the solution as well. After all, humans are the only ones that are going to be able to recognize the errors and prevent and correct them. We need to be able to balance both these views of human ability and experience; one that uses technology, design, standardization and simplicity to reduce human fallibility, while the other stresses human adaptability, foresight and resilience as a shield against errors. Systems and processes are important, but in the end people make the difference. We need to think not in terms of humans as hazards, but rather in terms of humans as heroes. In reality, what’s amazing is that in spite of the chaos, constraints, and limitations under which hospitals in India function, doctors and nurses are able to deliver safe care to their patients the vast majority of times. This is on account of the hard work, individual vigilance, resourcefulness and problem-solving ability which the medical staff brings to work every single day. Sadly, the cardinal virtues and abilities of clinical staff are being squandered on filling in forms.Their job seems to be to fix administrative and organizational inefficiencies, rather than being put at the service of patients. Clinical staff maintains safety by adapting and working around these inefficiencies. If we truly want safer healthcare, front line staff may have to complain more and demand action on these inefficiencies, on behalf of themselves and their patients.
Hospitals are complex adaptive systems which means they do not respond in predictable ways to rules and policies. It also means that efforts to improve safety must combine rules and standards with messier activities that respect the importance of culture, innovation, and iterative learning in the clinical setting. A variety of strategies have been employed to create safer systems, including: Simplification Standardization Building in redundancies Using checklists Improving teamwork Communication Learning from past mistakes