top of page

“Leading with Eyes Wide Open” Series: #1

[August 24, 2020] by Kathy Scott, PhD, RN, and Bridget Sarikas




Seeing and Understanding

Risks and Opportunities

in the Crisis


This is the first blog in our series titled “Leading with Eyes Wide Open.” It is for those who want to step up their game as leaders in the crisis, with increased awareness and understanding of the many opportunities and risks in our midst. For it is through this understanding that we build our effectiveness, as well as our resilience.


Most of us have experienced those moments when we suddenly see something for the first time – something that’s been there all along. We wonder how it was possible to NOT notice it before because it is so glaringly obvious – the chipped paint, sagging couch, long chin hair (yikes), to name a few of the negative. Positive eye-openers happen as well – the talent of a young employee in our midst, the beauty of a piece of artwork we pass daily, a manicured yard in the middle of the unkept, a perfect rose on a bush that appears dead. Why not provide us with a few of your own?


It’s similar to a first snorkeling experience. Imagine swimming for several days in the beautiful, clear ocean waters. After several days, you suddenly deciding to try snorkeling. You find the equipment stand, rent some fins, snorkel and mask, wade out to your knees and don your equipment. As you shove off and dip your head under the cool surface, you suddenly see a whole new world – an amazing ecosystem that is alive with vibrant colors, choreographed coral reefs, fish of all shapes and sizes, creepy-looking prehistoric creatures that scare the bejesus out of you, and so much more! (Sadly, no mermaids, talking lobsters, or singing flounder – but just about as good.) It is astounding when you realize that you had been in the midst of all this beauty and scariness, and yet were so unaware!


Awareness often occurs when the unexpected happens – either something beyond our control, or when something we do produces a different outcome. We suddenly become aware of a risk that could produce grave consequences or new opportunities, depending on our ability to see, understand and adapt to it.


If the pandemic has taught us anything about today’s world, it is this:


  1. The unexpected happens. Retreat from changing conditions is not an option.

  2. We do not operate in a vacuum. We are continuously impacted by other people and systems (e.g., travel systems, work systems, school systems, supply-chain systems, etc.).

  3. Life is not totally random. In fact, much of it is predictable. (There is kind of an odd comfort in this.)

  4. Just because we can’t see it, doesn’t mean it isn’t there.

  5. None of us are perfect – sad, but true. We should expect ourselves and others to make mistakes.

  6. Humans, especially Americans, are not rule-followers by nature. Rules will never be enough to keep us on track. This one is worth repeating!

  7. Our biases (and we all have them) can blind us to solutions right in front of us.

  8. Relying on our past successes can be dangerous.

  9. We can help ourselves and others through small and large changes. In other words – it is okay to lean on each other.

  10. To be effective and resilient, we have to be able and willing to see and understand our risks and opportunities.

So how can we help ourselves and our organizations deal with the unexpected? It begins with seeing what’s below the surface with eyes wide open. It begins with a better understanding of our infallibilities.


A landmark study by the Institute of Medicine (1999) researched errors in healthcare and stated that as many as 98,000 people die in hospitals each year as a result of medical errors that could have been prevented. Right on the heels of this study in 2000 was research into human perceptions of infallibilities It was conducted by a University of Texas human factors research group (Sexton, Thomas & Helmreich) that looked at perceptions and attitudes about error, stress, and teamwork. They compared the views of healthcare workers and airline cockpit crewmembers in four categories: 1) perceptions of stress and fatigue; 2) attitudes toward teamwork and hierarchy; 3) differing perspectives of teamwork in medicine; and 4) attitudes about mistakes, error, and safety. The study revealed the following results:

Perceptions of stress and fatigue.

In response to the item “Even when fatigued, I perform effectively during critical times,” 60% of all medical respondents (physicians and nurses) agreed, ranging from 70% (surgeons) and 47% (anesthesiologists). The rate of agreement was much higher in medicine than in aviation (only 26 percent of the pilots agreed). In response to the item, “My decision-making ability is as good in medical emergencies as in routine situations,” 70% of all medical respondents agreed. In general, only a minority of respondents openly recognized the effects of stress on performance. Kudos to the minority who were brave enough to be aware of their limitations.

Attitudes to teamwork and hierarchy.

Respondents (70%) did not agree with the statement that junior team members should not question the decisions made by senior team members, but there were differences with position and discipline. Surgeons were least likely to advocate flat hierarchies (55%). By contrast, 94% of cockpit and intensive care staff advocated flat hierarchies. Flat hierarchies have a much greater tendency to engage the decision-making of the entire team. Take a moment and see how this would relate to your organization.

Differing perspectives of teamwork in healthcare.

Responses to the item “Rate the quality of teamwork and communication or cooperation with consultant surgeons” showed a variety of different perspectives. Surgeons and residents rated the teamwork they experienced with other surgeons the highest, while anesthesia residents and nurses, and surgical nurses, rated interactions with surgeons the lowest. Overall, surgeons reported good teamwork with anesthesia, but anesthesia staff did not hold reciprocal perceptions. Differences between physicians and nurses were found in intensive care. Doctors (77%) reported high levels of teamwork with nurses, but only 40% of the nurses reported high levels of teamwork with physicians. Hmmm…how much self-awareness was going on here?

Attitudes about mistakes, errors and risk.

Over 94% of the intensive care staff disagreed with the statement “Errors committed during patient management are not important, as long as the patient improves.” Twenty percent also believed that confidential systems for reporting error were important for patient safety. One out of three intensive-care respondents did not acknowledge that they made errors, and over one-half of the respondents reported that decision-making should include more team member input. The study also indicated that respondents found it difficult to discuss mistakes and several barriers were acknowledged, which included: personal reputation (76%), the threat of malpractice suits (71%), high expectations of the patients’ families or society (68%), possible disciplinary actions by licensing boards (64%), threat to job security (63%), as well as expectations (61%) and egos (60%) of other team members. Just on the surface this is alarming – let alone if we were to dig deeper. It’s difficult to learn from mistakes if you don’t acknowledge and reflect on them.

This 2000 study demonstrated that the majority of pilots and physicians in all cultures have a sense of personal un-vulnerability. More recent studies on errors in healthcare tell us that these perceptions continue in this high-risk industry and that progress has been very slow. Problems such as hand-hygiene noncompliance, patient falls with injury, medication errors, and wrong-site surgery continue on a routine basis in our health systems today (Chassin, 2019).

So, let us find the courage to look below the surface and recognize that we are infallible and that:

  1. Our decision-making isn’t as good in emergencies as in normal situations;

  2. Our performance is affected by personal biases and problems;

  3. We do make more errors under high stress; and

  4. Professionals often have a difficult time acknowledging and examining their failures.

This understanding can lead us to actions that we can take to overcome our vulnerabilities and recognize our opportunities. (We can do it!) One way to help ourselves is through the design of systems that help keep us on track. This will be the subject of our next blog.


Titter Time: BRAIN FOG

Due to intense brain fog all of my thoughts have been grounded until further notice."

~ Anonymous


References


Chassin, M. (2019, November 18). To Err is Human: The next 20 years. The Joint Commission High Reliability Healthcare [blog post]. Retrieved from www.jointcommission.org/resources/news-and-multimedia/blogs/high-reliability-healthcare/2019/11/to-err-is-human-the-next-20-years/


Institute of Medicine and Kohn, L. T., Corrigan, J. M., & Donaldson, M. S., (Eds.). (1999). To err is human: Building a safer health system. Washington, DC: National Academy Press.


Sexton, J., Thomas, E., & Helmreich, R. (2000, March). Error, stress, and teamwork in medicine and aviation: Cross sectional surveys. British Medical Journal, 320, 745-749.

35 views0 comments

Recent Posts

See All
bottom of page