Catastrophic events like Deepwater Horizon, Exxon Valdez, major hurricanes, and other such anomalies have a tendency to overwhelm the initial crisis management leadership due to the chaotic nature of the event. The inability to quickly and accurately make critical assessments about the magnitude and complexity of the emerging catastrophe can spell disaster for crisis managers long before the response ever truly takes shape.

This paper argues for the application of metacognitive models for sense and decision-making. Rather than providing tools and checklists as a recipe for success, this paper endeavors to provide awareness of the cognitive processes and heuristics that tend to emerge in crises including major oil spills, making emergency managers aware of their existence and potential impacts. Awareness, we argue, leads to recognition and self-awareness of key behavioral patterns and biases. The skill of metacognition—thinking about thinking—is what we endeavor to build through this work.

Using a literature review and cogent application to oil spill response, this paper reviews contemporary theories on metacognition and sense-making, as well as concepts of behavioral bias and risk perception in catastrophic environments.

When catastrophe occurs—and history has proven they will—the incident itself and the external pressures of its perceived management arguably emerge simultaneously, but not necessarily in tandem with one another. Previous spills have demonstrated how a mismanaged incident can result in an unwieldy and caustic confluence of external forces. This paper provides an awareness of biases that lead to mismanagement and apply for the first time a summary of concepts of sense-making and metacognition to major oil spill response.

The views and ideas expressed in this paper are those of the author and do not necessarily reflect the views of the U.S. Coast Guard or Department of Homeland Security.

Catastrophic events like Deepwater Horizon, Exxon Valdez, major hurricanes, and other such anomalies have a tendency to overwhelm the initial crisis management leadership due to the chaotic nature of the event. The inability to quickly and accurately make critical assessments about the magnitude and complexity of the emerging catastrophe can spell disaster for crisis managers long before the response ever truly takes shape.

This paper argues for the application of metacognitive models for sense and decision-making. Metacognition is simply thinking about one’s thinking. Metacognitive models are the models that reflect the patterns of one’s thinking. Rather than providing tools and checklists as a recipe for success, this paper aims to provide awareness of the mental shortcuts and decision processes that tend to emerge in crises including major oil spills, making emergency managers aware of their existence and potential impacts. Awareness, we argue, leads to recognition and self-awareness of key behavioral patterns and biases. The skill of metacognition—thinking about thinking—is what we endeavor to build through this work.

This paper will review contemporary theories on metacognition and sense-making, as well as concepts of behavioral bias and risk perception in catastrophic environments. Sense-making can best be thought of as a subset of meta-cognition and is the process whereby one assigns meaning to one’s experiences. This paper will also review the work of psychologists Dr. Amos Tversky and Dr. Daniel Kahneman, and others, to discuss how these principles can affect crisis managers during the initial hours of oil spill response.

When a catastrophe occurs, the incident itself and the external pressures of its perceived management emerge simultaneously, but not necessarily in tandem with one another. Previous spills have demonstrated how a mismanaged incident can result in an unwieldy and caustic confluence of external forces. This paper will provide an awareness of biases that lead to mismanagement and apply for the first time a summary of concepts of sense-making and metacognition to major oil spill response.

This paper is not a leadership critique of major oil spills, nor a reflective analysis of failed crisis response. Rather this paper aims to create a culture of metacognition, where leaders are more acutely aware of their decision and sense-making and propensities as the event unfolds. Arguably, when we know or have had opportunity to reflect on our bias and weaknesses, we are better prepared to recognize and address them as they emerge later.

In considering metacognition applications, this paper will focus on three theories of importance to not only the oil spill response but the broader crisis response community. These theories are prospect theory, the Cynefin Framework for sense-making, and the theory of metaleadership as proposed by Harvard’s National Preparedness Leadership Initiative (NPLI). Understanding these three theories and how they interact can strengthen the expectations as to how a Unified Command can make important operational decisions.

Prospect Theory.

Prospect theory, proposed by psychologists Dr. Amos Tversky and Dr. Daniel Kahneman, suggests that our psychology of choice when faced with losses and gains tends to be unreliable and highly prone to bias. This idea contrasts with traditional utility theories of economy that suggest people on the whole make predictable choices to maximize personal gain. The following summarizes the characteristics of prospect theory relevant to this paper:

  • - In situations where there is mixed risk, e.g. there is potential for loss as well as gain, loss aversion tends to result in extremely risk-averse choices.

  • - Where there is certainty for loss, behavior becomes risk-seeking rather than risk averse, gambling against higher risk of lower probability rather than accepting moderate certain loss. “People become risk seeking when all their options are bad” (Kahneman and Tversky, 2011).

Three distinct cognitive features of prospect theory relevant to spill response include:

  1. Adaptation level. Our reference point calibrates losses and gains. For example, if a colleague is offered a bonus, our reference point of gain and loss becomes anchored to the colleague’s fortunate position. If we get less of a bonus, we see it as a loss; more of a bonus would be a gain (Kahneman and Tversky, 2011).

  2. Diminishing sensitivity. Losses and gains are perceived proportionate to where one’s reference point is. An increase from $100 to $200 is different than $800 to $900. This is true for losses as well. The potential of a loss of $100 to $200 is weighted differently than a loss of $800 to $900 (Kahneman and Tversky, 2011).

  3. Loss aversion. “Losses loom larger than gains.” Consider the loss of $100 or the gain of $150 on a coin toss. The net gain of the wager is $50—positive. So it is in our interests to take the wager. However, because risks loom larger than gains, it is typical to reject the risk for a more acceptable wager. Loss aversion ratio typically puts the necessary amount of gain at closer to 1.5 to 2.5 times the loss (Kahneman and Tversky, 2011).

The idea of weighting gains differently than losses is a compelling issue that warrants our attention as responders. Our tendency toward loss and risk aversion, or risk-seeking in circumstances of sure loss, distorts normal and rational decision-making. In the most extreme cases, sometimes when faced with the overwhelming odds the problem presents or when there are too many options to select from, we choose nothing; the default decision is to make no decision at all.

Cynefin Framework.

The Cynefin Framework (pronounced kuh-nev-in) is a sense-making framework designed for decision makers to aid in making sense of a wide array of unspecified problems (Boone and Snowden, 2007). The framework “helps people break out of old ways of thinking and consider intractable problems in new ways” (Boone and Snowden, 2007). The framework is comprised of four transitional domains, each representing a separate place in our sense-making of a particular event. Figure 1 represents the framework (Boone and Snowden, 2007).

Figure 1,

Cynefin Framework

Figure 1,

Cynefin Framework

Close modal

Because the Cynefin Framework is a tool for sense-making, the purpose of its domains is to help give meaning to our experiences by applying recognizable patterns to our circumstances. The right-hand domains are those that are ordered, meaning simply that order and pattern can be discerned from the emergent event—cause and effect is apparent. The left-hand domains exist in an unordered state, which has no immediately recognizable pattern. When most oil spills occur, there is a cause and effect pattern that can be readily identified. In these events, we analyze, categorize and respond by selecting the appropriate response strategy. This is the ordered domain. However, when causality cannot be determined, we cannot easily select a response strategy. The link between cause and effect is not apparent; therefore, by selecting a particular strategy, we cannot assume a particular result. This is the unordered domain (Kurtz and Snowden, 2013).

Major catastrophes like the 1989 oil spill from the tank vessel Exxon Valdez, the devastation caused by the 2005 Hurricane Katrina, or the 2010 oil spill from the Gulf of Mexico Macondo Well Blowout, ultimately dwell in the unordered complex and chaotic domains, although this is not always where they begin. Using the Gulf of Mexico Macondo Well Blowout as an example, spudding the well and well completion were normal operations employing standard procedures in early April of 2010. Consequently, for decision making purposes, these types of operations can be categorized in the simple domain. However, situations can arise that require deviation from standard operations—this is the complicated domain. For example, pressure anomalies were detected on the Deepwater Horizon semisubmersible on April 20. The following day, mud overflowed the flow-line onto the drilling rig floor ten minutes before an explosion occurred. As events leading up to the blowout progressed, the situation became increasingly complex—cause and effect became more and more obfuscated. The transition from the complicated and complex domain to chaotic domain was nearly seamless. In events that provide no warning, e.g., earthquakes, terrorism, human error, etc., the emergence of chaos occurs from the simple domain.

Mary Douglas, known for her work on symbolism and social psychology makes an important insight about our perception and recognition of patterns (1966):

“…As perceivers we select from all the stimuli falling on our senses only those which interest us, and our interests are governed by a patternmaking tendency.... In a chaos of shifting impressions, each of us constructs a stable world in which objects have recognizable shapes, are located in depth and have permanence . . . As time goes on and experience builds up, we make greater investment in our systems of labels. So a conservative bias is built in. It gives us confidence” (Douglas, 1966).

In other words, we look for familiar patterns that reinforce what we believe and discount information that doesn’t fit or existing models. The fore mentioned makes an important observation about sense-making and eludes to a tendency about how we perceive events as they occur.

Meta-Leadership.

If metacognition is “thinking about one’s thinking”, then meta-leadership can be thought of as thinking about one’s leadership. The National Preparedness Leadership Initiative (NPLI) was established by the Harvard School of Public Health’s Division of Policy Translation and Leadership Development and the Harvard Kennedy School’s Center for Public Leadership. NPLI combined the theories of meta-cognition and organizational leadership to create the discipline of meta-leadership. NPLI has looked retrospectively across major contemporary crises to define five domains of crisis leadership deemed essential to successful response. In this paper, we contend these domains can be further separated into two response categories—Awareness and Execution:

Awareness

  • 1. Understanding yourself as a leader (understanding your bias, willingness to accept risk, capabilities, limits, et cetera).

  • 2. Understanding the situation (recognizing your Cynefin domain, knowns, unknowns, et cetera).

Execution

  • 3. Leading the silo (support and inspire your team)

  • 4. Leading up (know your boss’ expectations)

  • 5. Leading laterally (leveraging networks)

The meta-leader will have successfully developed the fore mentioned skills, which become essential to combating the biases and various heuristics discussed in this paper. Honed meta-leadership in major spill response becomes particularly necessary in the un-ordered domains of the complex and chaotic. It is here confident, even-handed, rational, and inspiring leadership is essential. Meta-leadership is the thread that binds meta-cognition and sense-making together.

The following section applies the above theories to oil spills and crisis response. Recall that biases such as prospect theory influence our evaluations of risk and subsequent decision-making. The Cynefin framework is then applied to aid in categorizing the way we understand the event and our subsequent response options. Finally, NPLI’s meta-leadership provides a leadership model, framed by awareness and response activities.

Prospect Theory applied to Response.

The distinct features of prospect theory applied to oil spill response include:

  1. Adaptation level. Our reference point calibrates losses and gains. An example here might be looking to the last spill response as our standard/expectation for future responses. Or perhaps the success of a given response strategy calibrates our expectation of future events. A major oil spill on the order of which we individually have not yet experienced will establish an anchor—a mental reference point—by which all our subsequent decisions will be made. This is the idea of “fighting the last war.” Future response patterns will be predominantly based on historical events. Good or bad, the patterns we’ve learned guide our decision-making like mile-posts.

  2. Diminishing sensitivity. Losses and gains are perceived proportionate to where the reference point is. A change in spill volume of 1,000 to 2,000 gallons is perceived differently and appears more severe than 15,000 to 16,000 gallons, although both represent a change in 1,000 gallons. This of course influences when we choose to select certain tactics and when we do not.

  3. Loss aversion. “Losses loom larger than gains.” In this case, an investment in response tactics where even moderate risk of failure is involved requires a large return on the investment.

Table 1 is a crosswalk of prospect theory, resulting bias, and application of the Cynefin framework:

Table 1,

Crosswalk of Sense-Making and Bias

Crosswalk of Sense-Making and Bias
Crosswalk of Sense-Making and Bias

Cynefin Framework applied to Response.

The most prevalent example of bias in oil spill response begins at the initial stages where preparation and initial actions in response are vital. This is the Simple and Complicated Domains of Cynefin. Optimism bias and overestimation/underestimation, both planning biases, are those most likely to emerge. Optimism bias is common in contingency planning (underestimating the probability of certain spills), whereas estimation bias is prevalent early in spill response operations (initially estimating volumes of a spill). The latter is a trap that repeatedly ensnares Unified Commands. The absence of accurate information in the initial hours of a spill coerces gross estimations by the Unified Command. The estimations are made based on prospect theories and typically misjudge loss and gains; the result is an underestimation of spill volumes and overestimation of recovery capabilities and timeframes. This situation is understandable—remember that losses loom larger than gains. Consequently, we underestimate the loss and overestimate the gain.

An example of the impacts of perceived risk on decision-making in oil spill response can be seen repeatedly. Consider the debate to the use of dispersants and other alternative response technologies. All response options are on the table initially when a major spill occurs. Everything from booming to alternative response technologies are available to the Unified Command to help mitigate the adverse impacts of the spill. However, some response options tend to be removed early in the response. Decisions to use dispersants or in-situ burning as spill response options is often constrained by windows of opportunity created by the location and environmental conditions of the spill. Additionally, these response tools have potential tradeoffs for environmental effects. So why is there sometimes an aversion to using dispersants and in situ burning?

The answer might be found in cognitive bias, particularly in that our decision-making is impacted by our weighted perception of risks and gains, perceptions that tend to be weighted by our observations and sense-making (McDermott, Fowler, and Smirnov, 2008). The condition to more heavily weight losses than gains as described by Kahneman and Tversky, may not be a condition we can avoid and may instead be a more intuitive reaction to crises as they emerge (McDermott, Fowler, and Smirnov, 2008). In our dispersant example, in an effort to protect sensitive environments in the nearshore or on the water surface, dispersants may be an acceptable solution that could arguably mitigate significant irreparable damage, changing the course of the spill response—a potential long-term gain. But this clear benefit is eclipsed by the idea of applying dispersants to the already marred environment and the sociopolitical ramifications of using the response method—a short-term loss. Consequently, Unified Commands often retreat to a conservative position of using traditional containment countermeasures and on-water mechanical-recovery. This approach may mitigate a very small percent of the total spill volume at a greater cost, potentially placing the environment at far greater risk.

Considering the above scenario, when making decisions in the chaotic and complex domains of Cynefin where information is unclear, more information is not always better. This is especially true when faced with high degrees of risk. Research shows when faced with risk, too many choices resulting in too much information may result in no choice at all (called decision distortion/decision-averse) (Redelmeier and Shafir, 1995). This is in direct contradiction and creates a sort of paradoxical conflict of cognitive decision-making when considering “ambiguity aversion” proposed by Daniel Ellsberg (1961), where Ellsberg suggests in the absence of information, people will tend to select the risk-wager where limited information was provided (Pulford and Coleman, 2008). This is further supported in research by Tversky and Fox which found people tend to wager based on vague beliefs where they tend to feel particularly knowledgeable, “but will bet on chance when they do not” (Pulford and Coleman, 2008).

Each domain of the Cynefin framework requires a different leadership style and application of action. The simple domain requires only best practices and tested methodologies. Here, responders collect information (sense), categorize based on spill type, volume, product, environment, etc. to determine appropriate actions, then respond. The sensible application of lawful authority and the provisions of the National Oil and Hazardous Substances Pollution Contingency Plan are enough in this domain. As the spill response becomes more severe, the complicated domain emerges. Here, myriad variables where cause and effect relationships may not be readily discernable come into play and expert advice is necessary—this is the domain where bias tends to emerge including the dismissal of new, novel approaches by “non-experts.” In the complicated domain, good practices are acceptable. Again, responders sense, analyze to determine viable options, then respond appropriately.

In the complex domain, patterns are not readily discernable and there is no “right answer” for the response; therefore, novel practices are fully applicable here. Sun Tzu’s The Art of War observes, “There are no set rules. The rules can only be established according to the circumstances” (Griffith, 1964). This succinctly defines the domain of the complex. Flexibility and establishing an environment of creativity and experimentation is essential. In this domain, diminishing sensitivity, loss aversion, and adaptation—all features of prospect theory—emerge. Finally, the chaotic domain is the domain of the unknowable. The cacophony of the fog of war dominates this space and no clear cause and effect is apparent. Risk expert Patrick Lagadec suggests parallel teams established to think through intractable issues as they emerge are essential (Lagadec, 1993). Novel techniques are key. In this domain, it is necessary to take immediate action to establish command quickly to re-establish some degree of order.

The following is a simple guide to applying the Cynefin framework to spill response along with the most applicable type of practice, level of understanding and proficiency, and sense-making process.

Table 2,

Sense-Making and Spill Response

Sense-Making and Spill Response
Sense-Making and Spill Response

Connecting Prospect Theory & Cynefin to Meta-leadership.

Plenty of examples of bias and stress-impacted decision-making exist in spill response. Frankly, the introduction and investigation of these propensities should be at the forefront of any crisis leadership course; however, this is the exception rather than the norm. As a result, we continue to experience the same traps and bias repeatedly, despite our best intentions to put forth the leadership that matters at the time when it matters most.

Tools for the Meta-Leader.

  1. Understanding our bias, both intentional and unintentional, is a process of metacognitive behavior and is an absolutely essential skill for the crisis manager.

  2. We view problems through the lens of our experiences.

  3. Be aware of discounting; discarding what doesn’t fit our mental model.

  4. In best cases, formulate an initial range for your estimates. Then revisit based on any new data of estimates to further hone the range, providing a starting point that is far more reliable than a wild guess. (Hubbard, 2010)

  5. Develop an executive team that can check decisions for blindness and bias.

  6. Develop strong self-awareness and practice bias-recognition and introspection.

This paper has explored the concept of thinking about one’s thinking by exploring biases through prospect theory, and introduced sense-making as a framework for categorizing one’s environment. The next step is applying meta-leadership to tie the concepts firmly together. The first steps of meta-leadership establish awareness through self-examination and examining our environment. It is in this context we execute our response.

Step-1: AwarenessUnderstanding yourself (understanding your bias, willingness to accept risk, capabilities, limits, et cetera).

It should be the immediate task of the crisis leader to balance intuitive and creative tendencies with process-oriented decision-making. The emotional basement is a term coined by the National Preparedness Leadership Initiative. The emotional basement is the equivalent to Kahneman and Tversky’s system one thinking where intuition is the predominant driver, giving more calculated process-driven thought (system two analytical thinking) the back seat. The danger in intuitive thinking, which often is foisted on us by our evolutionary hunter-gatherer instincts (fight or flight), is that by necessity of survival, this is where heuristics and bias (mental shortcuts) are developed and employed (McDermott, Fowler, and Smirnov, 2008). Upon the transition to system two thinking—often via a series of pre-established checklists—we tend to “box” ourselves into a standard calculus of thinking, hindering the ability to think creatively (Marcus, Heshkanazi, et al., 2015).

Step-2: AwarenessUnderstanding the situation (recognizing your Cynefin domain, knowns, unknowns, et cetera).

In the chaotic and unknowable domain, there are no discernable patterns and information emerges faster than can be processed. These dynamics further deepen biased thinking and keep initial crisis leaders in the basement. Adaptive systems like Incident Command Systems (ICS) can be used as a tool to transition from the unknowable domain of chaos where there is no discernable pattern to the complex domain where although unknowable, patterns become evident in retrospect. Systems like these use feedback loops within their planning cycle, allowing new environmental information gained through action and probing to aid in building an increased situational awareness.

The execution phase of NPLI’s meta-leadership involves the application of leadership: leading the silo (support and inspire your team), leading up (know your boss’ expectations), and leading laterally (leveraging networks). In some more extreme cases, leaders make certain decisions during responses because they are ordered to do so by their superiors. In these cases, leading up is essential. Speaking truth to power, using effective communication, and being a great subordinate help to effectively manage the bosses’ expectations (Marcus, Heshkanazi, et al., 2007).

Transitioning a major response effort—whether major oil spill or other national crisis—from the unknowable to the knowable domains requires the skills of the meta-leader. It is our contention it is within the unknowable domains meta-leadership emerges and is most valuable in its application. To this point, it should be one of the key initial priorities of any crisis leader to gain situational awareness as quickly as possible, even if only partially accurate. The challenge becomes distilling the correct information from the streams of disparate data that may be coming in during the initial phases of the event.

When the phone rings in the middle of the night that a major disaster has struck, there will be an absence of information. It is important for the Incident Commander to take five minutes to ask a series of introspective questions right up front:

  • What information do I have (what do I know)?

  • What is familiar to me (what have I seen before)?

  • What have I never seen before (what is totally unfamiliar)?

  • What do I want to do?

  • What do I need to do?

  • What are my potential biases?

  • What sense-making domain is this?

Taking the time to answer these few questions will better enable the leader to establish his or her meta-leadership and avoid the pitfalls of basement and rapid system one thinking. It helps to answer the most important question, which is: what is it we want to accomplish? The first domain of meta-leadership requires a dynamic situational perspective that is regularly reevaluated. Failure to conduct the necessary introspection will inevitably result in myopic leadership based on bias and heuristics rooted in subjective analysis—action without context.

This paper has been written to challenge leaders who might find themselves in charge of a major spill or other crises to take the time to develop skills as a meta-leader, developing emotional intelligence, the ability to create mutually agreeable priorities, and the ability to think and reason clearly, aware of potential biases that prevail in crisis environments.