By Stacey Largent, senior consultant, ValSource, Inc.
Anyone in the pharmaceutical industry can tell you the number of regulatory requirements and internal procedures they are obligated to comply with to produce a safe and efficacious drug product. If you add in the complication of stressful conditions such as the COVID-19 pandemic, many companies are struggling to make both the decisions they would on any other day as well as those that are required in the current environment, e.g., how to maintain compliance with an environmental monitoring program with a reduced workforce and/or supplies.
Decisions that are made within the first hours, days, and even weeks of the start of a crisis are the most critical to ensuring success and preventing realization of risks. Bruce T. Blythe, chairman of the R3 Continuum, once said that decision making while in a crisis is “located somewhere between analysis and intuition.” Intuition equates to making the right decision without necessarily knowing the reason why.
This is the third of three articles from ValSource authors on some of the challenges of making decisions in difficult times. In the other two articles, we looked at how to combat stress with quality risk management (QRM) and how to ensure the production of safe and effective drug products in times of staff shortage. Here we examine decision-making practices used in the military to provide insight into how appropriate conclusions can be reached during the pandemic or other stressful conditions.
Powell’s 40-70 Rule
Colin Powell, retired four-star U.S. Army general and former National Security Advisor, commander of the U.S. Army Forces Command, and chairman of the Joint Chiefs of Staff, has his own interpretation of the paradigm presented by Blythe. A well-respected leader in the history of the U.S. military, Powell has been forced to make numerous decisions under stressful conditions and has often been quoted about what is now known as the 40-70 Rule: “Use the formula P = 40 to 70, in which P stands for the probability of success and its numbers indicate the percentage of information acquired. Once the information is in the 40 to 70 range, go with your gut.”1
So, what does this mean?
Judgment is more important than additional data; if you wait until you are 100 percent sure of success, often it will be too late. This rule equates to finding that point in time when intuition combined with some level of information or data outperforms the level of success obtained by waiting for more knowledge. After estimating the P value for a situation, if it falls within that sweet spot, then intuition should be used to make the correct decision.
Individuals put in charge of critical decisions must be decisive and may not always have all necessary information available to them at the time. If P is less than 40, i.e., less than 40 percent of the information needed is available, the decision is likely being made too swiftly and will not be well informed, while if P exceeds 70, the decision maker may be viewed as being indecisive. If time is of the essence in making a decision, P can be closer to 40, because waiting for additional information may result in deadlines being missed or even the wrong decision being made as a result of “analysis paralysis” (too much information results in overthinking a situation and no decision is made). Powell has said that in the military, they are taught to only use about 30 percent of the time allotted for decision making; with the objective of reducing risk, this principle minimizes the amount of time available for a risk to be realized.
In addition to the amount of information available, the presence of stressful conditions during decision making can impact the actual decision being made. Prospect theory was first introduced in 1979 by psychologists Daniel Kahneman and Amos Tversky and was further evaluated in the military setting by U.S. Army Lieutenant Colonel James Schultz. This theory, as explained by Schultz, suggests that “the decision maker’s reference point determines the domain in which he makes a decision.” In other words, the decision a person makes is based on perception of gains or losses with respect to their current environment or situation rather than the actual final outcome. Gains and losses are valued differently by each individual and that is why decisions are typically made based on perceived gains rather than perceived losses. Three different biases are built into prospect theory: (1) certainty (put more importance into a decision option that is certain, i.e., choose the option that is an assured “win” even if it is a smaller gain in the end); (2) isolation effect (focus on elements that are unique to an option); (3) loss aversion (belief that losses result in greater negative reactions than gains result in positive ones).2
While prospect theory does not predict the choice that will be made, it can demonstrate the decision maker’s risk tolerance. Essentially, an individual will be more risk seeking if he or she is in a negative state, i.e., in one of perceived losses, vs. being more risk adverse if he or she is in more of a positive state (perceived gains).
A case study was performed on the decision to launch the failed World War II Operation Market Garden to test the theory. Operation Market Garden was intended to seize key bridges in Holland through use of the Allied Airborne Army under the command of Supreme Commander General Dwight D. Eisenhower. The principles behind Market Garden would utilize previously effective principles used at Normandy and its success would increase the morale of the troops but had great potential for creating losses for the Allies, e.g., a delay in establishing a base at Antwerp. Eisenhower approved the operation despite having less risky alternative available to him and a previous history of being risk adverse in his military decision making. If Eisenhower proceeded with action to improve logistics, it would result in delay of a more decisive action against Germans (ultimately seen as a gain), while if he delayed, military force would become relatively weaker (seen as a loss). With his domain being one of loss as his opportunity to seize the bridgeheads was slipping away, a change in his reference point and framing likely contributed to proceeding with the risky operation. Ultimately, Eisenhower proceeded with the choice that offered the largest gain and greatest risk (i.e., favored enemy success/Allied failure).3
Recognition Planning Model
Military planners have historically utilized the military decision-making process (MDMP) to facilitate and expedite decision making to get ahead of and neutralize any threats before the opposition. The MDMP is often abbreviated to speed it up, as it typically requires the evaluation of three courses of action (COA), but it is still not ideal. This is where the recognition-primed (RPD) model, presented by Gary A. Klein, et al, can be of benefit, as it speeds up the decision-making process through use of quick mental simulations. The RPD model is a paradigm where one probable COA is identified through intuition based on knowledge, training, and experience in lieu of an analytical process where comparisons are made. The shortcomings of the MDMP and RPD models led Klein and John F. Schmitt to develop the recognition planning model (RPM), which has been shown to increase planning tempo by approximately 20 percent. The basic RPM can be described in the following steps:
- “Identify mission” with the guidance of situational information or tasking from higher ranks
- “Test/operationalize COA” that may identify weaknesses that require evaluating consequences of an alternative COA
- “Wargame COA” to test whether it will be effective against the opposition
- “Develop orders,” which are then disseminated, executed, and potentially improvised.4
Risk-based Thinking And QRM Tools
One of the purposes of quality risk management (QRM) and risk-based thinking is to help with making science-based decisions with respect to risk (ICH Q9).5 While QRM is typically thought of in the pharmaceutical and medical device industries, its basic principles, along with risk-based thinking, are universally applicable, including in the military. Risk-based thinking requires companies to evaluate risk when developing processes, controls, and continuous improvements. The previously discussed theories and models all include some element of risk-based thinking. However, the use of risk management tools (both formal and informal) can help with analysis paralysis when too much information is available. These tools can provide structure and guidance on sorting through what is useful, and when information is lacking, tools can help identify which decisions should be prioritized. Ultimately, risk-based thinking (and the optional use of a formal tool) offers a method to think about things more strategically when the unknowns are greater than the knowns during a stressful time.
Wartime and military decisions occur under some of the most taxing conditions, but there are methods to use this to one’s advantage that can be applied universally. While researchers still do not understand the exact manner of the relationship between stress and decision making, stress has the ability to narrow the focus of one’s attention. Many hypothesize that stress may not necessarily result in a negative effect, as it could actually simplify things during a crisis by helping the decision maker focus on critical issues or activities.6 Risk-based thinking and tools can help to further focus thoughts for decision making. It is important to understand what decisions should be made quickly and which ones can be delegated or not even made. When insufficient information is available, individuals should rely on intuition, prior experience, and training to make as educated a decision as possible. In many cases, decisions can be reversed should additional information provide further clarity, allowing for any necessary course correction.
While the pharmaceutical industry is not the military, and producing drug products is not war, the methods described in this article can be applied in a variety of situations. The use of risk-based thinking and the structured approach of using QRM tools can provide the framework needed to make decisions during the unprecedented times we are currently undergoing.
The author would like to thank Jim Vesper, Ph.D., and Chris Smalley for their input on this article.
- General Colin Powel – A Leadership Primer https://www.slideshare.net/guesta3e206/colin-powells-leadership-presentation
- Kahneman, D., & Tversky, A. (1979). Prospect Theory: An Analysis of Decision under Risk. Econometrica, 47(2), 263-291.
- Schultz, J.V. (1997). A Framework for Military Decision Making under Risks. [Master’s thesis, School of Advance Airpower Studies, Maxwell Air Force Base, Alabama]. Retrieved from: https://media.defense.gov/2017/Dec/29/2001862135/-1/-1/0/T_SCHULTZ_FRAMEWORK_FOR_MILITARY_DECISION.PDF
- Ross, Karol & Klein, Gary & Thunholm, Peter & Schmitt, John & Baxter, Holly. (2004). The Recognition-Primed Decision Model.
- ICH Q9: Quality Risk Management. Geneva: International Conference on Harmonization.
- Kowalski-Trakofler, K.M. & Vaught, C. (2003) Judgment and decision making under stress: an overview for emergency managers. International Journal of Emergency Management 2003; Jun: 1(3):278-289. Retrieved from: https://www.cdc.gov/niosh/mining%5C/UserFiles/works/pdfs/jadmu.pdf
About The Author:
Stacey Largent is a senior consultant at ValSource, Inc. She specializes in quality risk management (QRM), aseptic processing, sterility assurance, and supporting the needs of vaccine/biopharmaceutical product development and manufacturing. She holds a B.S. in biomedical engineering from Drexel University and an M. Eng. in biological chemical engineering from Lehigh University. With almost 20 years in the pharmaceutical and medical device industry, she spent over 16 years at Merck and most recently served as head of its global QRM Center of Excellence. Largent also has experience in evaluating new technologies, tech transfer, leading complex investigations, and method development. She can be reached at firstname.lastname@example.org.