Skip to content

Risk management in development projects, Prof. Dr. Halman – my thoughts to the lecture from a decision-making perspective

During the course Safety by Design I followed an interesting guest lecture given by prof. dr. Halman, who introduced his field of expertise: risk management in innovation projects and how to optimize decision-making in these projects. Especially, from a decision-making point of view relevant conclusions can be drawn. In the following I present my own thoughts to prof. Halman’s lecture and connect it to other literature findingsin decision-making. The lecture slides of prof. dr. Halman presentation can be found here:

https://safety.productions/2018/12/04/risk-management-in-development-projects/

Compared to the psychological viewpoint ondecision-making, decisions in the business environment are of dynamic nature asthe environment is rapidly changing. Especially, in complex systems as the system engineering domain this holds true. Therefore, the characteristics of business decisions have four distinctive characteristics compared to other decisions, namely decisions involve a series of decisions, they are interdependent, the environment changes constantly, and the decisions must be made in the correct order and at a specific moment in time (Brehmer, 1992). That is why in the business environment human decision-makers are specifically challenged to make the correct decision and should be as objective as possible. But human decision-makers are not free of subjectivity when they make decisions. For example, there is a difference between actual and perceived risk for decision-makers. Often we make decisions about risks based on our assumptions, not proven by empirical data. In this context, prof. Halman discussed the perceived risk of dying in a plane accident versus the actual risk of dying in a car accident. We, as humans, are inclined to perceive risk differently than what it actually is, our assumptions play therefore an essential part in the decision-making process.

Moreover, in his lecture prof. Halman stressed the findings of the well-known researchers Kahneman and Tversky, who demonstrated that how a statement is framed will influence how the decision-maker perceives it. Generally, people rather tend to avoid a loss (Kahneman, 2011). This indicates that by framing a decision in a positive way, the decision-maker is more inclined to think about the decision more critical,  especially useful if the decision involves the mitigation of risks (Kahneman, 2011). This can be used to the decision support tool’s advantage.

This is also supported by other researcher’s findings, as known reasons why decisions fail are human mental models and psychological traps (Chermack, 2004;Hall, 2010). Mental models in particular are how we understand the world, this includes the assumptions we use to explain us the world. The mind constructs those models from perception, imagination, knowledge and  disclosure comprehension (Johnson-Laird,2006). Next to the mental models of humans, Chermack (2004) identified three more characteristics of human decisions that can lead to failure, namely bounded rationality, stickiness of information/ friction of knowledge and that humans often only rely on outside variables instead of trusting on past performance data for their decisions (ignoring iterative feedback loops). An interesting aspect to consider further is  friction of knowledge, implying that people from different backgrounds have a different understanding of the information received. Depending on the cost of knowledge transfer, they are inclined not to share the information, which can lead to bad decisions because of being uninformed. The cost of knowledge transfer increases when team members speak different languages, come from different backgrounds or interact in different social environments (Chermack, 2004).

According to Halman, the risks of projects can roughly be categorized in three areas, namely technological-, organizational- and market & business risks. Often times, especially organizational risks are overlooked in decision-making. In this context, prof. Halman introduced “the Challenger case”, an important lesson in terms of multi-stakeholder decision-making and group thinking. Group thinking describes the desire for harmony in a group, which then can lead to irrational decision-making (Janis, 1991). Callaway &Esser (1984) discussed the eight symptoms of group thinking; when several symptoms are present it can lead to bad team decisions. Those eight symptoms are the following:

  1. Illusion of invulnerability: having the feeling nothing can happen to the group.
  2. Belief in inherent morality of the group: ignore ethical consequences.
  3. Collective rationalization: discredit and explain away warnings.  
  4. Out-group stereotypes: a social group with which the individual does not identify, i.e. managers vs. engineers.
  5. Self-censorship: controlling of what to say in order to not experience criticism. 
  6. Illusion of unanimity: thinking everyone agrees with the solution, understanding silence as agreement. 
  7. Direct pressure on dissenters: pressure from the outside and the group to conform (time, cost vs. risks).
  8. Self-appointed mind guards: protect the group from contrary viewpoints.

Group thinking often occurs in situations where the group is rather homogenous, isolated from opposing opinions, and groups with a strong hierarchy structure. On top of that, if there is no formal decision-making process the risk of group thinking increases (Callaway & Esser, 1984). Many of the symptoms of group thinking have been present in “the Challenger case”, which resulted in a decision that led to a catastrophic disaster (Janis, 1991). Even though, the symptoms of group thinking have been discovered several decades ago, they can still be found in today’s group decisions and should be kept in mind when designing a decision support tool for multi-organizational decision-making.

In his lecture Halman argued that especially valuing the minority viewpoint is essential for mitigating the risks of group thinking. It would be interesting to investigate how the different interests of stakeholders can influence this approach, as reaching stakeholder consensus in decision-making is an important aspect in more recent literature findings. For example, Kim, MacDonald, & Andersen (2013) suggest that establishing a collective mental model for team decisions with the help of simulation-based dynamic hypothesis testing improves decision-making in complex environments.

Conclusions that can be drawn are mainly related to the psychological aspect of human decisions. This is sometimes underestimated, because the focus is on mitigating technological risks. A good balance between the organizational and technological risks should be aimed for in decision-making. Several important lessons can be learnt, some of which are suggested by Halman as well (Keizer, Halman, & Song, 2002):

  1. Use a systematic approach to identify as many alternatives as possible (i.e. use different categories, project management tools & project charters, checklists etc.)
  2. Incorporate expert opinions in a multidisciplinary team decision (teach the team in multidisciplinary thinking)
  3. Use past performance data, lessons learnt and iterative feedback loops to make the best use of the knowledge available
  4. Value the minority viewpoint by maximizing individual and group contribution
  5. Expose groups to external and conflicting viewpoints
  6. As a team leader avoid being directive, but rather be encouraging
  7. Use positive statements to facilitate critical thinking
  8. Use scenarios and simulation models to facilitate thinking in terms of consequences(Chermack, 2004) and evaluate dynamic hypotheses (Kim et al., 2013)
  9. Incorporate costs/benefits directly into the team, then they are more inclined to make a well-thought through decision (Hall, 2010)

To what extent these lessons can, and should, be applied in a decision support tool and to what extent more recent literature agrees with the findings must still be investigated.

References:

Brehmer, B. (1992). Dynamic decision making: Human control of complex systems. Acta Psychologica, 81(3), 211–241. https://doi.org/10.1016/0001-6918(92)90019-A

Callaway, M. R., &Esser, J. K. (1984). Groupthink: Effects of cohesiveness and problem-solving procedures on group decision-making. Social Behavior and Personality: An International Journal, 12(2), 157–164.https://doi.org/10.2224/sbp.1984.12.2.157

Chermack, T. J. (2004). Improving decision-making with scenario planning. Future, 36, 295–309.

Hall, W. L. (2010). Barriers to achieving sustainability-based decision making. Sustainable Land Development and Restoration (1st ed.). Elsevier.https://doi.org/10.1016/B978-1-85617-797-9.00004-6

Janis, I. L. (1991). Groupthink. In A First Look at Communication Theory (pp. 235–246). New York: McGrawHill.

Johnson-Laird, P. N. (2006). Mental Models, Sentential Reasoning, and Illusory Inferences. In H. Carsten, M. Knauff, & G. Vosgerau(Eds.), Mental Models and the Mind Current Developments in Cognitive Psychology, Neuroscience, and Philosophy of Mind (Vol. 138, pp. 27–51).

Kahneman, D. (2011). Thinking, fast and slow. London: Farrar Straus Giroux.

Keizer, J. A., Halman, J. I. M., & Song, M. (2002). From experience: applying the risk diagnosing methodology. The Journal of Product Innovation Management, 19, 213–232.

Kim, H., MacDonald, R. H., & Andersen, D. F.(2013). Simulation and Managerial Decision Making: A Double-Loop LearningFramework. Public Administration Review, 73(2), 291–300.https://doi.org/10.1111/j.1540-6210.2012.02656.x