From

Time -

The logic of inconsistencies : a formal model for the analysis of human error

Diagnostic, Cognitive biases, Belief revision, Logic

Speaker : Valentin Fouillard

=========================================================

Jury composition

  • Andreas Herzig (Directeur de Recherches, IRIT-CNRS, Université Paul Sabatier) – Rapporteur
  • Michel Occello (Professeur des Universités, LCIS, Université Grenoble Alpes) – Rapporteur
  • Carole Adam (Maîtresse de Conférences, LIG-CNRS, Université Grenoble-Alpes) – Examinatrice
  • Emmanuelle Grislin-Le Strugeon (Maîtresse de Conférences-HdR, LAMIH-CNRS, INSA Hauts-de-France) – Examinatrice
  • Christine Paulin-Mohring (Professeur des Universités, LMF-CNRS, Université Paris-Saclay) – Examinatrice
  • Nicolas Sabouret (Professeur des Universités, LISN-CNRS, Université Paris-Saclay) – Directeur de thèse
  • Safouan Taha (Maître de Conférences, LMF-CNRS, Université Paris-Saclay) – Co-encadrant de thèse
  • Frédéric Boulanger (Professeur des Universités, LMF-CNRS, Université Paris-Saclay) – Co-encadrant de thèse

Abstract

In this thesis, we are interested in the use of formal methods to guide the diagnosis of human errors in accident situations. The application of formal methods in such a context raises several difficulties. The first one is to be able to explain with the help of mathematical logic situations that are incoherent and therefore in contradiction with this logic. The second is to be able to compare the different diagnoses. Indeed, an incorrect decision is never the work of a hazard but is based on the beliefs, desires and intentions of the operator. Thus, not all errors are equal and it is necessary to formalize and define what makes a good diagnosis.

The first part of the thesis presents a state of the art of human and social sciences (HSS) work on human error. We show that it is necessary to distinguish two aspects: the determination of the causes of erroneous decision making and the understanding of these causes through the search for cognitive biases. We then present the main computer models for modeling reasoning and studying human error. We show that consistency-based diagnosis and the belief revision operator AGM is a good way to explain human errors.

The second part of the thesis deals with the modeling of an accident situation and the diagnosis of human errors in this situation. We have based our work on a belief logic inspired by the BDI logic for the modeling of accident situations. We have developed an iterative diagnosis algorithm based on a minimal belief revision operator respecting the AGM axiomatic. This iterative diagnosis algorithm has the advantage of facilitating the distinction of errors of different nature. Moreover, it is correct and complete compared to a minimal diagnosis algorithm.

The third contribution of the thesis lies in our work to formally define the plausibility of a diagnosis. We based our work on the literature of human sciences and more precisely on cognitive biases. For this purpose, we have developed a first formal taxonomy of biases that allows us to define common logical characteristics between biases. From this taxonomy, we were able to define eight cognitive biases related to the biases present in the literature. We then considered that the more a diagnosis can be explained by the biases, the more plausible the diagnosis is.

We then studied the validity of this computer model on two cases of civil aviation accidents. We show that we find the explanations proposed by the Bureau d’Enquêtes et d’Analyses as well as explanations not considered by the investigators.

Finally, we propose several perspectives to improve our approach. In particular, we intend to take into account emotions and social interactions in the modeling of the accident situation in order to increase the variety of possible diagnoses. Finally, we wish to extend the evaluation of the diagnoses by a meta-evaluation of the cognitive biases as well as by taking into account the intention of action.