Le blog d'Archiloque

Fiche de lecture : “Being bumpable: Consequences of resource saturation and near-saturation for cognitive demands on ICU practitioners”

Le texte en question est une étude du travail d’anesthésiste.

Il s’agit d’un travail complexe dans un environnement complexe et changeant qui ressemble un peu à celui du développement logiciel et beaucoup à celui d’investigations lors d’incidents de production (d’ailleurs l’auteur a également écrit sur ce sujet).

Trois choses distinguent leur travail par rapport à d’autres domaines complexes que j’ai croisés plusieurs fois comme la sûreté aérienne ou la lutte anti-incendie :

  • Les manières de faire peuvent varier d’un établissement voire d’une personne à l’autre,

  • Les manières de faire évoluent de manière régulière en fonction des nouvelles études et produits disponibles,

  • Hors des situations standards, il n’y a pas forcément une seule manière d’approcher une intervention mais plusieurs, en fonction par exemple de la manière dont on évalue tel ou tel risque.

Des échanges pour améliorer leurs pratiques

Pour progresser, les anesthésistes organisent des points d’échanges hebdomadaires obligatoires pour discuter ensemble des interventions difficiles et/ou qui se sont mal passées (heureusement il s’agit rarement de post-mortems au sens propre).

L’objectif de ces points est double :

  • améliorer les pratiques de l’ensemble des personnes,

  • faire progresser plus spécifiquement celles qui ont le moins d’expérience.

Lors de ces points, beaucoup de discussions sur les cas compliqués consistent à peser le pour et le contre pour voir si des enseignements peuvent en être tirés. En revanche, la recherche de consensus n’est pas une fin en soi.

Si, en théorie, les discussions doivent se faire de manière professionnelle même lorsqu’on revient sur une erreur, l’auteur remarque qu’en pratique les moqueries et autres attaques personnelles n’en sont pas exclues.

Le mauvais modèle mental

La biologie humaine est complexe : les personnes sont dans des situations différentes, ne réagissent pas de la même manière aux traitements et les besoins en anesthésie changent suivant les opérations.

Lors d’une opération, il faut pouvoir prendre des décisions raisonnablement vite, voire parfois très rapidement. Pour cela les anesthésistes gardent en tête une version simplifiée de leur patient qu’on appelle modèle mental.

L’étude du dossier médical et les questions qui sont posées avant les opérations permettent de choisir quel modèle mental utiliser, mais malheureusement le modèle choisi n’est pas forcément le bon.

Le texte étudie en détail le fait de rester bloquer sur un certain modèle mental pendant une opération, ce qui peut mener à de mauvaises décisions, alors que des éléments devraient permettre de déterminer qu’il n’est plus adapté. En effet, une personne qui intervient en urgence doit à la fois choisir parmi les options possibles du modèle mental qu’elle est en train d’utiliser et garder le recul lui permettant d’en changer si nécessaire.

Malheureusement, le texte n’offre pas de solution évidente au problème.

Il y a un fort parallèle avec la manipulation des abstractions en informatique, à la différence près qu’avec le temps nos systèmes ont tendance à devenir plus faciles à investiguer (même si, à l’inverse, ils ont aussi tendance à devenir plus complexes).

Des informations partielles

Les informations dont l’anesthésiste dispose sont partielles : certaines pathologies ne sont pas visibles avant le début de l’opération, et pendant l’opération,seules certaines données sont disponibles de manière régulière et fiable.

Obtenir des informations supplémentaires peut induire plus d’examens avant l’opération et/ou des manipulations supplémentaires qui peuvent prendre du temps, gêner l’opération ou même être dangereuses pour la personne traitée (par exemple la mise en place d’une sonde). Savoir de quelles données iel a besoin est donc déjà un exercice difficile.

Pendant les opérations, l’anesthésiste doit surveiller ce qui se passe pour détecter les problèmes. Si certains signaux sont évidents et peuvent déclencher des alarmes automatiques, d’autres sont plus difficiles à déterminer et dépendent du contexte : de la personne à opérer, de l’opération et du moment de l’opération.

Cela signifie qu’iel doit en permanence garder un œil sur les différents indicateurs, même pendant qu’iel est en train de pratiquer un geste médical et doit pouvoir déterminer que quelque chose d’imprévu est en train de se produire.

C’est d’autant plus important que certains problèmes apparaissent progressivement et d’abord sous forme de signaux faibles, et que plus ils sont détectés tôt, plus il est facile de les traiter.

Cela recoupe le sujet du modèle mental : se rendre compte que le modèle mental de la personne est faux est plus difficile quand le signal est plus faible.

La solution proposée est d’avoir des habitudes, dans le sens d’essayer de travailler de la même manière pour les mêmes situations, comme par exemple de disposer les moniteurs à la même place, car cela permet d’identifier des choses inhabituelles, par exemple une valeur qui n’est pas la bonne sur un moniteur.

Le danger de la rétrospective

L’auteur insiste beaucoup sur le risque que la source d’un problème semble évidente après coup quand on connaît le résultat final de l’opération alors qu’elle ne l’était pas pendant.

Lors des discussions, les personnes prennent donc soin de se baser sur la progression chronologique du cas : à chaque moment, que s’est-il passé et qu’était-il possible de savoir ?

Car pour pouvoir faire en sorte qu’un problème ne se reproduise pas, il faut que l’anesthésiste puisse déterminer qu’iel se trouve dans une situation à risque au moment où l’information est exploitable. Or, il est parfois impossible d’identifier ces situations et l’issue de l’opération, parfois dramatique, n’aurait pas pu être évitée de manière raisonnable.

Quelques citations

In anesthesiology, as in other complex dynamic worlds, single factors rarely cause accidents. Rather accidents develop or evolve through a conjunction of several small factors, each by itself insufficient to create an incident. One important role of the anesthesiologist is to prevent these single factors from developing and combining to produce an accident. The progression of an incident towards or away from a bad outcome rests, often literally, in the hands of the practitioner. Skilled human actions can steer the evolving incident towards satisfactory outcomes. Much of anesthesia training is directed towards developing precisely this kind of skill. When disasters do occur, they often involve the conjuction of multiple factors which overwhelm, sometimes overtly and sometimes subtely, the ability of the physician to cope with the disturbance. Because the anesthesiologist is the final common pathway for incidents, he or she is often regarded as the source of error when anesthesia accidents come to attention. As Rasmussen points out,

if a system performs less satisfactorily than it normally does the cause will very likely be identified as human error… Because of human complexity, it is very difficult to “pass through” a person in causal explanations Ultimately, a thorough analysis will always end up with human error (Rasmussen, 1986).

Note that this is ultimately a process of directing attention to different parts of the stimulus world based on the pattern of cues and on previously activated knowledge. Focusing attention is critical to practitioners' ability to function in multi-signal, interleaved task domains. There are so many possible stimuli that paying attention to all of them is impossible as well as unproductive. Note, however, that it is not simply the loudness or brightness of the stimulus that draws attention but rather these features coupled with the experience of the practitioner at gleaning from many stimuli those which have significance in the current context. As experience in these complex domains grows, so does the ability to discriminate between the meaningful and meaningless stimuli. Thus the novice focuses seemingly at random on each source of data, each bright thing, while experts' attention shifts only to significant findings.

For example, one senior anesthesiologist asked about the policy of an instution regarding the care for emergent C-sections replied: “Our policy is to do the right thing”.

Many phenomena are similarly infrequent. For example, the incidence of unintubatable, unmaskable patient is quite low, and the condition may be difficult to predict. Yet, expertise in anesthesia, as in similar high consequence domains, consists in large part of being able to avoid these situations and deal with them when they arise. That is to say, expertise is largely concerned with infrequent or unusual situations. It is not acceptable for the anesthesiologist to say, well, this is really infrequent and so I couldn’t handle the situation; the function of training and study is to prepare for these rare events. Note, however, that the nature of experience generally is contrary to training: it reinforces the typical, high frequency situation. One may give two milligrams of midazolam repeatedly without complication and learn (in the sense that the cognitive cycle prompts particular schemata to be activated) that doing so is acceptable.

One consequence of the cognitive cycle is that particular perceptual stimuli arouse specific but varied items of knowledge. Some items require very particular stimuli in order to be activated; it is quite possible for individuals to “know” something in one setting and not in another. Knowledge important to a situation but not active is called “inert knowledge”

Thus the demonstration that a practitioner has the knowledge in the sense that he or she can answer questions, does not guarantee that the same knowledge will be activated under appropriate circumstances.

Failure to activate relevant knowledge is a frequent occurrence in complex domains and frequently plays a role in the cases comprising the corpus.

Limiting cognitive workload can also be accomplished by reducing the variability of the world in order to reduce the dimensions of the problem space. There are two main methods of reducing the variety that confronts the anesthesiologist. The first is by actually simplifying the world itself. For example, the practitioner may arrange syringes on the backstand in a certain way to reduce the effort necessary to find them. Organization of the workspace in general is a means of reducing the variability in the world. These strategies are particularly useful because the simplification usually requires effort at low workload times (e.g. setting up before the case).

The other method of reducing the variability of the world is by simplifying the cognitive tasks themselves, usually by employing defaults values (assumptions) or simplified models which are cognitively easier to manipulate. For example, Patel, et al. (1989) have found that many practitioners have in inaccurate model of congestive heart failure but this model may actually be quite useful because it is simpler and more easily manipulated than a more accurate model. The value of a model of the world depends mostly on what results one can derive from it. Successful practitioners must, by definition, have fairly reliable models even if these models can be shown to be incorrect in some theoretical sense. The potential for error lies in the non-standard case, in which the model or assumption is inadequate.

Note that the simplified versions of the cognitive tasks are not likely to be developed unless they (a) reduce cognitive effort and (b) are usually correct. In a case reported to us one year before the corpus data collection began, emergent reintubation for residual paralysis was complicated by a monitor which appeared to be showing a flat end-tidal carbon dioxide waveform. The endotracheal tube was removed and direct laryngoscopy repeated. In fact the monitor was set to display a different waveform in that screen location. There was a faulty indication (flat line trace) that the endotracheal tube was in the esophagus. Interestingly, the same person who had set the monitor up for that other waveform also read the flat line as “no end-tidal carbon dioxide”. To do this it was necessary to have used an assumption about the state of the monitor (i.e. that the flat trace represented carbon dioxide) which was incorrect. The monitor itself contributed to the misperception because its indication of which trace was being displayed was not perceptually salient (it consisted of a small LED indicator well away from the video screen and a screen label in small type in a noisy background). These two forms of simplification most often appear together. In example just given, the anesthesiologist always set the monitor to display end-tidal carbon dioxide during preoperative equipment checkout. This ordering of the world constitutes a means of reducing the variability in the environment. The assumption, that the screen trace was carbon dioxide, was thus made valid for the period of induction and intubation. In the case, however, the intubation was undertaken at an unusual time. Under the pressure of the emergent reintubation, the demand for cognitive efficiency was coupled with the act of intubation and led to employment of the assumption that the trace was carbon dioxide. Thus the standard procedure of reducing variability in the world supported the assumption of a default value (that the trace configuration showed carbon dioxide) when, because of unusual circumstances, it was actually incorrect. The assumption of default value is normal and useful for practitioners under most circumstances. Indeed, it is actually impossible to avoid deriving assumptions and using simplified models. All complex domains, including anesthesiology, are so semantically complex that it would be impossible for practitioners to constantly check all components of their internal mental model against the actual state of the world.

Naive critiques of domain practice often indict the assumptions of default values and the use of simplified models. In retrospect all assumptions are susceptible to flaws which may contribute to poor performance. What such criticisms fail to do, however, is acknowledge that high quality domain performance is often dependent on these same assumptions. It would be impossible to test every assumption about the state of the system at each instant. Even if it were theoretically possible to do so, requiring such tests would cripple cognitive processing. Any practitioner confronted with such a “rule” would necessarily learn the aspects of it relevant to the actual contexts seen and discard the remainder. Practitioners learn by experience which ones are likely to be true and which are more vulnerable.

In real situations, such as the case above, there may be many influences operating simultaneously in a changing environment. The practitioner is charged with sorting out the influences and effects in real time. He or she must keep track of what is working and what is not, whether the interventions are successful, what is likely to happen next, etc. Anesthesiology, like similar domains, deals with volatile and fast paced situations. It is critically important that practitioners build and maintain a coherent “situation awareness” , which makes sense of the multiple factors at work including faults, operator interventions and automatic system responses (e.g. the functioning of infusion devices). Researchers examining expertise in situ have noted that practitioners themselves coin phrases to describe this ability to maintain a coherent view of the changing situation: in commercial aviation it is sometimes referred to as flying ahead of the plane, in carrier flightdeck operations it is called having the bubble and von Clausewitz called it coup d’œil on the battlefield. Situation assessment is what allows practitioners to to determine where and when decisive action can be taken.

Being able to exercise effective control over a situation demands first that the practitioner track the state of the system. This means not only determining the condition of the patient but also the external and internal influences which are acting to produce that state. Practitioners need to keep a running tab of the influences acting on the patient. To make this possible they may adopt control strategies which minimize the overlap of different influences in order to eliminate the need to separate their contributions.

When situation assessment is lost, that is, when the practitioner is no longer tracking the influences and effects with sufficient precision to permit meaningful interventions, the practitioner has “lost the bubble”. Losing the bubble can have grave consequences if the situation is precarious or changing. Most practitioners in these domains can describe personal experiences which fit the definition of loosing the bubble and, many times, these are cases which resulted in near disaster. It is difficult to detect loss of situation awareness in the conference cases. A good part of the anesthesiologist’s training is oriented towards avoiding the loss of situation awareness and on reestablishing it when it is absent.

Reviewing and altering plans under pressure is difficult and may even be impossible given the demands for immediate action. But planning can be undertaken to various depths. Planning is not simply the selection of a single approach to a problem but rather the construction of a collection of approaches for a variety of different circumstances. The difference between simplistic planning (“I will do X for this case”) and extensive planning (“I will do X for this case but will be prepared to do Y under certain circumstances and Z under others”) can be crucial in event driven, high uncertainty domains like anesthesiology.

This extensive planning is cognitively effortful and demands integration of large amounts of material. The situations for which the alternative courses of action are required rarely occur (e.g. unable to intubate, unable to mask) and so there is little reinforcement for extensive contingent planning.