(Hu)man in the loop

Julkaistu 30.07.2019

As we know, technology is “hard” and human behaviour is “soft”. In the following text, I try to review few essential points about how interaction between soft and hard parties are handled in the nuclear domain – or how it should be.

User interfaces – interfaces to what?

The user interface is needed to enable the interactions between humans and machines, e.g. to control the nuclear reactor. The goal of this interaction is to enable an effective control of the plant by its operators, while the machine simultaneously feeds back the information that aids the operators’ decision-making process. In case of nuclear reactor, the role of safety functions against the potential reactor accidents is emphasized, obviously with very good reasons.

Terminology of the user interfaces has variable alternatives and its evolution interestingly reflects the changes in attitudes and thinking. Oldest of those frequently occurring abbreviations for control room user interface is MMI, man-machine interface, simply expressing that there is some specifically engineered function between “the man” and “the machine” that allows the interaction between them. However, since “man” could be understood as referring to a specific gender, thus not supporting the equality, it needed to be replaced with an expression of more neutral nature. Therefore you may nowadays often see term HMI, human-machine interface, which allows also women to operate “the machine”, although women still work rarely as power plant operators. Most recent term for the same meaning is HSI, human-system interface. This modification could be understood from the perspective of systems engineering thinking. The operators don’t actually operate single “machines”, but in reality, they are part of more complex system, including lot of technical systems and layers of components, machines and especially automation algorithms interacting as a complicated network.

Human factors engineering – are all the people stupid?

Human Factors Engineering, HFE, is a discipline, that intends to ensure the minimization of possibilities for human errors. This includes many aspects, for example analysing the operators’ tasks in regard of complexity of the tasks and time available for their successful completion. Another essential thing is to ensure that the MMI/HMI/HSI is easily usable and understandable so that necessary information can be found and utilized without errors. HFE also tries to ensure that human errors can be avoided in any maintenance works.

If you think whether or not all that is necessary, and why couldn’t the people just pay more attention when doing their work, you could memorize yourself few well known major accidents, e.g. :

  1. The Tenerife air crash – the most deadly aviation accident in the world so far – caused by series of misunderstandings between the flight control and the airplane crew
  2. Destruction of NASA’s Mars Orbiter while trying the landing in Mars, due to a mistake of using different measurements units by different engineering teams (actually, feet and meters were confused!))
  3. Three Mile Island nuclear reactor meltdown due to an operator error in reading the poorly organized user interface.

The common feature for these accidents is, that human made just a simple, minor mistake either in communication, operation or design, which happened to lead to really catastrophic outcome. Those were exactly the kind of errors the HFE tries to prevent. Human errors as such may be unavoidable, but clever design can provide effective tools and means to substantially reduce such errors, as well as increase systems’ tolerance against them.

Human reliability – are all people unreliable?

Human reliability (also known as human performance) is related to the field of human factors and ergonomics and refers to the scientific approach of reliability of humans. It is applied in various industrial fields. It contains several types of methods and analyses which try to quantify the probability of erroneous action of a human operator. Known fact is that humans tend to overestimate their ability to maintain control when they are doing work. The common traps of human nature, especially when working under complicated environment, are e.g.:

  • Stress – The problem with stress is that it can accumulate and overpower a person, thus becoming detrimental to performance.
  • Avoidance of Mental Strain – Humans are reluctant to engage in lengthy concentrated thinking, as it requires high levels of attention for extended periods.
  • The mental biases, or shortcuts, often used to reduce mental effort and expedite decision-making include:
    • Assumptions – A condition taken for granted or accepted as true without verification of the facts.
    • Habit – An unconscious pattern of behaviour acquired through frequent repetition.
    • Confirmation bias – The reluctance to abandon a current solution.
    • Similarity bias – The tendency to recall solutions from situations that appear similar
    • Frequency bias – A gamble that a frequently used solution will work.
    • Availability bias – The tendency to settle on solutions or courses of action that readily come to mind.
  • Limited Working Memory – The mind’s short-term memory is the “workbench” for problem solving and decision-making.
  • Limited Attention Resources – The limited ability to concentrate on two or more activities challenges the ability to process information needed to solve problems.
  • Mind-Set People tend to focus more on what they want to accomplish (a goal) and less on what needs to be avoided because human beings are primarily goal-oriented by nature. As such, people tend to “see” only what the mind expects, or wants, to see.
  • Difficulty Seeing One’s Own Error – Individuals, especially when working alone, are particularly susceptible to missing errors.
  • Limited Perspective – Humans cannot see all there is to see. The inability of the human mind to perceive all facts pertinent to a decision challenges problem-solving.
  • Susceptibility to Emotional/Social Factors – Anger and embarrassment adversely influence team and individual performance.
  • Fatigue – People get tired. Physical, emotional, and mental fatigue can lead to error and poor judgment.
  • Presenteeism – Some employees will be present in the need to belong to the workplace despite a diminished capacity to perform their jobs due to illness or injury.

Any many more…

Ergonomy – but aren’t all humans different?

Today, ergonomics should be well known practical science that provides all different people with sufficiently nice conditions and well-organized working environment.

There is a story about how to manage the situation with practical sense in the task usually requiring scientific knowledge. During a Finnish nuclear power plant construction several decades ago, there was ongoing the installation of the control panels to the walls of the control room. However, it was somewhat unclear which is the right height so that any operator, now and in future, would be able to use the panels and reach all the buttons. That time there was not sufficient expertise present so that anyone would have been solving this question beforehand, so that the problem needed to be solved right there and right now. So, the control room designers started thinking, who could be the shortest person on the construction site, and once they concluded the right candidate, he was immediately fetched to the control room and the appropriate height of the control panel was defined.

Nowadays the control room designers know how to use anthropometric data. Anthropometrics refers to the science of human body measurements and is closely related to ergonomics. It provides the ready-made data, e.g. how to position the control room equipment so that 98% of the population (of particular country) is able to use it, to reach it, or to see necessary object over any known obstacles.

Operating procedures – who needs instructions anyway?

Emergency situations, which are very unusual but also very critical situations are always very challenging even for the most experienced operators. It is a well-known fact, that the consciousness about the possibility of the catastrophic results from the mistakes in one’s action in controlling the process can reduce person’s cognitive capability. In common terms, negative emotions such as fear or nervousness, could get anyone making “easy mistakes” in cognitive tasks, making poor decisions, or even getting “frozen”, being unable to act at all. This has been seen during all kind of emergency situations, in different fields of society, industrial environments or e.g. in traffic accidents.

To support power plant operators’ work during unusual events or accidents, there are situation-specific procedures that instruct the operators step-by-step first of all to verify what has happened, how to choose the right strategy from the possible alternatives, and the how to carry out the right actions to turn things right again.

What is interesting, there are two completely opposing views how those procedures should be understood. Both views agree that the procedures are necessary and useful, but how to use them is subject to debate. One view is that the emergency procedures shall be followed by the operators literally, no matter what happens. Any personal thinking should be avoided, since it would not improve the results, instead it could make things worse, since the situation is highly unusual, unique for anyone getting into it, and likelihood of any error is increased due to highly stressful situation. Another view is to see the procedure more as a proposal or support to the operator, who can and should use his own reasoning when applying of the procedure and deviate from the procedure if he thinks it is the better way. This latter way enables operator to react on the circumstances that may be different from the assumptions used by the safety engineer who originally composed the procedure. But does the operator have time, knowledge and nerves to consider all the aspects correctly?

Which view would you support?

Digitalization – do you see details or overview?

Emergence of the digital user interfaces in the control rooms to replace the conventional control panels has not been painless. The digital screen-based information presentation has flexibility and efficiency far beyond the conventional systems, but it has pitfalls too. One well known risk is the keyhole effect. It means that when the panel walls and desks are replaced by the computer screens, there is true risk of losing the overall picture of the process status. Also it is possible that vital information is not received since it is shown only in the display page that is not currently open. In other words, you don’t have all the measurements information visible all the time as before, but instead only some parts of it, depending which display pages you are watching right now. Common solution to the keyhole effect problem is to provide specific overall displays which show the critical plant data from all relevant process systems. Those overall displays should be continuously visible. The issue of course remains, how to choose just right information for them.

Don’t worry, we have the alarm system!

Basically, alarm systems should be able to show all relevant deviations to the operators, and sometimes they manage to do so. Common issue is the alarms avalanche, which frequently happens in bigger disturbances, e.g. in the reactor trip. Then, practically all the process systems send lot of alarms as the process parameters move away from their normal operation ranges. For the operators this situation is chaotic. The alarm system suddenly shows hundreds or thousands alarms, and using it to understand the causes and consequences turns impossible, right when it is needed most.

There are several ways to improve this. Most common means are to add priorities to the alarms, so that high priority alarms can be immediately distinguished from the less important categories, or to build specific logics to “catch” the first causes of the chosen critical events. Still, all the unimportant events need to be recorded, since they provide valuable information for event analyses or in some cases for the equipment diagnostics. So, a particular challenge for the alarms systems is that it should be equally useful in the quiet times of the normal operation when almost nothing is happening, and in the fuzzy emergency situations where everything is happening at the same time. Building such a system requires know-how, effort and deep knowledge about the plant in question and its processes.

(Hu)man in the loop

Despite the development of computational power, algorithms and even the AI, the basic idea of human operators running the nuclear plant is not changing. Human brains are still considered the most reliable control equipment in the Nuclear reactor operation. Human operators are kept “in the loop” and considered as a guarantee to keep things right and safe.

Human Factors engineering as a discipline tries to ensure the compatibility of technical and human sides. Unfortunately, HFE is still underestimated and poorly known discipline in Finnish nuclear sector, with very few real experts out there. Reason for its slow development is the fact that new plants were not built for a long time and there has not been opportunity for cumulative growth of this knowledge. Also the guidance provided by international standardization has not evolved as it should have. There is obviously room for up-to-date approach that would support practical work better than the American standards commonly used today.

Kirjoittaja