- Human cognitive reliability correlation
Human Cognitive Reliability Correlation (HCR) is a technique used in the field of
Human reliabilityAssessment (HRA), for the purposes of evaluating the probability of a human error occurring throughout the completion of a specific task. From such analyses measures can then be taken to reduce the likelihood of errors occurring within a system and therefore lead to an improvement in the overall levels of safety. There exist three primary reasons for conducting an HRA; error identification, error quantification and error reduction. As there exist a number of techniques used for such purposes, they can be split into one of two classifications; first generation techniques and second generation techniques. First generation techniques work on the basis of the simple dichotomy of ‘fits/doesn’t fit’ in the matching of the error situation in context with related error identification and quantification and second generation techniques are more theory based in their assessment and quantification of errors. HRA techniques have been utilised in a range of industries including healthcare, engineering, nuclear, transportation and business sector; each technique has varying uses within different disciplines.
HCR is based on the premise that an operator’s likelihood of success or failure in a time-critical task is dependent on the cognitive process used to make the critical decisions that determine the outcome. Three Performance Shaping Factors (PSFs) – Operator Experience, Stress Level, and Quality of Operator/Plant Interface - also influence the average (median) time taken to perform the task. Combining these factors enables “response-time” curves to be calibrated and compared to the available time to perform the task. Using these curves, the analyst can then estimate the likelihood that an operator will take the correct action, as required by a given stimulus (e.g. pressure warning signal), within the available time window. The relationship between these normalised times and Human Error Probabilities (HEPs) is based on simulator experimental data.
HCR is a psychology/cognitive modelling approach to HRA developed by Hannaman et al in 1984. [HANNAMAN, G.W., Spurgin, A.J. & Lukic, Y.D., Human cognitive reliability model for PRA analysis. Draft Report NUS-4531, EPRI Project RP2170-3. 1984, Electric Power and Research Institute: Palo Alto, CA.] The method uses Rasmussen’s idea of rule-based, skill-based, and knowledge-based decision making to determine the likelihood of failing a given task [Rasmussen, J. (1983) Skills, rules, knowledge; signals, signs and symbols and other distinctions in human performance models. IEEE Transactions on Systems, Man and
Cybernetics. SMC-13(3).] , as well as considering the PSFs of operator experience, stress and interface quality. The database underpinning this methodology was originally developed through the use of nuclear power-plant simulations due to a requirement for a method by which nuclear operating reliability could be quantified.
The HCR methodology is broken down into a sequence of steps as given below:
# The first step is for the analyst to determine the situation in need of a human reliability assessment. It is then determined whether this situation is governed by rule-based, skill-based or knowledge-based
# From the relevant literature, the appropriate HCR mathematical model or graphical curve is then selected.
# The median response time to perform the task in question is thereafter determined. This is commonly done by expert judgement, operator interview or simulator experiment. In much literature, this time is referred to as T1/2 nominal.
# The median response time, (T1/2), requires to be amended to make it specific to the situational context. This is done by means of the PSF coefficients K1 (Operator Experience), K2 (Stress Level) and K3 (Quality of Operator/Plant Interface) given in the literature and using the following formula:
T1/2 = T1/2 nominal × (1 + K1)(1 + K2)(1 + K3)
Performance improving PSFs (e.g. worker experience, low stress) will take negative values resulting in quicker times, whilst performance inhibiting PSFs (e.g. poor interface) will increase this adjusted median time.
5. For the action being assessed, the time window (T) should then be calculated, which is the time in which the operator must take action to correctly resolve the situation.
6. To obtain the non-response probability, the time window (T) is divided by T1/2, the median time. This gives the Normalised Time Value. The probability of non-response can then be found by referring to the HCR curve selected earlier.This non-response probability may then be integrated into a fuller HRA; a complete HEP can only be reached in conjunction with other methods as non-response is not the sole source of human error.
The following example is taken from Human Factors in Reliability GroupHumphreys, P. (1995). Human Reliability Assessor’s Guide. Human Factors in Reliability Group.] in which Hannaman describes analysis of failure to manually
SCRAMin a Westinghouse PWR.
The example concerns a model in which failures occurs to manually SCRAM in a Westinghouse PWR. The primary task to be carried out involves inserting control rods into the core. This can be further broken down into two sub-tasks which involve namely detection and action, which are in turn based upon recognising and identifying an automatic trip failure.
Given that there exists the assumption that there is simply one option in the procedures and that within training procedures optional actions are disregarded, the likelihood that a reactor trip failure will be incorrectly diagnosed is minimal.
It is also assumed that the behaviour of the operating crew under consideration is skill-based; the reactor trip event which takes place is not part of a routine, however the temporary behaviour adopted by the crew when the event is taking place is nevertheless recognised. Moreover, there are well set procedures which determine how the event should be conducted and these are comprehended and practised to required standards in training sessions.
The average time taken by the crew to complete the task is 25 seconds; there is no documentation as to why this is the case. The average completion times for the respective subtasks are therefore set as 10 seconds for detection of the failure and 15 seconds for taking subsequent action to remedy the situation.
The PSFs (K factor) judged to influence the situation are assessed to be in the following categories: -operator experience is “well trained”-stress level is “potential emergency”-quality of interface is “good”The various K factors are assigned the following values:
* K1 = 0.0
* K2 = 0.28
* K3 = 0.0
Referring to the equation in Step 4 above, the product is therefore equal to the value of 1.28. In response, the average tasks times are altered from 10 and 15 seconds to 12.8 and 19.2 seconds respectively. Given that the PSFs are identical for both of the given subtasks, it is therefore possible to sum the median response times to give a total of 32 seconds, adjusting the figure for stress, compared to a previous total of 25 seconds.
The time window (T) to perform the task as part of the overall system is given as 79 seconds. This time is derived from a study conducted by Westinghouse in which it was discovered that the crew had approximately 79 seconds to complete the task of inserting the control rod to the reactor and then to shut the reactor down in order to inhibit over-pressuring within the main operating system.
Consulting the graphical curve central to the technique, the normalised time for the task can thus be established. It is determined by the division of 79 seconds and 32 seconds, giving a result of 2.47 seconds. Identifying this point on the
abscissa(the HCR curve model) provides a non response probability of 2.9 x 10-3; this can also be checked for validationutilising the formula:-
PRT (79) = exp – [ (79/32) – 0.7 / 0.407] 1.2PRT (79) = 2.9 x 10 -3/ demand
Where PRT (T) equals the probability of non success within the system time window T.
Provided below is the graphical solution for the assessment using the HCR technique:
Advantages of HCR
* The approach explicitly models the time-dependent nature of HRA
* It is a fairly quick technique to carry out and has a relative ease of use
* The three modes of decision-making, knowledge-based, skill-based and rule-based are all modelled
Disadvantages of HCR
* The HEP produced by HCR is not complete; it calculates the probability that a system operator will fail to diagnose and process information, make a decision and act within the time available. It does not give any regard to misdiagnoses or rule violations.
* The same probability curves are used to model non-detection and slow response failures. These are very different processes, and it is unlikely that identical curves could model their behaviour. Furthermore, it is uncertain as to whether such curves could be applied to situations in which detection failures or processing difficulties are the primary dominating factors of influence.
* The rules for judging Knowledge-based, Skill-based and Rule-based behaviour are not exhaustive. Assigning the wrong behaviour to a task can mean differences of up to two orders of magnitude in the HEP.
* The method is very sensitive to changes in the estimate of the median time. Therefore, this estimate requires to be very accurate otherwise the estimation in the HEP will suffer as a consequence.
* It is highly resource intensive to collect all the required data for the HCR methodology, particularly due to the necessity of evaluation for all new situations which require an assessment.
* There is no sense of output from the model that indicates in any way of how human reliability could be adjusted to allow for improvement or
optimisationto meet required goals of performance.
* Only three PSFs are included in the methodology; there are several other PSF’s that could affect performance which are unaccounted for.
* The model is relatively insensitive to PSF changes as opposed to, for example, time parameter changes.
* As the HCR
correlationwas originally developed for use within the nuclear industry, it is not possible to use the methodology for applications out-with this domain.
Wikimedia Foundation. 2010.
Look at other dictionaries:
Human nutrition — For aspects of nutrition science not specific to humans, see Nutrition. Human nutrition is the provision to humans to obtain the materials necessary to support life. In general, humans can survive for two to eight weeks without food, depending on … Wikipedia
Intelligence quotient — IQ redirects here. For other uses, see IQ (disambiguation). Intelligence quotient Diagnostics An example of one kind of IQ test item, modeled after items in the Raven s Progressive Matrices test … Wikipedia
Race and intelligence — Human intelligence Abilities and Traits … Wikipedia
Life Sciences — ▪ 2009 Introduction Zoology In 2008 several zoological studies provided new insights into how species life history traits (such as the timing of reproduction or the length of life of adult individuals) are derived in part as responses to… … Universalium
Factor analysis — is a statistical method used to describe variability among observed, correlated variables in terms of a potentially lower number of unobserved, uncorrelated variables called factors. In other words, it is possible, for example, that variations in … Wikipedia
List of psychology topics — This page aims to list all topics related to psychology. This is so that those interested in the subject can monitor changes to the pages by clicking on Related changes in the sidebar. It is also to see the gaps in Wikipedia s coverage of the… … Wikipedia
Creativity — For other uses of Creativity , see Creativity (disambiguation). Human intelligence Abilities and Traits Abstract thought Communication … Wikipedia
Health and Disease — ▪ 2009 Introduction Food and Drug Safety. In 2008 the contamination of infant formula and related dairy products with melamine in China led to widespread health problems in children, including urinary problems and possible renal tube… … Universalium
Race and crime in the United States — Race Classification Race (classification of humans) Genetics … Wikipedia
IQ and Global Inequality — is a controversial 2006 book by psychologist Richard Lynn and political scientist Tatu Vanhanen. IQ and Global Inequality is follow up to their 2002 book IQ and the Wealth of Nations,[ … Wikipedia