Wednesday, 30 March 2011 15:35

Accident Modelling

Rate this item
(6 votes)

Humans play important roles in most of the processes leading up to accidents and in the majority of measures aimed at accident prevention. Therefore, it is vital that models of the accident process should provide clear guidance about the links between human actions and accidents. Only then will it be possible to carry out systematic accident investigation in order to understand these links and to make predictions about the effect of changes in the design and layout of workplaces, in the training, selection and motivation of workers and managers, and in the organization of work and management safety systems.

Early Modelling

Up until the 1960s, modelling human and organizational factors in accidents had been rather unsophisticated. These models had not differentiated human elements relevant to accidents beyond rough subdivisions such as skills, personality factors, motivational factors and fatigue. Accidents were seen as undifferentiated problems for which undifferentiated solutions were sought (as doctors two centuries ago sought to cure many then undifferentiated diseases by bleeding the patient).

Reviews of accident research literature that were published by Surry (1969) and by Hale and Hale (1972) were among the first attempts to go deeper and offer a basis for classifying accidents into types reflecting differentiated aetiologies, which were themselves linked to failures in different aspects of the man-technology-environment relationships. In both of these reviews, the authors drew upon the accumulating insights of cognitive psychology in order to develop models presenting people as information processors, responding to their environment and its hazards by trying to perceive and control the risks that are present. Accidents were considered in these models as failures of different parts of this process of control that occur when one or more of the control steps does not perform satisfactorily. The emphasis was also shifted in these models away from blaming the individual for failures or errors, and towards focusing on the mismatch between the behavioural demands of the task or system and the possibilities inherent in the way behaviour is generated and organized.

Human Behaviour

Later developments of these models by Hale and Glendon (1987) linked them to the work of Rasmussen and Reason (Reason 1990), which classified human behaviour into three levels of processing:

  • automatic, largely unconscious responses to routine situations (skill-based behaviour)
  • matching learned rules to a correct diagnosis of the prevailing situation (rule-based behaviour)
  • conscious and time-consuming problem solving in novel situations (knowledge-based behaviour).


The typical failures of control differ from one level of behaviour to another, as do the types of accidents and the appropriate safety measures used to control them. The Hale and Glendon model, updated with more recent insights, is depicted in figure 1. It is made up of a number of building blocks which will be explained successively in order to arrive at the full model.

Figure 1. Individual problem solving in the face of danger


Link to deviation models

The starting point of the Hale and Glendon model is the way in which danger evolves in any workplace or system. Danger is considered to be always present, but kept under control by a large number of accident-prevention measures linked to hardware (e.g., the design of equipment and safeguards), people (e.g., skilled operators), procedures (e.g., preventive maintenance) and organization (e.g., allocation of responsibility for critical safety tasks). Provided that all relevant dangers and potential hazards have been foreseen and the preventive measures for them have been properly designed and chosen, no damage will occur. Only if a deviation from this desired, normal state takes place can the accident process start. (These deviation models are dealt with in detail in “Accident deviation models”.)

The task of the people in the system is to assure proper functioning of the accident-prevention measures so as to avert deviations, by using the correct procedures for each eventuality, handling safety equipment with care, and undertaking the necessary checks and adjustments. People also have the task of detecting and correcting many of the deviations which may occur and of adapting the system and its preventive measures to new demands, new dangers and new insights. All these actions are modelled in the Hale and Glendon model as detection and control tasks related to a danger.

Problem solving

The Hale and Glendon model conceptualizes the role of human action in controlling danger as a problem-solving task. The steps in such a task can be described generically as in figure 2.

Figure 2. Problem-solving cycle


This task is a goal-seeking process, driven by the standards set in step one in figure 2. These are the standards of safety which workers set for themselves, or which are set by employers, manufacturers or legislators. The model has the advantage that it can be applied not only to individual workers faced with imminent or future danger, but also to groups of workers, departments or organizations aiming to control both existing danger from a process or industry and future danger from new technology or products at the design stage. Hence safety management systems can be modelled in a consistent way with human behaviour, allowing the designer or evaluator of safety management to take an appropriately focused or a wide view of the interlocking tasks of different levels of an organization (Hale et al. 1994).













Applying these steps to individual behaviour in the face of danger we obtain figure 3. Some examples of each step can clarify the task of the individual. Some degree of danger, as stated above, is assumed to be present all the time in all situations. The question is whether an individual worker responds to that danger. This will depend partly on how insistent the danger signals are and partly on the worker’s own consciousness of danger and standards of acceptable level of risk. When a piece of machinery unexpectedly glows red hot, or a fork-lift truck approaches at high speed, or smoke starts seeping from under the door, individual workers skip immediately to considering the need for action, or even to deciding what they or someone else can do.

Figure 3. Behaviour in the face of danger


These situations of imminent danger are rare in most industries, and it is normally desirable to activate workers to control danger when it is much less imminent. For example, workers should recognize slight wear on the machine guard and report it, and realize that a certain noise level will make them deaf if they are continuously exposed to it for some years. Designers should anticipate that a novice worker could be liable to use their proposed new product in a way that could be dangerous.

To do this, all persons responsible for safety must first consider the possibility that danger is or will be present. Consideration of danger is partly a matter of personality and partly of experience. It can also be encouraged by training and guaranteed by making it an explicit part of tasks and procedures at the design and execution phases of a process, where it may be confirmed and encouraged by colleagues and superiors. Secondly, workers and supervisors must know how to anticipate and recognize the signs of danger. To ensure the appropriate quality of alertness, they must accustom themselves to recognize potential accident scenarios—that is, indications and sets of indications that could lead to loss of control and so to damage. This is partly a question of understanding webs of cause and effect, such as how a process can get out of control, how noise damages hearing or how and when a trench can collapse.

Just as important is an attitude of creative mistrust. This involves considering that tools, machines and systems can be misused, go wrong, or show properties and interactions outside their designers’ intentions. It applies “Murphy’s Law” (whatever can go wrong will go wrong) creatively, by anticipating possible failures and affording the opportunity of eliminating or controlling them. Such an attitude, together with knowledge and understanding, also helps at the next step—that is, in really believing that some sort of danger is sufficiently likely or serious to warrant action.

Labelling something as dangerous enough to need action is again partly a matter of personality; for instance, it may have to do with how pessimistic a person may be about technology. More importantly, it is very strongly influenced by the kind of experience that will prompt workers to ask themselves such questions as, “Has it gone wrong in the past?” or “Has it worked for years with the same level of risk with no accidents?” The results of research on risk perception and on attempts to influence it by risk communication or feedback on accident and incident experience are given in more detail in other articles.

Even if the need for some action is realized, workers may take no action for many reasons: they do not, for example, think it is their place to interfere with someone else’s work; they do not know what to do; they see the situation as unchangeable (“it is just part of working in this industry”); or they fear reprisal for reporting a potential problem. Beliefs and knowledge about cause and effect and about the attribution of responsibility for accidents and accident prevention are important here. For example, supervisors who consider that accidents are largely caused by careless and accident-prone workers will not see any need for action on their own part, except perhaps to eliminate those workers from their section. Effective communications to mobilize and coordinate the people who can and should take action are also vital at this step.

The remaining steps are concerned with the knowledge of what to do to control the danger, and the skills needed to take appropriate action. This knowledge is acquired by training and experience, but good design can help greatly by making it obvious how to achieve a certain result so as to avert danger or to protect one’s self from it—for instance, by means of an emergency stop or shutdown, or an avoiding action. Good information resources such as operations manuals or computer support systems can help supervisors and workers to gain access to knowledge not available to them in the course of day-to-day activity. Finally, skill and practice determine whether the required response action can be carried out accurately enough and with the right timing to make it successful. A difficult paradox arises in this connection: the more alert and prepared that people are, and the more reliable the hardware is, the less frequently the emergency procedures will be needed and the harder it will be to sustain the level of skill needed to carry them out when they are called upon.

Links with behaviour based on skill, rules and knowledge

The final element in the Hale and Glendon model, which turns figure 3 into figure 1, is the addition of the link to the work of Reason and Rasmussen. This work emphasized that behaviour can be evinced at three different levels of conscious control—skill-based, rule-based and knowledge-based—which implicate different aspects of human functioning and are subject to different types and degrees of disturbance or error on account of external signals or internal processing failures.

Skill-based. The skill-based level is highly reliable, but subject to lapses and slips when disturbed, or when another, similar routine captures control. This level is particularly relevant to the kind of routine behaviour that involves automatic responses to known signals indicating danger, either imminent or more remote. The responses are known and practised routines, such as keeping our fingers clear of a grinding wheel while sharpening a chisel, steering a car to keep it on the road, or ducking to avoid a flying object coming at us. The responses are so automatic that workers may not even be aware that they are actively controlling danger with them.

Rule-based. The rule-based level is concerned with choosing from a range of known routines or rules the one which is appropriate to the situation—for example, choosing which sequence to initiate in order to close down a reactor which would otherwise become overpressurized, selecting the correct safety goggles to work with acids (as opposed to those for working with dusts ), or deciding, as a manager, to carry out a full safety review for a new plant rather than a short informal check. Errors here are often related to insufficient time spent matching the choice to the real situation, to relying on expectation rather than observation to understand the situation, or to being misled by outside information into making a wrong diagnosis. In the Hale and Glendon model, behaviour at this level is particularly relevant to detecting hazards and choosing correct procedures in familiar situations.

Knowledge-based. The knowledge-based level is engaged only when no pre-existing plans or procedures exist for coping with a developing situation. This is particularly true of the recognition of new hazards at the design stage, of detecting unsuspected problems during safety inspections or of coping with unforeseen emergencies. This level is predominant in the steps at the top of figure 1. It is the least predictable and least reliable mode of operation, but also the mode where no machine or computer can replace a human in detecting potential danger and in recovering from deviations.

Putting all the elements together results in figure 1, which provides a framework for both classifying where failures occurred in human behaviour in a past accident and analysing what can be done to optimize human behaviour in controlling danger in a given situation or task in advance of any accidents.



Read 9870 times Last modified on Friday, 19 August 2011 20:19

" DISCLAIMER: The ILO does not take responsibility for content presented on this web portal that is presented in any language other than English, which is the language used for the initial production and peer-review of original content. Certain statistics have not been updated since the production of the 4th edition of the Encyclopaedia (1998)."


Accident Prevention References

Adams, JGU. 1985. Risk and Freedom; The Record of Read Safety Regulation. London: Transport Publishing Projects.

American National Standards Institute (ANSI). 1962. Method of Recording and Measuring Work Injury Experience. ANSI Z-16.2. New York: ANSI.

—. 1978. American National Standard Manual on Uniform Traffic Control Devices for Streets and Highways. ANSI D6.1. New York: ANSI.

—. 1988. Hazardous Industrial Chemicals—Precautionary Labeling. ANSI Z129.1. New York: ANSI.

—. 1993. Safety Color Code. ANSI Z535.1. New York: ANSI.

—. 1993. Environmental and Facility Safety Signs. ANSI Z535.2. New York: ANSI.

—. 1993. Criteria for Safety Symbols. ANSI Z535.3. New York: ANSI.

—. 1993. Product Safety Signs and Labels. ANSI Z535.4. New York: ANSI.

—. 1993. Accident Prevention Tags. ANSI Z535.5. New York: ANSI.

Andersson, R. 1991. The role of accidentology in occupational accident research. Arbete och halsa. 1991. Solna, Sweden. Thesis.

Andersson, R and E Lagerlöf. 1983. Accident data in the new Swedish information system on occupational injuries. Ergonomics 26.

Arnold, HJ. 1989. Sanctions and rewards: Organizational perspectives. In Sanctions and Rewards in the Legal System:
A Multidisciplinary Approach. Toronto: University of Toronto Press.

Baker, SP, B O’Neil, MJ Ginsburg, and G Li. 1992. Injury Fact Book. New York: Oxford University Press.

Benner, L. 1975. Accident investigations—multilinear sequencing methods. J Saf Res 7.

Centers for Disease Control and Prevention (CDC). 1988. Guidelines for evaluating surveillance systems. Morb Mortal Weekly Rep 37(S-5):1–18.

Davies, JC and DP Manning. 1994a. MAIM: the concept and construction of intelligent software. Saf Sci 17:207–218.

—. 1994b. Data collected by MAIM intelligent software: The first fifty accidents. Saf Sci 17:219-226.

Department of Trade and Industry. 1987. Leisure Accident Surveillance System (LASS): Home and Leisure Accident Research 1986 Data. 11th Annual Report of the Home Accident Surveillance System. London: Department of Trade and Industry.

Ferry, TS. 1988. Modern Accident Investigation and Analysis. New York: Wiley.

Feyer, A-M and AM Williamson. 1991. An accident classification system for use in preventive strategies. Scand J Work Environ Health 17:302–311.

FMC. 1985. Product Safety Sign and Label System. Santa Clara, California: FMC Corporation.

Gielen, AC. 1992. Health education and injury control: Integrating approaches. Health Educ Q 19(2):203–218.

Goldenhar, LM and PA Schulte. 1994. Intervention research in occupational health and safety. J Occup Med 36(7):763–775.

Green, LW and MW Kreuter. 1991. Health Promotion Planning: An Educational and Environmental Approach. Mountainview, CA: Mayfield Publishing Company.

Guastello, SJ. 1991. The Comparative Effectiveness of Occupational Accident Reduction Programs. Paper presented at the International Symposium Alcohol Related Accidents and Injuries. Yverdon-les-Bains, Switzerland, Dec. 2-5.

Haddon, WJ. 1972. A logical framework for categorizing highway safety phenomena and activity. J Trauma 12:193–207.

—. 1973. Energy damage and the 10 countermeasure strategies. J Trauma 13:321–331.

—. 1980. The basic strategies for reducing damage from hazards of all kinds. Hazard Prevention September/October:8–12.

Hale, AR and AI Glendon. 1987. Individual Behaviour in the Face of Danger. Amsterdam: Elsevier.

Hale, AR and M Hale. 1972. Review of the Industrial Accident Research Literature. Research paper No. l, Committee on Safety & Health. London: HMSO.

Hale, AR, B Heming, J Carthey and B Kirwan. 1994. Extension of the Model of Behaviour in the Control of Danger. Vol. 3: Extended Model Description. Sheffield: Health and Safety Executive project HF/GNSR/28.

Hare, VC. 1967. System Analysis: A Diagnostic Approach. New York: Harcourt Brace World.

Harms-Ringdahl, L. 1993. Safety Analysis. Principles and Practice in Occupational Safety. Vol. 289. Amsterdam: Elsevier.

Heinrich, HW. 1931. Industrial Accident Prevention. New York: McGraw-Hill.

—. 1959. Industrial Accident Prevention: A Scientific Approach. New York: McGraw-Hill Book Company.

Hugentobler, MK, BA Israel, and SJ Schurman. 1992. An action research approach to workplace health: Intergrating methods. Health Educ Q 19(1):55–76.

International Organization for Standardization (ISO). 1967. Symbols, Dimensions, and Layout for Safety Signs. ISO R557. Geneva: ISO.

—. 1984. Safety Signs and Colors. ISO 3864. Geneva: ISO.

—. 1991. Industrial Automation Systems—Safety of Integrated Manufacturing Systems—Basic Requirements (CD 11161). TC 184/WG 4. Geneva: ISO.

—. 1994. Quality Management and Quality Assurance Vocabulary. ISO/DIS 8402. Paris: Association française de normalisation.

Janssen, W. 1994. Seat-belt wearing and driving behavior: An instrumented-vehicle study. Accident analysis and prevention. Accident Anal. Prev. 26: 249-261.

Jenkins, EL, SM Kisner, D Fosbroke, LA Layne, MA Stout, DN Castillo, PM Cutlip, and R Cianfrocco. 1993. Fatal Injuries to Workers in the United States, 1980–1989: A Decade of Surveillance. Cincinnati, OH: NIOSH.

Johnston, JJ, GTH Cattledge, and JW Collins. 1994. The efficacy of training for occupational injury control. Occup Med: State Art Rev 9(2):147–158.

Kallberg, VP. 1992. The Effects of Reflector Posts on Driving Behaviour and Accidents on Two-lane Rural Roads in Finland. Report 59/1992. Helsinki: The Finnish National Road Administration Technical Development Center.

Kjellén, U. 1984. The deviation concept in occupational accident control. Part I: Definition and classification; Part II: Data collection and assesment of significance. Accident Anal Prev 16:289–323.

Kjellén, U and J Hovden. 1993. Reducing risks by deviation control—a retrospection into a research strategy. Saf Sci 16:417–438.

Kjellén, U and TJ Larsson. 1981. Investigating accidents and reducing risks—a dynamic approach. J Occup Acc 3:129–140.

Last, JM. 1988. A Dictionary of Epidemiology. New York: Oxford University Press.

Lehto, MR. 1992. Designing warning signs and warning labels: Part I—Guidelines for the practitioner. Int J Ind Erg 10:105–113.

Lehto, MR and D Clark. 1990. Warning signs and labels in the workplace. In Workspace, Equipment and Tool Design, edited by A Mital and W Karwowski. Amsterdam: Elsevier.

Lehto, MR and JM Miller. 1986. Warnings: Volume I: Fundamentals, Design, and Evaluation Methodologies. Ann Arbor, MI: Fuller Technical Publications.
Leplat, J. 1978. Accident analyses and work analyses. J Occup Acc 1:331–340.

MacKenzie, EJ, DM Steinwachs, and BS Shankar. 1989. Classifying severity of trauma based on hospital discharge diagnoses: Validation of an ICD-9CM to AIS-85 conversion table. Med Care 27:412–422.

Manning, DP. 1971. Industrial accident-type classifications—A study of the theory and practice of accident prevention based on a computer analysis of industrial injury records. M.D. Thesis, University of Liverpool.

McAfee, RB and AR Winn. 1989. The use of incentives/feedback to enhance work place safety: A critique of the literature. J Saf Res 20:7-19.

Mohr, DL and D Clemmer. 1989. Evaluation of an occupational injury intervention in the petroleum industry. Accident Anal Prev 21(3):263–271.

National Committee for Injury Prevention and Control. 1989. Injury Prevention: Meeting the Challenge. New York: Oxford University Press.

National Electronic Manufacturers Association (NEMA). 1982. Safety Labels for Padmounted Switch Gear and Transformers Sited in Public Areas. NEMA 260. Rosslyn, VA: NEMA.

Occupational Health and Safety Administration (OSHA). 1985. Specification for Accident Prevention Signs and Tags. CFR 1910.145. Washington DC: OSHA.

—. 1985. [Chemical] Hazard Communication. CFR 1910.1200. Washington DC: OSHA.

Occupational Injury Prevention Panel. 1992. Occupational injury prevention. In Centers for Disease Control. Position Papers from the Third National Injury Control Conference: Setting the National Agenda for Injury Control in the 1990s. Atlanta, GA: CDC.

Organization for Economic Cooperation and Development (OECD). 1990. Behavioural Adaptation to Changes in the Road Transport System. Paris: OECD.

Rasmussen, J. 1982. Human errors. A taxonomy for describing human malfunction in industrial installations. J Occup Acc 4:311–333.

Rasmussen, J, K Duncan and J Leplat. 1987. New Technology and Human Error. Chichester: Wiley.

Reason, JT. 1990. Human Error. Cambridge: CUP.

Rice, DP, EJ MacKenzie and associates. 1989. Cost of Injury in the United States: A Report to Congress. San Francisco: Institute for Health and Aging, University of California; and Baltimore: Injury Prevention Center, The Johns Hopkins University.

Robertson, LS. 1992. Injury Epidemiology. New York: Oxford University Press.

Saari, J. 1992. Successful implementation of occupational health and safety programs in manufacturing for the 1990s. J Hum Factors Manufac 2:55–66.

Schelp, L. 1988. The role of organizations in community participation—prevention of accidental injuries in a rural
Swedish municipality. Soc Sci Med 26(11):1087–1093.

Shannon, HS. 1978. A statistical study of 2,500 consecutive reported accidents in an automobile factory. Ph.D. thesis, University of London.

Smith, GS and H Falk. 1987. Unintentional injuries. Am J Prev Medicine 5, sup.:143–163.

Smith, GS and PG Barss. 1991. Unintentional injuries in developing countries: The epidemiology of a neglected problem. Epidemiological Reviews :228–266.

Society of Automotive Engineers (SAE). 1979. Safety Signs. SAE J115: SAE.

Steckler, AB, L Dawson, BA Israel, and E Eng. 1993. Community health development: An overview of the works of Guy W. Stewart. Health Educ Q Sup. 1: S3-S20.

Steers, RM and LW Porter.1991. Motivation and Work Behavior (5th ed). New York: McGraw-Hill.

Surry, J. 1969. Industrial Accident Research: A Human Engineering Appraisal. Canada: University of Toronto.

Tollman, S. 1991. Community-oriented primary care: Origins, evolutions, applications. Soc Sci Med 32(6):633-642.

Troup, JDG, J Davies, and DP Manning. 1988. A model for the investigation of back injuries and manual handling problems at work. J Soc Occup Med 10:107–119.

Tuominen, R and J Saari. 1982. A model for analysis of accidents and its applications. J Occup Acc 4.

Veazie, MA, DD Landen, TR Bender and HE Amandus. 1994. Epidemiologic research on the etiology of injuries at work. Ann Rev Pub Health 15:203–21.

Waganaar, WA, PT Hudson and JT Reason. 1990. Cognitive failures and accidents. Appl Cogn Psychol 4:273–294.

Waller, JA. 1985. Injury Control: A Guide to the Causes and Prevention of Trauma. Lexington, MA: Lexington Books.

Wallerstein, N and R Baker. 1994. Labor education programs in health and safety. Occup Med State Art Rev 9(2):305-320.

Weeks, JL. 1991. Occupational health and safety regulation in the coal mining industry: Public health at the workplace. Annu Rev Publ Health 12:195–207.

Westinghouse Electric Corporation. 1981. Product Safety Label Handbook. Trafford, Pa: Westinghouse Printing Division.

Wilde, GJS. 1982. The theory of risk homeostasis: Implications for safety and health. Risk Anal 2:209-225.

—. 1991. Economics and accidents: A commentary. J Appl Behav Sci 24:81-84.

—. 1988. Risk homeostasis theory and traffic accidents: propositions, deductions and discussion of dissemsion in recent reactions. Ergonomics 31:441-468.

—. 1994. Target Risk. Toronto: PDE Publications.

Williamson, AM and A-M Feyer. 1990. Behavioural epidemiology as a tool for accident research. J Occup Acc 12:207–222.

Work Environment Fund [Arbetarskyddsfonden]. 1983. Olycksfall i arbetsmiljön—Kartläggning och analys av forskningsbehov [Accidents in the work environment—survey and analysis]. Solna: Arbetarskyddsfonden