Banner GeneralHazard

Children categories

36. Barometric Pressure Increased

36. Barometric Pressure Increased (2)

Banner 6

 

 

36. Barometric Pressure Increased

 

Chapter Editor: T.J.R. Francis

 


Table of Contents

Tables

 

Working under Increased Barometric Pressure

Eric Kindwall

 

Decompression Disorders

Dees F. Gorman

 

Tables

Click a link below to view table in article context.

1. Instructions for compressed-air workers
2. Decompression illness: Revised classification

View items...
37. Barometric Pressure Reduced

37. Barometric Pressure Reduced (4)

Banner 6

 

37. Barometric Pressure Reduced

Chapter Editor:  Walter Dümmer


Table of Contents

Figures and Tables

Ventilatory Acclimatization to High Altitude
John T. Reeves and John V. Weil

Physiological Effects of Reduced Barometric Pressure
Kenneth I. Berger and William N. Rom

Health Considerations for Managing Work at High Altitudes
John B. West

Prevention of Occupational Hazards at High Altitudes
Walter Dümmer

Figures

Point to a thumbnail to see figure caption, click to see figure in article context.

 

BA1020F1BA1020F3BA1020F4BA1020F5BA1030T1BA1030F1BA1030F2

View items...
38. Biological Hazards

38. Biological Hazards (4)

Banner 6

 

38. Biological Hazards

Chapter Editor: Zuheir Ibrahim Fakhri


Table of Contents

Tables

Workplace Biohazards
Zuheir I. Fakhri

Aquatic Animals
D. Zannini

Terrestrial Venomous Animals
J.A. Rioux and B. Juminer

Clinical Features of Snakebite
David A. Warrell

Tables

Click a link below to view table in article context.

1. Occupational settings with biological agents
2. Viruses, bacteria, fungi & plants in the workplace
3. Animals as a source of occupational hazards

View items...
39. Disasters, Natural and Technological

39. Disasters, Natural and Technological (12)

Banner 6

 

39. Disasters, Natural and Technological

Chapter Editor: Pier Alberto Bertazzi


Table of Contents

Tables and Figures

Disasters and Major Accidents
Pier Alberto Bertazzi

     ILO Convention concerning the Prevention of Major Industrial Accidents, 1993 (No. 174)

Disaster Preparedness
Peter J. Baxter

Post-Disaster Activities
Benedetto Terracini and Ursula Ackermann-Liebrich

Weather-Related Problems
Jean French

Avalanches: Hazards and Protective Measures
Gustav Poinstingl

Transportation of Hazardous Material: Chemical and Radioactive
Donald M. Campbell

Radiation Accidents
Pierre Verger and Denis Winter

     Case Study: What does dose mean?

Occupational Health and Safety Measures in Agricultural Areas Contaminated by Radionuclides: The Chernobyl Experience
Yuri Kundiev, Leonard Dobrovolsky and V.I. Chernyuk

Case Study: The Kader Toy Factory Fire
Casey Cavanaugh Grant

Impacts of Disasters: Lessons from a Medical Perspective
José Luis Zeballos
 

 

 

 

Tables

 

Click a link below to view table in article context.

 

1. Definitions of disaster types
2. 25-yr average # victims by type & region-natural trigger
3. 25-yr average # victims by type & region-non-natural trigger
4. 25-yr average # victims by type-natural trigger (1969-1993)
5. 25-yr average # victims by type-non-natural trigger (1969-1993)
6. Natural trigger from 1969 to 1993: Events over 25 years
7. Non-natural trigger from 1969 to 1993: Events over 25 years
8. Natural trigger: Number by global region & type in 1994
9. Non-natural trigger: Number by global region & type in 1994
10. Examples of industrial explosions
11. Examples of major fires
12. Examples of major toxic releases
13. Role of major hazard installations management in hazard control
14. Working methods for hazard assessment
15. EC Directive criteria for major hazard installations
16. Priority chemicals used in identifying major hazard installations
17. Weather-related occupational risks
18. Typical radionuclides, with their radioactive half-lives
19. Comparison of different nuclear accidents
20. Contamination in Ukraine, Byelorussia & Russia after Chernobyl
21. Contamination strontium-90 after the Khyshtym accident (Urals 1957)
22. Radioactive sources that involved the general public
23. Main accidents involving industrial irradiators
24. Oak Ridge (US) radiation accident registry (worldwide, 1944-88)
25. Pattern of occupational exposure to ionizing radiation worldwide
26. Deterministic effects: thresholds for selected organs
27. Patients with acute irradiation syndrome (AIS) after Chernobyl
28. Epidemiological cancer studies of high dose external irradiation
29. Thyroid cancers in children in Belarus, Ukraine & Russia, 1981-94
30. International scale of nuclear incidents
31. Generic protective measures for general population
32. Criteria for contamination zones
33. Major disasters in Latin America & the Caribbean, 1970-93
34. Losses due to six natural disasters
35. Hospitals & hospital beds damaged/ destroyed by 3 major disasters
36. Victims in 2 hospitals collapsed by the 1985 earthquake in Mexico
37. Hospital beds lost resulting from the March 1985 Chilean earthquake
38. Risk factors for earthquake damage to hospital infrastructure

 

Figures

Point to a thumbnail to see figure caption, click to see figure in article context.

 

 

 

 

DIS010F2DIS010F1DIS010T2DIS020F1DIS080F1DIS080F2DIS080F3DIS080F4DIS080F5DIS080F6DIS080F7DIS090T2DIS095F1DIS095F2

 


 

Click to return to top of page

 

View items...
40. Electricity

40. Electricity (3)

Banner 6

 

40. Electricity

Chapter Editor:  Dominique Folliot

 


 

Table of Contents 

Figures and Tables

Electricity—Physiological Effects
Dominique Folliot

Static Electricity
Claude Menguy

Prevention And Standards
Renzo Comini

Tables

Click a link below to view table in article context.

1. Estimates of the rate of electrocution-1988
2. Basic relationships in electrostatics-Collection of equations
3. Electron affinities of selected polymers
4. Typical lower flammability limits
5. Specific charge associated with selected industrial operations
6. Examples of equipment sensitive to electrostatic discharges

Figures

Point to a thumbnail to see figure caption, click to see figure in article context.

ELE030F1ELE030F2ELE040F1

View items...
41. Fire

41. Fire (6)

Banner 6

 

41. Fire

Chapter Editor:  Casey C. Grant


 

Table of Contents 

Figures and Tables

Basic Concepts
Dougal Drysdale

Sources of Fire Hazards
Tamás Bánky

Fire Prevention Measures
Peter F. Johnson

Passive Fire Protection Measures
Yngve Anderberg

Active Fire Protection Measures
Gary Taylor

Organizing for Fire Protection
S. Dheri

Tables

Click a link below to view table in article context.

1. Lower & upper flammability limits in air
2. Flashpoints & firepoints of liquid & solid fuels
3. Ignition sources
4. Comparison of concentrations of different gases required for inerting

Figures

Point to a thumbnail to see figure caption, click to see figure in article context.

FIR010F1FIR010F2FIR020F1FIR040F1FIR040F2FIR040F3FIR050F4FIR050F1FIR050F2FIR050F3FIR060F3

View items...
42. Heat and Cold

42. Heat and Cold (12)

Banner 6

 

42. Heat and Cold

Chapter Editor:  Jean-Jacques Vogt


 

Table of Contents 

Figures and Tables

Physiological Responses to the Thermal Environment
W. Larry Kenney

Effects of Heat Stress and Work in the Heat
Bodil Nielsen

Heat Disorders
Tokuo Ogawa

Prevention of Heat Stress
Sarah A. Nunneley

The Physical Basis of Work in Heat
Jacques Malchaire

Assessment of Heat Stress and Heat Stress Indices
Kenneth C. Parsons

     Case Study: Heat Indices: Formulae and Definitions

Heat Exchange through Clothing
Wouter A. Lotens

     Formulae and Definitions

Cold Environments and Cold Work
Ingvar Holmér, Per-Ola Granberg and Goran Dahlstrom

Prevention of Cold Stress in Extreme Outdoor Conditions
Jacques Bittel and Gustave Savourey

Cold Indices and Standards
Ingvar Holmér

Tables

Click a link below to view table in article context.

1. Electrolyte concentration in blood plasma & sweat
2. Heat Stress Index & Allowable Exposure Times: calculations
3. Interpretation of Heat Stress Index values
4. Reference values for criteria of thermal stress & strain
5. Model using heart rate to assess heat stress
6. WBGT reference values
7. Working practices for hot environments
8. Calculation of the SWreq index & assessment method: equations
9. Description of terms used in ISO 7933 (1989b)
10. WBGT values for four work phases
11. Basic data for the analytical assessment using ISO 7933
12. Analytical assessment using ISO 7933
13. Air temperatures of various cold occupational environments
14. Duration of uncompensated cold stress & associated reactions
15. Indication of anticipated effects of mild & severe cold exposure
16. Body tissue temperature & human physical performance
17. Human responses to cooling: Indicative reactions to hypothermia
18. Health recommendations for personnel exposed to cold stress
19. Conditioning programmes for workers exposed to cold
20. Prevention & alleviation of cold stress: strategies
21. Strategies & measures related to specific factors & equipment
22. General adaptational mechanisms to cold
23. Number of days when water temperature is below 15 ºC
24. Air temperatures of various cold occupational environments
25. Schematic classification of cold work
26. Classification of levels of metabolic rate
27. Examples of basic insulation values of clothing
28. Classification of thermal resistance to cooling of handwear
29. Classification of contact thermal resistance of handwear
30. Wind Chill Index, temperature & freezing time of exposed flesh
31. Cooling power of wind on exposed flesh

Figures

Point to a thumbnail to see figure caption, click to see figure in article context.

HEA030F1HEA050F1HEA010F1HEA080F1HEA080F2HEA080F3HEA020F1HEA020F2HEA020F3HEA020F4HEA020F5HEA020F6HEA020F7HEA090F1HEA090F2HEA090F3HEA090T4HEA090F4HEA090T8HEA090F5HEA110F1HEA110F2HEA110F3HEA110F4HEA110F5HEA110F6


Click to return to top of page

View items...
43. Hours of Work

43. Hours of Work (1)

Banner 6

 

43. Hours of Work

Chapter Editor:  Peter Knauth


 

Table of Contents 

Hours of Work
Peter Knauth

Tables

Click a link below to view table in article context.

1. Time intervals from beginning shiftwork until three illnesses
2. Shiftwork & incidence of cardiovascular disorders

Figures

Point to a thumbnail to see figure caption, click to see figure in article context.

HOU010F1HOU010T3HOU010F2HOU10F2BHOU010F3HOU010F4HOU010F5HOU010F6HOU010F7

View items...
44. Indoor Air Quality

44. Indoor Air Quality (8)

Banner 6

 

44. Indoor Air Quality

Chapter Editor:  Xavier Guardino Solá


 

Table of Contents 

Figures and Tables

Indoor Air Quality: Introduction
Xavier Guardino Solá

Nature and Sources of Indoor Chemical Contaminants
Derrick Crump

Radon
María José Berenguer

Tobacco Smoke
Dietrich Hoffmann and Ernst L. Wynder

Smoking Regulations
Xavier Guardino Solá

Measuring and Assessing Chemical Pollutants
M. Gracia Rosell Farrás

Biological Contamination
Brian Flannigan

Regulations, Recommendations, Guidelines and Standards
María José Berenguer

Tables

Click a link below to view table in article context.

1. Classification of indoor organic pollutants
2. Formaldehyde emission from a variety of materials
3. Ttl. volatile organic comp’ds concs, wall/floor coverings
4. Consumer prods & other sources of volatile organic comp’ds
5. Major types & concentrations in the urban United Kingdom
6. Field measurements of nitrogen oxides & carbon monoxide
7. Toxic & tumorigenic agents in cigarette sidestream smoke
8. Toxic & tumorigenic agents from tobacco smoke
9. Urinary cotinine in non-smokers
10. Methodology for taking samples
11. Detection methods for gases in indoor air
12. Methods used for the analysis of chemical pollutants
13. Lower detection limits for some gases
14. Types of fungus which can cause rhinitis and/or asthma
15. Micro-organisms and extrinsic allergic alveolitis
16. Micro-organisms in nonindustrial indoor air & dust
17. Standards of air quality established by the US EPA
18. WHO guidelines for non-cancer and non-odour annoyance
19. WHO guideline values based on sensory effects or annoyance
20. Reference values for radon of three organizations

Figures

Point to a thumbnail to see figure caption, click to see figure in article context.

AIR010T1AIR010F1AIR030T7AIR035F1AIR050T1


Click to return to top of page

View items...
47. Noise

47. Noise (5)

Banner 6

 

47. Noise

Chapter Editor:  Alice H. Suter


 

Table of Contents 

Figures and Tables

The Nature and Effects of Noise
Alice H. Suter

Noise Measurement and Exposure Evaluation
Eduard I. Denisov and German A. Suvorov

Engineering Noise Control
Dennis P. Driscoll

Hearing Conservation Programmes
Larry H. Royster and Julia Doswell Royster

Standards and Regulations
Alice H. Suter

Tables

Click a link below to view table in article context.

1. Permissible exposure limits (PEL)for noise exposure, by nation

Figures

Point to a thumbnail to see figure caption, click to see figure in article context.

NOI010T1NOI050F6NOI050F7NOI060F1NOI060F2NOI060F3NOI060F4NOI070F1NOI070T1

View items...
48. Radiation: Ionizing

48. Radiation: Ionizing (6)

Banner 6

 

48. Radiation: Ionizing

Chapter Editor:  Robert N. Cherry, Jr.


 

Table of Contents

Introduction
Robert N. Cherry, Jr.

Radiation Biology and Biological Effects
Arthur C. Upton

Sources of Ionizing Radiation
Robert N. Cherry, Jr.

Workplace Design for Radiation Safety
Gordon M. Lodde

Radiation Safety
Robert N. Cherry, Jr.

Planning for and Management of Radiation Accidents
Sydney W. Porter, Jr.

View items...
Friday, 25 March 2011 05:56

Motion Sickness

Motion sickness, or kinetosis, is not a pathological condition, but is a normal response to certain motion stimuli with which the individual is unfamiliar and to which he or she is, therefore, unadapted; only those without a functioning vestibular apparatus of the inner ear are truly immune.

Motions producing sickness

There are many different types of provocative motion that induce the motion sickness syndrome. Most are associated with aids to locomotion—in particular, ships, hovercraft, aircraft, automobiles and trains; less commonly, elephants and camels. The complex accelerations generated by fairground amusements, such as swings, roundabouts (merry-go-rounds), roller-coasters and so on, can be highly provocative. In addition, many astronauts/cosmonauts suffer from motion sickness (space-motion sickness) when they first make head movements in the abnormal force environment (weightlessness) of orbital flight. The motion sickness syndrome is also produced by certain moving visual stimuli, without any physical motion of the observer; the external visual world display of fixed-base simulators (simulator sickness) or a large-screen projection of scenes taken from a moving vehicle (Cinerama or IMAX sickness) are examples.

Aetiology

The essential characteristics of stimuli that induce motion sickness is that they generate discordant information from the sensory systems that provide the brain with information about the spatial orientation and motion of the body. The principal feature of this discord is a mismatch between the signals provided, principally, by the eyes and inner ear, and those that the central nervous system “expects” to receive and to be correlated.

Several categories of mismatch can be identified. Most important is the mismatch of signals from the vestibular apparatus (labyrinth) of the inner ear, in which the semicircular canals (the specialized receptors of angular accelerations) and the otolith organs (the specialized receptors of translational accelerations) do not provide concordant information. For example, when a head movement is made in a car or aircraft which is turning, both the semicircular canals and the otoliths are stimulated in an atypical manner and provide erroneous and incompatible information, information that differs substantially from that generated by the same head movement in a stable, 1-G gravity environment. Likewise, low-frequency (below 0.5 Hz) linear accelerations, such as occur aboard ship in rough seas or in an aircraft during flight through turbulent air, also generate conflicting vestibular signals and, hence, are a potent cause of motion sickness.

The mismatch of visual and vestibular information can also be an important contributory factor. The occupant of a moving vehicle who cannot see out is more likely to suffer from motion sickness than one who has a good external visual reference. The passenger below deck or in an aircraft cabin senses motion of the vehicle by vestibular cues, but he or she receives visual information only of his or her relative movement within the vehicle. The absence of an “expected” and concordant signal in a particular sensory modality is also considered to be the essential feature of visually induced motion sickness, because the visual motion cues are not accompanied by the vestibular signals that the individual “expects” to occur when subjected to the motion indicated by the visual display.

Signs and symptoms

On exposure to provocative motion, the signs and symptoms of motion sickness develop in a definite sequence, the time scale being dependent upon the intensity of the motion stimuli and the susceptibility of the individual. There are, however, considerable differences among individuals not only in susceptibility, but also in the order in which particular signs and symptoms develop, or whether they are experienced at all. Typically, the earliest symptom is epigastric discomfort (“stomach awareness”); this is followed by nausea, pallor and sweating, and is likely to be accompanied by a feeling of bodily warmth, increased salivation and eructation (belching). These symptoms commonly develop relatively slowly, but with continuing exposure to the motion, there is a rapid deterioration in well-being, the nausea increases in severity and culminates in vomiting or retching. Vomiting may bring relief, but this is likely to be short-lived unless the motion ceases.

There are other more variable features of the motion sickness syndrome. Alteration of respiratory rhythm with sighing and yawning may be an early symptom, and hyperventilation may occur, particularly in those who are anxious about the cause or consequence of their disability. Headache, tinnitus and dizziness are reported, while in those with severe malaise, apathy and depression are not uncommon, and may be of such severity that personal safety and survival are neglected. A feeling of lethargy and somnolence may be dominant following the cessation of provocative motion, and these may be the only symptoms in situations where adaptation to unfamiliar motion takes place without malaise.

Adaptation

With continued or repeated exposure to a particular provocative motion, most individuals show a decrease in the severity of symptoms; typically after three or four days of continuous exposure (as aboard ship or in a space vehicle) they have adapted to the motion and can carry out their normal duties without disability. In terms of the “mismatch” model, this adaptation or habituation represents the establishment of a new set of “expectations” in the central nervous system. However, on return to a familiar environment, these will no longer be appropriate and symptoms of motion sickness can recur (mal de débarquement) until readaptation occurs. Individuals differ considerably in the rate at which they adapt, the way they retain adaptation and the degree to which they can generalize protective adaptation from one motion environment to another. Unfortunately, a small proportion of the population (probably about 5%) do not adapt, or adapt so slowly that they continue to experience symptoms throughout the period of exposure to provocative motion.

Incidence

The incidence of sickness in a particular motion environment is governed by a number of factors, notably:

  • the physical characteristics of the motion (its intensity, frequency and direction of action)
  • the duration of exposure
  • the intrinsic susceptibility of the individual
  • the task being performed
  • other environmental factors (e.g., odour).

 

Not surprisingly, the occurrence of sickness varies widely in different motion environments. For example: nearly all the occupants of life rafts in rough seas will vomit; 60% of student aircrew members suffer from air sickness at some time during training, which in 15%is sufficiently severe to interfere with training; in contrast, less than 0.5% of passengers in civil transport aircraft are affected, although the incidence is higher in small commuter aircraft flying at low altitude in turbulent air.

Laboratory and field studies have shown that for vertical translational oscillatory motion (appropriately called heave), oscillation at a frequency of about 0.2 Hz is the most provocative (figure 1). For a given intensity (peak acceleration) of oscillation, the incidence of sickness falls quite rapidly with an increase in frequency above 0.2 Hz; motion at 1 Hz is less than one-tenth as provocative as that at 0.2 Hz. Likewise, for motion at frequencies below 0.2 Hz, although the relationship between incidence and frequency is not well defined because of a lack of experimental data; certainly, a stable, zero frequency, 1-G environment is not provocative.

Figure 1. Motion sickness incidence as a function of wave frequency and acceleration for 2 hour exposure to vertical sinusoidal motion

VIB040F1

Relationships established between the incidence of symptoms of motion sickness and the frequency, magnitude and duration of heave (z-axis) motion have led to the development of simple formulae that can be used to predict incidence when the physical parameters of the motion are known. The concept, embodied in British Standard 6841 (BSI 1987b) and in ISO Draft International Standard 2631-1, is that the incidence of symptoms is proportional to the Motion Sickness Dose Value (MSDVz). The MSDVz (in m/s1.5) is defined:

MSDVz=(a2t)½

where a is the root-mean-square (r.m.s.) value of the frequency-weighted acceleration (in m/s2) determined by linear integration over the duration, t (in seconds), of exposure to the motion.

The frequency weighting to be applied to the stimulus acceleration is a filter having a centre frequency and attenuation characteristics similar to those depicted in figure 1. The weighting function is defined precisely in the standards.

The percentage of an unadapted adult population (P) who are likely to vomit is given by:

P =1/3 MSDVz

Furthermore, the MSDVz may also be used to predict the level of malaise. On a four-point scale of zero (I felt all right) to three (I felt absolutely dreadful) an “illness rating” (I) is given by:

I =0.02MSDVz

Given the large differences among individuals in their susceptibility to motion sickness, the relationship between MSDVz and the occurrence of vomiting in laboratory experiments and in sea trials (figure 2) is acceptable. It should be noted that the formulae were developed from data acquired on exposures lasting from about 20 minutes to six hours with vomiting occurring in up to 70% of individuals (mostly seated) exposed to vertical, heave, motion.

 

Figure 2. Relationship between incidence of vomiting and stimulus dose (MSDV2), calculated by the procedure described in the text. Data from laboratory experiments volving  vertical oscillation (x) and sea trials (+)

 

VIB040F2

Knowledge about the effectiveness of translational oscillation acting in other body axes and other than in a vertical direction is fragmentary. There is some evidence from laboratory experiments on small groups of subjects that translational oscillation in a horizontal plane is more provocative, by a factor of about two, than the same intensity and frequency of vertical oscillation for seated subjects, but is less provocative, also by a factor of two, when the subject is supine and the stimulus acts in the longitudinal (z) body axis. Application of formulae and weighting characteristics embodied in standards to the prediction of sickness incidence should, therefore, be made with caution and due concern for the constraints noted above.

The considerable variability between individuals in their response to provocative motion is an important feature of motion sickness. Differences in susceptibility can, in part, be related to constitutional factors. Infants much below the age of about two years are rarely affected, but with maturation, susceptibility increases rapidly to reach a peak between four and ten years. Thereafter, susceptibility falls progressively so that the elderly are less likely to be affected, but are not immune. In any age group, females are more sensitive than males, the incidence data suggesting a ratio of approximately 1.7:1. Certain dimensions of personality, such as neuroticism, introversion and perceptual style have also been shown to be correlated, albeit weakly, with susceptibility. Motion sickness can also be a conditioned response and a manifestation of phobic anxiety.

Preventive measures

Procedures which minimize the provocative stimulus or increase the tolerance are available. These may prevent sickness in a proportion of the population, but none, other than withdrawal from the motion environment, is 100%effective. In the design of a vehicle, attention to factors which raise the frequency and reduce the magnitude of the oscillations (see figure 1) experienced by occupants during normal operation is beneficial. The provision of head support and body restraint to minimize the unnecessary head movements is advantageous, and is further aided if the occupant can assume a reclined or supine position. Sickness is less if the occupant can be given a view of the horizon; for those deprived of an external visual reference, closing the eyes reduces visual/vestibular conflict. Involvement in a task, particularly control of the vehicle, is also helpful. These measures can be of immediate benefit, but in the longer term the development of protective adaptation is of the greatest value. This is achieved by continued and repeated exposure to the motion environment, though it can be facilitated by ground-based exercises in which provocative stimuli are generated by making head movements whilst rotating on a spin table (desensitization therapy).

There are several drugs which increase tolerance, though all have side-effects (in particular, sedation), so that they should not be taken by those in primary control of a vehicle or when optimum performance is mandatory. For short-term (less than four hours) prophylaxis, 0.3 to 0.6 mg hyoscine hydrobromide (scopolamine) is recommended; longer acting are the antihistaminics, promethazine hydrochloride (25 mg), meclozine hydrochloride (50 mg), dimenhydrinate (50 mg) and cinnarizine (30 mg). The combination of either hyoscine or promethazine with 25 mg ephedrine sulphate increases prophylactic potency with some reduction of side-effects. Prophylaxis for up to 48 hours can be achieved using a scopolamine patch, which allows the drug to be slowly absorbed through the skin at a controlled rate. Effective concentrations of the drug in the body are not achieved until six to eight hours after application of the patch, so the need for this type of therapy must be anticipated.

Treatment

Those suffering from established motion sickness with vomiting should, when practicable, be placed in a position where the motion stimulus is minimized, and be given an anti–motion sickness drug, preferably promethazine by injection. Should vomiting be prolonged and repeated, intravenous replacement of fluid and electrolytes may be necessary.

 

Back

Friday, 25 March 2011 05:02

Violence in the Workplace

Violence is pervasive in modern society and appears to be escalating. Entirely apart from repression, wars and terrorist activities, the media daily report in banner headlines on the mayhem inflicted by humans upon each other in “civilized” as well as more primitive communities. Whether there has been a real increase or this simply represents more thorough reporting is arguable. After all, violence has been a feature of human interaction since prehistoric ages. Nevertheless, violence has become one of the leading causes of death in modern industrial societies—in some segments of the community it is the leading cause of death—and it is increasingly being recognized as a public health problem.

Inescapably, it finds its way into the workplace. From 1980 to 1989, homicide was the third leading cause of death from injury in North American workplaces, according to data compiled by the National Traumatic Occupational Facilities Surveillance System (NIOSH 1993a). During this period, occupational homicides accounted for 12% of deaths from injury in the workplace; only motor vehicles and machines accounted for more. By 1993, that figure had risen to 17%, a rate of 0.9 per 100,000 workers, now second only to motor vehicle deaths (Toscano and Windau 1994). For  women  workers,  it  remained  the  leading  cause  of  work-related death, although the rate (0.4 deaths per 100,000) was lower than that for men (1.2 deaths per 100,000) (Jenkins 1995).

These deaths, however, represent only the “tip of the iceberg”. For example, in 1992, about 22,400 American workers were injured seriously enough in non-fatal assaults in the workplace to require days away from work to recuperate (Toscano and Windau 1994). Reliable and complete data are lacking, but it is estimated that for every death there have been many thousands—perhaps, even hundreds of thousands—of instances of violence in the workplace.

In its newsletter, Unison, the large British union of health care and governmental service workers, has labelled violence as “the most threatening risk faced by members at work. It is the risk which is most likely to lead to injury. It can bring unmanageable levels of occupational stress which damages personal esteem and threatens people’s ability to continue on the job” (Unison 1992).

This article will summarize the characteristics of violence in the workplace, the kinds of people involved, its effects on them and their employers, and the steps that may be taken to prevent or control such effects.

Definition of Violence

There is no consensus on the definition of violence. For example, Rosenberg and Mercy (1991) include in the definition both fatal and nonfatal interpersonal violence where physical force or other means is used by one person with the intent of causing harm, injury or death to another. The Panel on the Understanding and Control of Violent Behavior convened by the US National Academy of Sciences adopted the definition of violence as: behaviours by individuals that intentionally threaten, attempt or inflict physical harm on others (Reiss and Roth 1993).

These definitions focus on threatening or causing physical harm. However, they exclude instances in which verbal abuse, harassment or humiliation and other forms of psychological trauma may be the sole harm to the victim and which may be no less devastating. They also exclude sexual harassment, which may be physical but which is usually entirely non-physical. In the national survey of American workers conducted by the Northwestern National Life Insurance Company, the researchers separated violent acts into: harassment (the act of creating a hostile environment through unwelcome words, actions or physical contacts not resulting in physical harm), threats (expressions of an intent to cause physical harm), and physical attacks (aggression resulting in a physical assault with or without the use of a weapon) (Lawless, 1993).

In the UK, the Health and Safety Executive’s working definition of workplace violence is: any incident in which an employee is abused, threatened or assaulted by a member of the public in circumstances arising out of the course of his or her employment. Assailants may be patients, clients or co-workers (MSF 1993).

In this article, the term violence will be used in its broadest sense to include all forms of aggressive or abusive behaviour that may cause physical or psychological harm or discomfort to its victims, whether they be intentional targets or innocent bystanders involved only impersonally or incidentally. While workplaces may be targets of terrorist attacks or may become involved in riots and mob violence, such instances will not be discussed.

Prevalence of Violence in the Workplace

Accurate information on the prevalence of violence in the workplace is lacking. Most of the literature focuses on cases that are formally reported: homicides which get tallied in the obligatory death registries, cases that get enmeshed in the criminal justice system, or cases involving time off the job that generate workers’ compensation claims. Yet, for every one of these, there is an untold number of instances in which workers are victims of aggressive, abusive behaviour. For example, according to a survey conducted by the Bureau of Justice Statistics in the US Department of Justice, over half the victimizations sustained at work were not reported to the police. About 40% of the respondents said they did not report the incident because they considered it to be a minor or a personal matter, while another 27% said they did report it to a manager or a company security officer but, apparently, the report was not relayed to the police (Bachman 1994). In addition to the lack of a consensus on a taxonomy of violence, other reasons for under-reporting include:

  • Cultural acceptance of violence. There is in many communities a widespread tolerance for violence among or against certain groups (Rosenberg and Mercy 1991). Although frowned upon by many, violence is often rationalized and tolerated as a “normal” response to competition. Violence among minority and ethnic groups is often condoned as a righteous response to discrimination, poverty and lack of access to social or economic equity resulting in low self-esteem and low valuations of human life. As a result, the assault is seen as a consequence of living in a violent society rather than working in an unsafe workplace. Finally, there is the “on-the-job syndrome”, in which workers in certain jobs are expected to put up with verbal abuse, threats and, even, physical attacks (SEIU 1995; Unison 1992).
  • Lack of a reporting system. Only a small proportion of organizations have articulated an explicit policy on violence or have designed procedures for reporting and investigating instances of alleged violence in the workplace. Even where such a system has been installed, the trouble of obtaining, completing and filing the required report form is a deterrent to reporting all but the most outrageous incidents.
  • Fear of blame or reprisal. Workers may fear being held responsible when they have been attacked by a client or a patient. Fear of reprisal by the assailant is also a potent deterrent to reporting, especially when that person is the worker’s superior and in a position to affect his or her job status.
  • Lack of interest on the part of the employer. The employer’s lack of interest in investigating and reacting to prior incidents will certainly discourage reporting. Also, supervisors, concerned that workplace violence might reflect unfavourably on their managerial capabilities may actually discourage or even block the filing of reports by workers in their units.

 

To determine the prevalence of violence in the workplace in the absence of reliable data, attempts have been made to extrapolate both from available statistics (e.g., death certificates, crime reports and workers’ compensation systems) and from specially designed surveys. Thus, the US National Crime Victimization Survey estimated that about 1 million American workers (out of a workforce of 110 million) are assaulted at work each year (Bachman 1994). And, a 1993 telephone survey of a national sample of 600 American full-time workers (excluding self-employed and military personnel) found that one in four said that he or she had been a victim of workplace violence during the study year: 19%ere harassed, 7% were threatened, and 3% were attacked physically. The researchers reported further that 68%of the harassment victims, 43% of the threat victims and 24% of the attack victims had not reported the incident (Lawless 1993).

A similar survey of workers in the UK employed by the National Health Service revealed that, during the previous year, 0.5% had required medical treatment following an on-the-job physical assault; 11% had suffered a minor injury requiring only first aid, 4 to 6% had been threatened by persons wielding a deadly weapon, and 17% had received verbal threats. Violence was a special problem for emergency staff in ambulances and accident departments, nurses, and workers involved in the care of psychologically disturbed patients (Health Services Advisory Committee 1987). The risk of health workers being confronted by violence has been labelled a feature of everyday work in primary care and in accident/emergency departments (Shepherd 1994).

Homicide in the Workplace

Although workplace homicides are only a small proportion of all homicides, their substantial contribution to work-related deaths, at least in the United States, their unique features, and the possibility of preventive interventions by employers earn them special attention. For example, while most homicides in the community involve people who know each other, many of them close relatives, and only 13% were reported to have been associated with another felony, these proportions were reversed in the workplace, where more than three-fourths of the homicides were committed in the course of a robbery (NIOSH 1992). Further, while persons aged 65 and older in the general population have the lowest rates of being victims of homicide, this age group has the highest rates of such involvement in workplace homicides (Castillo and Jenkins 1994).

American workplaces with the highest rates of homicide are listed in table 1. Over 50% are accounted for by only two industries: retail trade and services. The latter includes taxi driving, which has nearly 40 times the average workplace homicide rate, followed by liquor/convenience stores and gas stations, prime targets for robberies, and by detective/protective services (Castillo and Jenkins 1994).

Table 1. US workplaces with the highest rates of occupational homicide, 1980-1989

Workplaces

No. of homicides

Rate1

Taxicab establishments

287

26.9

Liquor stores

115

8.0

Gas stations

304

5.6

Detective/protective services

152

5.0

Justice/public order establishments

640

3.4

Grocery stores

806

3.2

Jewellery stores

56

3.2

Hotels/motels

153

3.2

Eating/drinking places

754

1.5

1 Number per 100,000 workers per year.

Source: NIOSH 1993b.

 

Table 2 lists the occupations with the highest rates of workplace homicides. Again, reflecting the likelihood of involvement in attempted felonies, taxi drivers head the list, followed by law-enforcement personnel, hotel clerks and workers in various types of retail establishments. Commenting on similar data from the UK, Drever (1995) noted that most of the occupations with the highest mortality from homicides had high rates of drug dependence (scaffolders, literary and artistic occupations, painters and decorators) or alcohol abuse (cooks and kitchen porters, publicans, bartenders and caterers).

Table 2. US occupations with the highest rates of occupational homicide, 1980-1989

Occupations

No. of homicides

Rate1

Taxicab drivers/chauffeurs

289

15.1

Law enforcement officers

520

9.3

Hotel clerks

40

5.1

Gas station workers

164

4.5

Security guards

253

3.6

Stock handlers/baggers

260

3.1

Store owners/managers

1,065

2.8

Bartenders

84

2.1

1 Number per 100,000 workers per year.

Source: NIOSH 1993b.

 

As noted above, the vast majority of work-related homicides occur during the course of a robbery or other crime committed by a person or persons usually not known to the victim. Risk factors associated with such incidents are listed in table 3.

 


Table 3. Risk factors for workplace homicides

 

Working alone or in small numbers

Exchange of money with the public

Working late night or early morning hours

Working in high crime areas

Guarding valuable property or possessions

Working in community settings (e.g. taxi drivers and police)

Source: NIOSH 1993b.


 

About 4% of workplace homicides occur during confrontations with family members or acquaintances who have followed the victim into the workplace. About 21% arise out of a confrontation related to the workplace: about two-thirds of these are perpetrated by workers or former employees with a grudge against a manager or a co-worker, while angry customers or clients account for the rest (Toscano and Windau 1994). In these cases, the target may be the particular manager or worker whose actions provoked the assault or, where there is a grudge against the organization, the target may be the workplace itself, and any employees and visitors who just happen to be in it at the critical moment. Sometimes, the assailant may be emotionally disturbed, as in the case of Joseph T. Weisbecker, an employee on long-term disability leave from his employer in Louisville, Kentucky, because of mental illness, who killed eight co-workers and injured 12 others before taking his own life (Kuzmits 1990).

Causes of Violence

Current understanding of the causes and risk factors for assaultive violence is very rudimentary (Rosenberg and Mercy 1991). Clearly, it is a multifactorial problem in which each incident is shaped by the characteristics of the assailant, the characteristics of the victim(s) and the nature of the interplay between them. Reflecting such complexity, a number of theories of causation have been developed. Biological theories, for example, focus on such factors as gender (most of the assailants are male), age (involvement in violence in the community diminishes with age but, as noted above, this is not so in the workplace), and the influence of hormones such as testosterone, neurotransmitters such as serotonin, and other such biological agents. The psychological approach focuses on personality, holding that violence is engendered by deprivation of love during childhood, and childhood abuse, and is learned from role models, reinforced by rewards and punishments in early life. Sociological theories emphasize as breeders of violence such cultural and subcultural factors as poverty, discrimination and lack of economic and social equity. Finally, interactional theories converge on a sequence of actions and reactions that ultimately escalate into violence (Rosenberg and Mercy 1991).

A number of risk factors have been associated with violence. They include:

Mental illness

The vast majority of people who are violent are not mentally ill, and the vast proportion of individuals with mental illness are not violent (American Psychiatric Association 1994). However, mentally disordered individuals are sometimes frightened, irritable, suspicious, excitable, or angry, or a combination of these (Bullard 1994). The resultant behaviour poses a particular risk of violence to the physicians, nurses and staff members involved in their care in ambulances, emergency departments and both inpatient and outpatient psychiatric facilities.

Certain types of mental illness are associated with a greater propensity for violence. Persons with psychopathic personalities tend to have a low threshold for anger and frustration, which often generate violent behaviour (Marks 1992), while individuals with paranoia are suspicious and prone to attack individuals or entire organizations whom they blame when things do not go as they would wish. However, violence may be exhibited by persons with other forms of mental illness. Furthermore, some mentally ill individuals are prone to episodes of acute dementia in which they may inflict violence on themselves as well as on those trying to restrain them.

Alcohol and drug abuse

Alcohol abuse has a strong association with aggressive and violent behaviour. While drunkenness on the part of either assailants or victims, or both, often results in violence, there is disagreement as to whether alcohol is the cause of the violence or merely one of a number of factors involved in its causation (Pernanen 1993). Fagan (1993) emphasized that while alcohol affects neurobiological functions, perception and cognition, it is the immediate setting in which the drinking takes place that channels the disinhibiting responses to alcohol. This was confirmed by a study in Los Angeles County which found that violent incidents were much more frequent in some bars and relatively uncommon in others where just as much drinking was taking place, and concluded that violent behaviour was not related to the amount of alcohol being consumed but, rather, to the kinds of individuals attracted to a particular drinking establishment and the kinds of unwritten rules in effect there (Scribner, MacKinnon and Dwyer 1995).

Much the same may be said for abuse of illicit drugs. Except perhaps for crack cocaine and the amphetamines, drug use is more likely to be associated with sedation and withdrawal rather than aggressive, violent behaviour. Most of the violence associated with illegal drugs seems to be associated not with the drugs, but with the effort to obtain them or the wherewithal to purchase them, and from involvement in the illegal drug traffic.

Violence in the community

Violence in the community not only spills over into workplaces but is a particular risk factor for workers such as police and firefighters,  and  for  postal  workers  and  other  government employees, repair and service personnel, social workers and others whose jobs take them into neighbourhoods in which violence and crime are indigenous. Important factors in the frequency of violence, particularly in the United States, is the prevalence of firearms in the hands of the general public and, especially for young people, the amount of violence depicted in films and on television.

Work-Related Factors Associated with Violence

Instances of violence may occur in any and all workplaces. There are, however, certain jobs and work-related circumstances that are particularly associated with a risk of generating or being subjected to violence. They include:

Criminal activities

Perhaps the least complex of episodes of work-related violence are those associated with criminal violence, the major cause of worksite homicides. These fall into two categories: those involved with attempts at robbery or other felonies, and those related to traffic in illicit drugs. Police, security guards and other personnel with law-enforcement responsibilities face a constant risk of attack by felons attempting to enter the workplace and those resisting detection and arrest. Those working alone and field workers whose duties take them into high-crime neighbourhoods are frequent targets of robbery attempts. Health professionals making home visits to such areas are particularly at risk because they often carry drugs and drug paraphernalia such as hypodermic syringes and needles.

Dealing with the public

Workers in government and private community service agencies, banks and other institutions serving the public are frequently confronted by attacks from individuals who have been kept waiting unduly, have been greeted with disinterest and indifference (whether real or perceived), or were thwarted in obtaining the information or services they desired because of complicated bureaucratic procedures or technicalities that made them ineligible. Clerks in retail establishments receiving items being returned, workers staffing airport ticket counters when flights are overbooked, delayed or cancelled, urban bus or trolley drivers and conductors, and others who must deal with customers or clients whose wants cannot immediately be satisfied are often targets for verbal and sometimes even physical abuse. Then, there are also those who must contend with impatient and unruly crowds, such as police officers, security guards, ticket takers and ushers at popular sporting and entertainment events.

Violent attacks on government workers, particularly those in uniform, and on government buildings and offices in which workers and visitors may be indiscriminately injured or killed, may result from resentment and anger at laws and official policies which the perpetrators will not accept.

Work stress

High levels of work stress may precipitate violent behaviour, while violence in the workplace can, in turn, be a potent stressor. The elements of work stress are well known (see chapter Psychosocial and Organizational Factors). Their common denominator is a devaluation of the individual and/or the work he or she performs, resulting in fatigue, frustration and anger directed at managers and co-workers perceived to be inconsiderate, unfair and abusive. Several recent population studies have demonstrated an association  between  violence  and  job  loss,  one  of  the  most  potent job-related stressors (Catalano et al. 1993; Yancey et al. 1994).

Interpersonal environment in the workplace

The interpersonal environment in the workplace may be a breeding ground for violence. Discrimination and harassment, forms of violence in themselves as defined in this article, may provoke violent retaliation. For example, MSF, the British union of workers in management, science and finance, calls attention to workplace bullying (defined as persistent offensive, abusive, intimidating, malicious or insulting behaviour, abuse of power or unfair penal sanctions), as a characteristic of the management style in some organizations (MSF 1995).

Sexual harassment has been branded a form of assault on the job (SEIU 1995). It may involve unwelcome touching or patting, physical assault, suggestive remarks or other verbal abuse, staring or leering, requests for sexual favours, compromising invitations, or a work environment made offensive by pornography. It is illegal in the United States, having been declared a form of sexual discrimination under Title VII of the Civil Rights Act of 1964 when the worker feels that his or her job status depends on tolerating the advances or if the harassment creates an intimidating, hostile or offensive workplace environment.

Although women are the usual targets, men have also been sexually harassed, albeit much less frequently. In a 1980 survey of US federal employees, 42% of female respondents and 15% of males said that they had been sexually harassed on the job, and a follow-up survey in 1987 yielded similar results (SEIU 1995). In the United States, extensive media coverage of the harassment of women who had “intruded” into jobs and workplaces traditionally filled by males, and the notoriety given to the involvement of prominent political and public figures in alleged harassment, have resulted in an increase in the number of complaints received by state and federal anti-discrimination agencies and the number of civil law suits filed.

Working in health care and social services

In addition to the attempted robberies as noted above, health care staff are often targets of violence from anxious and disturbed patients, especially in emergency and outpatient departments, where long waits and impersonal procedures are not uncommon and where anxiety and anger may boil over into verbal or physical assaults. They may also be victims of assault by family members or friends of patients who had unfavourable outcomes which they rightly or wrongfully attribute to denials, delays or errors in treatment. In such instances they may attack the particular health worker(s) whom they hold responsible, or the violence may be aimed randomly at any staff member(s) of the medical facility.

Effects of Violence on the Victim

The trauma caused by physical assault varies with the nature of the attack and the weapons employed. Bruises and cuts on the hands and forearms are common when the victim has tried to defend himself or herself. Since the face and head are frequent targets, bruises and fractures of the facial bones are common; these can be traumatic psychologically because the swelling and ecchymoses are so visible and may take weeks to disappear (Mezey and Shepherd 1994).

The psychological effects may be more troublesome than the physical trauma, especially when a health worker has been assaulted by a patient. The victims may experience a loss of composure and self-confidence in their professional competence accompanied by a sense of guilt at having provoked the attack or having failed to detect that it was coming. Unfocused or directed anger may persist at the apparent rejection of their well-intended professional efforts, and there may be a persistent loss of confidence in themselves as well as a lack of trust in their co-workers and supervisors that can interfere with work performance. All this may be accompanied by insomnia, nightmares, diminished or increased appetite, increased consumption of tobacco, alcohol and/or drugs, social withdrawal and absenteeism from the job (Mezey and Shepherd 1994).

Post-traumatic stress disorder is a specific psychological syndrome (PTSD) that may develop after major disasters and instances of violent assault, not only in those directly involved in the incident but also in those who have witnessed it. While usually associated with life-threatening or fatal incidents, PTSD may occur after relatively trivial attacks that are perceived as life-threatening (Foa and Rothbaum 1992). The symptoms include: re-experiencing the incident through recurrent and intrusive recollections (“flashbacks”) and nightmares, persistent feelings of arousal and anxiety including muscular tension, autonomic hyperactivity, loss of concentration, and exaggerated reactivity. There is often conscious or unconscious avoidance of circumstances that recall the incident. There may be a long period of disability but the symptoms usually respond to supportive psychotherapy. They can often be prevented by a post-incident debriefing conducted as soon as possible after the incident, followed, when needed, by short-term counselling (Foa and Rothbaum 1992).

After the Incident

Interventive measures to be taken immediately after the incident include:

Care of the victim

Appropriate first-aid and medical care should be provided as quickly as possible to all injured individuals. For possible medico-legal purposes (e.g., criminal or civil actions against the assailant) the injuries should be described in detail and, if possible, photographed.

Clean-up of the workplace

Any damage or debris in the workplace should be cleaned up, and any equipment that was involved should be checked to make sure that the safety and cleanliness of the workplace have been fully restored (SEIU 1995).

Post-incident debriefing

As soon as possible, all those involved in or witnessing the incident should participate in a post-incident debriefing or a “trauma-crisis counselling” session conducted by an appropriately qualified staff member or an outside consultant. This will not only provide emotional support and identify those for whom referral for one-on-one counselling may be advisable, but also enable the collection of details of exactly what has happened. Where necessary, the counselling may be supplemented by the formation of a peer support group (CAL/OSHA 1995).

Reporting

A standardized report form should be completed and submitted to the proper individual in the organization and, when appropriate, to the police in the community. A number of sample forms that may be adapted to the needs of a particular organization have been designed and published (Unison 1991, MSF 1993, SEIU 1995). Aggregating and analysing incident report forms will provide epidemiological information that may identify risk factors for violence in the particular workplace and point the way to suitable preventive interventions.

Investigating the incident

Each reported incident of alleged violence, however trivial it may seem, should be investigated by a designated properly trained individual. (Assignment for such investigations may be made by the joint labour/management safety and health committee, where one exists.) The investigation should be aimed at identifying the cause(s) of the incident, the person(s) involved, what, if any, disciplinary measures should be invoked, and what may be done to prevent recurrences. Failure to conduct an impartial and effective investigation is a signal of management’s disinterest and a lack of concern for employees’ health and welfare.

Employer support

Victims and observers of the incident should be assured that they will not be subject to discrimination or any other form of reprisal for reporting it. This is especially important when the alleged assailant is the worker’s superior.

Depending on the regulations extant in the particular jurisdiction, the nature and extent of any injuries, and the duration of any absence from work, the employee may be eligible for workers’ compensation benefits. In such cases, the appropriate claim forms should be filed promptly.

When appropriate, a report should be filed with the local law enforcement agency. When needed, the victim may be provided with legal advice on pressing charges against the assailant, and assistance in dealing with the media.

Union Involvement

A number of unions have been playing a prominent role in dealing with workplace violence, most notably those representing workers in the health care and service industries, such as the Service Employees International Union (SEIU) in the United States, and Management, Science and Finance (MSF) and Unison in the UK. Through the development of guidelines and the publication of fact sheets, bulletins and pamphlets, they have focused on the education of workers, their representatives and their employers about the importance of violence in the workplace, how to deal with it, and how to prevent it. They have acted as advocates for members who have been victims to ensure that their complaints and allegations of violence are given appropriate consideration without threats of reprisal, and that they receive all of the benefits to which they may be entitled. Unions also advocate with employers’ and trade associations and government agencies on behalf of policies, rules and regulations intended to reduce the prevalence of violence in the workplace.

Threats of Violence

All threats of violence should be taken seriously, whether aimed at particular individuals or at the organization as a whole. First, steps must be taken to protect the targeted individual(s). Then, where possible, the assailant should be identified. If that person is not in the workforce, the local law enforcement agencies should be notified. If he or she is in the organization, it may be desirable to consult a qualified mental health professional to guide the handling of the situation and/or deal directly with the assailant.

Preventive Strategies

Preventing violence in the workplace is fundamentally the employer’s responsibility. Ideally, a formal policy and programme will have been developed and implemented before victimization occurs. This is a process that should involve not only the appropriate individuals in human resources/personnel, security, legal affairs, and employee health and safety departments, but also line managers and shop stewards or other employee representatives. A number of guides for such an exercise have been published (see  table 4). They are generic and are intended to be tailored to the circumstances of a particular workplace or industry. Their common denominators include:

Table 4. Guides for programmes to prevent workplace violence

Date

Title

Source

1991

Violence in the Workplace:
NUPE Guidelines

Unison Health Care
1 Marbledon Place
London WC1H 9AJ, UK

1993

CAL/OSHA Guidelines for Security
and Safety of Health Care and
Community Service Workers

Division of Occupational Safety and Health
Department of Industrial Relations
45 Fremont Street
San Francisco, CA 94105, USA

1993

Prevention of Violence at Work:
An MSF Guide with Model
Agreement and Violence at Work
Questionnaire (MSF Health and
Safety Information No. 37)

MSF Health and Safety Office
Dane O’Coys Road
Bishops Stortford
Herts, CM23 2JN, UK

1995

Assault on the Job: We Can Do
Something About Workplace
Violence (2nd Edition)

Service Employees International Union
1313 L Street, NW
Washington, DC 20005, USA

1995

CAL/OSHA: Model Injury and
Illness Prevention Program for
Workplace Security

Division of Occupational Safety and Health
Department of Industrial Relations
45 Fremont Street
San Francisco, CA 94105, USA

1996

Guidelines for Preventing Work-
place Violence for Health Care
and Social Service Workers
(OSHA 3148)

OSHA Publications Office
P.O. Box 37535
Washington, DC 20013-7535, USA

 

Establishing a policy

A policy explicitly outlawing discriminatory and abusive behaviour and the use of violence for dispute resolution, accompanied by specified disciplinary measures for infractions (up to and including dismissal), should be formulated and published.

Risk assessment

An inspection of the workplace, supplemented by analysis of prior incidents and/or information from employee surveys, will enable an expert to assess risk factors for violence and suggest preventive interventions. Examination of the prevailing style of management and supervision and the organization of work may disclose high levels of work stress that may precipitate violence. Study of interactions with clients, customers or patients may reveal features that may generate needless anxiety, frustration and anger, and precipitate violent reactions.

Workplace modifications to reduce crime

Guidance from police or private security experts may suggest changes in work procedures and in the layout and furnishing of the workplace that will make it a less attractive target for robbery attempts. In the United States, the Virginia Department of Criminal Justice has been using Crime Prevention Through Environmental Design (CPTED), a model approach developed by a consortium of the schools of architecture in the state that includes: changes in interior and exterior lighting and landscaping with particular attention to parking areas, stairwells and restrooms; making sales and waiting areas visible from the street; use of drop safes or time-release safes to hold cash; alarm systems, television monitors and other security equipment (Malcan 1993). CPTED has been successfully applied in convenience stores, banks (particularly in relation to automatic teller machines which may be accessed around the clock), schools and universities, and in the Washington, DC, Metro subway system.

In New York City, where robbery and killing of taxi drivers is relatively frequent compared to other large cities, the Taxi and Limousine Commission issued regulations that mandated the insertion of a transparent, bullet-resistant partition between the driver and passengers in the rear seat, a bullet-proof plate in the back of the driver’s seat, and an external distress signal light that could be turned on by the driver while remaining invisible to those inside the cab (NYC/TLC 1994). (There has been a spate of head and facial injuries among rear seat passengers who were not wearing seat belts and were thrown forward against the partition when the cab stopped suddenly.)

Where work involves interaction with customers or patients, employee safety may be enhanced by interposing barriers such as counters, desks or tables, transparent, shatter-proof partitions, and locked doors with shatter-proof windows (CAL/OSHA 1993). Furniture and equipment can be arranged to avoid entrapment of the employee and, where privacy is important, it should not be maintained at the expense of isolating the employee with a potentially aggressive or violent individual in a closed or secluded area.

Security systems

Every workplace should have a well-designed security system. Intrusion of strangers may be reduced by limiting entry to a designated reception area where visitors may have an identity check and receive ID badges indicating the areas to be visited. In some situations, it may be advisable to use metal detectors to identify visitors carrying concealed weapons.

Electronic alarm systems triggered by strategically located “panic buttons” can provide audible and/or visual signals that can alert co-workers to danger and summon help from a nearby security station. Such alarm systems may also be rigged to summon local police. However, they are of little use if guards and co-workers have not been trained to respond promptly and properly. Television monitors can not only provide protective surveillance but also record any incidents as they occur, and may help identify the perpetrator. Needless to say, such electronic systems are of little use unless they are maintained properly and tested at frequent intervals to ensure that they are in working order.

Two-way radios and cellular telephones can provide a measure of security for field personnel and those who are working alone. They also provide a means of reporting their location and, when necessary, summoning medical and other forms of assistance.

Work practice controls

Work practices should be reviewed periodically and modified to minimize the build-up of work stress. This involves attention to work schedules, work load, job content, and monitoring of work performance. Adequate staffing levels should be maintained in high-risk work areas both to discourage violent behaviour and to deal with it when it occurs. Adjustment of staffing levels to cope with peak flows of clients or patients will help to minimize irritating delays and crowding of work areas.

Staff training

Workers and supervisors should be trained to recognize rising tension and anger and in non-violent methods of defusing them. Training involving role-playing exercises will help employees to cope with overly aggressive or abusive individuals without being confrontational. In  some  situations,  training  employees  in  self-defence may be indicated, but there is the danger that this will breed a level of self-confidence that will lead them to delay or entirely neglect calling for available help.

Security guards, staff in psychiatric or penal institutions, and others likely to be involved with physically violent individuals should be trained to subdue and restrain them with minimal risk of injury to others or to themselves (SEIU 1995). However, according to Unison (1991), training can never be a substitute for good work organization and the provision of adequate security.

Employee assistance programmes

Employee assistance programmes (EAPs—also known as member assistance programmes, or MAPs, when provided by a union) can be particularly helpful in crisis situations by providing counselling and support to victims and witnesses of violent incidents, referring them to outside mental health professionals when needed, monitoring their progress and overseeing any protective arrangements intended to facilitate their return to work.

EAPs can also counsel employees whose frustration and anger might culminate in violent behaviour because they are overburdened by work-related problems or those arising from life in the family and/or in the community, whose frustration and anger might culminate in violent behaviour. When they have several such clients from a particular area of the workplace, they can (without breaching the confidentiality of personal information essential to their operation) guide managers to making desirable work modifications that will defuse the potential “powder keg” before violence erupts.

Research

Because of the seriousness and complexity of the problem and the paucity of reliable information, research is needed in the epidemiology, causation, prevention and control of violence in society in general and in the workplace. This requires a multidisciplinary effort involving (in addition to experts in occupational safety and health), mental health professionals, social workers, architects and engineers, experts in management science, lawyers, judges and experts in the criminal justice system, authorities on public policy, and others. Urgently needed are expanded and improved systems for the collection and analysis of the relevant data and the development of a consensus on a taxonomy of violence so that information and ideas can be more easily transposed from one discipline to others.

Conclusion

Violence is endemic in the workplace. Homicides are a major cause of work-related deaths, but their impact and cost are considerably outweighed by the prevalence of near misses, non-fatal physical assaults, threats, harassment, aggressive behaviour and abuse, much of which remains undocumented and unreported. Although most of the homicides and many of the assaults occur in conjunction with criminal activities, workplace violence is not just a criminal justice problem. Nor is it solely a problem for mental health professionals and specialists in addictions, although much of it is associated with mental illness, alcoholism and drug abuse. It requires a coordinated effort by experts in a broad variety of disciplines, led by occupational health and safety professionals, and aimed at developing, validating and implementing a coherent set of strategies for intervention and prevention, keeping in mind that the diversity in workers, jobs and industries dictates an ability to tailor them to the unique characteristics of a particular workforce and the organization that employs it.

 

Back

Friday, 25 March 2011 03:40

Overview

New information technologies are being introduced in all industrial sectors, albeit to varying extents. In some cases, the costs of computerizing production processes may constitute an impediment to innovation, particularly in small and medium-sized companies and in developing countries. Computers make possible the rapid collection, storage, processing and dissemination of large quantities of information. Their utility is further enhanced by their integration into computer networks, which allow resources to be shared (Young 1993).

Computerization exerts significant effects on the nature of employment and on working conditions. Beginning about the mid-1980s, it was recognized that workplace computerization may lead to changes in task structure and work organization, and by extension to work requirements, career planning and stress suffered by production and management personnel. Computerization may exert positive or negative effects on occupational health and safety. In some cases, the introduction of computers has rendered work more interesting and resulted in improvements in the work environment and reductions of workload. In others, however, the result of technological innovation has been an increase in the repetitive nature and intensity of tasks, a reduction of the margin for individual initiative and the isolation of the worker. Furthermore, several companies have been reported to increase the number of work shifts in an attempt to extract the largest possible economic benefit from their financial investment (ILO 1984).

As far as we have been able to determine, as of 1994 statistics on the worldwide use of computers are available from one source only—The Computer Industry Almanac (Juliussen and Petska-Juliussen 1994). In addition to statistics on the current international distribution of computer use, this publication also reports the results of retrospective and prospective analyses. The figures reported in the latest edition indicate that the number of computers is increasing exponentially, with the increase becoming particularly marked at the beginning of the 1980s, the point at which personal computers began to attain great popularity. Since 1987, total computer processing power, measured in terms of the number of million instructions per second executed (MIPS) has increased 14-fold, thanks to the development of new microprocessors (transistor components of microcomputers which perform arithmetical and logical calculations). By the end of 1993, total computing power attained 357 million MIPS.

Unfortunately, available statistics do not differentiate between computers used for work and personal purposes, and statistics are unavailable for some industrial sectors. These knowledge gaps are most likely due to methodological problems related to the collection of valid and reliable data. However, reports of the International Labour Organization’s tripartite sectoral committees contain relevant and comprehensive information on the nature and extent of the penetration of new technologies in various industrial sectors.

In 1986, 66 million computers were in use throughout the world. Three years later, there were more than 100 million, and by 1997, it is estimated that 275–300 million computers will be in use, with this number reaching 400 million by 2000. These predictions assume the widespread adoption of multimedia, information highway, voice recognition and virtual reality technologies. The Almanac’s authors consider that most televisions will be equipped with personal computers within ten years of publication, in order to simplify access to the information highway.

According to the Almanac, in 1993 the overall computer: population ratio in 43 countries in 5 continents was 3.1 per 100. It should however be noted that South Africa was the only African country reporting and that Mexico was the only Central American country reporting. As the statistics indicate, there is a very wide international variation in the extent of computerization, the computer:population ratio ranging from 0.07 per 100 to 28.7 per 100.

The computer:population ratio of less than 1 per 100 in developing countries reflects the generally low level of computerization prevailing there (table 1) (Juliussen and Petska-Juliussen 1994). Not only do these countries produce few computers and little software, but lack of financial resources may in some cases prevent them from importing these products. Moreover, their often rudimentary telephone and electrical utilities are often barriers to more widespread computer use. Finally, little linguistically and culturally appropriate software is available, and training in computer-related fields is often problematic (Young 1993).

 


Table 1. Distribution of computers in various regions of the world

 

REGION

COMPUTERS PER 100 PEOPLE

   

NORTH AMERICA

 

   United States

28.7

   Canada

8.8

CENTRAL AMERICA

 

   Mexico

1.7

SOUTH AMERICA

 

   Argentina

1.3

   Brazil

0.6

   Chile

2.6

   Venezuela

1.9

WESTERN EUROPE

 

   Austria

9.5

   Belgium

11.7

   Denmark

16.8

   Finland

16.7

   France

12.9

   Germany

12.8

   Greece

2.3

   Ireland

13.8

   Italy

7.4

   Netherlands

13.6

   Norway

17.3

   Portugal

4.4

   Spain

7.9

   Sweden

15

   Switzerland

14

   United Kingdom

16.2

EASTERN EUROPE

 

   Czech Republic

2.2

   Hungary

2.7

   Poland

1.7

   Russian Federation

0.78

   Ukraine

0.2

OCEANIA

 

   Australia

19.2

   New Zealand

14.7

AFRICA

 

   South Africa

1

ASIA

 

   China

0.09

   India

0.07

   Indonesia

0.17

   Israel

8.3

   Japan

9.7

   Korea, Republic of

3.7

   Phillipines

0.4

   Saudi Arabia

2.4

   Singapore

12.5

   Taiwan

7.4

   Thailand

0.9

   Turkey

0.8

Less than 1

1 - 5   6 - 10   11 - 15   16-20   21 - 30

Source: Juliussen and Petska-Juliussen 1994.


 

Computerization has significantly increased in the countries of the former Soviet Union since the end of the Cold War. The Russian Federation, for example, is estimated to have increased its stock of computers from 0.3 million in 1989 to 1.2 million in 1993.

The largest concentration of computers is found in the industrialized countries, especially in North America, Australia, Scandinavia and Great Britain (Juliussen and Petska-Juliussen 1994). It was principally in these countries that the first reports of visual display unit (VDU) operators’ fears regarding health risks appeared and the initial research aimed at determining the prevalence of health effects and identifying risk factors undertaken. The health problems studied fall into the following categories: visual and ocular problems, musculoskeletal problems, skin problems, reproductive problems, and stress.

It soon became evident that the health effects observed among VDU operators were dependent not only on screen characteristics and workstation layout, but also on the nature and structure of tasks, organization of work and manner in which the technology was introduced (ILO 1989). Several studies have reported a higher prevalence of symptoms among female VDU operators than among male operators. According to recent studies, this difference is more reflective of the fact that female operators typically have less control over their work than do their male counterparts than of true biological differences. This lack of control is thought to result in higher stress levels, which in turn result in increased symptom prevalence in female VDU operators.

VDUs were first introduced on a widespread basis in the tertiary sector, where they were used essentially for office work, more specifically data entry and word processing. We should not therefore be surprised that most studies of VDUs have focused on office workers. In industrialized countries, however, computerization has spread to the primary and secondary sectors. In addition, although VDUs were used almost exclusively by production workers, they have now penetrated to all organizational levels. In recent years, researchers have therefore begun to study a wider range of VDU users, in an attempt to overcome the lack of adequate scientific information on these situations.

Most computerized workstations are equipped with a VDU and a keyboard or mouse with which to transmit information and instructions to the computer. Software mediates information exchange between the operator and the computer and defines the format with which information is displayed on the screen. In order to establish the potential hazards associated with VDU use, it is first necessary to understand not only the characteristics of the VDU but also those of the other components of the work environment. In 1979, Çakir, Hart and Stewart published the first comprehensive analysis in this field.

It is useful to visualize the hardware used by VDU operators as nested components that interact with each other (IRSST 1984). These components include the terminal itself, the workstation (including work tools and furniture), the room in which the work is carried out, and the lighting. The second article in this chapter reviews the main characteristics of workstations and their lighting. Several recommendations aimed at optimizing working conditions while taking into account individual variations and variations in tasks and work organization are offered. Appropriate emphasis is placed on the importance of choosing equipment and furniture which allow flexible layouts. This flexibility is extremely important in light of international competition and rapidly evolving technological development that are constantly driving companies to introduce innovations and while simultaneously forcing them to adapt to the changes these innovations bring.

The next six articles discuss health problems studied in response to fears expressed by VDU operators. The relevant scientific literature is reviewed and the value and limitations of research results highlighted. Research in this field draws upon numerous disciplines, including epidemiology, ergonomics, medicine, engineering, psychology, physics and sociology. Given the complexity of the problems and more specifically their multifactorial nature, the necessary research has often been conducted by multidisciplinary research teams. Since the 1980s, these research efforts have been complemented by regularly organized international congresses such as Human-Computer Interaction and Work with Display Units, which provide an opportunity to disseminate research results and promote the exchange of information between researchers, VDU designers, VDU producers and VDU users.

The eighth article discusses human-computer interaction specifically. The principles and methods underlying the development and evaluation of interface tools are presented. This article will prove useful not only to production personnel but also those interested in the criteria used to select interface tools.

Finally, the ninth article reviews international ergonomic standards as of 1995, related to the design and layout of computerized workstations. These standards have been produced in order to eliminate the hazards to which VDU operators can be exposed in the course of their work. The standards provide guidelines to companies producing VDU components, employers responsible for the purchase and layout of workstations, and employees with decision-making responsibilities. They may also prove useful as tools with which to evaluate existing workstations and identify modifications required in order to optimize operators’ working conditions.

 

Back

Workstation Design

On workstations with visual display units

Visual displays with electronically generated images (visual display units or VDUs) represent the most characteristic element of computerized work equipment both in the workplace and in private life. A workstation may be designed to accommodate just a VDU and an input device (normally a keyboard), as a minimum; however, it can also provide room for diverse technical equipment including numerous screens, input and output devices, etc. As recently as the early 1980s, data entry was the most typical task for computer users. In many industrialized countries, however, this type of work is now performed by a relatively small number of users. More and more, journalists, managers and even executives have become “VDU users”.

Most VDU workstations are designed for sedentary work, but working in standing postures may offer some benefits for the users. Thus, there is some need for generic design guidelines applicable to simple and complex workstations used both while sitting and standing. Such guidelines will be formulated below and then applied to some typical workplaces.

Design guidelines

Workplace design and equipment selection should consider not only the needs of the actual user for a given task and the variability of users’ tasks during the relatively long life cycle of furniture (lasting 15 years or longer), but also factors related to maintenance or change of equipment. ISO Standard 9241, part 5, introduces four guiding principles to be applied to workstation design:

Guideline 1: Versatility and flexibility.

A workstation should enable its user to perform a range of tasks comfortably and efficiently. This guideline takes into account the fact that users’ tasks may vary often; thus, the chance of a universal adoption of guidelines for the workplace will be small.

Guideline 2: Fit.

The design of a workstation and its components should ensure a “fit” to be achieved for a variety of users and a range of task requirements. The concept of fit concerns the extent to which furniture and equipment can accommodate an individual user’s various needs, that is, to remain comfortable, free from visual discomfort and postural strain. If not designed for a specific user population, e.g., male European control room operators younger than 40 years of age, the workstation concept should ensure fit for the entire working population including users with special needs, e.g., handicapped persons. Most existing standards for furniture or the design of workplaces take only parts of the working population into consideration (e.g., “healthy” workers between the 5th and 95th percentile, aged between 16 and 60, as in German standard DIN 33 402), neglecting those who may need more attention.

Moreover, though some design practices are still based on the idea of an “average” user, an emphasis on individual fit is needed. With regard to workstation furniture, the fit required may be achieved by providing adjustability, designing a range of sizes, or even by custom-made equipment. Ensuring a good fit is crucial for the health and safety of the individual user, since musculoskeletal problems associated with the use of VDUs are common and significant.

Guideline 3: Postural change.

The design of the workstation should encourage movement, since static muscular load leads to fatigue and discomfort and may induce chronic musculoskeletal problems. A chair that allows easy movement of the upper half of the body, and provision of sufficient space to place and use paper documents as well as keyboards at varying positions during the day, are typical strategies for facilitating body movement while working with a VDU.

Guideline 4: Maintainability—adaptability.

The design of the workstation should take into consideration factors such as maintenance, accessibility, and the ability of the workplace to adapt to changing requirements, such as the ability to move the work equipment if a different task is to be performed. The objectives of this guideline have not received much attention in the ergonomics literature, because problems related to them are assumed to have been solved before users start to work at a workstation. In reality, however, a workstation is an ever-changing environment, and cluttered workspaces, partly or fully unsuitable for the tasks at hand, are very often not the result of their initial design process but are the outcome of later changes.

Applying the guidelines

Task analysis.

Workplace design should be preceded by a task analysis, which provides information about the primary tasks to be performed at the workstation and the equipment needed for them. In such an analysis, the priority given to information sources (e.g., paper-based documents, VDUs, input devices), the frequency of their use and possible restrictions (e.g., limited space) should be determined. The analysis should include major tasks and their relationships in space and time, visual attention areas (how many visual objects are to be used?) and the position and use of the hands (writing, typing, pointing?).

General design recommendations

Height of the work surfaces.

If fixed-height work surfaces are to be used, the minimum clearance between the floor and the surface should be greater than the sum of the popliteal height (the distance between the floor and the back of the knee) and thigh clearance height (sitting), plus allowance for footwear (25 mm for male users and 45 mm for female users). If the workstation is designed for general use, the popliteal height and thigh clearance height should be selected for the 95th percentile male population. The resulting height for the clearance under the desk surface is 690 mm for the population of Northern Europe and for North American users of European origin. For other populations, the minimum clearance needed is to be determined according to the anthropometric characteristics of the specific population.

If the legroom height is selected this way, the top of the work surfaces will be too high for a large proportion of intended users, and at least 30 per cent of them will need a footrest.

If work surfaces are adjustable in height, the required range for adjustment can be calculated from the anthropometric dimensions of female users (5th or 2.5th percentile for minimum height) and male users (95th or 97.5th percentile for maximum height). A workstation with these dimensions will in general be able to accommodate a large proportion of persons with little or no change. The result of such a calculation yields a range between 600 mm to 800 mm for countries with an ethnically varied user population. Since the technical realization of this range may cause some mechanical problems, best fit can also be achieved, for example, by combining adjustability with different size equipment.

The minimum acceptable thickness of the work surface depends on the mechanical properties of the material. From a technical point of view, a thickness between 14 mm (durable plastic or metal) and 30 mm (wood) is achievable.

Size and form of the work surface.

The size and the form of a work surface are mainly determined by the tasks to be performed and the equipment needed for those tasks.

For data entry tasks, a rectangular surface of 800 mm by 1200 mm provides sufficient space to place the equipment (VDU, keyboard, source documents and copy holder) properly and to rearrange the layout according to personal needs. More complex tasks may require additional space. Therefore, the size of the work surface should exceed 800 mm by 1,600 mm. The depth of the surface should allow placing the VDU within the surface, which means that VDUs with cathode ray tubes may require a depth of up to 1,000 mm.

In principle, the layout displayed in figure 1 gives maximum flexibility for organizing the workspace for various tasks. However, workstations with this layout are not easy to construct. Thus, the best approximation of the ideal layout is as displayed in figure 2. This layout allows arrangements with one or two VDUs, additional input devices and so on. The minimum area of the work surface should be larger than 1.3 m2.

Figure 1. Layout of a flexible workstation that can be adapted to fit the needs of users with different tasks

VDU020F1

Figure 2. Flexible layout

VDU020F2

Arranging the workspace.

The spatial distribution of equipment in the workspace should be planned after a task analysis determining the importance and use frequency of each element has been conducted (table 1). The most frequently used visual display should be located within the central visual space, which is the shaded area of figure 3, while the most important and frequently used controls (such as the keyboard) should be located within optimum reach. In the workplace represented by the task analysis (table 1), the keyboard and the mouse are by far the most frequently handled pieces of equipment. Therefore, they should be given the highest priority within the reach area. Documents which are frequently consulted but do not need much handling should be assigned priority according to their importance (e.g., handwritten corrections). Placing them on the right-hand side of the keyboard would solve the problem, but this would create a conflict with the frequent use of the mouse which is also to be located to the right of the keyboard. Since the VDU may not need adjustment frequently, it can be placed to the right or left of the central field of vision, allowing the documents to be set on a flat document holder behind the keyboard. This is one possible, though not perfect, “optimized” solution.

Table 1. Frequency and importance of elements of equipment for a given task

VDU020T1

Figure 3. Visual workplace range

VDU020F3

Since many elements of the equipment possess dimensions comparable to corresponding parts of the human body, using various elements within one task will always be associated with some problems. It also may require some movements between parts of the workstation; hence a layout like that shown in figure 1 is important for various tasks.

In the course of the last two decades, computer power that would have needed a ballroom at the beginning was successfully miniaturized and condensed into a simple box. However, contrary to the hopes of many practitioners that miniaturization of equipment would solve most problems associated with workplace layout, VDUs have continued to grow: in 1975, the most common screen size was 15"; in 1995 people bought 17" to 21”:monitors, and no keyboard has become much smaller than those designed in 1973. Carefully performed task analyses for designing complex workstations are still of growing importance. Moreover, although new input devices have emerged, they have not replaced the keyboard, and require even more space on the work surface, sometimes of substantial dimensions, e.g., graphic tablets in an A3-format.

Efficient space management within the limits of a workstation, as well as within work rooms, may help in developing acceptable workstations from an ergonomic point of view, thus preventing the emergence of various health and safety problems.

Efficient space management does not mean saving space at the expense of the usability of input devices and especially vision. Using extra furniture, such as a desk return, or a special monitor-holder clamped to the desk, may appear to be a good way to save desk space; however, it may be detrimental to posture (raised arms) and vision (raising the line of vision upwards from the relaxed position). Space-saving strategies should ensure that an adequate visual distance (approximately 600 mm to 800 mm) is maintained, as well as an optimum line-of-vision, obtained from an inclination of approximately 35º from the horizontal (20º head and 15º eyes).

New furniture concepts.

Traditionally, office furniture was adapted to the needs of businesses, supposedly reflecting the hierarchy of such organizations: large desks for executives working in “ceremonial” offices at one end of the scale, and small typists furniture for “functional” offices at the other. The basic design of office furniture did not change for decades. The situation changed substantially with the introduction of information technology, and a completely new furniture concept has emerged: that of systems furniture.

Systems furniture was developed when people realized that changes in working equipment and work organization could not be matched by the limited capabilities of existing furniture to adapt to new needs. Furniture today offers a tool-box that enables the user organizations to create workspace as needed, from a minimal space for just a VDU and a keyboard up to complex workstations that can accommodate various elements of equipment and possibly also groups of users. Such furniture is designed for change and incorporates efficient and flexible cable management facilities. While the first generation of systems furniture did not do much more than add an auxiliary desk for the VDU to an existing desk, the third generation has completely broken its ties to the traditional office. This new approach offers great flexibility in designing workspaces, limited only by the available space and the abilities of organizations to use this flexibility.

Radiation

Radiation in the context of VDU applications

Radiation is the emission or transfer of radiant energy. The emission of radiant energy in the form of light as the intended purpose for the use of VDUs may be accompanied by various unwanted by-products such as heat, sound, infrared and ultraviolet radiation, radio waves or x rays, to name a few. While some forms of radiation, like visible light, may affect humans in a positive way, some emissions of energy can have negative or even destructive biological effects, especially when the intensity is high and the duration of exposure is long. Some decades ago exposure limits for different forms of radiation were introduced to protect people. However, some of these exposure limits are questioned today, and, for low frequency alternating magnetic fields, no exposure limit can be given based on levels of natural background radiation.

Radiofrequency and microwave radiation from VDUs

Electromagnetic radiation with a frequency range from a few kHz to 109 Hertz (the so-called radiofrequency, or RF, band, with wavelengths ranging from some km to 30 cm) can be emitted by VDUs; however, the total energy emitted depends on the characteristics of the circuitry. In practice, however, the field strength of this type of radiation is likely to be small and confined to the immediate vicinity of the source. A comparison of the strength of alternating electric fields in the range of 20 Hz to 400 kHz indicates that VDUs using cathode ray tube (CRT) technology emit, in general, higher levels than other displays.

“Microwave” radiation covers the region between 3x108 Hz to 3x1011 Hz (wavelengths 100 cm to 1 mm). There are no sources of microwave radiation in VDUs that emit a detectable amount of energy within this band.

Magnetic fields

Magnetic fields from a VDU originate from the same sources as alternating electric fields. Although magnetic fields are not “radiation”, alternating electric and magnetic fields cannot be separated in practice, since one induces the other. One reason why magnetic fields are discussed separately is that they are suspected to have teratogenic effects (see discussion later in this chapter).

Although the fields induced by VDUs are weaker than those induced by some other sources, such as high-voltage power lines, power plants, electrical locomotives, steel ovens and welding equipment, the total exposure produced by VDUs may be similar since people may work eight or more hours in the vicinity of a VDU but seldom near power lines or electric motors. The question of the relationship between electromagnetic fields and cancer, however, is still a matter for debate.

Optical radiation

“Optical” radiation covers visible radiation (i.e., light) with wavelengths from 380 nm (blue) to 780 nm (red), and the neighbouring bands in the electromagnetic spectrum (infrared from 3x1011 Hz to 4x1014 Hz, wavelengths from 780 nm to 1 mm; ultraviolet from 8x1014 Hz to 3x1017 Hz). Visible radiation is emitted at moderate levels of intensity comparable with that emitted by room surfaces (»100 cd/m2). However, ultraviolet radiation is trapped by the glass of the tube face (CRTs) or not emitted at all (other display technologies). Levels of ultraviolet radiation, if detectable at all, stay well below occupational exposure standards, as do those of infrared radiation.

X rays

CRTs are well-known sources of x rays, while other technologies like liquid crystal displays (LCDs) do not emit any. The physical processes behind emissions of this type of radiation are well understood, and tubes and circuitry are designed to keep the emitted levels far below the occupational exposure limits, if not below detectable levels. Radiation emitted by a source can only be detected if its level exceeds the background level. In the case of x rays, as for other ionizing radiation, the background level is provided by cosmic radiation and by radiation from radioactive materials in the ground and in buildings. In normal operation, a VDU does not emit x rays exceeding the background level of radiation (50 nGy/h).

Radiation recommendations

In Sweden, the former MPR (Statens Mät och Provråd, the National Council for Metrology and Testing) organization, now SWEDAC, has worked out recommendations for evaluating VDUs. One of their main objectives was to limit any unwanted by-product to levels that can be achieved by reasonable technical means. This approach goes beyond the classical approach of limiting hazardous exposures to levels where the likelihood of an impairment of health and safety seems to be acceptably low.

At the beginning, some recommendations of MPR led to the unwanted effect of reducing the optical quality of CRT displays. However, at present, only very few products with extremely high resolution may suffer any degradation if the manufacturer attempts to comply with the MPR (now MPR-II). The recommendations include limits for static electricity, magnetic and electric alternating fields, visual parameters, etc.

Image Quality

Definitions for image quality

The term quality describes the fit of distinguishing attributes of an object for a defined purpose. Thus, the image quality of a display includes all properties of the optical representation regarding the perceptibility of symbols in general, and the legibility or readability of alphanumeric symbols. In this sense, optical terms used by tube manufacturers, like resolution or minimum spot size, describe basic quality criteria concerning the abilities of a given device for displaying thin lines or small characters. Such quality criteria are comparable with the thickness of a pencil or brush for a given task in writing or painting.

Some of the quality criteria used by ergonomists describe optical properties that are relevant for legibility, e.g., contrast, while others, like character size or stroke width, refer more to typographical features. In addition, some technology-dependent features like the flicker of images, the persistence of images, or the uniformity of contrast within a given display are also considered in ergonomics (see figure 4).

Figure 4. Criteria for image evaluation

VDU020F4

Typography is the art of composing “type”, which is not only shaping the fonts, but also selecting and setting of type. Here, the term typography is used in the first meaning.

Basic characteristics

Resolution.

Resolution is defined as the smallest discernible or measurable detail in a visual presentation. For example, the resolution of a CRT display can be expressed by the maximum number of lines that can be displayed in a given space, as usually done with the resolution of photographic films. One can also describe the minimum spot size that a device can display at a given luminance (brightness). The smaller the minimum spot, the better the device. Thus, the number of dots of minimum size (picture elements—also known as pixels) per inch (dpi) represents the quality of the device, e.g., a 72 dpi device is inferior to a 200 dpi display.

In general, the resolution of most computer displays is well below 100 dpi: some graphic displays may achieve 150 dpi, however, only with limited brightness. This means, if a high contrast is required, the resolution will be lower. Compared with the resolution of print, e.g., 300 dpi or 600 dpi for laser printers, the quality of VDUs is inferior. (An image with 300 dpi has 9 times more elements in the same space than a 100 dpi image.)

Addressability.

Addressability describes the number of individual points in the field that the device is capable of specifying. Addressability, which is very often confused with resolution (sometimes deliberately), is one specification given for devices: “800 x 600” means that the graphic board can address 800 points on every one of 600 horizontal lines. Since one needs at least 15 elements in the vertical direction to write numbers, letters and other characters with ascenders and descenders, such a screen can display a maximum of 40 lines of text. Today, the best available screens can address 1,600 x 1,200 points; however, most displays used in industry address 800 x 600 points or even less.

On displays of the so-called “character-oriented” devices, it is not dots (points) of the screen that are addressed but character boxes. In most such devices, there are 25 lines with 80 character positions each in the display. On these screens, each symbol occupies the same space regardless of its width. In industry the lowest number of pixels in a box is 5 wide by 7 high. This box allows both upper and lower case characters, although the descenders in “p”, “q” and “g”, and the ascenders above “Ä” or “Á” cannot be displayed. Considerably better quality is provided with the 7 x 9 box, which has been “standard” since the mid-1980s. To achieve good legibility and reasonably good character shapes, the character box size should be at least 12 x 16.

Flicker and refresh rate.

The images on CRTs and on some other types of VDU are not persistent images, as on paper. They only appear to be steady by taking advantage of an artefact of the eye. This, however, is not without penalty, since the screen tends to flicker if the image is not refreshed constantly. Flicker can influence both performance and comfort of the user and should always be avoided.

Flicker is the perception of brightness varying over time. The severity of flicker depends on various factors such as the characteristics of the phosphor, size and brightness of the flickering image, etc. Recent research shows that refresh rates up to 90 Hz may be needed to satisfy 99 per cent of users, while in earlier research, refresh rates well below 50 Hz were thought to be satisfactory. Depending on various features of the display, a flicker-free image may be achieved by refresh rates between 70 Hz and 90 Hz; displays with a light background (positive polarity) need a minimum of 80 Hz to be perceived as flicker-free.

Some modern devices offer an adjustable refresh rate; unfortunately, higher refresh rates are coupled with lower resolution or addressability. The ability of a device to display high “resolution” images with high refresh rates can be assessed by its video bandwidth. For displays with high quality, the maximum video bandwidth lies above 150 MHz, while some displays offer less than 40 MHz.

To achieve a flicker-free image and a high resolution with devices with lower video bandwidth, the manufacturers apply a trick that stems from commercial TV: the interlace mode. In this case, every second line on the display is refreshed with a given frequency. The result, however, is not satisfactory if static images, such as text and graphics, are displayed and the refresh rate is below 2 x 45 Hz. Unfortunately, the attempt to suppress the disturbing effect of flicker may induce some other negative effects.

Jitter.

Jitter is the result of spatial instability of the image; a given picture element is not displayed at the same location on the screen after each refresh process. The perception of jitter cannot be separated from the perception of flicker.

Jitter may have its cause in the VDU itself, but it can also be induced by interaction with other equipment at the workplace, such as a printer or other VDUs or devices that generate magnetic fields.

Contrast.

Brightness contrast, the ratio of the luminance of a given object to its surroundings, represents the most important photometric feature for readability and legibility. While most standards require a minimum ratio of 3:1 (bright characters on dark background) or 1:3 (dark characters on bright background), optimum contrast is actually about 10:1, and devices of good quality achieve higher values even in bright environments.

The contrast of “active” displays is impaired when the ambient light is increased, while “passive” displays (e.g., LCDs) lose contrast in dark environments. Passive displays with background lighting may offer good visibility in all environments under which people may work.

Sharpness.

Sharpness of an image is a well-known, but still ill-defined feature. Hence, there is no agreed-upon method to measure sharpness as a relevant feature for legibility and readability.

Typographical features

Legibility and readability.

Readability refers to whether a text is understandable as a series of connected images, while legibility refers to the perception of single or grouped characters. Thus, good legibility is, in general, a precondition for readability.

Legibility of text depends on several factors: some have been investigated thoroughly, while other relevant factors like character shapes are yet to be classified. One of the reasons for this is that the human eye represents a very powerful and robust instrument, and the measures used for performance and error rates often do not help to distinguish between different fonts. Thus, to some extent, typography still remains an art rather than a science.

Fonts and readability.

A font is a family of characters, designed to yield either optimum readability on a given medium, e.g., paper, electronic display or projection display, or some desired aesthetic quality, or both. While the number of available fonts exceeds ten thousand, only a few fonts, numbered in tens, are believed to be “readable”. Since legibility and readability of a font are also affected by the experience of the reader—some “legible” fonts are believed to have become so because of decades or even centuries of use without changing their shape—the same font may be less legible on a screen than on paper, merely because its characters look “new”. This, however, is not the main reason for the poor legibility of screens.

In general, the design of screen fonts is restricted by shortcomings in technology. Some technologies impose very narrow limits on the design of characters, e.g., LEDs or other rastered screens with limited numbers of dots per display. Even the best CRT displays can seldom compete with print (figure 5). In the last years, research has shown that speed and accuracy of reading on screens is about 30% lower than on paper, but whether this is due to features of the display or to other factors is not yet known.

Figure 5. Appearance of a letter at various screen resolutions and on paper (right)

VDU020F5

Characteristics with measurable effects.

The effects of some characteristics of alphanumeric representations are measurable, e.g., apparent size of the characters, height/width ratio, stroke width/size ratio, line, word and character spacing.

The apparent size of the characters, measured in minutes of arc, shows an optimum by 20' to 22'; this corresponds to about 3 mm to 3.3 mm in height under normal viewing conditions in offices. Smaller characters may lead to increased errors, visual strain, and also to more postural strain due to restricted viewing distance. Thus, text should not be represented in an apparent size of less than 16'.

However, graphical representations may require text of smaller size to be displayed. To avoid errors, on the one hand, and a high visual load for the user on the other, parts of the text to be edited should be displayed in a separate window to assure good readability. Characters with an apparent size of less than 12' should not be displayed as readable text, but replaced by a rectangular grey block. Good programs allow the user to select the minimum actual size of characters that are to be displayed as alphanumerics.

The optimum height/width ratio of characters is about 1:0.8; legibility is impaired if the ratio is above 1:0.5. For good legible print and also for CRT screens, the ratio of character height to stroke width is about 10:1. However, this is only a rule of thumb; legible characters of high aesthetical value often show different stroke widths (see figure 5).

Optimal line spacing is very important for readability, but also for space saving, if a given amount of information is to be displayed in limited space. The best example for this is the daily newspaper, where an enormous amount of information is displayed within a page, but is still readable. The optimum line spacing is about 20% of character height between the descenders of a line and the ascenders of the next; this is a distance of about 100% of the character height between the baseline of a line of text and the ascenders of the next. If the length of the line is reduced, the space between the lines may be reduced, too, without losing readability.

Character spacing is invariable on character-oriented screens, making them inferior in readability and aesthetic quality to displays with variable space. Proportional spacing depending on the shape and width of the characters is preferable. However, a typographical quality comparable to well-designed printed fonts is achievable only on few displays and when using specific programs.

Ambient Lighting

The specific problems of VDU workstations

During the last 90 years of industrial history, the theories about the lighting of our workplaces have been governed by the notion that more light will improve vision, reduce stress and fatigue, as well as enhance performance. “More light”, correctly speaking “more sunlight”, was the slogan of people in Hamburg, Germany, more than 60 years ago when they took to the streets to fight for better and healthier homes. In some countries like Denmark or Germany, workers today are entitled to have some daylight at their workplaces.

The advent of information technology, with the emergence of the first VDUs in working areas, was presumably the first event ever when workers and scientists began to complain about too much light in working areas. The discussion was fuelled by the easily detectable fact that most VDUs were equipped with CRTs, which have curved glass surfaces prone to veiling reflections. Such devices, sometimes called “active displays”, lose contrast when the level of ambient lighting becomes higher. Redesigning lighting to reduce the visual impairments caused by these effects, however, is complicated by the fact that most users also use paper-based information sources, which generally require increased levels of ambient light for good visibility.

The role of ambient light

Ambient light found in the vicinity of VDU workstations serves two different purposes. First, it illuminates the workspace and working materials like paper, telephones, etc. (primary effect). Secondly, it illuminates the room, giving it its visible shape and giving the users the impression of a light surrounding (secondary effect). Since most lighting installations are planned according to the concept of general lighting, the same lighting sources serve both purposes. The primary effect, illuminating passive visual objects to make them visible or legible, became questionable when people started to use active screens that do not need ambient light to be visible. The remaining benefit of the room lighting was reduced to the secondary effect, if the VDU is the major source of information.

The function of VDUs, both of CRTs (active displays) and of LCDs (passive displays), is impaired by the ambient light in specific ways:

CRTs:

  • The curved glass surface reflects bright objects in the environment, and forms a kind of visual “noise”.
  • Depending on the intensity of ambient illumination, the contrast of displayed objects is reduced to a degree that readability or legibility of the objects is impaired.
  • Images on colour CRTs suffer a twofold degradation: First, the brightness contrast of all displayed objects is reduced, as on monochrome CRTs. Secondly, the colours are changed so that colour contrast is also reduced. In addition, the number of distinguishable colours is reduced.

 

LCDs (and other passive displays):

  • The reflections on LCDs cause less concern than those on CRT surfaces, since these displays have flat surfaces.
  • In contrast to active displays, LCDs (without backlight) lose contrast under low levels of ambient illumination.
  • Due to poor directional characteristics of some display technologies, visibility or legibility of displayed objects is substantially reduced if the main direction of light incidence is unfavourable.

 

The extent to which such impairments exert a stress on users or lead to a substantial reduction of visibility/readability/legibility of visual objects in real working environments varies greatly. For example, the contrast of alphanumeric characters on monochrome (CRT) displays is reduced in principle, but, if the illuminance on the screen is ten times higher than in normal working environments, many screens will still have a contrast sufficient to read alphanumeric characters. On the other hand, colour displays of computer-aided design (CAD) systems decrease substantially in visibility so that most users prefer to dim the artificial lighting or even to switch it off, and, in addition, to keep the daylight out of their working area.

Possible remedies

Changing illuminance levels.

Since 1974, numerous studies have been performed which led to recommendations for reducing illuminance at the workplace. However, these recommendations were mostly based on studies with unsatisfactory screens. The recommended levels were between 100 lux and 1,000 lx, and generally, levels well below the recommendations of the existing standards for office lighting (e.g., 200 lx or 300 to 500 lx) have been discussed.

When positive screens with a luminance of approximately 100 cd/m2 brightness and some kind of efficient anti-glare treatment are used, the utilization of a VDU does not limit the acceptable illuminance level, since users find illuminance levels up to 1,500 lx acceptable, a value which is very rare in working areas.

If the relevant characteristics of the VDUs do not allow comfortable working under normal office lighting, as can occur when working with storage tubes, microimage readers, colour screens etc., the visual conditions can be improved substantially by introducing two-component lighting. Two-component lighting is a combination of indirect room lighting (secondary effect) and direct task lighting. Both components should be controllable by the users.

Controlling glare on screens.

Controlling glare on screens is a difficult task since almost all remedies that improve the visual conditions are likely to impair some other important characteristic of the display. Some remedies, proposed for many years, such as mesh filters, remove reflections from the displays but they also impair the legibility of the display. Low luminance luminaires cause less reflected glare on screens, but the quality of such lighting generally is judged by users to be worse than that of any other type of lighting.

For this reason, any measures (see figure 6) should be applied cautiously, and only after analysing the real cause of the annoyance or disturbance. Three possible ways of controlling glare on screens are: selection of the correct location of the screen with respect to glare sources; selection of suitable equipment or addition of elements to it; and use of lighting. The costs of the measures to be taken are of the same order: it costs almost nothing to place screens in such a way as to eliminate reflected glare. However, this may not be possible in all cases; thus, equipment-related measures will be more expensive but may be necessary in various working environments. Glare control by lighting is often recommended by lighting specialists; however, this method is the most expensive but not the most successful way of controlling glare.

Figure 6. Strategies for controlling glare on screens

VDU020F6

The most promising measure at present is the introduction of positive screens (displays with bright background) with an additional anti-glare treatment for the glass surface. Even more successful than this will be the introduction of flat screens with a nearly matt surface and bright background; such screens, however, are not available for general use today.

Adding hoods to displays is the ultima ratio of the ergonomists for difficult work environments like production areas, towers of airports or operator cabins of cranes, etc. If hoods are really needed, it is likely that there will be more severe problems with lighting than just reflected glare on visual displays.

Changing luminaire design is mainly accomplished in two ways: first, by reducing the luminance (corresponds to apparent brightness) of parts of the light fittings (so called “VDU lighting”), and secondly, by introducing indirect light instead of direct light. The results of current research show that introducing indirect light yields substantial improvements for users, reduces visual load, and is well accepted by users.

 

Back

Friday, 25 March 2011 04:00

Ocular and Visual Problems

There have been a comparatively large number of studies devoted to visual discomfort in visual display unit (VDU) workers, many of which have yielded contradictory results. From one survey to another, there are discrepancies in reported prevalence of disorders ranging from practically 0 per cent to 80 per cent or more (Dainoff 1982). Such differences should not be considered too surprising because they reflect the large number of variables which can influence complaints of eye discomfort or disability.

Correct epidemiological studies of visual discomfort must take into account several population variables, such as sex, age, eye deficiencies, or use of lenses, as well as socio-economic status. The nature of the job being carried out with the VDU and the characteristics of the workstation layout and of the work organization are also important and many of these variables are interrelated.

Most often, questionnaires have been used to assess the eye discomfort of VDU operators. The prevalence of visual discomfort differs thus with the content of questionnaires and their statistical analysis. Appropriate questions for surveys concern the extent of symptoms of distress asthenopia suffered by VDU operators. Symptoms of this condition are well known and can include itching, redness, burning and tearing of the eyes. These symptoms are related to the fatigue of the accommodative function in the eye. Sometimes this eye symptoms are accompanied by a headache, with the pain located in the front portion of the head. There may also be disturbances in eye function, with symptoms such as double vision and reduced accommodative power. Visual acuity, itself, however, is rarely depressed, provided the conditions of measurement are carried out with a constant pupil size.

If a survey includes general questions, such as “Do you feel well at the end of the working day?” or “Have you ever had visual problems when working with VDUs?” the prevalence of positive responses may be higher than when single symptoms related to asthenopia are evaluated.

Other symptoms may also be strongly associated to asthenopia. Pains in the neck, shoulders and arms are frequently found. There are two main reasons that these symptoms may occur together with eye symptoms. The muscles of the neck participate in keeping a steady distance between eye and screen in VDU work and VDU work has two main components: screen and keyboard, which means that the shoulders and arms and the eyes are all working at the same time and thus may be subject to similar work-related strains.

User Variables Related to Visual Comfort

Sex and Age

In the majority of surveys, women report more eye discomfort than men. In one French study, for example, 35.6% of women complained of eye discomfort, against 21.8% of men (p J 05 significance level) (Dorard 1988). In another study (Sjödren and Elfstrom 1990) it was observed that while the difference in the degree of discomfort between women (41%) and men (24%) was great, it “was more pronounced for those working 5-8 hours a day than for those working 1-4 hours a day”. Such differences are not necessarily sex-related, however, since women and men seldom share similar tasks. For example, in one computer plant studied, when women and men were both occupied in a traditional “woman’s job”, both sexes displayed the same amount of visual discomfort. Furthermore when women worked in traditional “men’s jobs”, they did not report more discomfort than men. In general, regardless of sex, the number of visual complaints among skilled workers who use VDUs on their jobs is much lower than the number of complaints from workers in unskilled, hectic jobs, such as data entry or word processing (Rey and Bousquet 1989). Some of these data are given in table 1.

Table 1. Prevalence of ocular symptoms in 196 VDU operators according to 4 categories

Categories

Percentage of symptoms (%)

Females in "female" jobs

81

Males in "female" jobs

75

Males in "male" jobs

68

Females in "male" jobs

65

Source: From Dorard 1988 and Rey and Bousquet 1989.

The highest number of visual complaints usually arise in the 40–50-year-old group, probably because this is the time when changes in accommodation ability of the eye are occurring rapidly. However, although older operators are perceived as having more visual complaints than younger workers, and, as a consequence, presbyopia (vision impairment due to ageing) is often cited as the main visual defect associated with visual discomfort at VDU workstations, it is important to consider that there is also a strong association between having acquired advanced skills in VDU work and age. There is usually a higher proportion of older women among unskilled female VDU operators, and younger male workers tend to more often be employed in skilled jobs. Thus before broad generalizations about age and visual problems associated with VDU can be made, the figures should be adjusted to take into account the comparative nature and skill level of the work being done at the VDU.

Eye defects and corrective lenses

In general, about half of all VDU operators display some kind of eye deficiency and most of these people use prescriptive lenses of one type or another. Often VDU user populations do not differ from the working population as far as eye defects and eye correction are concerned. For example, one survey (Rubino 1990) conducted among Italian VDU operators revealed that roughly 46% had normal vision and 38% were nearsighted (myopic), which is consistent with figures observed among Swiss and French VDU operators (Meyer and Bousquet 1990). Estimates of the prevalence of eye defects will vary according to the assessment technique used (Çakir 1981).

Most experts believe that presbyopia itself does not appear to have a significant influence on the incidence of asthenopia (persistent tiredness of the eyes). Rather, the use of unsuitable lenses appears to be likely to induce eye fatigue and discomfort. There is some disagreement about the effects in shortsighted young persons. Rubino has observed no effect while, according to Meyer and Bousquet (1990), myopic operators readily complain of undercorrection for the distance between eye and screen (usually 70 cm). Rubino also has proposed that people who suffer from a deficiency in eye coordination may be more likely to suffer from visual complaints in VDU work.

One interesting observation that resulted from a French study involving a thorough eye examination by ophthalmologists of 275 VDU operators and 65 controls was that 32% of those examined could have their vision improved by good correction. In this study 68% had normal vision, 24% were shortsighted and 8% farsighted (Boissin et al., 1991). Thus, although industrialized countries are, in general, well equipped to provide excellent eye care, eye correction is probably either completely neglected or inappropriate for those working at a VDU. An interesting finding in this study was that more cases of conjunctivitis were found in the VDU operators (48%) than in the controls. Since conjunctivitis and poor eyesight are correlated, this implies that better eye correction is needed.

Physical and Organizational Factors Affecting Visual Comfort

It is clear that in order to assess, correct and prevent visual discomfort in VDU work an approach which takes into account the many different factors described here and elsewhere in this chapter is essential. Fatigue and eye discomfort can be the result of individual physiological difficulties in normal accommodation and convergence in the eye, from conjunctivitis, or from wearing glasses that are poorly corrected for distance. Visual discomfort can be related to the workstation itself and can also be linked to work organization factors such as monotony and time spent on the job with and without a break. Inadequate lighting, reflections on screen, flicker and too much luminance of characters can also increase the risk of eye discomfort. Figure 1 illustrates some of these points.

Figure 1. Factors that increase the risk of eye fatigue among VDU workers

VDU030F1

Many of the appropriate characteristics of workstation layout are described more fully earlier in the chapter.

The best viewing distance for visual comfort which still leaves enough space for the keyboard appears to be about 65 cm. However, according to many experts, such as Akabri and Konz (1991), ideally, “it would be best to determine an individual’s dark focus so workstations could be adjusted to specific individuals rather than population means”. As far as the characters themselves go, in general, a good rule of thumb is “bigger is better”. Usually, letter size increases with the size of the screen, and a compromise is always struck between the readability of letters and the number of words and sentences that can be displayed on the screen at one time. The VDU itself should be selected according to the task requirements and should try to maximize user comfort.

In addition to the design of the workstation and the VDU itself is the need to allow the eyes to rest. This is particularly important in unskilled jobs, in which the freedom of “moving around” is generally much lower than in skilled jobs. Data entry work or other activities of the same type are usually performed under time pressure, sometimes even accompanied by electronic supervision, which times operator output very precisely. In other interactive VDU jobs which involve using databases, operators are obliged to wait for a response from the computer and thus must remain at their posts.

Flicker and eye discomfort

Flicker is the change in brightness of the characters on the screen over time and is more fully described above. When characters do not refresh themselves frequently enough, some operators are able to perceive flicker. Younger workers may be more affected since their flicker fusion frequency is higher than that of older people (Grandjean 1987). The rate of flicker increases with increase in brightness, which is one reason why many VDU operators do not commonly make use of the whole range of brightness of the screen that are available. In general a VDU with a refresh rate of at least 70 Hz should “fit” the visual needs of a large proportion of VDU operators.

The sensitivity of the eyes to flicker is enhanced by increased brightness and contrast between the fluctuating area and the surrounding area. The size of the fluctuating area also affects sensitivity because the larger the area to be viewed, the larger the area of the retina that is stimulated. The angle at which the light from the fluctuating area strikes the eye and the amplitude of modulation of the fluctuating area are other important variables.

The older the VDU user, the less sensitive the eye because older eyes are less transparent and the retina is less excitable. This is also true in sick people. Laboratory findings such as these help to explain the observations made in the field. For example, it has been found that operators are disturbed by flicker from the screen when reading paper documents (Isensee and Bennett as quoted in Grandjean 1987), and the combination of fluctuation from the screen and fluctuation of fluorescent light has been found to be particularly disturbing.

Lighting

The eye functions best when the contrast between the visual target and its background is maximum, as for example, with a black letter on white paper. Efficiency is further enhanced when the outer edge of the visual field is exposed to slightly lower levels of brightness. Unfortunately, with a VDU the situation is just the reverse of this, which is one reason that so many VDU operators try to protect their eyes against excess light.

Inappropriate contrasts in brightness and unpleasant reflections produced by fluorescent light, for example, can lead to visual complaints among VDU operators. In one study, 40% of 409 VDU workers made such complaints (Läubli et al., 1989).

In order to minimize problems with lighting, just as with viewing distances, flexibility is important. One should be able to adapt light sources to the visual sensitivity of individuals. Workplaces should be provided to offer individuals the opportunity to adjust their lighting.

Job characteristics

Jobs which are carried out under time pressure, especially if they are unskilled and monotonous, are often accompanied by sensations of general fatigue, which, in turn, can give rise to complaints of visual discomfort. In the authors’ laboratory, it was found that visual discomfort increased with the number of accommodative changes the eyes needed to make to carry out the task. This occurred more often in data entry or word processing than in tasks which involved dialogues with the computer. Jobs which are sedentary and provide little opportunity for moving around also provide less opportunity for muscular recovery and hence enhance the likelihood of visual discomfort.

Job organization

Eye discomfort is just one aspect of the physical and mental problems that can be associated with many jobs, as described more fully elsewhere in this chapter. It is not surprising, therefore, to find a high correlation between the level of eye discomfort and job satisfaction. Although night work is still not widely practised in office work, its effects on eye discomfort in VDU work may well be unexpected. This is because, although there are few data as yet available to confirm this, on the one hand, eye capacity during the night shift may be somehow depressed and thus more vulnerable to VDU effects, while on the other hand, the lighting environment is easier to adjust without disturbance from natural lighting, provided that the reflections from fluorescent lamps on dark windows are eliminated.

Individuals who use VDUs to work at home should ensure that they provide themselves with the appropriate equipment and lighting conditions to avoid the adverse environmental factors found in many formal workplaces.

Medical Surveillance

No single, particular hazardous agent has been identified as a visual risk. Asthenopia among VDU operators appears rather to be an acute phenomenon, although there is some belief that sustained strain of accommodation may occur. Unlike many other chronic diseases, misadjustment to VDU work is usually noticed very soon by the “patient”, who may be more likely to seek medical care than will workers in other workplace situations. After such visits, spectacles are often prescribed, but unfortunately they are sometimes ill adapted to needs of the workplace which have been described here. It is essential that practitioners be specially trained to care for patients who work with VDUs. A special course, for example, has been created at the Swiss Federal Institute of Technology in Zurich just for this purpose.

The following factors must be taken into consideration in caring for VDU workers. In comparison to traditional office work, the distance between the eye and the visual target, the screen, is usually of 50 to 70 cm and cannot be changed. Therefore, lenses should be prescribed which take this steady viewing distance into account. Bifocal lenses are inappropriate because they will require a painful extension of the neck in order for the user to read the screen. Multifocal lenses are better, but as they limit rapid eye movements, their use can lead to more head movements, producing additional strain.

Eye correction should be as precise as possible, taking into account the slightest visual defects (e.g., astigmatism) and also the viewing distance of the VDU. Tinted glasses which reduce the illumination level in the centre of the visual field should not be prescribed. Partially tinted spectacles are not useful, since eyes at the workplace are always moving in all directions. Offering special spectacles to employees, however, should not mean that further complaints of visual discomfort from workers may be ignored since the complaints could be justified by poor ergonomic design of the workstation and equipment.

It should be said, finally, that the operators who suffer the most discomfort are those who need raised illumination levels for detail work and who, at the same time, have a higher glare sensitivity. Operators with undercorrected eyes will thus display a tendency to get closer to the screen for more light and will be in this way more exposed to flicker.

Screening and secondary prevention

The usual principles of secondary prevention in public health are applicable to the working environment. Screening therefore should be targeted towards known hazards and is most useful for diseases with long latency periods. Screening should take place prior to any evidence of preventable disease and only tests with high sensitivity, high specificity and high predictive power are useful. The results of screening examinations can be used to assess the extent of exposure both of individuals and of groups.

Since no severe adverse effects on the eye have ever been identified in VDU work, and since no hazardous level of radiations associated with visual problems have been detected, it has been agreed that there is no indication that work with VDUs “will cause disease or damage to the eye” (WHO 1987). The ocular fatigue and eye discomfort that have been reported to occur in VDU operators are not the kinds of health effect which generally form the basis for medical surveillance in a secondary prevention programme.

However, pre-employment visual medical examinations of VDU operators are widespread in most member countries of the International Labour Organization, a requirement supported by trade unions and employers (ILO 1986). In many European countries (including France, the Netherlands and the United Kingdom), medical surveillance for VDU operators, including ocular tests, has also been instituted subsequent to the issuing of Directive 90/270/EEC on work with display screen equipment.

If a programme for the medical surveillance of VDU operators is to be set up, the following issues must be addressed in addition to deciding on the contents of the screening programme and the appropriate testing procedures:

  • What is the meaning of the surveillance and how should its results be interpreted?
  • Are all VDU operators in need of the surveillance?
  • Are any ocular effects which are observed appropriate for a secondary prevention programme?

 

Most routine visual screening tests available to the occupational physician have poor sensitivity and predictive power for eye discomfort associated with VDU work (Rey and Bousquet 1990). Snellen visual testing charts are particularly inappropriate for the measurement of visual acuity of VDU operators and for predicting their eye discomfort. In Snellen charts the visual targets are dark, precise letters on a clear, well illuminated background, not at all like typical VDU viewing conditions. Indeed, because of the inapplicability of other methods, a testing procedure has been developed by the authors (the C45 device) which simulates the reading and lighting conditions of a VDU workplace. Unfortunately, this remains for the time being a laboratory set-up. It is important to realise, however, that screening examinations are not a substitute for a well-designed workplace and good work organization.

Ergonomic Strategies to Reduce Visual Discomfort

Although systematic ocular screening and systematic visits to the eye specialist have not been shown to be effective in reducing visual symptomatology, they have been widely incorporated into occupational health programmes for VDU workers. A more cost-effective strategy could include an intensive ergonomic analysis of both the job and the workplace. Workers with known ocular diseases should try to avoid intensive VDU work as much as possible. Poorly corrected vision is another potential cause of operator complaints and should be investigated if such complaints occur. The improvement of the ergonomics of the workplace, which could include providing for a low reading angle to avoid a decreased blinking rate and neck extension, and providing the opportunity to rest and to move about on the job, are other effective strategies. New devices, with separate keyboards, allow distances to be adjusted. The VDU may also be made to be moveable, such as by placing it on a mobile arm. Eye strain will thus be reduced by permitting changes in viewing distance which match the corrections to the eye. Often the steps taken to reduce muscular pain in the arms, shoulders and back will at the same time also allow the ergonomist to reduce visual strain. In addition to the design of equipment, the quality of the air can affect the eye. Dry air leads to dry eyes, so that appropriate humidification is needed.

In general the following physical variables should be addressed:

  • the distance between the screen and the eye
  • the reading angle, which determines the position of the head and the neck
  • the distance to walls and windows
  • the quality of paper documents (often very poor)
  • luminances of screen and surroundings (for artificial and natural lighting)
  • flicker effects
  • glare sources and reflections
  • the humidity level.

 

Among the organizational variables that should be addressed in improving visual working conditions are:

  • content of the task, responsibility level
  • time schedules, night work, duration of work
  • freedom to “move around”
  • full time or part time jobs, etc.

 

Back

Friday, 25 March 2011 04:03

Reproductive Hazards - Experimental Data

The purpose of the experimental studies described here, using animal models is, in part, to answer the question as to whether extremely low frequency (ELF) magnetic field exposures at levels similar to those around VDU workstations can be shown to affect reproductive functions in animals in a manner that can be equated to a human health risk.

The studies considered here are limited to in vivo studies (those performed on live animals) of reproduction in mammals exposed to very low frequency (VLF) magnetic fields with appropriate frequencies, excluding, therefore, studies on the biological effects in general of VLF or ELF magnetic fields. These studies on experimental animals fail to demonstrate unequivocally that magnetic fields, such as are found around VDUs, affect reproduction. Moreover, as can be seen from considering the experimental studies described in some detail below, the animal data do not shed a clear light on possible mechanisms for human reproductive effects of VDU use. These data complement the relative absence of indications of a measurable effect of VDU use on reproductive outcomes from human population studies.

Studies of Reproductive Effects of VLF Magnetic Fields in Rodents

VLF magnetic fields similar to those around VDUs have been used in five teratological studies, three with mice and two with rats. The results of these studies are summarized in table 1. Only one study (Tribukait and Cekan 1987), found an increased number of foetuses with external malformations. Stuchly et al. (1988) and Huuskonen, Juutilainen and Komulainen (1993) both reported a significant increase in the number of foetuses with skeletal abnormalities, but only when the analysis was based on the foetus as a unit. The study by Wiley and Corey (1992) did not demonstrate any effect of magnetic field exposures on placental resorption, or other pregnancy outcomes. Placental resorptions roughly correspond to spontaneous abortions in humans. Finally, Frölén and Svedenstål (1993) performed a series of five experiments. In each experiment, the exposure occurred on a different day. Among the first four experimental subgroups (start day 1–start day 5), there were significant increases in the number of placental resorptions among exposed females. No such effects were seen in the experiment where exposure started on day 7 and which is illustrated in figure 1.

Table 1. Teratological studies with rats or mice exposed to 18-20 kHz saw-tooth formed magnetic fields

   

Magnetic field exposure

 

Study

Subject1

Frequency

Amplitude2

Duration3

Results4

Tribukait and Cekan (1987)

76 litters of mice
(C3H)

20 kHz

1 μT, 15 μT

Exposed to day 14 of pregnancy

Significant increase in external malformation; only if foetus is used as the unit of observation; and only in the first half of the experiment; no difference as to resorption or foetal death.

Stuchly et al.
(1988)

20 litters of rats
(SD)

18 kHz

5.7μT, 23μT,
66μT

Exposed throughout
pregnancy

Significant increase in minor skeletal malformations; only if foetus is used as the unit of observation; some decrease in blood cell concentrations no difference as to resorption, nor as to other types of malformations

Wiley and Corey
(1992)

144 litters of
mice (CD-1)

20 kHz

3.6 μT, 17μT,
200 μT

Exposed throughout
pregnancy

No difference as to any observed outcome (malformation,
resorption, etc.).

Frölén and
Svedenstål
(1993)

In total 707
litters of mice
(CBA/S)

20 kHz

15 μT

Beginning on various days of pregnancy in
different subexperiments

Significant increase in resorption; only if exposure starts on day 1 to day 5; no difference as to malformations

Huuskonen,
Juutilainen and
Komulainen
(1993)

72 litters of rats
(Wistar)

20 kHz

15 μT

Exposed to day 12 of pregnancy

Significant increase in minor skeletal malformations; only if foetus is used as the unit of observation; no difference as to
resorption, nor as to other types of malformations.

1 Total number of litters in the maximum exposure category.

2 Peak-to-peak amplitude.

3 Exposure varied from 7 to 24 hours/day in different experiments.

4 “Difference” refers to statistical comparisons between exposed and unexposed animals, “increase”  refers to a comparison of the highest exposed group vs. the unexposed group.

 

Figure 1. The percentage of female mice with placental resorptions in relation to exposure

VDU040F1

The interpretations given by the researchers to their findings include the following. Stuchly and co-workers reported that the abnormalities they observed were not unusual and ascribed the result to “common noise that appears in every teratological evaluation”. Huuskonen et al., whose findings were similar to Stuchly et al., were less negative in their appraisal and considered their result to be more indicative of a real effect, but they too remarked in their report that the abnormalities were “subtle and would probably not impair the later development of the foetuses”. In discussing their findings in which effects were observed in the early onset exposures but not the later ones, Frölén and Svedenstål suggest that the effects observed could be related to early effects on reproduction, before the fertilized egg is implanted in the uterus.

In addition to the reproductive outcomes, a decrease in white and red blood cells were noted in the highest exposure group in the study by Stuchly and co-workers. (Blood cell counts were not analysed in the other studies.) The authors, while suggesting that this could indicate a mild effect of the fields, also noted that the variations in blood cell counts were “within the normal range”. The absence of histological data and the absence of any effects on bone marrow cells made it difficult to evaluate these latter findings.

Interpretation and comparison of studies 

Few of the results described here are consistent with one another. As stated by Frölén and Svedenstål, “qualitative conclusions with regard to corresponding effects in human beings and test animals may not be drawn”. Let us examine some of the reasoning that could lead to such a conclusion.

The Tribukait findings are generally not considered to be conclusive for two reasons. First, the experiment only yielded positive effects when the foetus was used as the unit of observation for statistical analysis, whereas the data themselves actually indicated a litter-specific effect. Second, there is a discrepancy in the study between the findings in the first and the second part, which implies that the positive findings may be the result of random variations and/or uncontrolled factors in the experiment.

Epidemiological studies investigating specific malformations have not observed an increase in skeletal malformations among children born of mothers working with VDUs—and thus exposed to VLF magnetic fields. For these reasons (foetus-based statistical analysis, abnormalities probably not health-related, and lack of concordance with epidemiological findings), the results—on minor skeletal malformations—are not such as to provide a firm indication of a health risk for humans.


Technical Background

Units of observation

When statistically evaluating studies on mammals, consideration must be given to at least one aspect of the (often unknown) mechanism. If the exposure affects the mother—which in turn affects the foetuses in the litter, it is the status of the litter as a whole which should be used as the unit of observation (the effect which is being observed and measured), since the individual outcomes among litter-mates are not independent. If, on the other hand, it is hypothesized that the exposure acts directly and independently on the individual foetuses within the litter, then one can appropriately use the foetus as a unit for statistical evaluation. The usual practice is to count the litter as the unit of observation, unless evidence is available that the effect of the exposure on one foetus is independent of the effect on the other foetuses in the litter.


Wiley and Corey (1992) did not observe a placental resorption effect similar to that seen by Frölén and Svedenstål. One reason put forward for this discrepancy is that different strains of mice were used, and the effect could be specific for the strain used by Frölén and Svedenstål. Apart from such a speculated species effect, it is also noteworthy that both females exposed to 17 μT fields and controls in the Wiley study had resorption frequencies similar to those in exposed females in the corresponding Frölén series, whereas most non-exposed groups in the Frölén study had much lower frequencies (see figure 1). One hypothetical explanation could be that a higher stress level among the mice in the Wiley study resulted from the handling of animals during the three hour period without exposure. If this is the case, an effect of the magnetic field could perhaps have been “drowned” by a stress effect. While it is difficult to definitely dismiss such a theory from the data provided, it does appear somewhat far-fetched. Furthermore, a “real” effect attributable to the magnetic field would be expected to be observable above such a constant stress effect as the magnetic field exposure increased. No such trend was observed in the Wiley study data.

The Wiley study reports on environmental monitoring and on rotation of cages to eliminate the effects of uncontrolled factors which might vary within the room environment itself, as magnetic fields can, while the Frölén study does not. Thus, control of “other factors” is at least better documented in the Wiley study. Hypothetically, uncontrolled factors that were not randomized could conceivably offer some explanations. It is also interesting to note that the lack of effect observed in the day 7 series of the Frölén study appears to be due not to a decrease in the exposed groups, but to an increase in the control group. Thus variations in the control group are probably important to consider while comparing the disparate results of the two studies.

Studies of Reproductive Effects of ELF Magnetic Fields in Rodents

Several studies have been performed, mostly on rodents, with 50–80 Hz fields. Details on six of these studies are shown in table 2. While other studies of ELF have been carried out, their results have not appeared in the published scientific literature and are generally available only as abstracts from conferences. In general the findings are of “random effects”, “no differences observed” and so on. One study, however, found a reduced number of external abnormalities in CD–1 mice exposed to a 20 mT, 50 Hz field but the authors suggested that this might reflect a selection problem. A few studies have been reported on species other than rodents (rhesus monkeys and cows), again apparently without observations of adverse exposure effects.

Table 2. Teratological studies with rats or mice exposed to 15-60 Hz sinusoidal or square pulsed magnetic fields

   

Magnetic field exposure

   

Study

Subject1

Frequency

Amplitude

Description

Exposure duration

Results

Rivas and Rius
(1985)

25 Swiss mice

50 Hz

83 μT, 2.3 mT

Pulsed, 5 ms pulse duration

Before and during pregnancy and offspring growth; total 120 days

No significant differences at birth in any measured parameter; decreased male body weight when adult

Zecca et al. (1985)

10 SD rats

50 Hz

5.8 mT

 

Day 6-15 of pregnancy,
3 h/day

No significant differences

Tribukait and Cekan (1987)

35 C3H mice

50 Hz

1 μT, 15 μT
(peak)

Square wave-forms, 0.5 ms duration

Day 0-14 of pregnancy,
24 h/day

No significant differences

Salzinger and
Freimark (1990)

41 off-springs of SD rats. Only male pups used

60 Hz

100 μT (rms).

Also electric
field exposure.

Uniform circular polarized

Day 0-22 of pregnancy and
8 days after birth, 20 h/day

Lower increase in operand response during training commencing at 90 days of age

McGivern and
Sokol (1990)

11 offsprings of SD rats. Only male pups used.

15 Hz

800 μT (peak)

Square wave-forms, 0.3 ms duration

Day 15-20 of pregnancy,
2x15 min/day

Territorial scent marking behaviour reduced at 120 days of age.
Some organ weight increased.

Huuskonen et al.
(1993)

72 Wistar rats

50 Hz

12.6μT (rms)

Sinusoidal

Day 0-12 of pregnancy,
24 h/day

More foetuses/litter. Minor skeletal malformations

1 Number of animals (mothers) in the highest exposure category given unless otherwise noted.

 

As can be seen from table 2, a wide range of results were obtained. These studies are more difficult to summarize because there are so many variations in exposure regimens, the endpoints under study as well as other factors. The foetus (or the surviving, “culled” pup) was the unit used in most studies. Overall, it is clear that these studies do not show any gross teratogenic effect of magnetic field exposure during pregnancy. As remarked above, “minor skeletal anomalies” do not appear to be of importance when evaluating human risks. The behavioural study results of Salzinger and Freimark (1990) and McGivern and Sokol (1990) are intriguing, but they do not form a basis for indications of human health risks at a VDU workstation, either from the standpoint of procedures (use of the foetus, and, for McGivern, a different frequency) or of effects.

Summary of specific studies

Behavioural retardation 3–4 months after birth was observed in the offspring of exposed females by Salzinger and McGivern. These studies appear to have used individual offspring as the statistical unit, which may be questionable if the stipulated effect is due to an effect on the mother. The Salzinger study also exposed the pups during the first 8 days after birth, so that this study involved more than reproductive hazards. A limited number of litters was used in both studies. Furthermore, these studies cannot be considered to confirm each other’s findings since the exposures varied greatly between them, as can be seen in table 2.

Apart from a behavioural change in the exposed animals, the McGivern study noted an increased weight of some male sex organs: the prostate, the seminal vesicles and the epididymis (all parts of the male reproductive system). The authors speculate as to whether this could be linked to stimulation of some enzyme levels in the prostate since magnetic field effects on some enzymes present in the prostate have been observed for 60 Hz.

Huuskonen and co-workers (1993) noted an increase in the number of foetuses per litter (10.4 foetuses/litter in the 50 Hz exposed group vs. 9 foetuses/litter in the control group). The authors, who had not observed similar trends in other studies, downplayed the importance of this finding by noting that it “may be incidental rather than an actual effect of the magnetic field”. In 1985 Rivas and Rius reported a different finding with a slightly lower number of live births per litter among exposed versus nonexposed groups. The difference was not statistically significant. They carried out the other aspects of their analyses on both a “per foetus” and “per litter” basis. The noted increase in minor skeletal malformations was only seen with the analysis using the foetus as the unit of observation.

Recommendations and Summary

Despite the relative lack of positive, consistent data demonstrating either human or animal reproductive effects, attempts at replications of the results of some studies are still warranted. These studies should attempt to reduce the variations in exposures, methods of analysis and strains of animals used.

In general, the experimental studies performed with 20 kHz magnetic fields have provided somewhat varied results. If adhering strictly to the litter analysis procedure and statistical hypothesis testing, no effects have been shown in rats (although similar nonsignificant findings were made in both studies). In mice, the results have been varied, and no single coherent interpretation of them appears possible at present. For 50 Hz magnetic fields, the situation is somewhat different. Epidemiological studies which are relevant to this frequency are scarce, and one study did indicate a possible risk of miscarriage. By contrast, the experimental animal studies have not produced results with similar outcomes. Overall, the results do not establish an effect of extremely low frequency magnetic fields from VDUs on the outcome of pregnancies. The totality of results fails thus to suggest an effect of VLF or ELF magnetic fields from VDUs on reproduction.

 

Back

Friday, 25 March 2011 04:16

Reproductive Effects - Human Evidence

The safety of visual display units (VDUs) in terms of reproductive outcomes has been questioned since the widespread introduction of VDUs in the work environment during the 1970s. Concern for adverse pregnancy outcomes was first raised as a result of numerous reports of apparent clusters of spontaneous abortion or congenital malformations among pregnant VDU operators (Blackwell and Chang 1988). While these reported clusters were determined to be no more than what could be expected by chance, given the widespread use of VDUs in the modern workplace (Bergqvist 1986), epidemiologic studies were undertaken to explore this question further.

From the published studies reviewed here, a safe conclusion would be that, in general, working with VDUs does not appear to be associated with an excess risk of adverse pregnancy outcomes. However, this generalized conclusion applies to VDUs as they are typically found and used in offices by female workers. If, however, for some technical reason, there existed a small proportion of VDUs which did induce a strong magnetic field, then this general conclusion of safety could not be applied to that special situation since it is unlikely that the published studies would have had the statistical ability to detect such an effect. In order to be able to have generalizable statements of safety, it is essential that future studies be carried out on the risk of adverse pregnancy outcomes associated with VDUs using more refined exposure measures.

The most frequently studied reproductive outcomes have been:

  • Spontaneous abortion (10 studies): usually defined as a hospitalized unintentional cessation of pregnancy occurring before 20 weeks of gestation.
  • Congenital malformation (8 studies): many different types were assessed, but in general, they were diagnosed at birth.
  • Other outcomes (8 studies) such as low birthweight (under 2,500 g), very low birthweight (under 1,500 g), and fecundability (time to pregnancy from cessation of birth control use) have also been assessed. See table 1.

 

Table 1. VDU use as a factor in adverse pregnancy outcomes

Objectives

Methods

Results

Study

Outcome

Design

Cases

Controls

Exposure

OR/RR (95% CI)

Conclusion

Kurppa et al.
(1986)

Congenital malformation

Case-control

1, 475

1, 475 same age, same delivery date

Job titles,
face-to-face
interviews

235 cases,
255 controls,
0.9 (0.6-1.2)

No evidence of increased risk among women who reported exposure to VDU or among women whose job titles indicated possible exposure

Ericson and Källén (1986)

Spontaneous abortion,
infant died,
malformation,
very low birthweight

Case-case

412
22
62
26

1, 032 similar age and from same registry

Job titles

1.2 (0.6-2.3)
(applies to pooled outcome)

The effect of VDU use was not statistically significant

Westerholm and Ericson
(1986)

Stillbirth,
low birthweight,
prenatal mortality,
malformations

Cohort

7

13
43

4, 117

Job titles

1.1 (0.8-1.4)
NR(NS)
NR(NS)
1.9 (0.9-3.8)

No excesses were found for any of the studied outcomes.

Bjerkedal and Egenaes (1986)

Stillbirth,
first week death,
prenatal death,
low birthweight,
very low birthweight,
preterm,
multiple birth,
malformations

Cohort

17
8
25
46
10
97
16
71

1, 820

Employment records

NR(NS)
NR(NS)
NR(NS)
NR(NS)
NR(NS)
NR(NS)
NR(NS)
NR(NS)

The study concluded that there was no indication that introduction of VDUs in the centre has led to any increase in the rate of adverse pregnancy outcomes.

Goldhaber, Polen and Hiatt
(1988)

Spontaneous abortion,
malformations

Case-control

460
137

1, 123 20% of all normal births, same region, same time

Postal questionnaire

1.8 (1.2-2.8)
1.4 (0.7-2.9)

Statistically increased risk for spontaneous abortions for VDU exposure. No excess risk for congenital malformations associates with VDU exposure.

McDonald et al. (1988)

Spontaneous abortion,

stillbirth,
malformations,

low birthweight

Cohort

776

25
158

228

 

Face-to-face interviews

1.19 (1.09-1.38)
current/0.97 previous
0.82 current/ 0.71 previous
0.94 current/1, 12
(89-1, 43) previous
1.10

No increase in risk was found among women exposed to VDUs.

Nurminen and Kurppa (1988)

Threatened abortion,
gestation  40 weeks,
low birthweight,
placental weight,
hypertension

Cohort

239
96
57
NR
NR

 

Face-to-face interviews

0.9
VDU:30.5%, non: 43.8%
VDU:25.4%, non: 23.6%
other comparisons (NR)

The crude and adjusted rate ratios did not show statistically significant effects for working with VDUs.

Bryant and Love (1989)

Spontaneous abortion

Case-control

344

647
Same hospital,
age, last menstrual period, parity

Face-to-face interviews

1.14 (p = 0.47) prenatal
0.80 (p = 0.2) postnatal

VDU use was similar between the cases and both the prenatal controls and postnatal controls.

Windham et al. (1990)

Spontaneous abortion,
low birth weight,
intra-uterine growth
retardation

Case-control

626
64
68

1,308 same age, same last menstrual period

Telephone interviews

1.2 (0.88-1.6)
1.4 (0.75-2.5)
1.6 (0.92-2.9)

Crude odds ratios for spontaneous abortion and VDU use less than 20 hours per week were 1.2; 95% CI 0.88-1.6, minimum of 20 hours per week were 1.3; 95% CI 0.87-1.5. Risks for low birthweight and intra-uterine growth retardation were not significantly elevated.

Brandt and
Nielsen (1990)

Congenital malformation

Case-control

421

1,365; 9.2% of all pregnancies, same registry

Postal questionnaire

0.96 (0.76-1.20)

Use of VDUs during pregnancy was not associated with a risk of congenital malformations.

Nielsen and
Brandt (1990)

Spontaneous abortion

Case-control

1,371

1,699 9.2%
of all pregnancies, same registry

Postal questionnaire

0.94 (0.77-1.14)

No statistically significant risk for spontaneous abortion with VDU exposure.

Tikkanen and Heinonen
(1991)

Cardiovascular malformations

Case-control

573

1,055 same time, hospital delivery

Face-to-face interviews

Cases 6.0%, controls 5.0%

No statistically significant association between VDU use and cardiovascular malformation

Schnorr et al.
(1991)

Spontaneous abortion

Cohort

136

746

Company records measurement of magnetic field

0.93 (0.63-1.38)

No excess risk for women who used VDUs during first trimester and no apparent
exposure – response relation for time of VDU use per week.

Brandt and
Nielsen (1992)

Time to pregnancy

Cohort

188
(313 months)

 

Postal questionnaire

1.61 (1.09-2.38)

For a time to pregnancy of greater than 13 months, there was an increased relative risk for the group with at least 21 hours of weekly VDU use.

Nielsen and
Brandt (1992)

Low birthweight,
preterm birth,
small for gestational
age,
infant mortality

Cohort

434
443
749
160

 

Postal questionnaire

0.88 (0.67-1.66)
1.11 (0.87-1.47)
0.99 (0.62-1.94)
NR(NS)

No increase in risk was found among women exposed to VDUs.

Roman et al.
(1992)

Spontaneous abortion

Case-control

150

297 nulliparous hospital

Face-to-face interviews

0.9 (0.6-1.4)

No relation to time spent using VDUs.

Lindbohm
et al. (1992)

Spontaneous abortion

Case-control

191

394 medical registers

Employment records field measurement

1.1 (0.7-1.6),
3.4 (1.4-8.6)

Comparing workers with exposure to high magnetic field strengths to those with undetectable levels the ratio was 3.4 (95% CI 1.4-8.6)

OR = Odds ratio. CI = Confidence Interval. RR = Relative risk. NR = Value not reported. NS = Not statistically significant.

Discussion 

Evaluations of reported clusters of adverse pregnancy outcomes and VDU use have concluded that there was a high probability that these clusters occurred by chance (Bergqvist 1986). In addition, the results of the few epidemiologic studies which have assessed the relation between VDU use and adverse pregnancy outcomes have, on the whole, not shown a statistically significant increased risk.

In this review, out of ten studies of spontaneous abortion, only two found a statistically significant increased risk for VDU exposure (Goldhaber, Polen and Hiatt 1988; Lindbohm et al. 1992). None of the eight studies on congenital malformations showed an excess risk associated with VDU exposure. Of the eight studies which looked at other adverse pregnancy outcomes, one has found a statistically significant association between waiting time to pregnancy and VDU use (Brandt and Nielsen 1992).

Although there are no major differences between the three studies with positive findings and those with negative ones, improvements in exposure assessment may have increased the chances of finding a significant risk. Though not exclusive to the positive studies, these three studies attempted to divide the workers into different levels of exposure. If there is a factor inherent in VDU use which predisposes a woman to adverse pregnancy outcomes, the dose received by the worker may influence the outcome. In addition, the results of the studies by Lindbohm et al. (1992) and Schnorr et al. (1991) suggest that only a small proportion of the VDUs may be responsible for increasing the risk of spontaneous abortion among users. If this is the case, failure to identify these VDUs will introduce a bias that could lead to underestimating the risk of spontaneous abortion among VDU users.

Other factors associated with work on VDUs, such as stress and ergonomic constraints, have been suggested as possible risk factors for adverse pregnancy outcomes (McDonald et al. 1988; Brandt and Nielsen 1992). Failure of many studies to control for these possible confounders may have lead to unreliable results.

While it may be biologically plausible that exposure to high levels of extremely low frequency magnetic fields through some VDUs carries an increased risk for adverse pregnancy outcomes (Bergqvist 1986), only two studies have attempted to measure these (Schnorr et al. 1991; Lindbohm et al. 1992). Extremely low frequency magnetic fields are present in any environment where electricity is used. A contribution of these fields to adverse pregnancy outcomes could only be detected if there was a variation, in time or in space, of these fields. While VDUs contribute to the overall levels of magnetic fields in the workplace, only a small percentage of the VDUs are thought to have a strong influence on the magnetic fields measured in the working environment (Lindbohm et al. 1992). Only a fraction of the women working with VDUs are thought to be exposed to levels of magnetic radiation above that which is normally encountered in the working environment (Lindbohm et al. 1992). The lack of precision in exposure assessment encountered in counting all VDU users as “exposed” weakens the ability of a study to detect the influence of magnetic fields from VDUs on adverse pregnancy outcomes.

In some studies, women who are not gainfully employed represented a large proportion of the comparison groups for women exposed to VDUs. In this comparison, certain selective processes may have affected the results (Infante-Rivard et al. 1993); for instance, women with severe diseases are selected out of the workforce, leaving healthier women more likely to have favourable reproductive outcomes in the workforce. On the other hand, an “unhealthy pregnant worker effect” is also possible, since women who have children may stop work, whereas those without children and who experience pregnancy loss may continue working. A suggested strategy to estimate the magnitude of this bias is to do separate analyses with and without women not gainfully employed.

 

Back

Friday, 25 March 2011 04:21

Musculoskeletal Disorders

Introduction

VDU operators commonly report musculoskeletal problems in the neck, shoulders and upper limbs. These problems are not unique to VDU operators and are also reported by other workers performing tasks which are repetitive or which involve holding the body in a fixed posture (static load). Tasks which involve force are also commonly associated with musculoskeletal problems, but such tasks are not generally an important health and safety consideration for VDU operators.

Among clerical workers, whose jobs are generally sedentary and not commonly associated with physical stress, the introduction into workplaces of VDUs caused work-related musculoskeletal problems to gain in recognition and prominence. Indeed, an epidemic-like increase in reporting of problems in Australia in the mid 1980s and, to a lesser extent, in the United States and the United Kingdom in the early 1990s, has led to a debate about whether or not the symptoms have a physiological basis and whether or not they are work-related.

Those who dispute that musculoskeletal problems associated with VDU (and other) work have a physiological basis generally put forward one of four alternative views: workers are malingering; workers are unconsciously motivated by various possible secondary gains, such as workers’ compensation payments or the psychological benefits of being sick, known as compensation neurosis; workers are converting unresolved psychological conflict or emotional disturbance into physical symptoms, that is, conversion disorders; and finally, that normal fatigue is being blown out of proportion by a social process which labels such fatigue as a problem, termed social iatrogenesis. Rigorous examination of the evidence for these alternative explanations shows that they are not as well supported as explanations which posit a physiological basis for these disorders (Bammer and Martin 1988). Despite the growing evidence that there is a physiological basis for musculoskeletal complaints, the exact nature of the complaints is not well understood (Quintner and Elvey 1990; Cohen et al. 1992; Fry 1992; Helme, LeVasseur and Gibson 1992).

Symptom Prevalence

A large number of studies have documented the prevalence of musculoskeletal problems among VDU operators and these have been predominantly conducted in western industrialized countries. There is also growing interest in these problems in the rapidly industrializing nations of Asia and Latin America. There is considerable inter-country variation in how musculoskeletal disorders are described and in the types of studies carried out. Most studies have relied on symptoms reported by workers, rather than on the results of medical examinations. The studies can be usefully divided into three groups: those which have examined what can be called composite problems, those which have looked at specific disorders and those which have concentrated on problems in a single area or small group of areas.

Composite problems

Composite problems are a mixture of problems, which can include pain, loss of strength and sensory disturbance, in various parts of the upper body. They are treated as a single entity, which in Australia and the United Kingdom is referred to as repetitive strain injuries (RSI), in the United States as cumulative trauma disorders (CTD) and in Japan as occupational cervicobrachial disorders (OCD). A 1990 review (Bammer 1990) of problems among office workers (75% of the studies were of office workers who used VDUs) found that 70 studies had examined composite problems and 25 had found them to occur in a frequency range of between 10 and 29% of the workers studied. At the extremes, three studies had found no problems, while three found that 80% of workers suffer from musculoskeletal complaints. Half of the studies also reported on severe or frequent problems, with 19 finding a prevalence between 10 and 19%. One study found no problems and one found problems in 59%. The highest prevalences were found in Australia and Japan.

Specific disorders

Specific disorders cover relatively well-defined problems such as epicondylitis and carpal tunnel syndrome. Specific disorders have been less frequently studied and found to occur less frequently. Of 43 studies, 20 found them to occur in between 0.2 and 4% of workers. Five studies found no evidence of specific disorders and one found them in between 40–49% of workers.

Particular body parts

Other studies focus on particular areas of the body, such as the neck or the wrists. Neck problems are the most common and have been examined in 72 studies, with 15 finding them to occur in between 40 and 49% of workers. Three studies found them to occur in between 5 and 9% of workers and one found them in more than 80% of workers. Just under half the studies examined severe problems and they were commonly found in frequencies that ranged between 5% and 39%. Such high levels of neck problems have been found internationally, including Australia, Finland, France, Germany, Japan, Norway, Singapore, Sweden, Switzerland, the United Kingdom and the United States. In contrast, only 18 studies examined wrist problems, and seven found them to occur in between 10% and 19% of workers. One found them to occur in between 0.5 and 4% of workers and one in between 40% and 49%.

Causes

It is generally agreed that the introduction of VDUs is often associated with increased repetitive movements and increased static load through increased keystroke rates and (compared with typewriting) reduction in non-keying tasks such as changing paper, waiting for the carriage return and use of correction tape or fluid. The need to watch a screen can also lead to increased static load, and poor placement of the screen, keyboard or function keys can lead to postures which may contribute to problems. There is also evidence that the introduction of VDUs can be associated with reductions in staff numbers and increased workloads. It can also lead to changes in the psychosocial aspects of work, including social and power relationships, workers’ responsibilities, career prospects and mental workload. In some workplaces such changes have been in directions which are beneficial to workers.

In other workplaces they have led to reduced worker control over the job, lack of social support on the job, “de-skilling”, lack of career opportunities, role ambiguity, mental stress and electronic monitoring (see review by Bammer 1987b and also WHO 1989 for a report on a World Health Organization meeting). The association between some of these psychosocial changes and musculoskeletal problems is outlined below. It also seems that the introduction of VDUs helped stimulate a social movement in Australia which led to the recognition and prominence of these problems (Bammer and Martin 1992).

Causes can therefore be examined at individual, workplace and social levels. At the individual level, the possible causes of these disorders can be divided into three categories: factors not related to work, biomechanical factors and work organization factors (see table 1). Various approaches have been used to study causes but the overall results are similar to those obtained in empirical field studies which have used multivariate analyses (Bammer 1990). The results of these studies are summarized in table 1 and table 2. More recent studies also support these general findings.

Table 1. Summary of empirical fieldwork studies which have used multivariate analyses to study the causes of musculoskeletal problems among office workers

 

Factors


Reference


No./% VDU users


Non-work


Biomechanical

Work organisation

Blignault (1985)

146/90%

ο

ο

South Australian Health Commission Epidemiology Branch (1984)

456/81%

 

 

 

Ryan, Mullerworth and Pimble (1984)

52/100%

 

 

Ryan and
Bampton (1988)

143

     

Ellinger et al. (1982)

280

 

Pot, Padmos and
Bowers (1987)

222/100%

not studied

Sauter et al. (1983b)

251/74%

ο

 

Stellman et al. (1987a)

1, 032/42%

not studied

 

ο = non-factor ●= factor.

Source: Adapted from Bammer 1990.

 

Table 2. Summary of studies showing involvement of factors thought to cause musculoskeletal problems among office workers

 

Non-work

Biomechanical

Work organization

Country

No./% VDU
users

Age

Biol.
predisp.

Neuro ticism

Joint
angles

Furn.
Equip.
Obj.

Furn.
Equip.
Subj.

Visual
work

Visual
self

Years
in job

Pressure

Autonomy

Peer
cohesion

Variety

Key-
boarding

Australia

146/
90%

Ø

 

Ø

 

Ø

     

Ø

Ο

Ø

Australia

456/
81%

Ο

   

     

Ø

Ο

   

Ο

Australia

52/143/
100%

   

     

Ο

Ο

 

 

Ο

Germany

280

Ο

Ο

   

Ø

 

Ο

Ο

   ●

Ο

Netherlands

222/
100%

     

 

Ø

Ø

 

Ο

 

(Ø)

Ο

United States

251/
74%

Ø

     

Ø

 

 

Ο

 

(Ø)

 

United States

1,032/
42%

       

Ø

   

Ο

 

 

Ο = positive association, statistically significant. ● = negative association, statistically significant.  ❚ = statistically significant association. Ø = no statistically significant association. (Ø) = no variability in the factor in this study. ▲ = the youngest and the oldest had more symptoms.

Empty box implies that the factor was not included in this study.

1 Matches references in table 52.7.

Source: adapted from Bammer 1990.

 

Factors not related to work

There is very little evidence that factors not related to work are important causes of these disorders, although there is some evidence that people with a previous injury to the relevant area or with problems in another part of the body may be more likely to develop problems. There is no clear evidence for involvement of age and the one study which examined neuroticism found that it was not related.

Biomechanical factors

There is some evidence that working with certain joints of the body at extreme angles is associated with musculoskeletal problems. The effects of other biomechanical factors are less clear-cut, with some studies finding them to be important and others not. These factors are: assessment of the adequacy of the furniture and/or equipment by the investigators; assessment of the adequacy of the furniture and/or equipment by the workers; visual factors in the workplace, such as glare; personal visual factors, such as the use of spectacles; and years on the job or as an office worker (table 2).

Organizational factors

A number of factors related to work organization are clearly associated with musculoskeletal problems and are discussed more fully elsewhere is this chapter. Factors include: high work pressure, low autonomy (i.e., low levels of control over work), low peer cohesion (i.e., low levels of support from other workers) which may mean that other workers cannot or do not help out in times of pressure, and low task variety.

The only factor which was studied for which results were mixed was hours using a keyboard (table 2). Overall it can be seen that the causes of musculoskeletal problems on the individual level are multifactorial. Work-related factors, particularly work organization, but also biomechanical factors, have a clear role. The specific factors of importance may vary from workplace to workplace and person to person, depending on individual circumstances. For example, the large-scale introduction of wrist rests into a workplace when high pressure and low task variety are hallmarks is unlikely to be a successful strategy. Alternatively, a worker with satisfactory delineation and variety of tasks may still develop problems if the VDU screen is placed at an awkward angle.

The Australian experience, where there was a decline in prevalence of reporting of musculoskeletal problems in the late 1980s, is instructive in indicating how the causes of these problems can be dealt with. Although this has not been documented or researched in detail, it is likely that a number of factors were associated with the decline in prevalence. One is the widespread introduction into workplaces of “ergonomically” designed furniture and equipment. There were also improved work practices including multiskilling and restructuring to reduce pressure and increase autonomy and variety. These often occurred in conjunction with the implementation of equal employment opportunity and industrial democracy strategies. There was also widespread implementation of prevention and early intervention strategies. Less positively, some workplaces seem to have increased their reliance on casual contract workers for repetitive keyboard work. This means that any problems would not be linked to the employer, but would be solely the worker’s responsibility.

In addition, the intensity of the controversy surrounding these problems led to their stigmatization, so that many workers have become more reluctant to report and claim compensation when they develop symptoms. This was further exacerbated when workers lost cases brought against employers in well-publicized legal proceedings. A decrease in research funding, cessation in publication of incidence and prevalence statistics and of research papers about these disorders, as well as greatly reduced media attention to the problem all helped shape a perception that the problem had gone away.

Conclusion

Work-related musculoskeletal problems are a significant problem throughout the world. They represent enormous costs at the individual and social levels. There are no internationally accepted criteria for these disorders and there is a need for an international system of classification. There needs to be an emphasis on prevention and early intervention and this needs to be multifactorial. Ergonomics should be taught at all levels from elementary school to university and there need to be guidelines and laws based on minimum requirements. Implementation requires commitment from employers and active participation from employees (Hagberg et al. 1993).

Despite the many recorded cases of people with severe and chronic problems, there is little available evidence of successful treatments. There is also little evidence of how rehabilitation back into the workforce of workers with these disorders can be most successfully undertaken. This highlights that prevention and early intervention strategies are paramount to the control of work-related musculoskeletal problems.

 

Back

Friday, 25 March 2011 04:37

Skin Problems

The first reports of skin complaints among people working with or near VDUs came from Norway as early as 1981. A few cases have also been reported from the United Kingdom, the United States and Japan. Sweden, however, has provided many case reports and public discussion on the health effects of exposure to VDUs was intensified when one case of skin disease in a VDU worker was accepted as an occupational disease by the Swedish National Insurance Board in late 1985. The acceptance of this case for compensation coincided with a marked increase in the number of cases of skin disease that were suspected to be related to work with VDUs. At the Department of Occupational Dermatology at Karolinska Hospital, Stockholm, the caseload increased from seven cases referred between 1979 and 1985, to 100 new referrals from November 1985 to May 1986.

Despite the relatively large number of people who sought medical treatment for what they believed to be VDU-related skin problems, no conclusive evidence is available which shows that the VDUs themselves lead to the development of occupational skin disease. The occurrence of skin disease in VDU-exposed people appears to be coincidental or possibly related to other workplace factors. Evidence for this conclusion is strengthened by the observation that the increased incidence of skin complaints made by Swedish VDU workers has not been observed in other countries, where the mass media debate on the issue has not been as intense. Further, scientific data collected from provocation studies, in which patients have been purposely exposed to VDU-related electromagnetic fields to determine whether a skin effect could be induced, have not produced any meaningful data demonstrating a possible mechanism for development of skin problems which could be related to the fields surrounding a VDU.


Case Studies: Skin Problems and VDUs

Sweden: 450 patients were referred and examined for skin problems which they attributed to work at VDUs. Only common facial dermatoses were found and no patients had specific dermatoses that could be related to work with VDUs. While most patients felt that they had pronounced symptoms, their visible skin lesions were, in fact, mild according to standard medical definitions and most of the patients reported improvement without drug therapy even though they continued to work with VDUs . Many of the patients were suffering from identifiable contact allergies, which explained their skin symptoms . Epidemiological studies comparing the VDU-work patients to a non-exposed control population with a similar skin status showed no relationship between skin status and VDU work. Finally, a provocation study did not yield any relation between the patient symptoms and electrostatic or magnetic fields from the VDUs (Wahlberg and Lidén 1988; Berg 1988; Lidén 1990; Berg, Hedblad and Erhardt 1990; Swanbeck and Bleeker 1989).In contrast to a few early nonconclusive epidemiological studies (Murray et al. 1981; Frank 1983; Lidén and Wahlberg 1985), a large-scale epidemiological study (Berg, Lidén, and Axelson 1990; Berg 1989) of 3,745 randomly selected office employees, of whom 809 persons were medically examined, showed that while the VDU-exposed employees reported significantly more skin problems than a nonexposed control population of office employees, upon examination, they were not actually found to have no more visible signs or more skin disease.

Wales (UK): A questionnaire study found no difference between reports of skin problems in VDU workers and a control population (Carmichael and Roberts 1992).

Singapore: A control population of teachers reported significantly more skin complaints than did the VDU users (Koh et al. 1991).


It is, however, possible that work-related stress could be an important factor that can explain VDU-associated skin complaints. For example, follow-up studies in the office environment of a subgroup of the VDU-exposed office employees being studied for skin problems showed that significantly more people in the group with skin symptoms experienced extreme occupational stress than people without the skin symptoms. A correlation between levels of the stress-sensitive hormones testosterone, prolactin and thyroxin and skin symptoms were observed during work, but not during days off. Thus, one possible explanation for VDU-associated facial skin sensations could be the effects of thyroxin, which causes the blood vessels to dilate (Berg et al. 1992).

 

Back

Friday, 25 March 2011 04:39

Psychosocial Aspects of VDU Work

Introduction

Computers provide efficiency, competitive advantages and the ability to carry out work processes that would not be possible without their use. Areas such as manufacturing process control, inventory management, records management, complex systems control and office automation have all benefited from automation. Computerization requires substantial infrastructure support in order to function properly. In addition to architectural and electrical changes needed to accommodate the machines themselves, the introduction of computerization requires changes in employee knowledge and skills, and application of new methods of managing work. The demands placed on jobs which use computers can be very different from those of traditional jobs. Often computerized jobs are more sedentary and may require more thinking and mental attention to tasks, while at the same time require less physical energy expenditure. Production demands can be high, with constant work pressure and little room for decision-making.

The economic advantages of computers at work have overshadowed associated potential health, safety and social problems for workers, such as job loss, cumulative trauma disorders and increased mental stress. The transition from more traditional forms of work to computerization has been difficult in many workplaces, and has resulted in significant psychosocial and sociotechnical problems for the workforce.

Psychosocial Problems Specific to VDUs

Research studies (for example, Bradley 1983 and 1989; Bikson 1987; Westlander 1989; Westlander and Aberg 1992; Johansson and Aronsson 1984; Stellman et al. 1987b; Smith et al. 1981 and 1992a) have documented how the introduction of computers into the workplace has brought substantial changes in the process of work, in social relationships, in management style and in the nature and content of job tasks. In the 1980s, the implementation of the technological changeover to computerization was most often a “top-down” process in which employees had no input into the decisions regarding the new technology or the new work structures. As a result, many industrial relations, physical and mental health problems arose.

Experts disagree on the success of changes that are occurring in offices, with some arguing that computer technology improves the quality of work and enhances productivity (Strassmann 1985), while others compare computers to earlier forms of technology, such as assembly-line production that also make working conditions worse and increase job stress (Moshowitz 1986; Zuboff 1988). We believe that visual display unit (VDU) technology does affect work in various ways, but technology is only one element of a larger work system that includes the individual, tasks, environment and organizational factors.

Conceptualizing Computerized Job Design

Many working conditions jointly influence the VDU user. The authors have proposed a comprehensive job design model which illustrates the various facets of working conditions which can interact and accumulate to produce stress (Smith and Carayon-Sainfort 1989). Figure 1 illustrates this conceptual model for the various elements of a work system that can exert loads on workers and may result in stress. At the centre of this model is the individual with his/her unique physical characteristics, perceptions, personality and behaviour. The individual uses technologies to perform specific job tasks. The nature of the technologies, to a large extent, determines performance and the skills and knowledge needed by the worker to use the technology effectively. The requirements of the task also affect the required skill and knowledge levels needed. Both the tasks and technologies affect the job content and the mental and physical demands. The model also shows that the tasks and technologies are placed within the context of a work setting that comprises the physical and the social environment. The overall environment itself can affect comfort, psychological moods and attitudes. Finally, the organizational structure of work defines the nature and level of individual involvement, worker interactions, and levels of control. Supervision and standards of performance are all affected by the nature of the organization.

Figure 1. Model of working conditions and their impact on the individual

VDU080F1

This model helps to explain relationships between job requirements, psychological and physical loads and resulting health strains. It represents a systems concept in which any one element can influence any other element, and in which all elements interact to determine the way in which work is accomplished and the effectiveness of the work in achieving individual and organizational needs and goals. The application of the model to the VDU workplace is described below.

 

 

Environment

Physical environmental factors have been implicated as job stressors in the office and elsewhere. General air quality and housekeeping contribute, for example, to sick building syndrome and other stress responses (Stellman et al. 1985; Hedge, Erickson and Rubin 1992.) Noise is a well-known environmental stressor which can cause increases in arousal, blood pressure, and negative psychological mood (Cohen and Weinstein 1981). Environmental conditions that produce sensory disruption and make it more difficult to carry out tasks increase the level of worker stress and emotional irritation are other examples (Smith et al. 1981; Sauter et al. 1983b).

Task 

With the introduction of computer technology, expectations regarding performance increase. Additional pressure on workers is created because they are expected to perform at a higher level all the time. Excessive workload and work pressure are significant stressors for computer users (Smith et al. 1981; Piotrkowski, Cohen and Coray 1992; Sainfort 1990). New types of work demands are appearing with the increasing use of computers. For instance, cognitive demands are likely to be sources of increased stress for VDU users (Frese 1987). These are all facets of job demands.


Electronic Monitoring of Employee Performance

The use of electronic methods to monitor employee work performance has increased substantially with the widespread use of personal computers which make such monitoring quick and easy. Monitoring provides information which can be used by employers to better manage technological and human resources. With electronic monitoring it is possible to pinpoint bottlenecks, production delays and below average (or below standard) performance of employees in real time. New electronic communication technologies have the capability of tracking the performance of individual elements of a communication system and of pinpointing individual worker inputs. Such work elements as data entry into computer terminals, telephone conversations, and electronic mail messages can all be examined through the use of electronic surveillance.

Electronic monitoring increases management control over the workforce, and may lead to organisational management approaches that are stressful. This raises important issues about the accuracy of the monitoring system and how well it represents worker contributions to the employer’s success, the invasion of worker privacy, worker versus technology control over job tasks, and the implications of management styles that use monitored information to direct worker behaviour on the job (Smith and Amick 1989; Amick and Smith 1992; Carayon 1993b). Monitoring can bring about increased production, but it may also produce job stress, absences from work, turnover in the workforce and sabotage. When electronic monitoring is combined with incentive systems for increased production, work-related stress can also be increased (OTA 1987; Smith et al. 1992a). In addition, such electronic performance monitoring raises issues of worker privacy (ILO 1991) and several countries have banned the use of individual performance monitoring.

A basic requirement of electronic monitoring is that work tasks be broken up into activities that can easily be quantified and measured, which usually results in a job design approach that reduces the content of the tasks by removing complexity and thinking, which are replaced by repetitive action. The underlying philosophy is similar to a basic principle of “Scientific Management” (Taylor 1911) that calls for work “simplification.”

In one company, for example, a telephone monitoring capability was included with a new telephone system for customer service operators. The monitoring system distributed incoming telephone calls from customers, timed the calls and allowed for supervisor eavesdropping on employee telephone conversations. This system was instituted under the guise of a work flow scheduling tool for determining the peak periods for telephone calls to determine when extra operators would be needed. Instead of using the monitoring system solely for that purpose, management also used the data to establish work performance standards, (seconds per transaction) and to bring disciplinary action against employees with “below average performance.” This electronic monitoring system introduced a pressure to perform above average because of fear of reprimand. Research has shown that such work pressure is not conducive to good performance but rather can bring about adverse health consequences (Cooper and Marshall 1976; Smith 1987). In fact, the monitoring system described was found to have increased employee stress and lowered the quality of production (Smith et al. 1992a).

Electronic monitoring can influence worker self-image and feelings of self-worth. In some cases, monitoring could enhance feelings of self-worth if the worker gets positive feedback. The fact that management has taken an interest in the worker as a valuable resource is another possible positive outcome. However, both effects may be perceived differently by workers, particularly if poor performance leads to punishment or reprimand. Fear of negative evaluation can produce anxiety and may damage self-esteem and self-image. Indeed electronic monitoring can create known adverse working conditions, such as paced work, lack of worker involvement, reduced task variety and task clarity, reduced peer social support, reduced supervisory support, fear of job loss, or routine work activities, and lack of control over tasks (Amick and Smith 1992; Carayon 1993).

Michael J. Smith


Positive aspects also exist since computers are able to do many of the simple, repetitive tasks that were previously done manually, which can reduce the repetitiveness of the job, increase the content of the job and make it more meaningful. This is not universally true, however, since many new computer jobs, such as data entry, are still repetitive and boring. Computers can also provide performance feedback that is not available with other technologies (Kalimo and Leppanen 1985), which can reduce ambiguity.

Some aspects of computerized work have been linked to decreased control, which has been identified as a major source of stress for clerical computer users. Uncertainty regarding the duration of computer-related problems, such as breakdown and slowdown, can be a source of stress (Johansson and Aronsson 1984; Carayon-Sainfort 1992). Computer-related problems can be particularly stressful if workers, such as airline reservation clerks, are highly dependent on the technology to perform their job.

Technology

The technology being used by the worker often defines his or her ability to accomplish tasks and the extent of physiological and psychological load. If the technology produces either too much or too little workload, increased stress and adverse physical health outcomes can occur (Smith et al. 1981; Johansson and Aronsson 1984; Ostberg and Nilsson 1985). Technology is changing at a rapid pace, forcing workers to adjust their skills and knowledge continuously to keep up. In addition, today’s skills can quickly become obsolete. Technological obsolescence may be due to job de-skilling and impoverished job content or to inadequate skills and training. Workers who do not have the time or resources to keep up with the technology may feel threatened by the technology and may worry about losing their job. Thus, workers’ fears of having inadequate skills to use the new technology are one of the main adverse influences of technology, which training, of course, can help to offset. Another effect of the introduction of technology is the fear of job loss due to increased efficiency of technology (Ostberg and Nilsson 1985; Smith, Carayon and Miezio 1987).

Intensive, repetitive, long sessions at the VDU can also contribute to increased ergonomic stress and strain (Stammerjohn, Smith and Cohen 1981; Sauter et al. 1983b; Smith et al. 1992b) and can create visual or musculoskeletal discomfort and disorders, as described elsewhere in the chapter.

Organizational factors

The organizational context of work can influence worker stress and health. When technology requires new skills, the way in which workers are introduced to the new technology and the organizational support they receive, such as appropriate training and time to acclimatize, has been related to the levels of stress and emotional disturbances experienced (Smith, Carayon and Miezio 1987). The opportunity for growth and promotion in a job (career development) is also related to stress (Smith et al. 1981). Job future uncertainty is a major source of stress for computer users (Sauter et al. 1983b; Carayon 1993a) and the possibility of job loss also creates stress (Smith et al. 1981; Kasl 1978).

Work scheduling, such as shift work and overtime, have been shown to have negative mental and physical health consequences (Monk and Tepas 1985; Breslow and Buell 1960). Shift work is increasingly used by companies that want or need to keep computers running continuously. Overtime is often needed to ensure that workers keep up with the workload, especially when work remains incomplete as a result of delays due to computer breakdown or misfunction.

Computers provide management with the capability to continuously monitor employee performance electronically, which has the potential to create stressful working conditions, such as by increasing work pressure (see the box “Electronic Monitoring”). Negative employee-supervisor relationships and feelings of lack of control can increase in electronically supervised workplaces.

The introduction of VDU technology has affected social relationships at work. Social isolation has been identified as a major source of stress for computer users (Lindström 1991; Yang and Carayon 1993) since the increased time spent working on computers reduces the time that workers have to socialize and receive or give social support. The need for supportive supervisors and co-workers has been well documented (House 1981). Social support can moderate the impact of other stressors on worker stress. Thus, support from colleagues, supervisor or computer staff becomes important for the worker who is experiencing computer-related problems but the computer work environment may, ironically, reduce the level of such social support available.

The individual

A number of personal factors such as personality, physical health status, skills and abilities, physical conditioning, prior experiences and learning, motives, goals and needs determine the physical and psychological effects just described (Levi 1972).

Improving the Psychosocial Characteristics of VDU Work

The first step in making VDU work less stressful is to identify work organization and job design features that can promote psychosocial problems so that they can be modified, always bearing in mind that VDU problems which can lead to job stress are seldom the result of single aspects of the organization or of job design, but rather, are a combination of many aspects of improper work design. Thus, solutions for reducing or eliminating job stress must be comprehensive and deal with many improper work design factors simultaneously. Solutions that focus on only one or two factors will not succeed. (See figure 2.)

Figure 2. Keys to reducing isolation and stress

VDU080F2

Improvements in job design should start with the work organization providing a supportive environment for employees. Such an environment enhances employee motivation to work and feelings of security, and it reduces feelings of stress (House 1981). A policy statement that defines the importance of employees within an organization and is explicit on how the organization will provide a supportive environment is a good first step. One very effective means for providing support to employees is to provide supervisors and managers with specific training in methods for being supportive. Supportive supervisors can serve as buffers that “protect” employees from unnecessary organizational or technological stresses.

 

The content of job tasks has long been recognized as important for employee motivation and productivity (Herzberg 1974; Hackman and Oldham 1976). More recently the relationship between job content and job stress reactions has been elucidated (Cooper and Marshall 1976; Smith 1987). Three main aspects of job content that are of specific relevance to VDU work are task complexity, employee skills and career opportunities. In some respects, these are all related to the concept of developing the motivational climate for employee job satisfaction and psychological growth, which deals with the improvement of employees’ intellectual capabilities and skills, increased ego enhancement or self-image and increased social group recognition of individual achievement.

The primary means for enhancing job content is to increase the skill level for performing job tasks, which typically means enlarging the scope of job tasks, as well as enriching the elements of each specific task (Herzberg 1974). Enlarging the number of tasks increases the repertoire of skills needed for successful task performance, and also increases the number of employee decisions made while defining task sequences and activities. An increase in the skill level of the job content promotes employee self-image of personal worth and of value to the organization. It also enhances the positive image of the individual in his or her social work group within the organization.

Increasing the complexity of the tasks, which means increasing the amount of thinking and decision-making involved, is a logical next step that can be achieved by combining simple tasks into sets of related activities that have to be coordinated, or by adding mental tasks that require additional knowledge and computational skills. Specifically, when computerized technology is introduced, new tasks in general will have requirements that exceed the current knowledge and skills of the employees who are to perform them. Thus there is a need to train employees in the new aspects of the tasks so that they will have the skills to perform the tasks adequately. Such training has more than one benefit, since it not only may improve employee knowledge and skills, and thus enhance performance, but may also enhance employee self-esteem and confidence. Providing training also shows the employee that the employer is willing to invest in his or her skill enhancement, and thus promotes confidence in employment stability and job future.

The amount of control that an employee has over the job has a powerful psychosocial influence (Karasek et al. 1981; Sauter, Cooper and Hurrell 1989). Important aspects of control can be defined by the answers to the questions, “What, how and when?” The nature of the tasks to be undertaken, the need for coordination among employees, the methods to be used to carry out the tasks and the scheduling of the tasks can all be defined by answers to these questions. Control can be designed into jobs at the levels of the task, the work unit and the organization (Sainfort 1991; Gardell 1971). At the task level, the employee can be given autonomy in the methods and procedures used in completing the task.

At the work-unit level, groups of employees can self-manage several interrelated tasks and the group itself can decide on who will perform particular tasks, the scheduling of tasks, coordination of tasks and production standards to meet organizational goals. At the organization level, employees can participate in structured activities that provide input to management about employee opinions or quality improvement suggestions. When the levels of control available are limited, it is better to introduce autonomy at the task level and then work up the organizational structure, insofar as possible (Gardell 1971).

One natural result of computer automation appears to be an increased workload, since the purpose of the automation is to enhance the quantity and quality of work output. Many organizations believe that such an increase is necessary in order to pay for the investment in the automation. However, establishing the appropriate workload is problematic. Scientific methods have been developed by industrial engineers for determining appropriate work methods and workloads (the performance requirements of jobs). Such methods have been used successfully in manufacturing industries for decades, but have had little application in office settings, even after office computerization. The use of scientific means, such as those described by Kanawaty (1979) and Salvendy (1992), to establish workloads for VDU operators, should be a high priority for every organization, since such methods set reasonable production standards or work output requirements, help to protect employees from excessive workloads, as well as help to ensure the quality of products.

The demand that is associated with the high levels of concentration required for computerized tasks can diminish the amount of social interaction during work, leading to social isolation of employees. To counter this effect, opportunities for socialization for employees not engaged in computerized tasks, and for employees who are on rest breaks, should be provided. Non-computerized tasks which do not require extensive concentration could be organized in such a way that employees can work in close proximity to one another and thus have the opportunity to talk among themselves. Such socialization provides social support, which is known to be an essential modifying factor in reducing adverse mental health effects and physical disorders such as cardiovascular diseases (House 1981). Socialization naturally also reduces social isolation and thus promotes improved mental health.

Since poor ergonomic conditions can also lead to psychosocial problems for VDU users, proper ergonomic conditions are an essential element of complete job design. This is covered in some detail in other articles in this chapter and elsewhere in the Encyclopaedia.

Finding Balance

Since there are no “perfect” jobs or “perfect” workplaces free from all psychosocial and ergonomic stressors, we must often compromise when making improvements at the workplace. Redesigning processes generally involves “trade-offs” between excellent working conditions and the need to have acceptable productivity. This requires us to think about how to achieve the best “balance” between positive benefits for employee health and productivity. Unfortunately, since so many factors can produce adverse psychosocial conditions that lead to stress, and since these factors are interrelated, modifications in one factor may not be beneficial if concomitant changes are not made in other related factors. In general, two aspects of balance should be addressed: the balance of the total system and compensatory balance.

System balance is based on the idea that a workplace or process or job is more than the sum of the individual components of the system. The interplay among the various components produces results that are greater (or less) than the sum of the individual parts and determines the potential for the system to produce positive results. Thus, job improvements must take account of and accommodate the entire work system. If an organization concentrates solely on the technological component of the system, there will be an imbalance because personal and psychosocial factors will have been neglected. The model given in figure 1 of the work system can be used to identify and understand the relationships between job demands, job design factors, and stress which must be balanced.

Since it is seldom possible to eliminate all psychosocial factors that cause stress, either because of financial considerations, or because it is impossible to change inherent aspects of job tasks, compensatory balance techniques are employed. Compensatory balance seeks to reduce psychological stress by changing aspects of work that can be altered in a positive direction to compensate for those aspects that cannot be changed. Five elements of the work system—physical loads, work cycles, job content, control, and socialization—function in concert to provide the resources for achieving individual and organizational goals through compensatory balance. While we have described some of the potential negative attributes of these elements in terms of job stress, each also has positive aspects that can counteract the negative influences. For instance, inadequate skill to use new technology can be offset by employee training. Low job content that creates repetition and boredom can be balanced by an organizational supervisory structure that promotes employee involvement and control over tasks, and job enlargement that introduces task variety. The social conditions of VDU work could be improved by balancing the loads that are potentially stressful and by considering all of the work elements and their potential for promoting or reducing stress. The organizational structure itself could be adapted to accommodate enriched jobs in order to provide support to the individual. Increased staffing levels, increasing the levels of shared responsibilities or increasing the financial resources put toward worker well-being are other possible solutions.

 

Back

Page 7 of 7

" DISCLAIMER: The ILO does not take responsibility for content presented on this web portal that is presented in any language other than English, which is the language used for the initial production and peer-review of original content. Certain statistics have not been updated since the production of the 4th edition of the Encyclopaedia (1998)."

Contents