Banner 1


11. Sensory Systems

Chapter Editor: Heikki Savolainen

Table of Contents

Tables and Figures

The Ear
Marcel-André Boillat   

Chemically-Induced Hearing Disorders
Peter Jacobsen

Physically-Induced Hearing Disorders
Peter L. Pelmear

Lucy Yardley

Vision and Work
Paule Rey and Jean-Jacques Meyer

April E. Mott and Norman Mann

April E. Mott

Cutaneous Receptors
Robert Dykes and Daniel McBain


Click a link below to view table in article context.

1. Typical calculation of functional loss from an audiogram
2. Visual requirements for different activities
3. Recommended illuminance values for the lighting design
4. Visual requirements for a driving licence in France
5. Agents/processes reported to alter the taste system
6. Agents/processes associated with olfactory abnormalities


Point to a thumbnail to see figure caption, click to see figure in article context.



Click to return to top of page

Thursday, 03 March 2011 17:34

The Ear


The ear is the sensory organ responsible for hearing and the maintenance of equilibrium, via the detection of body position and of head movement. It is composed of three parts: the outer, middle, and inner ear; the outer ear lies outside the skull, while the other two parts are embedded in the temporal bone (figure 1).

Figure 1. Diagram of the ear.


The outer ear consists of the auricle, a cartilaginous skin-covered structure, and the external auditory canal, an irregularly-shaped cylinder approximately 25 mm long which is lined by glands secreting wax.

The middle ear consists of the tympanic cavity, an air-filled cavity whose outer walls form the tympanic membrane (eardrum), and communicates proximally with the nasopharynx by the Eustachian tubes, which maintain pressure equilibrium on either side of the tympanic membrane. For instance, this communication explains how swallowing allows equalization of pressure and restoration of lost hearing acuity caused by rapid change in barometric pressure (e.g., landing airplanes, fast elevators). The tympanic cavity also contains the ossicles—the malleus, incus and stapes—which are controlled by the stapedius and tensor tympani muscles. The tympanic membrane is linked to the inner ear by the ossicles, specifically by the mobile foot of the stapes, which lies against the oval window.

The inner ear contains the sensory apparatus per se. It consists of a bony shell (the bony labyrinth) within which is found the membranous labyrinth—a series of cavities forming a closed system filled with endolymph, a potassium-rich liquid. The membranous labyrinth is separated from the bony labyrinth by the perilymph, a sodium-rich liquid.

The bony labyrinth itself is composed of two parts. The anterior portion is known as the cochlea and is the actual organ of hearing. It has a spiral shape reminiscent of a snail shell, and is pointed in the anterior direction. The posterior portion of the bony labyrinth contains the vestibule and the semicircular canals, and is responsible for equilibrium. The neurosensory structures involved in hearing and equilibrium are located in the membranous labyrinth: the organ of Corti is located in the cochlear canal, while the maculae of the utricle and the saccule and the ampullae of the semicircular canals are located in the posterior section.

Hearing organs

The cochlear canal is a spiral triangular tube, comprising two and one-half turns, which separates the scala vestibuli from the scala tympani. One end terminates in the spiral ligament, a process of the cochlea’s central column, while the other is connected to the bony wall of the cochlea.

The scala vestibuli and tympani end in the oval window (the foot of the stapes) and round window, respectively. The two chambers communicate through the helicotrema, the tip of the cochlea. The basilar membrane forms the inferior surface of the cochlear canal, and supports the organ of Corti, responsible for the transduction of acoustic stimuli. All auditory information is transduced by only 15,000 hair cells (organ of Corti), of which the so-called inner hair cells, numbering 3,500, are critically important, since they form synapses with approximately 90% of the 30,000 primary auditory neurons (figure 2). The inner and outer hair cells are separated from each other by an abundant layer of support cells. Traversing an extraordinarily thin membrane, the cilia of the hair cells are embedded in the tectorial membrane, whose free end is located above the cells. The superior surface of the cochlear canal is formed by Reissner’s membrane.

Figure 2. Cross-section of one loop of the cochlea. Diameter: approximately 1.5 mm.


The bodies of the cochlear sensory cells resting on the basilar membrane are surrounded by nerve terminals, and their approximately 30,000 axons form the cochlear nerve. The cochlear nerve crosses the inner ear canal and extends to the central structures of the brain stem, the oldest part of the brain. The auditory fibres end their tortuous path in the temporal lobe, the part of the cerebral cortex responsible for the perception of acoustic stimuli.






Organs of Equilibrium

The sensory cells are located in the ampullae of the semicircular canals and the maculae of the utricle and saccule, and are stimulated by pressure transmitted through the endolymph as a result of head or body movements. The cells connect with bipolar cells whose peripheral processes form two tracts, one from the anterior and external semicircular canals, the other from the posterior semicircular canal. These two tracts enter the inner ear canal and unite to form the vestibular nerve, which extends to the vestibular nuclei in the brainstem. Fibres from the vestibular nuclei, in turn, extend to cerebellar centres controlling eye movements, and to the spinal cord.

The union of the vestibular and cochlear nerves forms the 8th cranial nerve, also known as the vestibulocochlear nerve.

Physiology of Hearing

Sound conduction through air

The ear is composed of a sound conductor (the outer and middle ear) and a sound receptor (the inner ear).

Sound waves passing through the external auditory canal strike the tympanic membrane, causing it to vibrate. This vibration is transmitted to the stapes through the hammer and anvil. The surface area of the tympanic membrane is almost 16 times that of the foot of the stapes (55 mm2/3.5 mm2), and this, in combination with the lever mechanism of the ossicles, results in a 22-fold amplification of the sound pressure. Due to the middle ear’s resonant frequency, the transmission ratio is optimal between 1,000 and 2,000 Hz. As the foot of the stapes moves, it causes waves to form in the liquid within the vestibular canal. Since the liquid is incompressible, each inward movement of the foot of the stapes causes an equivalent outward movement of the round window, towards the middle ear.

When exposed to high sound levels, the stapes muscle contracts, protecting the inner ear (the attenuation reflex). In addition to this function, the muscles of the middle ear also extend the dynamic range of the ear, improve sound localization, reduce resonance in the middle ear, and control air pressure in the middle ear and liquid pressure in the inner ear.

Between 250 and 4,000 Hz, the threshold of the attenuation reflex is approximately 80 decibels (dB) above the hearing threshold, and increases by approximately 0.6 dB/dB as the stimulation intensity increases. Its latency is 150 ms at threshold, and 24-35 ms in the presence of intense stimuli. At frequencies below the natural resonance of the middle ear, contraction of the middle ear muscles attenuates sound transmission by approximately 10 dB. Because of its latency, the attenuation reflex provides adequate protection from noise generated at rates above two to three per second, but not from discrete impulse noise.

The speed with which sound waves propagate through the ear depends on the elasticity of the basilar membrane. The elasticity increases, and the wave velocity thus decreases, from the base of the cochlea to the tip. The transfer of vibration energy to Reissner’s membrane and the basilar membrane is frequency-dependent. At high frequencies, the wave amplitude is greatest at the base, while for lower frequencies, it is greatest at the tip. Thus, the point of greatest mechanical excitation in the cochlea is frequency-dependent. This phenomenon underlies the ability to detect frequency differences. Movement of the basilar membrane induces shear forces in the stereocilia of the hair cells and triggers a series of mechanical, electrical and biochemical events responsible for mechanical-sensory transduction and initial acoustic signal processing. The shear forces on the stereocilia cause ionic channels in the cell membranes to open, modifying the permeability of the membranes and allowing the entry of potassium ions into the cells. This influx of potassium ions results in depolarization and the generation of an action potential.

Neurotransmitters liberated at the synaptic junction of the inner hair cells as a result of depolarization trigger neuronal impulses which travel down the afferent fibres of the auditory nerve toward higher centres. The intensity of auditory stimulation depends on the number of action potentials per unit time and the number of cells stimulated, while the perceived frequency of the sound depends on the specific nerve fibre populations activated. There is a specific spatial mapping between the frequency of the sound stimulus and the section of the cerebral cortex stimulated.

The inner hair cells are mechanoreceptors which transform signals generated in response to acoustic vibration into electric messages sent to the central nervous system. They are not, however, responsible for the ear’s threshold sensitivity and its extraordinary frequency selectivity.

The outer hair cells, on the other hand, send no auditory signals to the brain. Rather, their function is to selectively amplify mechano-acoustic vibration at near-threshold levels by a factor of approximately 100 (i.e., 40 dB), and so facilitate stimulation of inner hair cells. This amplification is believed to function through micromechanical coupling involving the tectorial membrane. The outer hair cells can produce more energy than they receive from external stimuli and, by contracting actively at very high frequencies, can function as cochlear amplifiers.

In the inner ear, interference between outer and inner hair cells creates a feedback loop which permits control of auditory reception, particularly of threshold sensitivity and frequency selectivity. Efferent cochlear fibres may thus help reduce cochlear damage caused by exposure to intense acoustic stimuli. Outer hair cells may also undergo reflex contraction in the presence of intense stimuli. The attenuation reflex of the middle ear, active primarily at low frequencies, and the reflex contraction in the inner ear, active at high frequencies, are thus complementary.

Bone conduction of sound

Sound waves may also be transmitted through the skull. Two mechanisms are possible:

In the first, compression waves impacting the skull cause the incompressible perilymph to deform the round or oval window. As the two windows have differing elasticities, movement of the endolymph results in movement of the basilar membrane.

The second mechanism is based on the fact that movement of the ossicles induces movement in the scala vestibuli only. In this mechanism, movement of the basilar membrane results from the translational movement produced by the inertia.

Bone conduction is normally 30-50 dB lower than air conduction—as is readily apparent when both ears are blocked. This is only true, however, for air-mediated stimuli, direct bone stimulation being attenuated to a different degree.

Sensitivity range

Mechanical vibration induces potential changes in the cells of the inner ear, conduction pathways and higher centres. Only frequencies of 16 Hz–25,000 Hz and sound pressures (these can be expressed in pascals, Pa) of 20 μPa to 20 Pa can be perceived. The range of sound pressures which can be perceived is remarkable—a 1-million-fold range! The detection thresholds of sound pressure are frequency-dependent, lowest at 1,000-6,000 Hz and increasing at both higher and lower frequencies.

For practical purposes, the sound pressure level is expressed in decibels (dB), a logarithmic measurement scale corresponding to perceived sound intensity relative to the auditory threshold. Thus, 20 μPa is equivalent to 0 dB. As the sound pressure increases tenfold, the decibel level increases by 20 dB, in accordance with the following formula:

Lx = 20 log Px/P0


Lx = sound pressure in dB

Px = sound pressure in pascals

P0 = reference sound pressure(2×10–5 Pa, the auditory threshold)

The frequency-discrimination threshold, that is the minimal detectable difference in frequency, is 1.5 Hz up to 500 Hz, and 0.3% of the stimulus frequency at higher frequencies. At sound pressures near the auditory threshold, the sound-pressure-discrimination threshold is approximately 20%, although differences of as little as 2% may be detected at high sound pressures.

If two sounds differ in frequency by a sufficiently small amount, only one tone will be heard. The perceived frequency of the tone will be midway between the two source tones, but its sound pressure level is variable. If two acoustic stimuli have similar frequencies but differing intensities, a masking effect occurs. If the difference in sound pressure is large enough, masking will be complete, with only the loudest sound perceived.

Localization of acoustic stimuli depends on the detection of the time lag between the arrival of the stimulus at each ear, and, as such, requires intact bilateral hearing. The smallest detectable time lag is 3 x 10–5 seconds. Localization is facilitated by the head’s screening effect, which results in differences in stimulus intensity at each ear.

The remarkable ability of human beings to resolve acoustic stimuli is a result of frequency decomposition by the inner ear and frequency analysis by the brain. These are the mechanisms that allow individual sound sources such as individual musical instruments to be detected and identified in the complex acoustic signals that make up the music of a full symphony orchestra.


Ciliary damage

The ciliary motion induced by intense acoustic stimuli may exceed the mechanical resistance of the cilia and cause mechanical destruction of hair cells. As these cells are limited in number and incapable of regeneration, any cell loss is permanent, and if exposure to the harmful sound stimulus continues, progressive. In general, the ultimate effect of ciliary damage is the development of a hearing deficit.

Outer hair cells are the most sensitive cells to sound and toxic agents such as anoxia, ototoxic medications and chemicals (e.g., quinine derivates, streptomycin and some other antibiotics, some anti-tumour preparations), and are thus the first to be lost. Only passive hydromechanical phenomena remain operative in outer hair cells which are damaged or have damaged stereocilia. Under these conditions, only gross analysis of acoustic vibration is possible. In very rough terms, cilia destruction in outer hair cells results in a 40 dB increase in hearing threshold.

Cellular damage

Exposure to noise, especially if it is repetitive or prolonged, may also affect the metabolism of cells of the organ of Corti, and afferent synapses located beneath the inner hair cells. Reported extraciliary effects include modification of cell ultrastructure (reticulum, mitochondria, lysosomes) and, postsynaptically, swelling of afferent dendrites. Dendritic swelling is probably due to the toxic accumulation of neurotransmitters as a result of excessive activity by inner hair cells. Nevertheless, the extent of stereociliary damage appears to determine whether hearing loss is temporary or permanent.

Noise-induced Hearing Loss

Noise is a serious hazard to hearing in today’s increasingly complex industrial societies. For example, noise exposure accounts for approximately one-third of the 28 million cases of hearing loss in the United States, and NIOSH (the National Institute for Occupational Safety and Health) reports that 14% of American workers are exposed to potentially dangerous sound levels, that is levels exceeding 90 dB. Noise exposure is the most widespread harmful occupational exposure and is the second leading cause, after age-related effects, of hearing loss. Finally, the contribution of non-occupational noise exposure must not be forgotten, such as home workshops, over-amplified music especially with use of earphones, use of firearms, etc.

Acute noise-induced damage. The immediate effects of exposure to high-intensity sound stimuli (for example, explosions) include elevation of the hearing threshold, rupture of the eardrum, and traumatic damage to the middle and inner ears (dislocation of ossicles, cochlear injury or fistulas).

Temporary threshold shift. Noise exposure results in a decrease in the sensitivity of auditory sensory cells which is proportional to the duration and intensity of exposure. In its early stages, this increase in auditory threshold, known as auditory fatigue or temporary threshold shift (TTS), is entirely reversible but persists for some time after the cessation of exposure.

Studies of the recovery of auditory sensitivity have identified several types of auditory fatigue. Short-term fatigue dissipates in less than two minutes and results in a maximum threshold shift at the exposure frequency. Long-term fatigue is characterized by recovery in more than two minutes but less than 16 hours, an arbitrary limit derived from studies of industrial noise exposure. In general, auditory fatigue is a function of stimulus intensity, duration, frequency, and continuity. Thus, for a given dose of noise, obtained by integration of intensity and duration, intermittent exposure patterns are less harmful than continuous ones.

The severity of the TTS increases by approximately 6 dB for every doubling of stimulus intensity. Above a specific exposure intensity (the critical level), this rate increases, particularly if exposure is to impulse noise. The TTS increases asymptotically with exposure duration; the asymptote itself increases with stimulus intensity. Due to the characteristics of the outer and middle ears’ transfer function, low frequencies are tolerated the best.

Studies on exposure to pure tones indicate that as the stimulus intensity increases, the frequency at which the TTS is the greatest progressively shifts towards frequencies above that of the stimulus. Subjects exposed to a pure tone of 2,000 Hz develop TTS which is maximal at approximately 3,000 Hz (a shift of a semi-octave). The noise’s effect on the outer hair cells is believed to be responsible for this phenomenon.

The worker who shows TTS recovers to baseline hearing values within hours after removal from noise. However, repeated noise exposures result in less hearing recovery and resultant permanent hearing loss.

Permanent threshold shift. Exposure to high-intensity sound stimuli over several years may lead to permanent hearing loss. This is referred to as permanent threshold shift (PTS). Anatomically, PTS is characterized by degeneration of the hair cells, starting with slight histological modifications but eventually culminating in complete cell destruction. Hearing loss is most likely to involve frequencies to which the ear is most sensitive, as it is at these frequencies that the transmission of acoustic energy from the external environment to the inner ear is optimal. This explains why hearing loss at 4,000 Hz is the first sign of occupationally induced hearing loss (figure 3). Interaction has been observed between stimulus intensity and duration, and international standards assume the degree of hearing loss to a function of the total acoustic energy received by the ear (dose of noise).

Figure 3. Audiogram showing bilateral noise-induced hearing loss.


The development of noise-induced hearing loss shows individual susceptibility. Various potentially important variables have been examined to explain this susceptibility, such as age, gender, race, cardiovascular disease, smoking, etc. The data were inconclusive.

An interesting question is whether the amount of TTS could be used to predict the risk of PTS. As noted above, there is a progressive shift of the TTS to frequencies above that of the stimulation frequency. On the other hand, most of the ciliary damage occurring at high stimulus intensities involves cells that are sensitive to the stimulus frequency. Should exposure persist, the difference between the frequency at which the PTS is maximal and the stimulation frequency progressively decreases. Ciliary damage and cell loss consequently occurs in the cells most sensitive to the stimulus frequencies. It thus appears that TTS and PTS involve different mechanisms, and that it is thus impossible to predict an individual’s PTS on the basis of the observed TTS.

Individuals with PTS are usually asymptomatic initially. As the hearing loss progresses, they begin to have difficulty following conversations in noisy settings such as parties or restaurants. The progression, which usually affects the ability to perceive high-pitched sounds first, is usually painless and relatively slow.

Examination of individuals suffering from hearing loss

Clinical examination

In addition to the history of the date when the hearing loss was first detected (if any) and how it has evolved, including any asymmetry of hearing, the medical questionnaire should elicit information on the patient’s age, family history, use of ototoxic medications or exposure to other ototoxic chemicals, the presence of tinnitus (i.e., buzzing, whistling or ringing sounds in one or both ears), dizziness or any problems with balance, and any history of ear infections with pain or discharge from the outer ear canal. Of critical importance is a detailed life-long history of exposures to high sound levels (note that, to the layperson, not all sounds are “noise”) on the job, in previous jobs and off-the-job. A history of episodes of TTS would confirm prior toxic exposures to noise.

Physical examination should include evaluation of the function of the other cranial nerves, tests of balance, and ophthalmoscopy to detect any evidence of increased cranial pressure. Visual examination of the external auditory canal will detect any impacted cerumen and, after it has been cautiously removed (no sharp object!), any evidence of scarring or perforation of the tympanic membrane. Hearing loss can be determined very crudely by testing the patient’s ability to repeat words and phrases spoken softly or whispered by the examiner when positioned behind and out of the sight of the patient. The Weber test (placing a vibrating tuning fork in the centre of the forehead to determine if this sound is “heard” in either or both ears) and the Rinné pitch-pipe test (placing a vibrating tuning fork on the mastoid process until the patient can no longer hear the sound, then quickly placing the fork near the ear canal; normally the sound can be heard longer through air than through bone) will allow classification of the hearing loss as transmission- or neurosensory.

The audiogram is the standard test to detect and evaluate hearing loss (see below). Specialized studies to complement the audiogram may be necessary in some patients. These include: tympanometry, word discrimination tests, evaluation of the attenuation reflex, electrophysical studies (electrocochleogram, auditory evoked potentials) and radiological studies (routine skull x rays complemented by CAT scan, MRI).


This crucial component of the medical evaluation uses a device known as an audiometer to determine the auditory threshold of individuals to pure tones of 250-8,000 Hz and sound levels between –10 dB (the hearing threshold of intact ears) and 110 dB (maximal damage). To eliminate the effects of TTSs, patients should not have been exposed to noise during the previous 16 hours. Air conduction is measured by earphones placed on the ears, while bone conduction is measured by placing a vibrator in contact with the skull behind the ear. Each ear’s hearing is measured separately and test results are reported on a graph known as an audiogram (Figure 3). The threshold of intelligibility, that is. the sound intensity at which speech becomes intelligible, is determined by a complementary test method known as vocal audiometry, based on the ability to understand words composed of two syllables of equal intensity (for instance, shepherd, dinner, stunning).

Comparison of air and bone conduction allows classification of hearing losses as transmission (involving the external auditory canal or middle ear) or neurosensory loss (involving the inner ear or auditory nerve) (figures 3 and 4). The audiogram observed in cases of noise-induced hearing loss is characterized by an onset of hearing loss at 4,000 Hz, visible as a dip in the audiogram (figure 3). As exposure to excessive noise levels continues, neighbouring frequencies are progressively affected and the dip broadens, encroaching, at approximately 3,000 Hz, on frequencies essential for the comprehension of conversation. Noise-induced hearing loss is usually bilateral and shows a similar pattern in both ears, that is, the difference between the two ears does not exceed 15 dB at 500 Hz, at 1,000 dB and at 2,000 Hz, and 30 dB at 3,000, at 4,000 and at 6,000 Hz. Asymmetric damage may, however, be present in cases of non-uniform exposure, for example, with marksmen, in whom hearing loss is higher on the side opposite to the trigger finger (the left side, in a right-handed person). In hearing loss unrelated to noise exposure, the audiogram does not exhibit the characteristic 4,000 Hz dip (figure 4).

Figure 4. Examples of right-ear audiograms. The circles represent air-conduction hearing loss, the ““ bone conduction.


There are two types of audiometric examinations: screening and diagnostic. Screening audiometry is used for the rapid examination of groups of individuals in the workplace, in schools or elsewhere in the community to identify those who appear to have some hearing loss. Often, electronic audiometers that permit self-testing are used and, as a rule, screening audiograms are obtained in a quiet area but not necessarily in a sound-proof, vibration-free chamber. The latter is considered to be a prerequisite for diagnostic audiometry which is intended to measure hearing loss with reproducible precision and accuracy. The diagnostic examination is properly performed by a trained audiologist (in some circumstances, formal certification of the competence of the audiologist is required). The accuracy of both types of audiometry depends on periodic testing and                                                                                                                                           recalibration of the equipment being used.

In many jurisdictions, individuals with job-related, noise-induced hearing loss are eligible for workers’ compensation benefits. Accordingly, many employers are including audiometry in their preplacement medical examinations to detect any existing hearing loss that may be the responsibility of a previous employer or represent a non-occupational exposure.

Hearing thresholds progressively increase with age, with higher frequencies being more affected (figure 3). The characteristic 4,000 Hz dip observed in noise-induced hearing loss is not seen with this type of hearing loss.

Calculation of hearing loss

In the United States the most widely accepted formula for calculating functional limitation related to hearing loss is the one proposed in 1979 by the American Academy of Otolaryngology (AAO) and adopted by the American Medical Association. It is based on the average of values obtained at 500, at 1,000, at 2,000 and at 3,000 Hz (table 1), with the lower limit for functional limitation set at 25 dB.

Table 1. Typical calculation of functional loss from an audiogram

Right ear (dB) 25 35 35 45 50 60 45
Left ear (dB) 25 35 40 50 60 70 50


Unilateral loss
Percentage unilateral loss = (average at 500, 1,000, 2,000 and 3,000 Hz)
– 25dB (lower limit) x1.5
Right ear: [([25 + 35 + 35 + 45]/4) – 25) x 1.5 = 15 (per cent)
Left ear: [([25 + 35 + 40 + 50]/4) – 25) x 1.5 = 18.8 (per cent)


Bilateral loss
Percentage of bilateral loss = {(percentage of unilateral loss of the best ear x 5) + (percentage of unilateral loss of the worst ear)}/6
Example: {(15 x 5) + 18.8}/6 = 15.6 (per cent)

Source: Rees and Duckert 1994.


Presbycusis or age-related hearing loss generally begins at about age 40 and progresses gradually with increasing age. It is usually bilateral. The characteristic 4,000 Hz dip observed in noise-induced hearing loss is not seen with presbycusis. However, it is possible to have the effects of ageing superimposed on noise-related hearing loss.


The first essential of treatment is avoidance of any further exposure to potentially toxic levels of noise (see “Prevention” below). It is generally believed that no more subsequent hearing loss occurs after the removal from noise exposure than would be expected from the normal ageing process.

While conduction losses, for example, those related to acute traumatic noise-induced damage, are amenable to medical treatment or surgery, chronic noise-induced hearing loss cannot be corrected by treatment. The use of a hearing aid is the sole “remedy” possible, and is only indicated when hearing loss affects the frequencies critical for speech comprehension (500 to 3,000 Hz). Other types of support, for example lip-reading and sound amplifiers (on telephones, for example), may, however, be possible.


Because noise-induced hearing loss is permanent, it is essential to apply any measure likely to reduce exposure. This includes reduction at the source (quieter machines and equipment or encasing them in sound-proof enclosures) or the use of individual protective devices such as ear plugs and/or ear muffs. If reliance is placed on the latter, it is imperative to verify that their manufacturers’ claims for effectiveness are valid and that exposed workers are using them properly at all times.

The designation of 85 dB (A) as the highest permissible occupational exposure limit was to protect the greatest number of people. But, since there is significant interpersonal variation, strenuous efforts to keep exposures well below that level are indicated. Periodic audiometry should be instituted as part of the medical surveillance programme to detect as early as possible any effects that may indicate noise toxicity.



Thursday, 03 March 2011 18:10

Chemically-Induced Hearing Disorders

Hearing impairment due to the cochlear toxicity of several drugs is well documented (Ryback 1993). But until the latest decade there has been only little attention paid to audiologic effects of industrial chemicals. The recent research on chemically-induced hearing disorders has focused on solvents, heavy metals and chemicals inducing anoxia.

Solvents. In studies with rodents, a permanent decrease in auditory sensitivity to high-frequency tones has been demonstrated following weeks of high-level exposure to toluene. Histopathological and auditory brainstem response studies have indicated a major effect on the cochlea with damage to the outer hair cells. Similar effects have been found in exposure to styrene, xylenes or trichloroethylene. Carbon disulphide and n-hexane may also affect auditory functions while their major effect seems to be on more central pathways (Johnson and Nylén 1995).

Several human cases with damage to the auditory system together with severe neurologic abnormalities have been reported following solvent sniffing. In case series of persons with occupational exposure to solvent mixtures, to n-hexane or to carbon disulphide, both cochlear and central effects on auditory functions have been reported. Exposure to noise was prevalent in these groups, but the effect on hearing has been considered greater than expected from noise.

Only few controlled studies have so far addressed the problem of hearing impairment in humans exposed to solvents without a significant noise exposure. In a Danish study, a statistically significant elevated risk for self-reported hearing impairment at 1.4 (95% CI: 1.1-1.9) was found after exposure to solvents for five years or more. In a group exposed to both solvents and noise, no additional effect from solvent exposure was found. A good agreement between reporting hearing problems and audiometric criteria for hearing impairment was found in a subsample of the study population (Jacobsen et al. 1993).

In a Dutch study of styrene-exposed workers a dose-dependent difference in hearing thresholds was found by audiometry (Muijser et al. 1988).

In another study from Brazil the audiologic effect from exposure to noise, toluene combined with noise, and mixed solvents was examined in workers in printing and paint manufacturing industries. Compared to an unexposed control group, significantly elevated risks for audiometric high frequency hearing loss were found for all three exposure groups. For noise and mixed solvent exposures the relative risks were 4 and 5 respectively. In the group with combined toluene and noise exposure a relative risk of 11 was found, suggesting interaction between the two exposures (Morata et al. 1993).

Metals. The effect of lead on hearing has been studied in surveys of children and teenagers from the United States. A significant dose-response association between blood lead and hearing thresholds at frequencies from 0.5 to 4 kHz was found after controlling for several potential confounders. The effect of lead was present across the entire range of exposure and could be detected at blood lead levels below 10 μg/100ml. In children without clinical signs of lead toxicity a linear relationship between blood lead and latencies of waves III and V in brainstem auditory potentials (BAEP) has been found, indicating a site of action central to the cochlear nucleus (Otto et al. 1985).

Hearing loss is described as a common part of the clinical picture in acute and chronic methyl-mercury poisoning. Both cochlear and postcochlear lesions have been involved (Oyanagi et al. 1989). Inorganic mercury may also affect the auditory system, probably through damage to cochlear structures.

Exposure to inorganic arsenic has been implied in hearing disorders in children. A high frequency of severe hearing loss (>30 dB) has been observed in children fed with powdered milk contaminated with inorganic arsenic V. In a study from Czechoslovakia, environmental exposure to arsenic from a coal-burning power plant was associated with audiometric hearing loss in ten-year-old children. In animal experiments, inorganic arsenic compounds have produced extensive cochlear damage (WHO 1981).

In acute trimethyltin poisoning, hearing loss and tinnitus have been early symptoms. Audiometry has shown pancochlear hearing loss between 15 and 30 dB at presentation. It is not clear whether the abnormalities have been reversible (Besser et al. 1987). In animal experiments, trimethyltin and triethyltin compounds have produced partly reversible cochlear damage (Clerisi et al. 1991).

Asphyxiants. In reports on acute human poisoning by carbon monoxide or hydrogen sulphide, hearing disorders have often been noted along with central nervous system disease (Ryback 1992).

In experiments with rodents, exposure to carbon monoxide had a synergistic effect with noise on auditory thresholds and cochlear structures. No effect was observed after exposure to carbon monoxide alone (Fechter et al. 1988).


Experimental studies have documented that several solvents can produce hearing disorders under certain exposure circumstances. Studies in humans have indicated that the effect may be present following exposures that are common in the occupational environment. Synergistic effects between noise and chemicals have been observed in some human and experimental animal studies. Some heavy metals may affect hearing, most of them only at exposure levels that produce overt systemic toxicity. For lead, minor effects on hearing thresholds have been observed at exposures far below occupational exposure levels. A specific ototoxic effect from asphyxiants has not been documented at present although carbon monoxide may enhance the audiological effect of noise.



Thursday, 03 March 2011 19:34

Physically-Induced Hearing Disorders

By virtue of its position within the skull, the auditory system is generally well protected against injuries from external physical forces. There are, however, a number of physical workplace hazards that may affect it. They include:

Barotrauma. Sudden variation in barometric pressure (due to rapid underwater descent or ascent, or sudden aircraft descent) associated with malfunction of the Eustachian tube (failure to equalize pressure) may lead to rupture of the tympanic membrane with pain and haemorrhage into the middle and external ears. In less severe cases stretching of the membrane will cause mild to severe pain. There will be a temporary impairment of hearing (conductive loss), but generally the trauma has a benign course with complete functional recovery.

Vibration. Simultaneous exposure to vibration and noise (continuous or impact) does not increase the risk or severity of sensorineural hearing loss; however, the rate of onset appears to be increased in workers with hand-arm vibration syndrome (HAVS). The cochlear circulation is presumed to be affected by reflex sympathetic spasm, when such workers have bouts of vasospasm (Raynaud’s phenomenon) in their fingers or toes.

Infrasound and ultrasound. The acoustic energy from both of these sources is normally inaudible to humans. The common sources of ultrasound, for example, jet engines, high-speed dental drills, and ultrasonic cleaners and mixers all emit audible sound so the effects of ultrasound on exposed subjects are not easily discernible. It is presumed to be harmless below 120 dB and therefore unlikely to cause NIHL. Likewise, low-frequency noise is relatively safe, but with high intensity (119-144 dB), hearing loss may occur.

“Welder’s ear”. Hot sparks may penetrate the external auditory canal to the level of the tympanic membrane, burning it. This causes acute ear pain and sometimes facial nerve paralysis. With minor burns, the condition requires no treatment, while in more severe cases, surgical repair of the membrane may be necessary. The risk may be avoided by correct positioning of the welder’s helmet or by wearing ear plugs.



Thursday, 03 March 2011 19:40


Balance System Function


Perception and control of orientation and motion of the body in space is achieved by a system that involves simultaneous input from three sources: vision, the vestibular organ in the inner ear and sensors in the muscles, joints and skin that provide somatosensory or “proprioceptive” information about movement of the body and physical contact with the environment (figure 1). The combined input is integrated in the central nervous system which generates appropriate actions to restore and maintain balance, coordination and well-being. Failure to compensate in any part of the system may produce unease, dizziness and unsteadiness that can produce symptoms and/or falls.

Figure 1.  An outline of the principal elements of the balance system


The vestibular system directly registers the orientation and movement of the head. The vestibular labyrinth is a tiny bony structure located in the inner ear, and comprises the semicircular canals filled with fluid (endolymph) and the otoliths (Figure 6). The three semicircular canals are positioned at right angles so that acceleration can be detected in each of the three possible planes of angular motion. During head turns, the relative movement of the endolymph within the canals (caused by inertia) results in deflection of the cilia projecting from the sensory cells, inducing a change in the neural signal from these cells (figure 2). The otoliths contain heavy crystals (otoconia) which respond to changes in the position of the head relative to the force of gravity and to linear acceleration or deceleration, again bending the cilia and so altering the signal from the sensory cells to which they are attached.




Figure 2. Schematic diagram of the vestibular labyrinth.



Figure 3. Schematic representation of the biomechanical effects of a ninety-degree (forward) inclination of the head.



The central interconnections within the balance system are extremely complex; information from the vestibular organs in both ears is combined with information derived from vision and the somatosensory system at various levels within the brainstem, cerebellum and cortex (Luxon 1984).


This integrated information provides the basis not only for the conscious perception of orientation and self-motion, but also the preconscious control of eye movements and posture, by means of what are known as the vestibuloocular and vestibulospinal reflexes. The purpose of the vestibuloocular reflex is to maintain a stable point of visual fixation during head movement by automatically compensating for the head movement with an equivalent eye movement in the opposite direction (Howard 1982). The vestibulospinal reflexes contribute to postural stability and balance (Pompeiano and Allum 1988).

Balance System Dysfunction

In normal circumstances, the input from the vestibular, visual and somatosensory systems is congruent, but if an apparent mismatch occurs between the different sensory inputs to the balance system, the result is a subjective sensation of dizziness, disorientation, or illusory sense of movement. If the dizziness is prolonged or severe it will be accompanied by secondary symptoms such as nausea, cold sweating, pallor, fatigue, and even vomiting. Disruption of reflex control of eye movements and posture may result in a blurred or flickering visual image, a tendency to veer to one side when walking, or staggering and falling. The medical term for the disorientation caused by balance system dysfunction is “vertigo,” which can be caused by a disorder of any of the sensory systems contributing to balance or by faulty central integration. Only 1 or 2% of the population consult their doctor each year on account of vertigo, but the incidence of dizziness and imbalance rises steeply with age. “Motion sickness” is a form of disorientation induced by artificial environmental conditions with which our balance system has not been equipped by evolution to cope, such as passive transport by car or boat (Crampton 1990).

Vestibular causes of vertigo

The most common causes of vestibular dysfunction are infection (vestibular labyrinthitis or neuronitis), and benign positional paroxysmal vertigo (BPPV) which is triggered principally by lying on one side. Recurrent attacks of severe vertigo accompanied by loss of hearing and noises (tinnitus) in one ear are typical of a syndrome known as Menière’s disease. Vestibular damage can also result from disorders of the middle ear (including bacterial disease, trauma and cholesteatoma), ototoxic drugs (which should be used only in medical emergencies), and head injury.

Non-vestibular peripheral causesof vertigo

Disorders of the neck, which may alter the somatosensory information relating to head movement or interfere with the blood-supply to the vestibular system, are believed by many clinicians to be a cause of vertigo. Common aetiologies include whiplash injury and arthritis. Sometimes unsteadiness is related to a loss of feeling in the feet and legs, which may be caused by diabetes, alcohol abuse, vitamin deficiency, damage to the spinal cord, or a number of other disorders. Occasionally the origin of feelings of giddiness or illusory movement of the environment can be traced to some distortion of the visual input. An abnormal visual input may be caused by weakness of the eye muscles, or may be experienced when adjusting to powerful lenses or to bifocal glasses.

Central causes of vertigo

Although most cases of vertigo are attributable to peripheral (mainly vestibular) pathology, symptoms of disorientation can be caused by damage to the brainstem, cerebellum or cortex. Vertigo due to central dysfunction is almost always accompanied by some other symptom of central neurological disorder, such as sensations of pain, tingling or numbness in the face or limbs, difficulty speaking or swallowing, headache, visual disturbances, and loss of motor control or loss of consciousness. The more common central causes of vertigo include disorders of the blood supply to the brain (ranging from migraine to strokes), epilepsy, multiple sclerosis, alcoholism, and occasionally tumours. Temporary dizziness and imbalance is a potential side-effect of a vast array of drugs, including widely-used analgesics, contraceptives, and drugs used in the control of cardiovascular disease, diabetes and Parkinson’s disease, and in particular the centrally-acting drugs such as stimulants, sedatives, anti-convulsants, anti-depressants and tranquillizers (Ballantyne and Ajodhia 1984).

Diagnosis and treatment

All cases of vertigo require medical attention in order to ensure that the (relatively uncommon) dangerous conditions which can cause vertigo are detected and appropriate treatment is given. Medication can be given to relieve symptoms of acute vertigo in the short term, and in rare cases surgery may be required. However, if the vertigo is caused by a vestibular disorder the symptoms will generally subside over time as the central integrators adapt to the altered pattern of vestibular input—in the same way that sailors continuously exposed to the motion of waves gradually acquire their “sea legs”. For this to occur, it is essential to continue to make vigorous movements which stimulate the balance system, even though these will at first cause dizziness and discomfort. Since the symptoms of vertigo are frightening and embarrassing, sufferers may need physiotherapy and psychological support to combat the natural tendency to restrict their activities (Beyts 1987; Yardley 1994).

Vertigo in the Workplace

Risk factors

Dizziness and disorientation, which may become chronic, is a common symptom in workers exposed to organic solvents; furthermore, long-term exposure can result in objective signs of balance system dysfunction (e.g., abnormal vestibular-ocular reflex control) even in people who experience no subjective dizziness (Gyntelberg et al. 1986; Möller et al. 1990). Changes in pressure encountered when flying or diving can cause damage to the vestibular organ which results in sudden vertigo and hearing loss requiring immediate treatment (Head 1984). There is some evidence that noise-induced hearing loss can be accompanied by damage to the vestibular organs (van Dijk 1986). People who work for long periods at computer screens sometimes complain of dizziness; the cause of this remains unclear, although it may be related to the combination of a stiff neck and moving visual input.

Occupational difficulties

Unexpected attacks of vertigo, such as occur in Menière’s disease, can cause problems for people whose work involves heights, driving, handling dangerous machinery, or responsibility for the safety of others. An increased susceptibility to motion sickness is a common effect of balance system dysfunction and may interfere with travel.


Equilibrium is maintained by a complex multisensory system, and so disorientation and imbalance can result from a wide variety of aetiologies, in particular any condition which affects the vestibular system or the central integration of perceptual information for orientation. In the absence of central neurological damage the plasticity of the balance system will normally enable the individual to adapt to peripheral causes of disorientation, whether these are disorders of the inner ear which alter vestibular function, or environments which provoke motion sickness. However, attacks of dizziness are often unpredictable, alarming and disabling, and rehabilitation may be necessary to restore confidence and assist the balance function.



Thursday, 03 March 2011 19:52

Vision and Work

Anatomy of the Eye

The eye is a sphere (Graham et al. 1965; Adler 1992), approximately 20 mm in diameter, that is set in the body orbit with the six extrinsic (ocular) muscles that move the eye attached to the sclera, its external wall (figure 1). In front, the sclera is replaced by the cornea, which is transparent. Behind the cornea in the interior chamber is the iris, which regulates the diameter of the pupil, the space through which the optic axis passes. The back of the anterior chamber is formed by the biconvex crystalline lens, whose curvature is determined by the ciliary muscles attached at the front to the sclera and behind to the choroidal membrane, which lines the posterior chamber. The posterior chamber is filled with the vitreous humour—a clear, gelatinous liquid. The choroid, the inner surface of the posterior chamber, is black to prevent interference with visual acuity by internal light reflections.

Figure 1.  Schematic representation of the eye.

SEN060F1The eyelids help to maintain a film of tears, produced by the lacrymal glands, which protects the anterior surface of the eye. Blinking facilitates the spread of tears and their emptying into the lacrymal canal, which empties in the nasal cavity. The frequency of blinking, which is used as a test in ergonomics, varies greatly depending on the activity being undertaken (for example, it is slower during reading) and also on the lighting conditions (the rate of blinking is lowered by an increase of illumination).

The anterior chamber contains two muscles: the sphincter of the iris, which contracts the pupil, and the dilator, which widens it. When a bright light is directed toward a normal eye, the pupil contracts (pupillary reflex). It also contracts when viewing a nearby object.

The retina has several inner layers of nerve cells and an outer layer containing two types of photoreceptor cells, the rods and cones. Thus, light passes through the nerve cells to the rods and cones where, in a manner not yet understood, it generates impulses in the nerve cells which pass along the optic nerve to the brain. The cones, numbering four to five millions, are responsible for the perception of bright images and colour. They are concentrated in the inner portion of the retina, most densely at the fovea, a small depression at the centre of the retina where there are no rods and where vision is most acute. With the help of spectrophotometry, three types of cones have been identified, whose absorption peaks are yellow, green and blue zones accounting for the sense of colour. The 80 to 100 million rods become more and more numerous toward the periphery of the retina and are sensitive to dim light (night vision). They also play a major role in black-white vision and in the detection of motion.

The nerve fibres, along with the blood vessels which nourish the retina, traverse the choroid, the middle of the three layers forming the wall of the posterior chamber, and leave the eye as the optic nerve at a point somewhat off-centre, which, because there are no photoreceptors there, is known as the “blind spot.”

The retinal vessels, the only arteries and veins that can be viewed directly, can be visualized by directing a light through the pupil and using an ophthalmoscope to focus on their image (the images can also be photographed). Such retinoscopic examinations, part of the routine medical examination, are important in evaluating the vascular components of such diseases as arteriosclerosis, hypertension and diabetes, which may cause retinal haemorrhages and/or exudates that may cause defects in the field of vision.

Properties of the Eye that Are Important for Work

Mechanism of accommodation

In the emmetropic (normal) eye, as light rays pass through the cornea, the pupil and the lens, they are focused on the retina, producing an inverted image which is reversed by the visual centres in the brain.

When a distant object is viewed, the lens is flattened. When viewing nearby objects, the lens accommodates (i.e., increases its power) by a squeezing of the ciliary muscles into a more oval, convex shape. At the same time, the iris constricts the pupil, which improves the quality of the image by reducing the spherical and chromatic aberrations of the system and increasing the depth of field.

In binocular vision, accommodation is necessarily accompanied by proportional convergence of both eyes.

The visual field and the field of fixation

The visual field (the space covered by the eyes at rest) is limited by anatomical obstacles in the horizontal plane (more reduced on the side towards the nose) and in the vertical plane (limited by the upper edge of the orbit). In binocular vision, the horizontal field is about 180 degrees and the vertical field 120 to 130 degrees. In daytime vision, most visual functions are weakened at the periphery of the visual field; on the contrary, perception of movement is improved. In night vision there is a considerable loss of acuity at the centre of the visual field, where, as noted above, the rods are less numerous.

The field of fixation extends beyond the visual field thanks to the mobility of the eyes, head and body; in work activities it is the field of fixation that matters. The causes of reduction of the visual field, whether anatomical or physiological, are very numerous: narrowing of the pupil; opacity of the lens; pathological conditions of the retina, visual pathways or visual centres; the brightness of the target to be perceived; the frames of spectacles for correction or protection; the movement and speed of the target to be perceived; and others.

Visual acuity

“Visual acuity (VA) is the capacity to discriminate the fine details of objects in the field of view. It is specified in terms of the minimum dimension of some critical aspects of a test object that a subject can correctly identify” (Riggs, in Graham et al. 1965). A good visual acuity is the ability to distinguish fine details. Visual acuity defines the limit of spatial discrimination.

The retinal size of an object depends not only on its physical size but also on its distance from the eye; it is therefore expressed in terms of the visual angle (usually in minutes of arc). Visual acuity is the reciprocal of this angle.

Riggs (1965) describes several types of “acuity task”. In clinical and occupational practice, the recognition task, in which the subject is required to name the test object and locate some details of it, is the most commonly applied. For convenience, in ophthalmology, visual acuity is measured relative to a value called “normal” using charts presenting a series of objects of different sizes; they have to be viewed at a standard distance.

In clinical practice Snellen charts are the most widely used tests for distant visual acuity; a series of test objects are used in which the size and broad shape of characters are designed to subtend an angle of 1 minute at a standard distance which varies from country to country (in the United States, 20 feet between the chart and the tested individual; in most European countries, 6 metres). The normal Snellen score is thus 20/20. Larger test objects which form an angle of 1 minute of arc at greater distances are also provided.

The visual acuity of an individual is given by the relation VA = D¢/D, where D¢ is the standard viewing distance and D the distance at which the smallest test object correctly identified by the individual subtends an angle of 1 minute of arc. For example, a person’s VA is 20/30 if, at a viewing distance of 20 ft, he or she can just identify an object which subtends an angle of 1 minute at 30 feet.

In optometric practice, the objects are often letters of the alphabet (or familiar shapes, for illiterates or children). However, when the test is repeated, charts should present unlearnable characters for which the recognition of differences involve no educational and cultural features. This is one reason why it is nowadays internationally recommended to use Landolt rings, at least in scientific studies. Landolt rings are circles with a gap, the directional position of which has to be identified by the subject.

Except in ageing people or in those individuals with accommodative defects (presbyopia), the far and the near visual acuity parallel each other. Most jobs require both a good far (without accommodation) and a good near vision. Snellen charts of different kinds are also available for near vision (figures 2 and 3). This particular Snellen chart should be held at 16 inches from the eye (40 cm); in Europe, similar charts exist for a reading distance of 30 cm (the appropriate distance for reading a newspaper).

Figure 2. Example of a Snellen chart: Landolt rings (acuity in decimal values (reading distance not specified)).


Figure 3. Example of a Snellen chart: Sloan letters for measuring near vision (40 cm)(acuity in decimal values and in distance equivalents).


With the broad use of visual display units, VDUs, however, there is an increased interest in occupational health to test operators at a longer distance (60 to 70 cm, according to Krueger (1992), in order to correct VDU operators properly.

Vision testers and visual screening

For occupational practice, several types of visual testers are available on the market which have similar features; they are named Orthorater, Visiotest, Ergovision, Titmus Optimal C Tester, C45 Glare Tester, Mesoptometer, Nyctometer and so on.

They are small; they are independent of the lighting of the testing room, having their own internal lighting; they provide several tests, such as far and near binocular and monocular visual acuity (most of the time with unlearnable characters), but also depth perception, rough colour discrimination, muscular balance and so on. Near visual acuity can be measured, sometimes for short and intermediate distance of the test object. The most recent of these devices makes extensive use of electronics to provide automatically written scores for different tests. Moreover, these instruments can be handled by non-medical personnel after some training.

Vision testers are designed for the purpose of pre-recruitment screening of workers, or sometimes later testing, taking into account the visual requirements of their workplace. Table 1 indicates the level of visual acuity needed to fulfil unskilled to highly skilled activities, when using one particular testing device (Fox, in Verriest and Hermans 1976).


Table 1. Visual requirements for different activities when using Titmus Optimal C Tester, with correction


Category 1: Office work

Far visual acuity 20/30 in each eye (20/25 for binocular vision)

Near VA 20/25 in each eye (20/20 for binocular vision)

Category 2: Inspection and other activities in fine mechanics

Far VA 20/35 in each eye (20/30 for binocular vision)

Near VA 20/25 in each eye (20/20 for binocular vision)

Category 3: Operators of mobile machinery

Far VA 20/25 in each eye (20/20 for binocular vision)

Near VA 20/35 in each eye (20/30 for binocular vision)

Category 4 : Machine tools operations

Far and near VA 20/30 in each eye (20/25 for binocular vision)

Category 5 : Unskilled workers

Far VA 20/30 in each eye (20/25 for binocular vision)

Near VA 20/35 in each eye (20/30 for binocular vision)

Category 6 : Foremen

Far VA 20/30 in each eye (20/25 for binocular vision)

Near VA 20/25 in each eye (20/20 for binocular vision)

Source: According to Fox in Verriest and Hermans 1975.



It is recommended by manufacturers that employees are measured when wearing their corrective glasses. Fox (1965), however, stresses that such a procedure may lead to wrong results—for example, workers are tested with glasses which are too old in comparison with the time of the present measurement; or lenses may be worn out by exposure to dust or other noxious agents. It is also very often the case that people come to the testing room with the wrong glasses. Fox (1976) suggests therefore that, if “the corrected vision is not improved to 20/20 level for distance and near, referral should be made to an ophthalmologist for a proper evaluation and refraction for the current need of the employee on his job”. Other deficiencies of vision testers are referred to later in this article.

Factors influencing visual acuity

VA meets its first limitation in the structure of the retina. In daytime vision, it may exceed 10/10ths at the fovea and may rapidly decline as one moves a few degrees away from the centre of the retina. In night vision, acuity is very poor or nil at the centre but may reach one tenth at the periphery, because of the distribution of cones and rods (figure 4).

Figure 4. Density of cones and rods in the retina as compared with the relative visual acuity in the corresponding visual field.


The diameter of the pupil acts on visual performance in a complex manner. When dilated, the pupil allows more light to enter into the eye and stimulate the retina; the blur due to the diffraction of the light is minimized. A narrower pupil, however, reduces the negative effects of the aberrations of the lens mentioned above. In general, a pupil diameter of 3 to 6 mm favours clear vision.

Thanks to the process of adaptation it is possible for the human being to see as well by moonlight as by full sunshine, even though there is a difference in illumination of 1 to 10,000,000. Visual sensitivity is so wide that luminous intensity is plotted on a logarithmic scale.

On entering a dark room we are at first completely blind; then the objects around us become perceptible. As the light level is increased, we pass from rod-dominated vision to cone-dominated vision. The accompanying change in sensitivity is known as the Purkinje shift. The dark-adapted retina is mainly sensitive to low luminosity, but is characterized by the absence of colour vision and poor spatial resolution (low VA); the light-adapted retina is not very sensitive to low luminosity (objects have to be well illuminated in order to be perceived), but is characterized by a high degree of spatial and temporal resolution and by colour vision. After the desensitization induced by intense light stimulation, the eye recovers its sensitivity according to a typical progression: at first a rapid change involving cones and daylight or photopic adaptation, followed by a slower phase involving rods and night or scotopic adaptation; the intermediate zone involves dim light or mesopic adaptation.

In the work environment, night adaptation is hardly relevant except for activities in a dark room and for night driving (although the reflection on the road from headlights always brings some light). Simple daylight adaptation is the most common in industrial or office activities, provided either by natural or by artificial lighting. However, nowadays with emphasis on VDU work, many workers like to operate in dim light.

In occupational practice, the behaviour of groups of people is particularly important (in comparison with individual evaluation) when selecting the most appropriate design of workplaces. The results of a study of 780 office workers in Geneva (Meyer et al. 1990) show the shift in percentage distribution of acuity levels when lighting conditions are changed. It may be seen that, once adapted to daylight, most of the tested workers (with eye correction) reach a quite high visual acuity; as soon as the surrounding illumination level is reduced, the mean VA decreases, but also the results are more spread, with some people having very poor performance; this tendency is aggravated when dim light is accompanied by some disturbing glare source (figure 5). In other words, it is very hard to predict the behaviour of a subject in dim light from his or her score in optimal daylight conditions.

Figure 5. Percentage distribution of tested office workers’ visual acuity.


Glare. When the eyes are directed from a dark area to a lighted area and back again, or when the subject looks for a moment at a lamp or window (illuminance varying from 1,000 to 12,000 cd/m2), changes in adaptation concern a limited area of the visual field (local adaptation). Recovery time after disabling glare may last several seconds, depending on illumination level and contrast (Meyer et al. 1986) (figure 6).

Figure 6. Response time before and after exposure to glare for perceiving the gap of a Landolt ring: Adaption to dim light.


Afterimages. Local disadaptation is usually accompanied by the continued image of a bright spot, coloured or not, which produces a veil or masking effect (this is the consecutive image). Afterimages have been studied very extensively to better understand certain visual phenomena (Brown in Graham et al. 1965). After visual stimulation has ceased, the effect remains for some time; this persistence explains, for example, why perception of continuous light may be present when facing a flickering light (see below). If the frequency of flicker is high enough, or when looking at cars at night, we see a line of light. These afterimages are produced in the dark when viewing an enlighted spot; they are also produced by coloured areas, leaving coloured images. It is the reason why VDU operators may be exposed to sharp afterimages after looking for a prolonged time at the screen and then moving their eyes towards another area in the room.

Afterimages are very complicated. For example, one experiment on afterimages found that a blue spot appears white during the first seconds of observation, then pink after 30 seconds, and then bright red after a minute or two. Another experiment showed that an orange-red field appeared momentarily pink, then within 10 to 15 seconds passed through orange and yellow to a bright green appearance which remained throughout the whole observation. When the point of fixation moves, usually the afterimage moves too (Brown in Graham et al. 1965). Such effects could be very disturbing to someone working with a VDU.

Diffused light emitted by glare sources also has the effect of reducing the object/background contrast (veiling effect) and thus reducing visual acuity (disability glare). Ergophthalmologists also describe discomfort glare, which does not reduce visual acuity but causes uncomfortable or even painful sensation (IESNA 1993).

The level of illumination at the workplace must be adapted to the level required by the task. If all that is required is to perceive shapes in an environment of stable luminosity, weak illumination may be adequate; but as soon as it is a question of seeing fine details that require increased acuity, or if the work involves colour discrimination, retinal illumination must be markedly increased.

Table 2 gives recommended illuminance values for the lighting design of a few workstations in different industries (IESNA 1993).

Table 2. Recommended illuminance values for the lighting design of a few workstations

Cleaning and pressing industry
Dry and wet cleaning and steaming 500-1,000 lux or 50-100 footcandles
Inspection and spotting 2,000-5,000 lux or 200-500 footcandles
Repair and alteration 1,000-2,000 lux or 100-200 footcandles
Dairy products, fluid milk industry
Bottle storage 200-500 lux or 20-50 footcandles
Bottle washers 200-500 lux or 20-50 footcandles
Filling, inspection 500-1,000 lux or 50-100 footcandles
Laboratories 500-1,000 lux or 50-100 footcandles
Electrical equipment, manufacturing
Impregnating 200-500 lux or 20-50 footcandles
Insulating coil winding 500-1,000 lux or 50-100 footcandles
Electricity-generating stations
Air-conditioning equipment, air preheater 50-100 lux or 50-10 footcandles
Auxiliaries, pumps, tanks, compressors 100-200 lux or 10-20 footcandles
Clothing industry
Examining (perching) 10,000-20,000 lux or 1,000-2,000 footcandles
Cutting 2,000-5,000 lux or 200-500 footcandles
Pressing 1,000-2,000 lux or 100-200 footcandles
Sewing 2,000-5,000 lux or 200-500 footcandles
Piling up and marking 500-1,000 lux or 50-100 footcandles
Sponging, decating, winding 200-500 lux or 20-50 footcandles
General 100-200 lux or 10-20 footcandles
Writing area 200-500 lux or 20-50 footcandles
Tellers’ stations 500-1,000 lux or 50-100 footcandles
Dairy farms
Haymow area 20-50 lux or 2-5 footcandles
Washing area 500-1,000 lux or 50-100 footcandles
Feeding area 100-200 lux or 10-20 footcandles
Core-making: fine 1,000-2,000 lux or 100-200 footcandles
Core-making: medium 500-1,000 lux or 50-100 footcandles
Moulding: medium 1,000-2,000 lux or 100-200 footcandles
Moulding: large 500-1,000 lux or 50-100 footcandles
Inspection: fine 1,000-2,000 lux or 100-200 footcandles
Inspection: medium 500-1,000 lux or 50-100 footcandles

Source: IESNA 1993.


Brightness contrast and spatial distribution of luminances at the workplace. From the point of view of ergonomics, the ratio between luminances of the test object, its immediate background and the surrounding area has been widely studied, and recommendations on this subject are available for different requirements of the task (see Verriest and Hermans 1975; Grandjean 1987).

The object-background contrast is currently defined by the formula (Lf – Lo)/Lf, where Lo is the luminance of the object and Lf the luminance of the background. It thus varies from 0 to 1.

As shown by figure 7, visual acuity increases with the level of illumination (as previously said) and with the increase of object-background contrast (Adrian 1993). This effect is particularly marked in young people. A large light background and a dark object thus provides the best efficiency. However, in real life, contrast will never reach unity. For example, when a black letter is printed on a white sheet of paper, the object-background contrast reaches a value of only around 90%.

Figure 7. Relationship between visual acuity of a dark object perceived on a background receiving increasing illumination for four contrast values.


In the most favourable situation—that is, in positive presentation (dark letters on a light background)—acuity and contrast are linked, so that visibility can be improved by affecting either one or the other factor—for example, increasing the size of letters or their darkness, as in Fortuin’s table (in Verriest and Hermans 1975). When video display units appeared on the market, letters or symbols were presented on the screen as light spots on a dark background. Later on, new screens were developed which displayed dark letters on a light background. Many studies were conducted in order to verify whether this presentation improved vision. The results of most experiments stress without any doubt that visual acuity is enhanced when reading dark letters on a light background; of course a dark screen favours reflections of glare sources.

The functional visual field is defined by the relationship between the luminosity of the surfaces actually perceived by the eye at the workpost and those of the surrounding areas. Care must be taken not to create too great differences of luminosity in the visual field; according to the size of the surfaces involved, changes in general or local adaptation occur which cause discomfort in the execution of the task. Moreover, it is recognized that in order to achieve good performance, the contrasts in the field must be such that the task area is more illuminated than its immediate surroundings, and that the far areas are darker.

Time of presentation of the object. The capacity to detect an object depends directly on the quantity of light entering the eye, and this is linked with the luminous intensity of the object, its surface qualities and the time during which it appears (this is known in tests of tachystocopic presentation). A reduction in acuity occurs when the duration of presentation is less than 100 to 500 ms.

Movements of the eye or of the target. Loss of performance occurs particularly when the eye jerks; nevertheless, total stability of the image is not required in order to attain maximum resolution. But it has been shown that vibrations such as those of construction site machines or tractors can adversely affect visual acuity.

Diplopia. Visual acuity is higher in binocular than in monocular vision. Binocular vision requires optical axes that both meet at the object being looked at, so that the image falls into corresponding areas of the retina in each eye. This is made possible by the activity of the external muscles. If the coordination of the external muscles is failing, more or less transitory images may appear, such as in excessive visual fatigue, and may cause annoying sensations (Grandjean 1987).

In short, the discriminating power of the eye depends on the type of object to be perceived and the luminous environment in which it is measured; in the medical consulting room, conditions are optimal: high object-background contrast, direct daylight adaptation, characters with sharp edges, presentation of the object without a time limit, and certain redundancy of signals (e.g., several letters of the same size on a Snellen chart). Moreover, visual acuity determined for diagnosis purposes is a maximal and unique operation in the absence of accommodative fatigue. Clinical acuity is thus a poor reference for the visual performance attained on the job. What is more, good clinical acuity does not necessarily mean the absence of discomfort at work, where conditions of individual visual comfort are rarely attained. At most workplaces, as stressed by Krueger (1992), objects to be perceived are blurred and of low contrast, background luminances are unequally scattered with many glare sources producing veiling and local adaptation effects and so on. According to our own calculations, clinical results do not carry much predictive value of the amount and nature of visual fatigue encountered, for example, in VDU work. A more realistic laboratory set-up in which conditions of measurement were closer to task requirements did somewhat better (Rey and Bousquet 1990; Meyer et al. 1990).

Krueger (1992) is right when claiming that ophthalmological examination is not really appropriate in occupational health and ergonomics, that new testing procedures should be developed or extended, and that existing laboratory set-ups should be made available to the occupational practitioner.

Relief Vision, Stereoscopic Vision

Binocular vision allows a single image to be obtained by means of synthesis of the images received by the two eyes. Analogies between these images give rise to the active cooperation that constitutes the essential mechanism of the sense of depth and relief. Binocular vision has the additional property of enlarging the field, improving visual performance generally, relieving fatigue and increasing resistance to glare and dazzle.

When the fusion of both eyes is not sufficient, ocular fatigue may appear earlier.

Without achieving the efficiency of binocular vision in appreciating the relief of relatively near objects, the sensation of relief and the perception of depth are nevertheless possible with monocular vision by means of phenomena that do not require binocular disparity. We know that the size of objects does not change; that is why apparent size plays a part in our appreciation of distance; thus retinal images of small size will give the impression of distant objects, and vice versa (apparent size). Near objects tend to hide more distant objects (this is called interposition). The brighter one of two objects, or the one with a more saturated colour, seems to be nearer. The surroundings also play a part: more distant objects are lost in mist. Two parallel lines seem to meet at infinity (this is the perspective effect). Finally, if two targets are moving at the same speed, the one whose speed of retinal displacement is slower will appear farther from the eye.

In fact, monocular vision does not constitute a major obstacle in the majority of work situations. The subject needs to get accustomed to the narrowing of the visual field and also to the rather exceptional possibility that the image of the object may fall on the blind spot. (In binocular vision the same image never falls on the blind spot of both eyes at the same time.) It should also be noted that good binocular vision is not necessarily accompanied by relief (stereoscopic) vision, since this also depends on complex nervous system processes.

For all these reasons, regulations for the need of stereoscopic vision at work should be abandoned and replaced by a thorough examination of individuals by an eye doctor. Such regulations or recommendations exist nevertheless and stereoscopic vision is supposed to be necessary for such tasks as crane driving, jewellery work and cutting-out work. However, we should keep in mind that new technologies may modify deeply the content of the task; for example, modern computerized machine-tools are probably less demanding in stereoscopic vision than previously believed.

As far as driving is concerned, regulations are not necessarily similar from country to country. In table 3 (overleaf), French requirements for driving either light or heavy vehicles are mentioned. The American Medical Association guidelines are the appropriate reference for American readers. Fox (1973) mentions that, for the US Department of Transportation in 1972, drivers of commercial motor vehicles should have a distant VA of at least 20/40, with or without corrective glasses; a field of vision of at least 70 degrees is needed in each eye. Ability to recognize the colours of the traffic lights was also required at that time, but today in most countries traffic lights can be distinguished not only by colour but also by shape.

Table 3. Visual requirements for a driving licence in France

Visual acuity (with eyeglasses)
For light vehicles At least 6/10th for both eyes with at least 2/10th in the worse eye
For heavy vehicles VA with both eyes of 10/10th with at least 6/10th in the worse eye
Visual field
For light vehicles No licence if peripheral reduction in candidates with one eye or with the second eye having a visual acuity of less than 2/10th
For heavy vehicles Complete integrity of both visual fields (no peripheral reduction, no scotoma)
Nystagmus (spontaneous eye movements)
For light vehicles No licence if binocular visual acuity of less than 8/10th
Heavy vehicles No defects of night vision are acceptable


Eye Movements

Several types of eye movements are described whose objective is to allow the eye to take advantage of all the information contained in the images. The system of fixation allows us to maintain the object in place at the level of the foveolar receptors where it can be examined in the retinal region with the highest power of resolution. Nevertheless, the eyes are constantly subject to micromovements (tremor). Saccades (particularly studied during reading) are intentionally induced rapid movements the aim of which is to displace the gaze from one detail to another of the motionless object; the brain perceives this unanticipated motion as the movement of an image across the retina. This illusion of movement is met in pathological conditions of the central nervous system or the vestibular organ. Search movements are partially voluntary when they involve the tracking of relatively small objects, but become rather irrepressible when very large objects are concerned. Several mechanisms for suppressing images (including jerks) allow the retina to prepare to receive new information.

Illusions of movements (autokinetic movements) of a luminous point or a motionless object, such as the movement of a bridge over a watercourse, are explained by retinal persistence and conditions of vision that are not integrated in our central system of reference. The consecutive effect may be merely a simple error of interpretation of a luminous message (sometimes harmful in the working environment) or result in serious neurovegetative disturbances. The illusions caused by static figures are well known. Movements in reading are discussed elsewhere in this chapter.

Flicker Fusion and de Lange Curve

When the eye is exposed to a succession of short stimuli, it first experiences flicker and then, with an increase in frequency, has the impression of stable luminosity: this is the critical fusion frequency. If the stimulating light fluctuates in a sinusoidal manner, the subject may experience fusion for all frequencies below the critical frequency insofar as the level of modulation of this light is reduced. All these thresholds can then be joined by a curve which was first described by de Lange and which can be altered when changing the nature of the stimulation: the curve will be depressed when the luminance of the flickering area is reduced or if the contrast between the flickering spot at its surrounding decreases; similar changes of the curve can be observed in retinal pathologies or in post-effects of cranial trauma (Meyer et al. 1971) (Figure 8).

Figure 8. Flicker-fusion curves connecting the frequency of intermittent luminous stimulation and its amplitude of modulation at threshold (de Lange’s curves), average and standard deviation, in 43 patients suffering from cranial trauma and 57 controls (dotted line).


Therefore one must be cautious when claiming to interpret a fall in critical flicker fusion in terms of work-induced visual fatigue.

Occupational practice should make a better use of flickering light to detect small retinal damage or dysfunctioning (e.g., an enhancement of the curve can be observed when dealing with slight intoxication, followed by a drop when intoxication becomes greater); this testing procedure, which does not alter retinal adaptation and which does not require eye correction, is also very useful for the follow-up of functional recovery during and after a treatment (Meyer et al. 1983) (figure 9).

Figure 9. De Lange’s curve in a young man absorbing ethambutol; the effect of treatment can de deduced from comparing the flicker sensitivity of the subject before and after treatment.


Colour Vision

The sensation of colour is connected with the activity of the cones and therefore exists only in the case of daylight (photopic range of light) or mesopic (middle range of light) adaptation. In order for the system of colour analysis to function satisfactorily, the illuminance of the perceived objects must be at least 10 cd/m2. Generally speaking, three colour sources, the so-called primary colours—red, green and blue—suffice to reproduce a whole spectrum of colour sensations. In addition, a phenomenon is observed of induction of colour contrast between two colours which mutually reinforce each other: the green-red pair and the yellow-blue pair.

The two theories of colour sensation, the trichromatic and the dichromatic, are not exclusive; the first appears to apply at the level of the cones and the second one at more central levels of the visual system.

To understand the perception of coloured objects against a luminous background, other concepts need to be used. The same colour may in fact be produced by different types of radiation. To reproduce a given colour faithfully, it is therefore necessary to know the spectral composition of the light sources and the spectrum of the reflectance of the pigments. The index of colour reproduction used by lighting specialists allows the selection of fluorescent tubes appropriate to the requirements. Our eyes have developed the faculty of detecting very slight changes in the tonality of a surface obtained by changing its spectral distribution; the spectral colours (the eye can distinguish more than 200) recreated by mixtures of monochromatic light represent only a small proportion of the possible colour sensation.

The importance of the anomalies of colour vision in the work environment should thus not be exaggerated except in activities such as inspecting the appearance of products, and e.g., for decorators and similar, where colours must be correctly identified. Moreover, even in electricians’ work, size and shape or other markers may replace colour.

Anomalies of colour vision may be congenital or acquired (degenerations). In abnormal trichromates, the change may affect the basic red sensation (Dalton type), or the green or the blue (the rarest anomaly). In dichromates, the system of three basic colours is reduced to two. In deuteranopia, it is the basic green that is lacking. In protanopia, it is the disappearance of the basic red; although less frequent, this anomaly, as it is accompanied by a loss of luminosity in the range of reds, deserves attention in the work environment, in particular by avoiding the deployment of red notices especially if they are not very well lighted. It should also be noted that these colour vision defects can be found in various degrees in the so-called normal subject; hence the need for caution in using too many colours. It should be kept in mind also that only broad colour defects are detectable with vision testers.

Refractive Errors

The near point (Weymouth 1966) is the shortest distance at which an object can be brought into sharp focus; the farthest away is the far point. For the normal (emmetropic) eye, the far point is situated at infinity. For the myopic eye, the far point is situated in front of the retina, at a finite distance; this excess of strength is corrected by means of concave lenses. For the hyperopic (hypermetropic) eye, the far point is situated behind the retina; this lack of strength is corrected by means of convex lenses (figure 10). In a case of light hyperopia, the defect is spontaneously compensated with accommodation and may be ignored by the individual. In myopics who are not wearing their spectacles the loss of accommodation can be compensated for by the fact that the far point is nearer.

Figure 10. Schematic representation of refractive errors and their correction.


In the ideal eye, the surface of the cornea should be perfectly spherical; however, our eyes show differences in curvature in different axes (this is called astigmatism); refraction is stronger when the curvature is more accentuated, and the result is that rays emerging from a luminous point do not form a precise image on the retina. These defects, when pronounced, are corrected by means of cylindrical lenses (see lowest diagram in figure 10, overleaf); in irregular astigmatism, contact lenses are recommended. Astigmatism becomes particularly troublesome during night driving or in work on a screen, that is, in conditions where light signals stand out on a dark background or when using a binocular microscope.

Contact lenses should not be used at workstations where air is too dry or in case of dusts and so on (Verriest and Hermans 1975).

In presbyopia, which is due to loss of elasticity of the lens with age, it is the amplitude of accommodation that is reduced—that is, the distance between the far and near points; the latter (from about 10 cm at the age of 10 years) moves further away the older one gets; the correction is made by means of unifocal or multifocal convergent lenses; the latter correct for ever nearer distances of the object (usually up to 30 cm) by taking into account that nearer objects are generally perceived in the lower part of the visual field, while the upper part of the spectacles is reserved for distance vision. New lenses are now proposed for work at VDUs which are different from the usual type. The lenses, known as progressive, almost blur the limits between the correction zones. Progressive lenses require the user to be more accustomed to them than do the other types of lenses, because their field of vision is narrow (see Krueger 1992).

When the visual task requires alternative far and near vision, bifocal, trifocal or even progressive lenses are recommended. However, it should be kept in mind that the use of multifocal lenses can create important modifications to the posture of an operator. For example, VDU operators with presbyopia corrected by the means of bifocal lenses tend to extend the neck and may suffer cervical and shoulder pain. Spectacles manufacturers will then propose progressive lenses of different kinds. Another cue is the ergonomic improvement of VDU workplaces, to avoid placing the screen too high.

Demonstrating refractive errors (which are very common in the working population) is not independent of the type of measurement. Snellen charts fixed on a wall will not necessarily give the same results as various kinds of apparatus in which the image of the object is projected on a near background. In fact, in a vision tester (see above), it is difficult for the subject to relax the accommodation, particularly as the axis of vision is lower; this is known as “instrumental myopia”.

Effects of Age

With age, as already explained, the lens loses its elasticity, with the result that the near point moves farther away and the power of accommodation is reduced. Although the loss of accommodation with age can be compensated for by means of spectacles, presbyopia is a real public health problem. Kauffman (in Adler 1992) estimates its cost, in terms of means of correction and loss of productivity, to be of the order of tens of billions of dollars annually for the United States alone. In developing countries we have seen workers obliged to give up work (in particular the making of silk saris) because they are unable to buy spectacles. Moreover, when protective glasses need to be used, it is very expensive to offer both correction and protection. It should be remembered that the amplitude of accommodation declines even in the second ten years of life (and perhaps even earlier) and that it disappears completely by the age of 50 to 55 years (Meyer et al. 1990) (figure 11).

Figure 11. Near point measured with the rule of Clement and Clark, percentage distribution of 367 office workers aged 18-35 years (below) and 414 office workers aged 36-65 years (above).


Other phenomena due to age also play a part: the sinking of the eye into the orbit, which occurs in very old age and varies more or less according to individuals, reduces the size of the visual field (because of the eyelid). Dilation of the pupil is at its maximum in adolescence and then declines; in older people, the pupil dilates less and the reaction of the pupil to light slows down. Loss of transparency of the media of the eye reduces visual acuity (some media have a tendency to become yellow, which modifies colour vision) (see Verriest and Hermans 1976). Enlargement of the blind spot results in the reduction of the functional visual field.

With age and illness, changes are observed in the retinal vessels, with consequent functional loss. Even the movements of the eye are modified; there is a slowing down and reduction in amplitude of the exploratory movements.

Older workers are at a double disadvantage in conditions of weak contrast and weak luminosity of the environment; first, they need more light to see an object, but at the same time they benefit less from increased luminosity because they are dazzled more quickly by glare sources. This handicap is due to changes in the transparent media which allow less light to pass and increase its diffusion (the veil effect described above). Their visual discomfort is aggravated by too sudden changes between strongly and weakly lighted areas (slowed pupil reaction, more difficult local adaptation). All these defects have a particular impact in VDU work, and it is very difficult, indeed, to provide good illumination of workplaces for both young and older operators; it can be observed, for example, that older operators will reduce by all possible means the luminosity of the surrounding light, although dim light tends to decrease their visual acuity.



Risks to the Eye at Work

These risks may be expressed in different ways (Rey and Meyer 1981; Rey 1991): by the nature of the causal agent (physical agent, chemical agents, etc.), by the route of penetration (cornea, sclera, etc.), by the nature of the lesions (burns, bruises, etc.), by the seriousness of the condition (limited to the outer layers, affecting the retina, etc.) and by the circumstances of the accident (as for any physical injury); these descriptive elements are useful in devising preventive measures. Only the eye lesions and circumstances most frequently encountered in the insurance statistics are mentioned here. Let us stress that workers’ compensation can be claimed for most eye injuries.

Eye conditions caused by foreign bodies

These conditions are seen particularly among turners, polishers, foundry workers, boilermakers, masons and quarrymen. The foreign bodies may be inert substances such as sand, irritant metals such as iron or lead, or animal or vegetable organic materials (dusts). This is why, in addition to the eye lesions, complications such as infections and intoxications may occur if the amount of substance introduced into the organism is sufficiently large. Lesions produced by foreign bodies will of course be more or less disabling, depending on whether they remain in the outer layers of the eye or penetrate deeply into the bulb; treatment will thus be quite different and sometimes requires the immediate transfer of the victim to the eye clinic.

Burns of the eye

Burns are caused by various agents: flashes or flames (during a gas explosion); molten metal (the seriousness of the lesion depends on the melting point, with metals melting at higher temperature causing more serious damage); and chemical burns due, for example, to strong acids and bases. Burns due to boiling water, electrical burns and many others also occur.

Injuries due to compressed air

These are very common. Two phenomena play a part: the force of the jet itself (and the foreign bodies accelerated by the air flow); and the shape of the jet, a less concentrated jet being less harmful.

Eye conditions caused by radiation

Ultraviolet (UV) radiation

The source of the rays may be the sun or certain lamps. The degree of penetration into the eye (and consequently the danger of the exposure) depends on the wavelength. Three zones have been defined by the International Lighting Commission: UVC (280 to 100 nm) rays are absorbed at the level of the cornea and conjunctiva; UVB (315 to 280 nm) are more penetrating and reach the anterior segment of the eye; UVA (400 to 315 nm) penetrate still further.

For welders the characteristic effects of exposure have been described, such as acute keratoconjunctivitis, chronic photo-ophthalmia with decreased vision, and so on. The welder is subjected to a considerable amount of visible light, and it is essential that the eyes be protected with adequate filters. Snowblindness, a very painful condition for workers in mountains, needs to be avoided by wearing appropriate sunglasses.

Infrared radiation

Infrared rays are situated between the visible rays and the shortest radio-electric waves. They begin, according to the International Lighting Commission, at 750 nm. Their penetration into the eye depends on their wavelength; the longest infrared rays can reach the lens and even the retina. Their effect on the eye is due to their calorigenicity. The characteristic condition is found in those who blow glass opposite the oven. Other workers, such as blast furnace workers, suffer from thermal irradiation with various clinical effects (such as keratoconjunctivitis, or membranous thickening of the conjunctiva).

LASER (Light amplification by stimulated emission of radiation)

The wavelength of the emission depends on the type of laser—visible light, ultraviolet and infrared radiation. It is principally the quantity of energy projected that determines the level of the danger incurred.

Ultraviolet rays cause inflammatory lesions; infrared rays can cause caloric lesions; but the greatest risk is destruction of retinal tissue by the beam itself, with loss of vision in the affected area.

Radiation from cathode screens

The emissions coming from the cathode screens commonly used in offices (x rays, ultraviolet, infrared and radio rays) are all situated below the international standards. There is no evidence of any relationship between video terminal work and the onset of cataract (Rubino 1990).

Harmful substances

Certain solvents, such as the esters and aldehydes (formaldehyde being very widely used), are irritating to the eyes. The inorganic acids, whose corrosive action is well known, cause tissue destruction and chemical burns by contact. The organic acids are also dangerous. Alcohols are irritants. Caustic soda, an extremely strong base, is a powerful corrosive that attacks the eyes and the skin. Also included in the list of harmful substances are certain plastic materials (Grant 1979) as well as allergenic dusts or other substances such as exotic woods, feathers and so on.

Finally, infectious occupational diseases can be accompanied by effects on the eyes.

Protective glasses

Since the wearing of individual protection (glasses and masks) can obstruct vision (reduction of visual acuity owing to loss of transparency of the glasses on account of the projection of foreign bodies, and obstacles in the visual field such as the sidepieces of glasses), workplace hygiene also tends towards using other means such as the extraction of dust and dangerous particles from the air through general ventilation.

The occupational physician is frequently called upon to advise on the quality of glasses adapted to the risk; national and international directives will guide this choice. Moreover, better goggles are now available, which include improvements in efficacy, comfort and even aesthetics.

In the United States, for example, reference can be made to ANSI standards (particularly ANSI Z87.1-1979) that have the force of law under the federal Occupational Safety and Health Act (Fox 1973). ISO Standard No. 4007-1977 refers also to protective devices. In France, recommendations and protective material are available from the INRS in Nancy. In Switzerland, the national insurance company CNA provides rules and procedures for extraction of foreign bodies at the workplace. For serious damage, it is preferable to send the injured worker to the eye doctor or the eye clinic.

Finally, people with eye pathologies may be more at risk than others; to discuss such a controversial problem goes beyond the scope of this article. As previously said, their eye doctor should be aware of the dangers that they may encounter at their workplace and survey them carefully.


At the workplace, most information and signals are visual in nature, although acoustic signals may play a role; nor should we forget the importance of tactile signals in manual work, as well as in office work (for example, the speed of a keyboard).

Our knowledge of the eye and vision comes mostly from two sources: medical and scientific. For the purpose of diagnosis of eye defects and diseases, techniques have been developed which measure visual functions; these procedures may not be the most effective for occupational testing purposes. Conditions of medical examination are indeed very far from those which are encountered at the workplace; for example, to determine visual acuity the eye doctor will make use of charts or instruments where contrast between test object and background is the highest possible, where the edges of test objects are sharp, where no disturbing glare sources are perceptible and so on. In real life, lighting conditions are often poor and visual performance is under stress for several hours.

This emphasizes the need to utilize laboratory apparatus and instrumentation which display a higher predictive power for visual strain and fatigue at the workplace.

Many of the scientific experiments reported in textbooks were performed for a better theoretical understanding of the visual system, which is very complex. References in this article have been limited to that knowledge which is immediately useful in occupational health.

While pathological conditions may impede some people in fulfilling the visual requirements of a job, it seems safer and fairer—apart from highly demanding jobs with their own regulations (aviation, for example)—to give the eye doctor the power of decision, rather than refer to general rules; and it is in this way that most countries operate. Guidelines are available for more information.

On the other hand, hazards exist for the eye when exposed at the workplace to various noxious agents, whether physical or chemical. Hazards for the eye in industry are briefly enumerated. From scientific knowledge, no danger of developing cataracts may be expected from working on a VDU.



Thursday, 03 March 2011 21:22


The three chemosensory systems, smell, taste, and the common chemical sense, require direct stimulation by chemicals for sensory perception. Their role is to monitor constantly both harmful and beneficial inhaled and ingested chemical substances. Irritating or tingling properties are detected by the common chemical sense. The taste system perceives only sweet, salty, sour, bitter and possibly metallic and monosodium glutamate (umami) tastes. The totality of the oral sensory experience is termed “flavour,” the interaction of smell, taste, irritation, texture and temperature. Because most flavour is derived from the smell, or aroma, of food and beverages, damage to the smell system is often reported as a problem with “taste”. Verifiable taste deficits are more likely present if specific losses to sweet, sour, salty and bitter sensations are described.

Chemosensory complaints are frequent in occupational settings and may result from a normal sensory system perceiving environmental chemicals. Conversely, they may also indicate an injured system: requisite contact with chemical substances renders these sensory systems uniquely vulnerable to damage (see table 1). In the occupational setting, these systems can also be damaged by trauma to the head as well as by agents other than chemicals (e.g., radiation). Taste disorders are either temporary or permanent: complete or partial taste loss (ageusia or hypogeusia), heightened taste (hypergeusia) and distorted or phantom tastes (dysgeusia) (Deems, Doty and Settle 1991; Mott, Grushka and Sessle 1993).

Table 1. Agents/processes reported to alter the taste system


Taste disturbance



Metallic taste

Siblerud 1990; see text

Dental restorations/appliances

Metallic taste

See text

Diving (dry saturation)

Sweet, bitter; salt, sour

See text

Diving and welding

Metallic taste

See text



See text


Sweet dysgeusia

Schweisfurth and Schottes 1993


Hypogeusia, “glue” dysgeusia

Hotz et al. 1992

Lead poisoning

Sweet/metallic dysgeusia

Kachru et al. 1989

Metals and metal fumes
(also, some specific metals listed in chart)


See text; Shusterman and Sheedy 1992


Metallic taste

Pfeiffer and Schwickerath 1991


Bitter/metallic dysgeusia



Increased DT & RT



Metallic taste

Bedwal et al. 1993


“Funny taste”, H


Sulphuric acid mists

“Bad taste”

Petersen and Gormsen 1991

Underwater welding

Metallic taste

See text


Metallic taste

Nemery 1990

DT = detection threshold, RT = recognition threshold, * = Mott & Leopold 1991, + = Schiffman & Nagle 1992
Specific taste disturbances are as stated in the articles referenced.

The taste system is sustained by regenerative capability and redundant innervation. Because of this, clinically notable taste disorders are less common than olfactory disorders. Taste distortions are more common than significant taste loss and, when present, are more likely to have secondary adverse effects such as anxiety and depression. Taste loss or distortion can interfere with occupational performance where keen taste acuity is required, such as culinary arts and blending of wines and spirits.

Anatomy and Physiology

Taste receptor cells, found throughout the oral cavity, the pharynx, the larynx and the oesophagus, are modified epithelial cells located within the taste buds. While on the tongue taste buds are grouped in superficial structures called papillae, extralingual taste buds are distributed within the epithelium. The superficial placement of taste cells makes them susceptible to injury. Damaging agents usually come in contact with the mouth through ingestion, although mouth breathing associated with nasal obstruction or other conditions (e.g., exercise, asthma) allows oral contact with airborne agents. The taste receptor cell’s average ten-day life span permits rapid recovery if superficial damage to receptor cells has occurred. Also, taste is innervated by four pairs of peripheral nerves: the front of the tongue by the chorda tympani branch of the seventh cranial nerve (CN VII); the posterior of the tongue and the pharynx by the glossopharyngeal nerve (CN IX); the soft palate by the greater superficial petrosal branch of CN VII; and the larynx/oesophagus by the vagus (CN X). Last, taste central pathways, although not completely mapped in humans (Ogawa 1994), appear more divergent than olfactory central pathways.

The first step in taste perception involves interaction between chemicals and taste receptor cells. The four taste qualities, sweet, sour, salty and bitter, enlist different mechanisms at the level of the receptor (Kinnamon and Getchell 1991), ultimately generating action potentials in taste neurons (transduction).

Tastants diffuse through salivary secretions and also mucus secreted around taste cells to interact with the surface of taste cells. Saliva ensures that tastants are carried to the buds, and provides an optimal ionic environment for perception (Spielman 1990). Alterations in taste can be demonstrated with changes in the inorganic constituents of saliva. Most taste stimuli are water soluble and diffuse easily; others require soluble carrier proteins for transport to the receptor. Salivary output and composition, therefore, play an essential role in taste function.

Salt taste is stimulated by cations such as Na+, K+ or NH4+. Most salty stimuli are transduced when ions travel through a specific type of sodium channel (Gilbertson 1993), although other mechanisms may also be involved. Changes in the composition of taste pore mucus or the taste cell’s environment could alter salt taste. Also, structural changes in nearby receptor proteins could modify receptor membrane function. Sour taste corresponds to acidity. Blockade of specific sodium channels by hydrogen ions elicits sour taste. As with salt taste, however, other mechanisms are thought to exist. Many chemical compounds are perceived as bitter, including cations, amino acids, peptides and larger compounds. Detection of bitter stimuli appears to involve more diverse mechanisms that include transport proteins, cation channels, G proteins and second messenger mediated pathways (Margolskee 1993). Salivary proteins may be essential in transporting lipophilic bitter stimuli to the receptor membranes. Sweet stimuli bind to specific receptors linked to G protein-activated second-messenger systems. There is also some evidence in mammals that sweet stimuli can gate ion channels directly (Gilbertson 1993).

Taste Disorders

General Concepts

The anatomic diversity and redundancy of the taste system is sufficiently protective to prevent total, permanent taste loss. Loss of a few peripheral taste fields, for example, would not be expected to affect whole mouth taste ability (Mott, Grushka and Sessle 1993). The taste system may be far more vulnerable to taste distortion or phantom tastes. For example, dysgeusias appear to be more common in occupational exposures than taste losses per se. Although taste is thought to be more robust than smell with respect to the ageing process, losses in taste perception with ageing have been documented.

Temporary taste losses can occur when the oral mucosa has been irritated. Theoretically, this can result in inflammation of the taste cells, closure of taste pores or altered function at the surface of taste cells. Inflammation can alter blood flow to the tongue, thereby affecting taste. Salivary flow can also be compromised. Irritants can cause swelling and obstruct salivary ducts. Toxicants absorbed and excreted through salivary glands, could damage ductal tissue during excretion. Either of these processes could cause long-term oral dryness with resultant taste effects. Exposure to toxicants could alter the turnover rate of taste cells, modify the taste channels at the surface of the taste cell, or change the internal or external chemical environments of the cells. Many substances are known to be neurotoxic and could injure peripheral taste nerves directly, or damage higher taste pathways in the brain.


Pesticide use is widespread and contamination occurs as residues in meat, vegetables, milk, rain and drinking water. Although workers exposed during the manufacture or use of pesticides are at greatest risk, the general population is also exposed. Important pesticides include organochloride compounds, organophosphate pesticides, and carbamate pesticides. Organochloride compounds are highly stable and therefore exist in the environment for lengthy periods. Direct toxic effects on central neurons have been demonstrated. Organophosphate pesticides have more widespread use because they are not as persistent, but they are more toxic; inhibition of acetylcholinesterase can cause neurological and behavioural abnormalities. Carbamate pesticide toxicity is similar to that for the organophosphorus compounds and are often used when the latter fail. Pesticide exposures have been associated with persistent bitter or metallic tastes (Schiffman and Nagle 1992), unspecified dysgeusia (Ciesielski et al. 1994), and less commonly with taste loss. Pesticides can reach taste receptors via air, water and food and can be absorbed from the skin, gastrointestinal tract, conjunctiva, and respiratory tract. Because many pesticides are lipid soluble, they can easily penetrate lipid membranes within the body. Interference with taste can occur peripherally irrespective of route of initial exposure; in mice, binding to the tongue has been seen with certain insecticides after injection of pesticide material into the bloodstream. Alterations in taste bud morphology after pesticide exposure have been demonstrated. Degenerative changes in the sensory nerve terminations have been also noted and may account for reports of abnormalities of neural transmission. Metallic dysgeusia may be a sensory paresthesia caused by the impact of pesticides on taste buds and their afferent nerve endings. There is some evidence, however, that pesticides can interfere with neurotransmitters and therefore disrupt transmission of taste information more centrally (El-Etri et al. 1992). Workers exposed to organophosphate pesticides can demonstrate neurological abnormalities on electroencephalography and neuropsychological testing independent of cholinesterase depression in the blood stream. It is thought that these pesticides have a neurotoxic effect on the brain independent of the effect upon cholinesterase. Although increased salivary flow has been reported to be associated with pesticide exposure, it is unclear what effect this might have on taste.

Metals and metal fume fever

Alterations of taste have occurred after exposure to certain metals and metallic compounds including mercury, copper, selenium, tellurium, cyanide, vanadium, cadmium, chromium and antimony. Metallic taste has also been noted by workers exposed to the fumes of zinc or copper oxide, from the ingestion of copper salt in poisoning cases, or from exposure to emissions resulting from the use of torches for cutting of brass piping.

Exposure to freshly formed fumes of metal oxides can result in a syndrome known as metal fume fever (Gordon and Fine 1993). Although zinc oxide is most frequently cited, this disorder has also been reported after exposure to oxides of other metals, including copper, aluminium, cadmium, lead, iron, magnesium, manganese, nickel, selenium, silver, antimony and tin. The syndrome was first noted in brass foundry workers, but is now most common in welding of galvanized steel or during galvanization of steel. Within hours after exposure, throat irritation and a sweet or metallic dysgeusia may herald more generalized symptoms of fever, shaking chills, and myalgia. Other symptoms, such as cough or headache, may also occur. The syndrome is notable for both rapid resolution (within 48 hours) and development of tolerance upon repeated exposures to the metal oxide. A number of possible mechanisms have been suggested, including immune system reactions and a direct toxic effect on respiratory tissue, but it is now thought that lung exposure to metal fumes results in release of specific mediators into the blood stream, called cytokines, that cause the physical symptoms and findings (Blanc et al. 1993). A more severe, potentially fatal, variant of metal fume fever occurs after exposure to zinc chloride aerosol in military screening smoke bombs (Blount 1990). Polymer fume fever is similar to metal fume fever in presentation, with the exception of the absence of metallic taste complaints (Shusterman 1992).

In lead poisoning cases, sweet metallic tastes are often described. In one report, silver jewellery workers with confirmed lead toxicity exhibited taste alterations (Kachru et al. 1989). The workers were exposed to lead fumes by heating jewellers’ silver waste in workshops which had poor exhaust systems. The vapours condensed on the skin and hair of the workers and also contaminated their clothing, food and drinking water.

Underwater welding

Divers describe oral discomfort, loosening of dental fillings and metallic taste during electrical welding and cutting underwater. In a study by Örtendahl, Dahlen and Röckert (1985), 55% of 118 divers working under water with electrical equipment described metallic taste. Divers without this occupational history did not describe metallic taste. Forty divers were recruited into two groups for further evaluation; the group with underwater welding and cutting experience had significantly more evidence of dental amalgam breakdown. Initially, it was theorized that intraoral electrical currents erode dental amalgam, releasing metal ions which have direct effects on taste cells. Subsequent data, however, demonstrated intraoral electrical activity of insufficient magnitude to erode dental amalgam, but of sufficient magnitude to directly stimulate taste cells and cause metallic taste (Örtendahl 1987; Frank and Smith 1991). Divers may be vulnerable to taste changes without welding exposure; differential effects on taste quality perception have been documented, with decreased sensitivity to sweet and bitter and increased sensitivity to salty and sour tastants (O’Reilly et al. 1977).

Dental restorations and oral galvanism

In a large prospective, longitudinal study of dental restorations and appliances, approximately 5% of subjects reported a metallic taste at any given time (Participants of SCP Nos. 147/242 & Morris 1990). Frequency of metallic taste was higher with a history of teeth grinding; with fixed partial dentures than with crowns; and with an increased number of fixed partial dentures. Interactions between dental amalgams and the oral environment are complex (Marek 1992) and could affect taste through a variety of mechanisms. Metals that bind to proteins can acquire antigenicity (Nemery 1990) and might cause allergic reactions with subsequent taste alterations. Soluble metal ions and debris are released and may interact with soft tissues in the oral cavity. Metallic taste has been reported to correlate with nickel solubility in saliva from dental appliances (Pfeiffer and Schwickerath 1991). Metallic taste was reported by 16% of subjects with dental fillings and none of subjects without fillings (Siblerud 1990). In a related study of subjects who had amalgam removed, metallic taste improved or abated in 94% (Siblerud 1990).

Oral galvanism, a controversial diagnosis (Council on Dental Materials report 1987), describes the generation of oral currents from either corrosion of dental amalgam restorations or electrochemical differences between dissimilar intraoral metals. Patients considered to have oral galvanism appear to have a high frequency of dysgeusia (63%) described as metallic, battery, unpleasant or salty tastes (Johansson, Stenman and Bergman 1984). Theoretically, taste cells could be directly stimulated by intraoral electric currents and generate dysgeusia. Subjects with symptoms of oral burning, battery taste, metallic taste and/or oral galvanism were determined to have lower electrogustometric thresholds (i.e. more sensitive taste) on taste testing than control subjects (Axéll, Nilner and Nilsson 1983). Whether galvanic currents related to dental materials are causative is debatable, however. A brief tin-foil taste shortly after restorative work is thought to be possible, but more permanent effects are probably unlikely (Council on Dental Materials 1987). Yontchev, Carlsson and Hedegård (1987) found similar frequencies of metallic taste or oral burning in subjects with these symptoms whether or not there was contact between dental restorations. Alternative explanations for taste complaints in patients with restorations or appliances are sensitivity to mercury, cobalt, chrome, nickel or other metals (Council on Dental Materials 1987), other intraoral processes (e.g., periodontal disease), xerostomia, mucosal abnormalities, medical illnesses, and medication side effects.

Drugs and medications

Many drugs and medications have been linked to taste alterations (Frank, Hettinger and Mott 1992; Mott, Grushka and Sessle 1993; Della Fera, Mott and Frank 1995; Smith and Burtner 1994) and are mentioned here because of possible occupational exposures during the manufacture of these drugs. Antibiotics, anticonvulsants, antilipidemics, antineoplastics, psychiatric, antiparkinsonism, antithyroid, arthritis, cardiovascular, and dental hygiene drugs are broad classes reported to affect taste.

The presumed site of action of drugs on the taste system varies. Often the drug is tasted directly during oral administration of the drug or the drug or its metabolites are tasted after being excreted in saliva. Many drugs, for example anticholinergics or some antidepressants, cause oral dryness and affect taste through inadequate presentation of the tastant to the taste cells via saliva. Some drugs may affect taste cells directly. Because taste cells have a high turnover rate, they are especially vulnerable to drugs that interrupt protein synthesis, such as antineoplastic drugs. It has also been thought that there may be an effect on impulse transmission through the taste nerves or in the ganglion cells, or a change in the processing of the stimuli in higher taste centres. Metallic dysgeusia has been reported with lithium, possibly through transformations in receptor ion channels. Anti-thyroid drugs and angiotensin converting enzyme inhibitors (e.g., captopril and enalapril) are well known causes of taste alterations, possibly because of the presence of a sulphydryl (-SH) group (Mott, Grushka and Sessle 1993). Other drugs with -SH groups (e.g., methimazole, penicillamine) also cause taste abnormalities. Drugs that affect neurotransmitters could potentially alter taste perception.

Mechanisms of taste alterations vary, however, even within a class of drug. For example, taste alterations after treatment with tetracycline may be caused by oral mycosis. Alternatively, an increased blood urea nitrogen, associated with the catabolic effect of tetracycline, can sometimes result in a metallic or ammonia-like taste.

Side effects of metronidazole include alteration of taste, nausea, and a distinctive distortion of the taste of carbonated and alcoholic beverages. Peripheral neuropathy and paraesthesias can also sometimes occur. It is thought that the drug and its metabolites may have a direct affect upon taste receptor function, and also on the sensory cell.

Radiation exposure

Radiation treatment can cause taste dysfunction through (1) taste cell changes, (2) damage to taste nerves, (3) salivary gland dysfunction, and (4) opportunistic oral infection (Della Fera et al. 1995). There have been no studies of occupational radiation effects on the taste system.

Head trauma

Head trauma occurs in the occupational setting and can cause alterations in the taste system. Although perhaps only 0.5% of head trauma patients describe taste loss, the frequency of dysgeusia may be much higher (Mott, Grushka and Sessle 1993). Taste loss, when it occurs, is likely quality-specific or localized and may not even be subjectively apparent. The prognosis of subjectively noted taste loss appears better than that for olfactory loss.

Non-occupational causes

Other causes of taste abnormalities must be considered in the differential diagnosis, including congenital/genetic, endocrine/metabolic, or gastrointestinal disorders; hepatic disease; iatrogenic effects; infection; local oral conditions; cancer; neurological disorders; psychiatric disorders; renal disease; and dry mouth/Sjogren’s syndrome (Deems, Doty and Settle 1991; Mott and Leopold 1991; Mott, Grushka and Sessle 1993).

Taste testing

Psychophysics is the measurement of a response to an applied sensory stimulus. “Threshold” tasks, tests that determine the minimum concentration that can be reliably perceived, are less useful in taste than olfaction because of wider variability in the former in the general population. Separate thresholds can be obtained for detection of tastants and recognition of tastant quality. Suprathreshold tests assess the ability of the system to function at levels above threshold and may provide more information about “real world” taste experience. Discrimination tasks, telling the difference between substances, can elicit subtle changes in sensory ability. Identification tasks may yield different results than threshold tasks in the same individual. For example, a person with central nervous system injury may be able to detect and rank tastants, but may not be able to identify them. Taste testing can assess whole mouth taste through swishing of tastants throughout the oral cavity, or can test specific taste areas with targeted droplets of tastants or focally applied filter paper soaked with tastants.


The taste system is one of three chemosensory systems, together with olfaction and the common chemical sense, committed to monitoring harmful and beneficial inhaled and ingested substances. Taste cells are rapidly replaced, are innervated by pairs of four peripheral nerves, and appear to have divergent central pathways in the brain. The taste system is responsible for the appreciation of four basic taste qualities (sweet, sour, salty, and bitter) and, debatably, metallic and umami (monosodium glutamate) tastes. Clinically significant taste losses are rare, probably because of the redundancy and diversity of innervation. Distorted or abnormal tastes, however, are more common and can be more distressing. Toxic agents unable to destroy the taste system, or to halt transduction or transmission of taste information, nevertheless have ample opportunities to impede the perception of normal taste qualities. Irregularities or obstacles can occur through one or more of the following: suboptimal tastant transport, altered salivary composition, taste cell inflammation, blockage of taste cell ion pathways, alterations in the taste cell membrane or receptor proteins, and peripheral or central neurotoxicity. Alternatively, the taste system may be intact and functioning normally, but be subjected to disagreeable sensory stimulation through small intraoral galvanic currents or the perception of intraoral medications, drugs, pesticides or metal ions.



Monday, 07 March 2011 15:31


Three sensory systems are uniquely constructed to monitor contact with environmental substances: olfaction (smell), taste (sweet, salty, sour, and bitter perception), and the common chemical sense (detection of irritation or pungency). Because they require stimulation by chemicals, they are termed “chemosensory” systems. Olfactory disorders consist of temporary or permanent: complete or partial smell loss (anosmia or hyposmia) and parosmias (perverted smells dysosmia or phantom smells phantosmia) (Mott and Leopold 1991; Mott, Grushka and Sessle 1993). After chemical exposures, some individuals describe a heightened sensitivity to chemical stimuli (hyperosmia). Flavour is the sensory experience generated by the interaction of the smell, taste and irritating components of food and beverages, as well as texture and temperature. Because most flavour is derived from the smell, or aroma, of ingestants, damage to the smell system is often reported as a problem with “taste”.

Chemosensory complaints are frequent in occupational settings and may result from a normal sensory system’s perceiving environmental chemicals. Conversely, they may also indicate an injured system: requisite contact with chemical substances renders these sensory systems uniquely vulnerable to damage. In the occupational setting, these systems can also be damaged by trauma to the head and agents other than chemicals (e.g., radiation). Pollutant-related environmental odours can exacerbate underlying medical conditions (e.g., asthma, rhinitis), precipitate development of odour aversions, or cause a stress-related type of illness. Malodors have been demonstrated to decrease complex task performance (Shusterman 1992).

Early identification of workers with olfactory loss is essential. Certain occupations, such as the culinary arts, wine making and the perfume industry, require a good sense of smell as a prerequisite. Many other occupations require normal olfaction for either good job performance or self-protection. For example, parents or day care workers generally rely on smell to determine children’s hygiene needs. Firefighters need to detect chemicals and smoke. Any worker with ongoing exposure to chemicals is at increased risk if olfactory ability is poor.

Olfaction provides an early warning system to many harmful environmental substances. Once this ability is lost, workers may not be aware of dangerous exposures until the concentration of the agent is high enough to be irritating, damaging to respiratory tissues or lethal. Prompt detection can prevent further olfactory damage through treatment of inflammation and reduction of subsequent exposure. Lastly, if loss is permanent and severe, it may be considered a disability requiring new job training and/or compensation.

Anatomy and Physiology


The primary olfactory receptors are located in patches of tissue, termed olfactory neuroepithelium, at the most superior portion of the nasal cavities (Mott and Leopold 1991). Unlike other sensory systems, the receptor is the nerve. One portion of an olfactory receptor cell is sent to the surface of the nasal lining, and the other end connects directly via a long axon to one of two olfactory bulbs in the brain. From here, the information travels to many other areas of the brain. Odorants are volatile chemicals that must contact the olfactory receptor for smell perception to occur. Odorant molecules are trapped by and then diffuse through mucus to attach to cilia at the ends of the olfactory receptor cells. It is not yet known how we are able to detect more than ten thousand odorants, discriminate from as many as 5,000, and judge varying odorant intensities. Recently, a multigene family was discovered that codes for odorant receptors on primary olfactory nerves (Ressler, Sullivan and Buck 1994). This has allowed investigation of how odours are detected and how the smell system is organized. Each neuron may respond broadly to high concentrations of a variety of odorants, but will respond to only one or a few odorants at low concentrations. Once stimulated, surface receptor proteins activate intracellular processes that translate sensory information into an electrical signal (transduction). It is not known what terminates the sensory signal despite continued odorant exposure. Soluble odorant binding proteins have been found, but their role is undetermined. Proteins that metabolize odorants may be involved or carrier proteins may transport odorants either away from the olfactory cilia or toward a catalytic site within the olfactory cells.

The portions of the olfactory receptors connecting directly to the brain are fine nerve filaments that travel through a plate of bone. The location and delicate structure of these filaments render them vulnerable to shear injury from blows to the head. Also, because the olfactory receptor is a nerve, physically contacts odorants, and connects directly to the brain, substances entering the olfactory cells can travel along the axon into the brain. Because of continued exposure to agents damaging to the olfactory receptor cells, olfactory ability might be lost early in the lifespan if it were not for a critical attribute: olfactory receptor nerves are capable of regeneration and may be replaced, provided the tissue has not been completely destroyed. If the damage to the system is more centrally located, however, the nerves can not be restored.

Common chemical sense

The common chemical sense is initiated by stimulation of mucosal, multiple, free nerve endings of the fifth (trigeminal) cranial nerve. It perceives the irritating properties of inhaled substances and triggers reflexes designed to limit exposure to dangerous agents: sneezing, mucus secretion, reduction of breathing rate or even breath-holding. Strong warning cues compel removal from the irritation as soon as possible. Although the pungency of substances vary, generally the odour of the substance is detected before irritation becomes apparent (Ruth 1986). Once irritation is detected, however, small increases in concentration enhance irritation more than odorant appreciation. Pungency may be evoked through either physical or chemical interactions with receptors (Cometto-Muñiz and Cain 1991). The warning properties of gases or vapours tend to correlate with their water solubilities (Shusterman 1992). Anosmics appear to require higher concentrations of pungent chemicals for detection (Cometto-Muñiz and Cain 1994), but thresholds of detection are not elevated as one ages (Stevens and Cain 1986).

Tolerance and adaptation

Perception of chemicals can be altered by previous encounters. Tolerance develops when exposure reduces the response to subsequent exposures. Adaptation occurs when a constant or rapidly repeated stimulus elicits a diminishing response. For example, short-term solvent exposure markedly, but temporarily, reduces solvent detection ability (Gagnon, Mergler and Lapare 1994). Adaptation can also occur when there has been prolonged exposure at low concentrations or rapidly, with some chemicals, when extremely high concentrations are present. The latter can lead to rapid and reversible olfactory “paralysis”. Nasal pungency typically shows less adaptation and development of tolerance than olfactory sensations. Mixtures of chemicals can also alter perceived intensities. Generally, when odorants are mixed, perceived odorant intensity is less than would be expected from adding the two intensities together (hypoadditivity). Nasal pungency, however, generally shows additivity with exposure to multiple chemicals, and summation of irritation over time (Cometto-Muñiz and Cain 1994). With odorants and irritants in the same mixture, the odour is always perceived as less intense. Because of tolerance, adaptation, and hypoadditivity, one must be careful to avoid relying on these sensory systems to gauge the concentration of chemicals in the environment.

Olfactory Disorders

General concepts

Olfaction is disrupted when odorants can not reach olfactory receptors, or when olfactory tissue is damaged. Swelling within the nose from rhinitis, sinusitis or polyps can preclude odorant accessibility. Damage can occur with: inflammation in the nasal cavities; destruction of the olfactory neuroepithelium by various agents; trauma to the head; and transmittal of agents via the olfactory nerves to the brain with subsequent injury to the smell portion of the central nervous system. Occupational settings contain varying amounts of potentially damaging agents and conditions (Amoore 1986; Cometto-Muñiz and Cain 1991; Shusterman 1992; Schiffman and Nagle 1992). Recently published data from 712,000 National Geographic Smell Survey respondents suggests that factory work impairs smell; male and female factory workers reported poorer senses of smell and demonstrated decreased olfaction on testing (Corwin, Loury and Gilbert 1995). Specifically, chemical exposures and head trauma were more frequently reported than by workers in other occupational settings.

When an occupational olfactory disorder is suspected, identification of the offending agent can be difficult. Current knowledge is largely derived from small series and case reports. It is of importance that few studies mention examination of the nose and sinuses. Most rely on patient history for olfactory status, rather than testing of the olfactory system. An additional complicating factor is the high prevalence of non-occupationally related olfactory disturbances in the general population, mostly due to viral infections, allergies, nasal polyps, sinusitis or head trauma. Some of these, however, are also more common in the work environment and will be discussed in detail here.

Rhinitis, sinusitis and polyposis

Individuals with olfactory disturbance must first be assessed for rhinitis, nasal polyps and sinusitis. It is estimated that 20% of the United States population, for example, has upper airway allergies. Environmental exposures can be unrelated, cause inflammation or exacerbate an underlying disorder. Rhinitis is associated with olfactory loss in occupational settings (Welch, Birchall and Stafford 1995). Some chemicals, such as isocyanates, acid anhydrides, platinum salts and reactive dyes (Coleman, Holliday and Dearman 1994), and metals (Nemery 1990) can be allergenic. There is also considerable evidence that chemicals and particles increase sensitivity to nonchemical allergens (Rusznak, Devalia and Davies 1994). Toxic agents alter the permeability of the nasal mucosa and allow greater penetration of allergens and enhanced symptoms, making it difficult to discriminate between rhinitis due to allergies and that due to exposure to toxic or particulate substances. If inflammation and/or obstruction in the nose or sinuses is demonstrated, return of normal olfactory function is possible with treatment. Options include topical corticosteroid sprays, systemic antihistamines and decongestants, antibiotics and polypectomy/sinus surgery. If inflammation or obstruction is not present or treatment does not secure improvement in olfactory function, olfactory tissue may have sustained permanent damage. Irrespective of cause, the individual must be protected from future contact with the offending substance or further injury to the olfactory system could occur.

Head trauma

Head trauma can alter olfaction through (1) nasal injury with scarring of the olfactory neuroepithelium, (2) nasal injury with mechanical obstruction to odours, (3) shearing of the olfactory filaments, and (4) bruising or destruction of the part of the brain responsible for smell sensations (Mott and Leopold 1991). Although trauma is a risk in many occupational settings (Corwin, Loury and Gilbert 1995), exposure to certain chemicals can increase this risk.

Smell loss occurs in 5% to 30% of head trauma patients and may ensue without any other nervous system abnormalities. Nasal obstruction to odorants may be surgically correctable, unless significant intranasal scarring has occurred. Otherwise, no treatment is available for smell disorders resulting from head trauma, although spontaneous improvement is possible. Rapid initial improvement may occur as swelling subsides in the area of injury. If olfactory filaments have been sheared, regrowth and gradual improvement of smell may also occur. Although this occurs in animals within 60 days, improvements in humans have been reported as long as seven years after injury. Parosmias developing as the patient recovers from injury may indicate regrowth of olfactory tissue and herald return of some normal smell function. Parosmias occurring at the time of injury or shortly thereafter are more likely due to brain tissue damage. Damage to the brain will not repair itself and improvement in smell ability would not be expected. Injury to the frontal lobe, the portion of the brain integral to emotion and thinking, may be more frequent in head trauma patients with smell loss. The resultant changes in socialization or thinking patterns may be subtle, though harmful to family and career. Formal neuropsychiatric testing and treatment may, therefore, be indicated in some patients.

Environmental agents

Environmental agents can gain access to the olfactory system through either the bloodstream or inspired air and have been reported to cause smell loss, parosmia and hyperosmia. Responsible agents include metallic compounds, metal dusts, nonmetallic inorganic compounds, organic compounds, wood dusts and substances present in various occupational environments, such as metallurgical and manufacturing processes (Amoore 1986; Schiffman and Nagle 1992 (table 1). Injury can occur both after acute and chronic exposures and can be either reversible or irreversible, depending on the interaction between host susceptibility and the damaging agent. Important substance attributes include bioactivity, concentration, irritant capacity, length of exposure, rate of clearance and potential synergism with other chemicals. Host susceptibility varies with genetic background and age. There are gender differences in olfaction, hormonal modulation of odorant metabolism and differences in specific anosmias. Tobacco use, allergies, asthma, nutritional status, pre-existing disease (e.g., Sjogren’s syndrome), physical exertion at time of exposure, nasal airflow patterns and possibly psychosocial factors influence individual differences (Brooks 1994). Resistance of the peripheral tissue to injury and presence of functioning olfactory nerves can alter susceptibility. For example, acute, severe exposure could decimate the olfactory neuroepithelium, effectively preventing spread of the toxin centrally. Conversely, long-term, low-level exposure might allow preservation of functioning peripheral tissue and slow, but steady-transit of damaging substances into the brain. Cadmium, for example, has a half-life of 15 to 30 years in humans, and its effects might not be apparent until years after exposure (Hastings 1990).

Table 1. Agents/processes associated with olfactory abnormalities


Smell disturbance


Acetates, butyl and ethyl
Acetic acid
Acid chloride
Acids (organic and inorganic)
Acrylate, methacrylate vapours
Aluminium fumes
Ashes (incinerator)
Asphalt (oxidized)

H or A
H, P
Low normal
Decreased odour ID
Low normal

1, 2

Benzoic acid
Blasting powder
Butyl acetate
Butylene glycol

Below average


Cadmium compounds, dust, oxides

Carbon disulphide
Carbon monoxide
Carbon tetrachloride
Chalk dust
Chestnut wood dust
Chlorovinylarsine chlorides
Chromium (salts and plating)
Chromate salts
Chromic acid
Chromium fumes
Cigarette smoking
Coal (coal-bunker)
Coal tar fumes
Copper (and sulphuric acid)
Copper arsenite
Copper fumes
Cotton, knitting factory
Creosote fumes
Cutting oils (machining)


Low normal
Olfactory disorder
Decreased ID
H or A
Olfactory disturbance
Abnormal UPSIT
Below average

1 ; Bar-Sela et al. 1992; Rose, Heywood and Costanzo 1992
2 ;4
Savov 1991




Ethyl acetate

Ethyl ether

Ethylene oxide

Decreased smell

Gosselin, Smith and
Hodge 1984

Flour, flour mill
Fluorine compounds

H or A
Below average

1, 2 ; Chia et al. 1992


H or A


Halogen compounds
Hard woods
Aromatic hydrocarbon solvent
combinations (e.g., toluene, xylene, ethyl
Hydrogen chloride
Hydrogen cyanide
Hydrogen fluoride
Hydrogen selenide
Hydrogen sulphide

Decreased UPSIT, H

H or A

5 ; Hotz et al. 1992

5; Guidotti 1994

Iron carbonyl






Magnet production
Manganese fumes
N-Methylformimino-methyl ester

Low normal

2 ; Naus 1968

Nickel dust, hydroxide, plating and refining
Nickel hydroxide
Nickel plating
Nickel refining (electrolytic)
Nitric acid
Nitro compounds
Nitrogen dioxide

Low normal

1;4; Bar-Sela et al. 1992

Oil of peppermint
Osmium tetroxide

Garlic odour; H or A
Temporary H

3 ; 5

Paint (lead)
Paint (solvent based)

Paper, packing factory
Pavinol (sewing)
Pepper and creosol mixture
Perfumes (concentrated)
Phosphorous oxychloride

Low normal
H or A

Possible H
Low normal
H or A

H or A
H or A
Low normal

Wieslander, Norbäck
and Edling 1994

Rubber vulcanization



Selenium compounds (volatile)
Selenium dioxide
Silicone dioxide
Silver nitrate
Silver plating

Steel production
Sulphur compounds
Sulphur dioxide
Sulphuric acid

Below normal
H, P, Low normal

Low normal

1; Ahlström, Berglund and Berglund 1986; Schwartz et al. 1991; Bolla et al. 1995
1; Petersen and Gormsen 1991

Tin fumes

Parosmia, H or A

2; 4

Vanadium fumes




Low normal


Zinc (fumes, chromate) and production

Low normal


H = hyposmia; A = anosmia; P = parosmia; ID = odour identification ability

1 = Mott and Leopold 1991. 2 = Amoore 1986. 3 = Schiffman and Nagle 1992. 4 = Naus 1985. 5 = Callendar et al. 1993.

Specific smell disturbances are as stated in the articles referenced.


Nasal passages are ventilated by 10,000 to 20,000 litres of air per day, containing varying amounts of potentially harmful agents. The upper airways almost totally absorb or clear highly reactive or soluble gases, and particles larger than 2 mm (Evans and Hastings 1992). Fortunately, a number of mechanisms exist to protect tissue damage. Nasal tissues are enriched with blood vessels, nerves, specialized cells with cilia capable of synchronous movement, and mucus-producing glands. Defensive functions include filtration and clearing of particles, scrubbing of water soluble gases, and early identification of harmful agents through olfaction and mucosal detection of irritants that can initiate an alarm and removal of the individual from further exposure (Witek 1993). Low levels of chemicals are absorbed by the mucus layer, swept away by functioning cilia (mucociliary clearance) and swallowed. Chemicals can bind to proteins or be rapidly metabolized to less damaging products. Many metabolizing enzymes reside in the nasal mucosa and olfactory tissues (Bonnefoi, Monticello and Morgan 1991; Schiffman and Nagle 1992; Evans et al. 1995). Olfactory neuroepithelium, for example, contains cytochrome P-450 enzymes which play a major role in the detoxification of foreign substances (Gresham, Molgaard and Smith 1993). This system may protect the primary olfactory cells and also detoxify substances that would otherwise enter the central nervous system through olfactory nerves. There is also some evidence that intact olfactory neuroepithelium can prevent invasion by some organisms (e.g., cryptococcus; see Lima and Vital 1994). At the level of the olfactory bulb, there may also be protective mechanisms preventing transport of toxic substances centrally. For example, it has been recently shown that the olfactory bulb contains metallothioneins, proteins which have a protective effect against toxins (Choudhuri et al. 1995).

Exceeding protective capacities can precipitate a worsening cycle of injury. For example, loss of olfactory ability halts early warning of the hazard and allows continued exposure. Increase in nasal blood flow and blood vessel permeability causes swelling and odorant obstruction. Cilial function, necessary for both mucociliary clearance and normal smell, may be impaired. Change in clearance will increase contact time between injurious agents and nasal mucosa. Intranasal mucus abnormalities alter absorption of odorants or irritant molecules. Overpowering the ability to metabolize toxins allows tissue damage, increased absorption of toxins, and possibly enhanced systemic toxicity. Damaged epithelial tissue is more vulnerable to subsequent exposures. There are also more direct effects on olfactory receptors. Toxins can alter the turnover rate of olfactory receptor cells (normally 30 to 60 days), injure receptor cell membrane lipids, or change the internal or external environment of the receptor cells. Although regeneration can occur, damaged olfactory tissue can exhibit permanent changes of atrophy or replacement of olfactory tissue with nonsensory tissue.

The olfactory nerves provide a direct connection to the central nervous system and may serve as a route of entry for a variety of exogenous substances, including viruses, solvents and some metals (Evans and Hastings 1992). This mechanism may contribute to some of the olfactory-related dementias (Monteagudo, Cassidy and Folb 1989; Bonnefoi, Monticello and Morgan 1991) through, for example, transmittal of aluminium centrally. Intranasally, but not intraperitoneally or intracheally, applied cadmium can be detected in the ipsilateral olfactory bulb (Evans and Hastings 1992). There is further evidence that substances may be preferentially taken up by olfactory tissue irrespective of the site of initial exposure (e.g., systemic versus inhalation). Mercury, for example, has been found in high concentrations in the olfactory brain region in subjects with dental amalgams (Siblerud 1990). On electroencephalography, the olfactory bulb demonstrates sensitivity to many atmospheric pollutants, such as acetone, benzene, ammonia, formaldehyde and ozone (Bokina et al. 1976). Because of central nervous system effects of some hydrocarbon solvents, exposed individuals might not readily recognize and distance themselves from the danger, thereby prolonging exposure. Recently, Callender and colleagues (1993) obtained a 94% frequency of abnormal SPECT scans, which assess regional cerebral blood flow, in subjects with neurotoxin exposures and a high frequency of olfactory identification disorders. The location of abnormalities on SPECT scanning was consistent with distribution of toxin through olfactory pathways.

The site of injury within the olfactory system differs with various agents (Cometto-Muñiz and Cain 1991). For example, ethyl acrylate and nitroethane selectively damage olfactory tissue while the respiratory tissue within the nose is preserved (Miller et al. 1985). Formaldehyde alters the consistency, and sulphuric acid the pH of nasal mucus. Many gases, cadmium salts, dimethylamine and cigarette smoke alter ciliary function. Diethyl ether causes leakage of some molecules from the junctions between cells (Schiffman and Nagle 1992). Solvents, such as toluene, styrene and xylene change olfactory cilia; they also appear to be transmitted into the brain by the olfactory receptor (Hotz et al. 1992). Hydrogen sulphide is not only irritating to mucosa, but highly neurotoxic, effectively depriving cells of oxygen, and inducing rapid olfactory nerve paralysis (Guidotti 1994). Nickel directly damages cell membranes and also interferes with protective enzymes (Evans et al. 1995). Dissolved copper is thought to directly interfere with different stages of transduction at the olfactory receptor level (Winberg et al. 1992). Mercuric chloride selectively distributes to olfactory tissue, and may interfere with neuronal function through alteration of neurotransmitter levels (Lakshmana, Desiraju and Raju 1993). After injection into the bloodstream, pesticides are taken up by nasal mucosa (Brittebo, Hogman and Brandt 1987), and can cause nasal congestion. The garlic odour noted with organophosphorus pesticides is not due to damaged tissue, but to detection of butylmercaptan, however.

Although smoking can inflame the lining of the nose and reduce smell ability, it may also confer protection from other damaging agents. Chemicals within the smoke may induce microsomal cytochrome P450 enzyme systems (Gresham, Molgaard and Smith 1993), which would accelerate metabolism of toxic chemicals before they can injure the olfactory neuroepithelium. Conversely, some drugs, for example tricyclic antidepressants and antimalarial drugs, can inhibit cytochrome P450.

Olfactory loss after exposure to wood and fibre board dusts (Innocenti et al. 1985; Holmström, Rosén and Wilhelmsson 1991; Mott and Leopold 1991) may be due to diverse mechanisms. Allergic and nonallergic rhinitis can result in obstruction to odorants or inflammation. Mucosal changes can be severe, dysplasia has been documented (Boysen and Solberg 1982) and adenocarcinoma may result, especially in the area of the ethmoid sinuses near the olfactory neuroepithelium. Carcinoma associated with hard woods may be related to a high tannin content (Innocenti et al. 1985). Inability to effectively clear nasal mucus has been reported and may be related to an increased frequency of colds (Andersen, Andersen and Solgaard 1977); resultant viral infection could further damage the olfactory system. Olfactory loss may also be due to chemicals associated with woodworking, including varnishes and stains. Medium-density fibre board contains formaldehyde, a known respiratory tissue irritant that impairs mucociliary clearance, causes olfactory loss, and is associated with a high incidence of oral, nasal and pharyngeal cancer (Council on Scientific Affairs 1989), all of which could contribute to an understanding of formaldehyde-induced olfactory losses.

Radiation therapy has been reported to cause olfactory abnormalities (Mott and Leopold 1991), but little information is available about occupational exposures. Rapidly regenerating tissue, such as olfactory receptor cells, would be expected to be vulnerable. Mice exposed to radiation in a spaceflight demonstrated smell tissue abnormalities, while the rest of the nasal lining remained normal (Schiffman and Nagle 1992).

After chemical exposures, some individuals describe a heightened sensitivity to odorants. “Multiple chemical sensitivities” or “environmental illness” are labels used to describe disorders typified by “hypersensitivity” to diverse environmental chemicals, often in low concentrations (Cullen 1987; Miller 1992; Bell 1994). Thus far, however, lower thresholds to odorants have not been demonstrated.

Non-occupational causes of olfactory problems

Ageing and smoking decrease olfactory ability. Upper respiratory viral damage, idiopathic (“unknown”), head trauma, and diseases of the nose and sinuses appear to be the four leading causes of smell problems in the United States (Mott and Leopold 1991) and must be considered as part of the differential diagnosis in any individual presenting with possible environmental exposures. Congenital inabilities to detect certain substances are common. For example, 40 to 50% of the population can not detect androsterone, a steroid found in sweat.

Testing of chemosensation

Psychophysics is the measurement of a response to an applied sensory stimulus. “Threshold” tests, tests that determine the minimum concentration that can be reliably perceived, are frequently used. Separate thresholds can be obtained for detection of odorants and identification of odorants. Suprathreshold tests assess ability of the system to function at levels above threshold and also provide useful information. Discrimination tasks, telling the difference between substances, can elicit subtle changes in sensory ability. Identification tasks may yield different results than threshold tasks in the same individual. For example, a person with central nervous system injury may be able to detect odorants at usual threshold levels, but may not be able to identify common odorants.


The nasal passages are ventilated by 10,000 to 20,000 litres of air per day, which may be contaminated by possibly hazardous materials in varying degrees. The olfactory system is especially vulnerable to damage because of requisite direct contact with volatile chemicals for odorant perception. Olfactory loss, tolerance and adaptation prevent recognition of the proximity of dangerous chemicals and may contribute to local injury or systemic toxicity. Early identification of olfactory disorders can prompt protective strategies, ensure appropriate treatment and prevent further damage. Occupational smell disorders can manifest themselves as temporary or permanent anosmia or hyposmia, as well as distorted smell perception. Identifiable causes to be considered in the occupational setting include rhinitis, sinusitis, head trauma, radiation exposure and tissue injury from metallic compounds, metal dusts, nonmetallic inorganic compounds, organic compounds, wood dusts, and substances present in metallurgical and manufacturing processes. Substances differ in their site of interference with the olfactory system. Powerful mechanisms for trapping, removing and detoxifying foreign nasal substances serve to protect olfactory function and also prevent spread of damaging agents into the brain from the olfactory system. Exceeding protective capacities can precipitate a worsening cycle of injury, ultimately leading to greater severity of impairment and extension of sites of injury, and converting temporary reversible effects into permanent damage.



Monday, 07 March 2011 15:46

Cutaneous Receptors

Cutaneous sensitivity shares the main elements of all the basic senses. Properties of the external world, such as colour, sound, or vibration, are received by specialized nerve cell endings called sensory receptors, which convert external data into nervous impulses. These signals are then conveyed to the central nervous system, where they become the basis for interpreting the world around us.

It is useful to recognize three essential points about these processes. First, energy, and changes in energy levels, can be perceived only by a sense organ capable of detecting the specific type of energy in question. (This is why microwaves, x rays, and ultraviolet light are all dangerous; we are not equipped to detect them, so that even at lethal levels they are not perceived.) Second, our perceptions are necessarily imperfect shadows of reality, as our central nervous system is limited to reconstructing an incomplete image from the signals conveyed by its sensory receptors. Third, our sensory systems provide us with more accurate information about changes in our environment than about static conditions. We are well-equipped with sensory receptors sensitive to flickering lights, for example, or to the tiny fluctuations of temperature provoked by a slight breeze; we are less well-equipped to receive information about a steady temperature, say, or a constant pressure on the skin.

Traditionally the skin senses are divided into two categories: cutaneous and deep. While deep sensitivity relies on receptors located in muscle, tendons, joints, and the periosteum (membrane surrounding the bones), cutaneous sensitivity, with which we are concerned here, deals with information received by receptors in the skin: specifically, the various classes of cutaneous receptors that are located in or near the junction of the dermis and the epidermis.

All sensory nerves linking cutaneous receptors to the central nervous system have roughly the same structure. The cell’s large body resides in a cluster of other nerve cell bodies, called a ganglion, located near the spinal cord and connected to it by a narrow branch of the cell’s trunk, called its axon. Most nerve cells, or neurons, that originate at the spinal cord send axons to bones, muscle, joints, or, in the case of cutaneous sensitivity, to the skin. Just like an insulated wire, each axon is covered along its course and at its endings with protective layers of cells known as Schwann cells. These Schwann cells produce a substance known as myelin, which coats the axon like a sheath. At intervals along the way are tiny breaks in the myelin, known as nodes of Ranvier. Finally, at the end of the axon are found the components that specialize in receiving and retransmitting information about the external environment: the sensory receptors (Mountcastle 1974).

The different classes of cutaneous receptors, like all sensory receptors, are defined in two ways: by their anatomical structures, and by the type of electrical signals they send along their nerve fibres. Distinctly structured receptors are usually named after their discoverers. The relatively few classes of sensory receptors found in the skin can be divided into three main categories: mechanoreceptors, thermal receptors, and nociceptors.

All of these receptors can convey information about a particular stimulus only after they have first encoded it in a type of electrochemical neural language. These neural codes use varying frequencies and patterns of nerve impulses that scientists have only just begun to decipher. Indeed, an important branch of neurophysiological research is devoted entirely to the study of sensory receptors and the ways in which they translate energy states in the environment into neural codes. Once the codes are generated, they are conveyed centrally along afferent fibres, the nerve cells that serve receptors by conveying the signals to the central nervous system.

The messages produced by receptors can be subdivided on the basis of the response given to a continuous, unvarying stimulation: slowly adapting receptors send electrochemical impulses to the central nervous system for the duration of a constant stimulus, whereas rapidly adapting receptors gradually reduce their discharges in the presence of a steady stimulus until they reach a low baseline level or cease entirely, thereupon ceasing to inform the central nervous system about the continuing presence of the stimulus.

The distinctly different sensations of pain, warmth, cold, pressure, and vibration are thus produced by activity in distinct classes of sensory receptors and their associated nerve fibres. The terms “flutter” and “vibration,” for example, are used to distinguish two slightly different vibratory sensations encoded by two different classes of vibration-sensitive receptors (Mountcastle et al. 1967). The three important categories of pain sensation known as pricking pain, burning pain, and aching pain have each been associated with a distinct class of nociceptive afferent fibre. This is not to say, however, that a specific sensation necessarily involves only one class of receptor; more than one receptor class may contribute to a given sensation, and, in fact, sensations may differ depending on the relative contribution of different receptor classes (Sinclair 1981).

The preceding summary is based on the specificity hypothesis of cutaneous sensory function, first formulated by a German physician named Von Frey in 1906. Although at least two other theories of equal or perhaps greater popularity have been proposed during the past century, Von Frey’s hypothesis has now been strongly supported by factual evidence.

Receptors that Respond to Constant Skin Pressure

In the hand, relatively large myelinated fibres (5 to 15 mm in diameter) emerge from a subcutaneous nerve network called the subpapillary nerve plexus and end in a spray of nerve terminals at the junction of the dermis and the epidermis (figure 1). In hairy skin, these nerve endings culminate in visible surface structures known as touch domes; in glabrous, or hairless, skin, the nerve endings are found at the base of skin ridges (such as those forming the fingerprints). There, in the touch dome, each nerve fibre tip, or neurite, is enclosed by a specialized epithelial cell known as a Merkel cell (see figures 2 and 3).

Figure 1. A schematic illustration of a cross-section of the skin


Figure 2. The touch dome on each raised region of skin contains 30 to 70 Merkel cells.


Figure 3. At a higher magnification available with the electron microscope, the Merkel cell, a specialized epithelial cell, is seen to be attached to the basement membrane that separates the epidermis from the dermis.


The Merkel cell neurite complex transduces mechanical energy into nerve impulses. While little is known about the cell’s role or about its mechanism of transduction, it has been identified as a slowly adapting receptor. This means that pressure on a touch dome containing Merkel cells causes the receptors to produce nerve impulses for the duration of the stimulus. These impulses rise in frequency in proportion to the intensity of the stimulus, thereby informing the brain of the duration and magnitude of pressure on the skin.

Like the Merkel cell, a second slowly adapting receptor also serves the skin by signalling the magnitude and duration of steady skin pressures. Visible only through a microscope, this receptor, known as the Ruffini receptor, consists of a group of neurites emerging from a myelinated fibre and encapsulated by connective tissue cells. Within the capsule structure are fibres that apparently transmit local skin distortions to the neurites, which in turn produce the messages sent along the neural highway to the central nervous system. Pressure on the skin causes a sustained discharge of nerve impulses; as with the Merkel cell, the frequency of nerve impulses is proportional to the intensity of the stimulus.

Despite their similarities, there is one outstanding difference between Merkel cells and Ruffini receptors. Whereas sensation results when Ruffini receptors are stimulated, stimulation of touch domes housing Merkel cells produces no conscious sensation; the touch dome is thus a mystery receptor, for its actual role in neural function remains unknown. Ruffini receptors, then, are believed to be the only receptors capable of providing the neural signals necessary for the sensory experience of pressure, or constant touch. In addition, it has been shown that the slowly adapting Ruffini receptors account for the ability of humans to rate cutaneous pressure on a scale of intensity.

Receptors that Respond to Vibration and Skin Movement

In contrast with slowly adapting mechanoreceptors, rapidly adapting receptors remain silent during sustained skin indentation. They are, however, well-suited to signal vibration and skin movement. Two general categories are noted: those in hairy skin, which are associated with individual hairs; and those which form corpuscular endings in glabrous, or hairless, skin.

Receptors serving hairs

A typical hair is enveloped by a network of nerve terminals branching from five to nine large myelinated axons (figure 4). In primates, these terminals fall into three categories: lanceolate endings, spindle-like terminals, and papillary endings. All three are rapidly adapting, such that a steady deflection of the hair causes nerve impulses only while movement occurs. Thus, these receptors are exquisitely sensitive to moving or vibratory stimuli, but provide little or no information about pressure, or constant touch.

Figure 4. The shafts of hairs are a platform for nerve terminals that detect movements.


Lanceolate endings arise from a heavily myelinated fibre that forms a network around the hair. The terminal neurites lose their usual coverage of Schwann cells and work their way among the cells at the base of the hair.

Spindle-like terminals are formed by axon terminals surrounded by Schwann cells. The terminals ascend to the sloping hair shaft and end in a semicircular cluster just below a sebaceous, or oil-producing, gland. Papillary endings differ from spindle-like terminals because instead of ending on the hair shaft, they terminate as free nerve endings around the orifice of the hair.

There are, presumably, functional differences among the receptor types found on hairs. This can be inferred in part from structural differences in the way the nerves end on the hair shaft and in part from differences in the diameter of axons, as axons of different diameters connect to different central relay regions. Still, the functions of receptors in hairy skin remains an area for study.







Receptors in glabrous skin

The correlation of a receptor’s anatomical structure with the neural signals it generates is most pronounced in large and easily manipulable receptors with corpuscular, or encapsulated, endings. Particularly well understood are the pacininan and Meissner corpuscles, which, like the nerve endings in hairs discussed above, convey sensations of vibration.

The pacinian corpuscle is large enough to be seen with the naked eye, making it easy to link the receptor with a specific neural response. Located in the dermis, usually around tendons or joints, it is an onion-like structure, measuring 0.5 × 1.0 mm. It is served by one of the body’s largest afferent fibres, having a diameter of 8 to 13 μm and conducting at 50 to 80 metres per second. Its anatomy, well-studied by both light and electron microscopy, is well known.

The principal component of the corpuscle is an outer core formed of cellular material enclosing fluid-filled spaces. The outer core itself is then surrounded by a capsule that is penetrated by a central canal and a capillary network. Passing through the canal is a single myelinated nerve fibre 7 to 11 mm in diameter, which becomes a long, nonmyelinated nerve terminal that probes deep into the centre of the corpuscle. The terminal axon is elliptical, with branch-like processes.

The pacinian corpuscle is a rapidly adapting receptor. When subjected to sustained pressure, it thus produces an impulse only at the beginning and the end of the stimulus. It responds to high-frequency vibrations (80 to 400 Hz) and is most sensitive to vibrations around 250 Hz. Often, these receptors respond to vibrations transmitted along bones and tendons, and because of their extreme sensitivity, they may be activated by as little as a puff of air on the hand (Martin 1985).

In addition to the pacinian corpuscle, there is another rapidly adapting receptor in glabrous skin. Most researchers believe it to be the Meissner corpuscle, located in the dermal papillae of the skin. Responsive to low-frequency vibrations of 2 to 40 Hz, this receptor consists of the terminal branches of a medium-sized myelinated nerve fibre enveloped in one or several layers of what appear to be modified Schwann cells, called laminar cells. The receptor’s neurites and laminar cells may connect to a basal cell in the epidermis (figure 5).

Figure 5. The Meissner corpuscle is a loosely encapsulated sensory receptor in the dermal papillae of glabrous skin.


If the Meissner corpuscle is selectively inactivated by the injection of a local anaesthetic through the skin, the sense of flutter or low-frequency vibration is lost. This suggests that it functionally complements the high frequency capacity of the pacinian corpuscles. Together, these two receptors provide neural signals sufficient to account for human sensibility to a full range of vibrations (Mountcastle et al. 1967).









Cutaneous Receptors Associated with Free Nerve Endings

Many still unidentifiable myelinated and unmyelinated fibres are found in the dermis. A large number are only passing through, on their way to skin, muscles, or periosteum, while others (both myelinated and unmyelinated) appear to end in the dermis. With a few exceptions, such as the pacinian corpuscle, most fibres in the dermis appear to end in poorly defined ways or simply as free nerve endings.

While more anatomical study is needed to differentiate these ill-defined endings, physiological research has clearly shown that these fibres encode a variety of environmental events. For example, free nerve endings found at the junction between the dermis and epidermis are responsible for encoding the environmental stimuli that will be interpreted as cold, warmth, heat, pain, itch, and tickle. It is not yet known which of these different classes of small fibres convey particular sensations.

The apparent anatomical similarity of these free nerve endings is probably due to the limitations of our investigative techniques, since structural differences among free nerve endings are slowly coming to light. For example, in glabrous skin, two different terminal modes of free nerve endings have been distinguished: a thick, short pattern and a long, thin one. Studies of human hairy skin have demonstrated histochemically recognizable nerve endings that terminate at the dermal-epidermal junction: the penicillate and papillary endings. The former arise from unmyelinated fibres and form a network of endings; in contrast, the latter arise from myelinated fibres and end around the hair orifices, as mentioned earlier. Presumably, these structural disparities correspond to functional differences.

Although it is not yet possible to assign specific functions to individual structural entities, it is clear from physiological experiments that there exist functionally different categories of free nerve endings. One small myelinated fibre has been found to respond to cold in humans. Another unmyelinated fibre serving free nerve endings responds to warmth. How one class of free nerve endings can respond selectively to a drop in temperature, while an increase of skin temperature can provoke another class to signal warmth is unknown. Studies show that activation of one small fibre with a free ending may be responsible for itching or tickling sensations, while there are believed to be two classes of small fibres specifically sensitive to noxious mechanical and noxious chemical or thermal stimuli, providing the neural basis for pricking and burning pain (Keele 1964).

The definitive correlation between anatomy and physiological response awaits the development of more advanced techniques. This is one of the major stumbling blocks in the management of disorders such as causalgia, paraesthesia, and hyperpathia, which continue to present a dilemma to the physician.

Peripheral Nerve Injury

Neural function can be divided into two categories: sensory and motor. Peripheral nerve injury, usually resulting from the crushing or severing of a nerve, can impair either function or both, depending on the types of fibres in the damaged nerve. Certain aspects of motor loss tend to be misinterpreted or overlooked, as these signals do not go to muscles but rather affect autonomic vascular control, temperature regulation, the nature and thickness of the epidermis, and the condition of cutaneous mechano-receptors. The loss of motor innervation will not be discussed here, nor will the loss of innervation affecting senses other than those responsible for cutaneous sensation.

The loss of sensory innervation to the skin creates a vulnerability to further injury, as it leaves an anaesthetic surface that is incapable of signalling potentially harmful stimuli. Once injured, anaesthetized skin surfaces are slow to heal, perhaps in part on account of the lack of autonomic innervation that normally regulates such key factors as temperature regulation and cellular nutrition.

Over a period of several weeks, denervated cutaneous sensory receptors begin to atrophy, a process which is easy to observe in large encapsulated receptors such as pacinian and Meissner corpuscles. If regeneration of the axons can occur, recovery of function may follow, but the quality of the recovered function will depend upon the nature of the original injury and upon the duration of denervation (McKinnon and Dellon 1988).

Recovery following a nerve crush is more rapid, much more complete and more functional than is recovery after a nerve is severed. Two factors explain the favourable prognosis for a nerve crush. First, more axons may again achieve contact with the skin than after a transection; second, the connections are guided back to their original site by Schwann cells and linings known as basement membranes, both of which remain intact in a crushed nerve, whereas after a nerve transection the nerves often travel to incorrect regions of the skin surface by following the wrong Schwann cell paths. The latter situation results in distorted spatial information being sent to the somatosensory cortex of the brain. In both cases, however, regenerating axons appear capable of finding their way back to the same class of sensory receptors that they previously served.

The reinnervation of a cutaneous receptor is a gradual process. As the growing axon reaches the skin surface, receptive fields are smaller than normal, while the threshold is higher. These receptive points expand with time and gradually coalesce into larger fields. Sensitivity to mechanical stimuli becomes greater and often approaches the sensitivity of normal sensory receptors of that class. Studies using the stimuli of constant touch, moving touch, and vibration have shown that the sensory modalities attributed to different types of receptors return to anaesthetic areas at different rates.

Viewed under a microscope, denervated glabrous skin is seen to be thinner than normal, having flattened epidermal ridges and fewer layers of cells. This confirms that nerves have a trophic, or nutritional, influence on skin. Soon after innervation returns, the dermal ridges become better developed, the epidermis becomes thicker, and axons can be found penetrating the basement membrane. As the axon comes back to the Meissner corpuscle, the corpuscle begins to increase in size, and the previously flattened, atrophic structure returns to its original form. If the denervation has been of long duration, a new corpuscle may form adjacent to the original atrophic skeleton, which remains denervated (Dellon 1981).

As can be seen, an understanding of the consequences of peripheral nerve injury requires knowledge of normal function as well as the degrees of functional recovery. While this information is available for certain nerve cells, others require further investigation, leaving a number of murky areas in our grasp of the role of cutaneous nerves in health and disease.



" DISCLAIMER: The ILO does not take responsibility for content presented on this web portal that is presented in any language other than English, which is the language used for the initial production and peer-review of original content. Certain statistics have not been updated since the production of the 4th edition of the Encyclopaedia (1998)."