The terms dermatitis and eczema are interchangeable and refer to a particular type of inflammatory reaction of the skin which may be triggered by internal or external factors. Occupational contact dermatitis is an exogenous eczema caused by the interaction of the skin with chemical, biological or physical agents found in the work environment.
Contact dermatitis accounts for 90% of all occupational dermatoses and in 80% of the cases, it will impair a worker’s most important tool, the hands (Adams 1988). Direct contact with the offending agent is the usual mode of production of the dermatitis, but other mechanisms may be involved. Particulate matter such as dust or smoke, or vapours from volatile substances, may give rise to airborne contact dermatitis. Some substances will be transferred from the fingers to distant sites on the body to produce ectopic contact dermatitis. Finally, a photocontact dermatitis will be induced when a contactant has become activated by exposure to ultraviolet light.
Contact dermatitis is divided into two broad categories based on different mechanisms of production. Table 1 lists the salient features of irritant contact dermatitis and of allergic contact dermatitis.
Table 1. Types of contact dematitis
Features |
Irritant contact dermatitis |
Allergic contact dermatitis |
Mechanism of production |
Direct cytotoxic effect |
Delayed–type cellular immunity |
Potential victims |
Everyone |
A minority of individuals |
Onset |
Progressive, after repeated or prolonged exposure |
Rapid, within 12–48 hours in sensitized individuals |
Signs |
Subacute to chronic eczema with erythema, desquamation and fissures |
Acute to subacute eczema with erythema, oedema, bullae and vesicles |
Symptoms |
Pain and burning sensation |
Pruritus |
Concentration of contactant |
High |
Low |
Investigation |
History and examination |
History and examination |
Irritant Contact Dermatitis
Irritant contact dermatitis is caused by a direct cytotoxic action of the offending agent. Participation of the immune system is secondary to cutaneous damage and results in visible skin inflammation. It represents the most common type of contact dermatitis and accounts for 80% of all cases.
Irritants are mostly chemicals, which are classified as immediate or cumulative irritants. Corrosive substances, such as strong acids and alkalis are examples of the former in that they produce skin damage within minutes or hours of exposure. They are usually well identified, so that contact with them is most often accidental. By contrast, cumulative irritants are more insidious and often are not recognized by the worker as deleterious because damage occurs after days, weeks or months of repeated exposure. As shown in table 2 (overleaf) such irritants include solvents, petroleum distillates, dilute acids and alkalis, soaps and detergents, resins and plastics, disinfectants and even water (Gellin 1972).
Table 2. Common irritants
Acids and alkalis
Soaps and detergents
Solvents
Aliphatic: Petroleum distillates (kerosene, gasoline, naphta)
Aromatic: Benzene, toluene, xylene
Halogenated: Trichloroethylene, chloroform, methylene chloride
Miscellaneous: Turpentine, ketones, esters, alcohols, glycols, water
Plastics
Epoxy, phenolic, acrylic monomers
Amine catalysts
Styrene, benzoyl peroxide
Metals
Arsenic
Chrome
Irritant contact dermatitis, which appears after years of trouble-free handling of a substance, may be due to loss of tolerance, when the epidermal barrier ultimately fails after repeated subclinical insults. More rarely, thickening of the epidermis and other adaptive mechanisms can induce a greater tolerance to some irritants, a phenomenon called hardening.
In summary, irritant contact dermatitis will occur in a majority of individuals if they are exposed to adequate concentrations of the offending agent for a sufficient length of time.
Allergic Contact Dermatitis
A cell-mediated, delayed allergic reaction, similar to that seen in graft rejection, is responsible for 20% of all cases of contact dermatitis. This type of reaction, which occurs in a minority of subjects, requires active participation of the immune system and very low concentrations of the causative agent. Many allergens are also irritants, but the threshold for irritancy is usually much higher than that required for sensitization. The sequence of events which culminate in visible lesions is divided in two phases.
The sensitization (induction or afferent) phase
Allergens are heterogeneous, organic or non-organic chemicals, capable of penetrating the epidermal barrier because they are lipophilic (attracted to the fat in the skin) and of small molecular weight, usually less than 500 daltons (table 3). Allergens are incomplete antigens, or haptens; that is, they must bind to epidermal proteins to become complete antigens.
Langerhans cells are antigen-presenting dendritic cells which account for less than 5% of all epidermal cells. They trap cutaneous antigens, internalize and process them before re-expressing them on their outer surface, bound to proteins of the major histocompatibility complex. Within hours of contact, Langerhans cells leave the epidermis and migrate via the lymphatics towards draining lymph nodes. Lymphokines such as interleukin-1 (IL-1) and tumour necrosis factor alpha (TNF-α) secreted by keratinocytes are instrumental in the maturation and migration of Langerhans cells.
Table 3. Common skin allergens
Metals
Nickel
Chrome
Cobalt
Mercury
Rubber additives
Mercaptobenzothiazole
Thiurams
Carbamates
Thioureas
Dyes
Paraphenylene diamine
Photographic colour developers
Disperse textile dyes
Plants
Urushiol (Toxicodendron)
Sesquiterpene lactones (Compositae)
Primin (Primula obconica)
Tulipalin A (Tulipa, Alstroemeria)
Plastics
Epoxy monomer
Acrylic monomer
Phenolic resins
Amine catalysts
Biocides
Formaldehyde
Kathon CG
Thimerosal
In the paracortical area of regional lymph nodes, Langerhans cells make contact with naive CD4+ helper T cells and present them with their antigenic load. Interaction between Langerhans cells and helper T cells involve recognition of the antigen by T-cell receptors, as well as the interlocking of various adhesion molecules and other surface glycoproteins. Successful antigen recognition results in a clonal expansion of memory T cells, which spill into the bloodstream and the entire skin. This phase requires 5 to 21 days, during which no lesion occurs.
The elicitation (efferent) phase
Upon re-exposure to the allergen, sensitized T cells become activated and secrete potent lymphokines such as IL-1, IL-2 and interferon gamma (IFN-γ). These in turn induce blast transformation of T cells, generation of cytotoxic as well as suppressor T cells, recruitment and activation of macrophages and other effector cells and production of other mediators of inflammation such as TNF-α and adhesion molecules. Within 8 to 48 hours, this cascade of events results in vasodilatation and reddening (erythema), dermal and epidermal swelling (oedema), blister formation (vesiculation) and oozing. If left untreated, this reaction may last between two and six weeks.
Dampening of the immune response occurs with shedding or degradation of the antigen, destruction of Langerhans cells, increased production of CD8+ suppressor T cells and production by keratinocytes of IL-10 which inhibits the proliferation of helper/cytotoxic T cells.
Clinical Presentation
Morphology. Contact dermatitis may be acute, subacute or chronic. In the acute phase, lesions appear rapidly and present initially as erythematous, oedematous and pruritic urticarial plaques. The oedema may be considerable, especially where the skin is loose, such as the eyelids or the genital area. Within hours, these plaques become clustered with small vesicles which may enlarge or coalesce to form bullae. When they rupture, they ooze an amber- coloured, sticky fluid.
Oedema and blistering are less prominent in subacute dermatitis; which is characterized by erythema, vesiculation, peeling of skin (desquamation), moderate oozing and formation of yellowish crusts.
In the chronic stage, vesiculation and oozing are replaced by increased desquamation, thickening of the epidermis, which becomes greyish and furrowed (lichenification) and painful, deep fissures over areas of movement or trauma. Long-lasting lymphoedema may result after years of persistent dermatitis.
Distribution. The peculiar pattern and distribution of a dermatitis will often allow the clinician to suspect its exogenous origin and sometimes identify its causative agent. For example, linear or serpiginous streaks of erythema and vesicles on uncovered skin are virtually diagnostic of a plant contact dermatitis, while an allergic reaction due to rubber gloves will be worse on the back of the hands and around the wrists.
Repeated contact with water and cleansers is responsible for the classic “housewives’ dermatitis”, characterized by erythema, desquamation and fissures of the tips and backs of the fingers and involvement of the skin between the fingers (interdigital webs). By contrast, dermatitis caused by friction from tools, or by contact with solid objects tends to be localized on the palm and underside (volar) area of the fingers.
Irritant contact dermatitis due to fibreglass particles will involve the face, hands and forearms and will be accentuated in flexures, around the neck and waist, where movement and friction from clothes will force the spicules into the skin. Involvement of the face, upper eyelids, ears and submental area suggests an airborne dermatitis. A photocontact dermatitis will spare sun-protected areas such as the upper eyelids, the submental and retroauricular areas.
Extension to distant sites. Irritant dermatitis remains localized to the area of contact. Allergic contact dermatitis, especially if acute and severe, is notorious for its tendency to disseminate away from the site of initial exposure. Two mechanisms may explain this phenomenon. The first, autoeczematization, also known as id-reaction or the excited skin syndrome, refers to a state of hypersensitivity of the entire skin in response to a persistent or severe localized dermatitis. Systemic contact dermatitis occurs when a patient topically sensitized to an allergen is re-exposed to the same agent by oral or parenteral route. In both cases, a widespread dermatitis will ensue, which may easily be mistaken for an eczema of endogenous origin.
Predisposing factors
The occurrence of an occupational dermatitis is influenced by the nature of the contactant, its concentration and the duration of contact. The fact that under similar conditions of exposure only a minority of workers will develop a dermatitis is proof of the importance of other personal and environmental predisposing factors (table 4).
Table 4. Predisposing factors for occupational dermatitis
Age |
Younger workers are often inexperienced or careless and are more likely to develop occupational dermatitis than older workers |
Skin type |
Orientals and Blacks are generally more resistant to irritation than Whites |
Pre-existing disease |
Atopy predisposes to irritant contact dermatitis Psoriasis or lichen planus may worsen because of the Koebner phenomenon |
Temperature and humidity |
High humidity reduces the effectiveness of the epidermal barrier Low humidity and cold cause chapping and desiccation of the epidermis |
Working conditions |
A dirty worksite is more often contaminated with toxic or allergenic chemicals Obsolete equipment and lack of protective measures increase the risk of occupational dermatitis Repetitive movements and friction may cause irritation and calluses |
Age. Younger workers are more likely to develop occupational dermatitis. It may be that they are often less experienced than their older colleagues, or may have a more careless attitude about safety measures. Older workers may have become hardened to mild irritants, or they have learned how to avoid contact with hazardous substances, or older workers may be a self-selected group that did not experience problems while others who did may have left the job.
Skin type. Most Black or Oriental skin appears to be more resistant to the effects of contact irritants than the skin of most Caucasians.
Pre-existing disease. Allergy-prone workers (having a background of atopy manifested by eczema, asthma or allergic rhinitis) are more likely to develop irritant contact dermatitis. Psoriasis and lichen planus may be aggravated by friction or repetitive trauma, a phenomenon called koebnerization. When such lesions are limited to the palms, they may be difficult to distinguish from chronic irritant contact dermatitis.
Temperature and humidity. Under conditions of extreme heat, workers often neglect to wear gloves or other appropriate protective gear. High humidity reduces the effectiveness of the epidermal barrier, while dry and cold conditions promote chapping and fissures.
Working conditions. The incidence of contact dermatitis is higher in worksites which are dirty, contaminated with various chemicals, have obsolete equipment, or lack protective measures and hygiene facilities. Some workers are at higher risk because their tasks are manual and they are exposed to strong irritants or allergens (e.g., hairdressers, printers, dental technicians).
Diagnosis
A diagnosis of occupational contact dermatitis can usually be made after a careful history and a thorough physical examination.
History. A questionnaire that includes the name and address of the employer, the worker’s job title and a description of functions should be completed The worker should provide a list of all the chemicals handled and supply information about them, such as is found on the Material Safety Data Sheets. The date of onset and location of the dermatitis should be noted. It is important to document the effects of vacation, sick leave, sun exposure and treatment on the course of the disease. The examining physician should obtain information about the worker’s hobbies, personal habits, history of pre-existing skin disease, general medical background and current medication, as well.
Physical examination. The involved areas must be carefully examined. Note should be taken of the severity and stage of the dermatitis, of its precise distribution and of its degree of interference with function. A complete skin examination must be performed, looking for tell-tale stigmata of psoriasis, atopic dermatitis, lichen planus, tinea, etc., which may signify that the dermatitis is not of occupational origin.
Complementary investigation
The information obtained from history and physical examination is usually sufficient to suspect the occupational nature of a dermatitis. However, additional tests are required in most cases to confirm the diagnosis and to identify the offending agent.
Patch testing. Patch testing is the technique of choice for the identification of cutaneous allergens and it should be routinely performed in all cases of occupational dermatitis (Rietschel et al. 1995). More than 300 substances are now commercially available. The standard series, which regroup the most common allergens, can be supplemented with additional series aimed at specific categories of workers such as hairdressers, dental technicians, gardeners, printers, etc. Table 6 lists the various irritants and sensitizers encountered in some of these occupations.
Table 5. Examples of skin irritants and sensitizers with occupations where contact can occur
Occupation |
Irritants |
Sensitizers |
Construction |
Turpentine, thinner, |
Chromates, epoxy and phenolic |
Dental |
Detergents, disinfectants |
Rubber, epoxy and acrylic monomer, amine catalysts, local anaesthetics, mercury, gold, nickel, eugenol, formaldehyde, glutaraldehyde |
Farmers, florists, |
Fertilizers, disinfectants, |
Plants, woods, fungicides, insecticides |
Food handlers, |
Soaps and detergents, |
Vegetables, spices, garlic, rubber, benzoyl peroxide |
Hairdressers, |
Shampoos, bleach, peroxide, |
Paraphenylenediamine in hair dye, glycerylmonothioglycolate in permanents, ammonium persulphate in bleach, surfactants in shampoos, nickel, perfume, essential oils, preservatives in cosmetics |
Medical |
Disinfectants, alcohol, soaps |
Rubber, colophony, formaldehyde, glutaraldehyde, disinfectants, antibiotics, local anaesthetics, pheno-thiazines, benzodiazepines |
Metal workers, |
Soaps and detergents, cutting |
Nickel, cobalt, chrome, biocides in cutting oils, hydrazine and colophony in welding flux, epoxy resins and amine catalysts, rubber |
Printers and |
Solvents, acetic acid, ink, |
Nickel, cobalt, chrome, rubber,colophony, formaldehyde, paraphenylene diamine and azo dyes, hydroquinone, epoxy and acrylic monomer, amine catalysts, B&W and colour developers |
Textile workers |
Solvents, bleaches, natural |
Formaldehyde resins, azo- and anthraquinone dyes, rubber, biocides |
The allergens are mixed in a suitable vehicle, usually petroleum jelly, at a concentration which was found by trial and error over the years to be non-irritant but high enough to reveal allergic sensitization. More recently, prepackaged, ready-to-apply allergens embedded in adhesive strips have been introduced, but so far only the 24 allergens of the standard series are available. Other substances must be bought in individual syringes.
At the time of testing, the patient must be in a quiescent phase of dermatitis and not be taking systemic corticosteroids. A small amount of each allergen is applied to shallow aluminium or plastic chambers mounted on porous, hypoallergenic adhesive tape. These rows of chambers are affixed to an area free of dermatitis on the patient’s back and left in place for 24 or more commonly 48 hours. A first reading is done when the strips are removed, followed by a second and sometimes a third reading after four and seven days respectively. Reactions are graded as follows:
Nul no reaction
? doubtful reaction, mild macular erythema
+ weak reaction, mild papular erythema
++ strong reaction, erythema, oedema, vesicles
+++ extreme reaction, bullous or ulcerative;
IR irritant reaction, glazed erythema or erosion resemblinga burn.
When a photocontact dermatitis (one that requires exposure to ultraviolet light, UV-A) is suspected, a variant of patch testing, called photopatch testing, is performed. Allergens are applied in duplicate to the back. After 24 or 48 hours, one set of allergens is exposed to 5 joules of UV-A and the patches are put back in place for another 24 to 48 hours. Equal reactions on both sides signify allergic contact dermatitis, positive reactions on the UV-exposed side only is diagnostic of photocontact allergy, while reactions on both sides but stronger on the UV-exposed side mean contact and photocontact dermatitis combined.
The technique of patch testing is easy to perform. The tricky part is the interpretation of the results, which is best left to the experienced dermatologist. As a general rule, irritant reactions tend to be mild, they burn more than they itch, they are usually present when the patches are removed and they fade rapidly. By contrast, allergic reactions are pruritic, they reach a peak at four to seven days and may persist for weeks. Once a positive reaction has been identified, its relevance must be assessed: is it pertinent to the current dermatitis, or does it reveal past sensitization? Is the patient exposed to that particular substance, or is he allergic to a different but structurally-related compound with which it cross-reacts?
The number of potential allergens far exceeds the 300 or so commercially available substances for patch testing. It is therefore often necessary to test patients with the actual substances that they work with. While most plants can be tested “as is,” chemicals must be precisely identified and buffered if their acidity level (pH) falls outside the range of 4 to 8. They must be diluted to the appropriate concentration and mixed in a suitable vehicle according to current scientific practice (de Groot 1994). Testing a group of 10 to 20 control subjects will ensure that irritant concentrations are detected and rejected.
Patch testing is usually a safe procedure. Strong positive reactions may occasionally cause exacerbation of the dermatitis under investigation. On rare occasions, active sensitization may occur, especially when patients are tested with their own products. Severe reactions may leave hypo- or hyperpigmented marks, scars or keloids.
Skin biopsy. The histological hallmark of all types of eczema is epidermal intercellular oedema (spongiosis) which stretches the bridges between keratinocytes to the point of rupture, causing intraepidermal vesiculation. Spongiosis is present even in the most chronic dermatitis, when no macroscopic vesicle can be seen. An inflammatory infiltrate of lymphohistiocytic cells is present in the upper dermis and migrates into the epidermis (exocytosis). Because a skin biopsy cannot distinguish between the various types of dermatitis, this procedure is rarely performed, except in rare cases where the clinical diagnosis is unclear and in order to rule out other conditions such as psoriasis or lichen planus.
Other procedures. It may at times be necessary to perform bacterial, viral or fungal cultures, as well as potassium hydroxide microscopic preparations in search of fungi or ectoparasites. Where the equipment is available, irritant contact dermatitis can be assessed and quantified by various physical methods, such as colorimetry, evaporimetry, Laser-Doppler velocimetry, ultrason- ography and the measurement of electrical impedance, conductance and capacitance (Adams 1990).
Workplace. On occasion, the cause of an occupational dermatitis is uncovered only after a careful observation of a particular worksite. Such a visit allows the physician to see how a task is performed and how it might be modified to eliminate the risk of occupational dermatitis. Such visits should always be arranged with the health officer or supervisor of the plant. The information that it generates will be useful to both the worker and the employer. In many localities, workers have the right to request such visits and many work sites have active health and safety committees which do provide valuable information.
Treatment
Local treatment of an acute, vesicular dermatitis will consist of thin, wet dressings soaked in lukewarm saline, Burow’s solution or tap water, left in place for 15 to 30 minutes, three to four times a day. These compresses are followed by the application of a strong topical corticosteroid. As the dermatitis improves and dries up, the wet dressings are spaced and stopped and the strength of the corticosteroid is decreased according to the part of the body being treated.
If the dermatitis is severe or widespread, it is best treated with a course of oral prednisone, 0.5 to 1.0 mg/kg/day for two to three weeks. Systemic first-generation antihistamines are given as needed to provide sedation and relief from pruritus.
Subacute dermatitis usually responds to mid-strength corticosteroid creams applied two to three times a day, often combined with protective measures such as the use of cotton liners under vinyl or rubber gloves when contact with irritants or allergens cannot be avoided.
Chronic dermatitis will require the use of corticosteroid ointments, coupled with the frequent application of emollients, the greasier the better. Persistent dermatitis may need to be treated with psoralen and ultraviolet-A (PUVA) phototherapy, or with systemic immunosuppressors such as azathioprine (Guin 1995).
In all cases, strict avoidance of causative substances is a must. It is easier for the worker to stay away from offending agents if he or she is given written information which specifies their names, synonyms, sources of exposure and cross-reaction patterns. This printout should be clear, concise and written in terms that the patient can easily understand.
Worker’s compensation
It is often necessary to withdraw a patient from work. The physician should specify as precisely as possible the estimated length of the disability period, keeping in mind that full restoration of the epidermal barrier takes four to five weeks after the dermatitis is clinically cured. The legal forms that will allow the disabled worker to receive adequate compensation should be diligently filled out. Finally, the extent must be determined of permanent impairment or the presence of functional limitations, which may render a patient unfit to return to his former work and make him a candidate for rehabilitation.
Malignant melanoma is rarer than non-melanocytic skin cancer. Apart from exposure to solar radiation, no other environmental factors show a consistent association with malignant melanoma of the skin. Associations with occupation, diet and hormonal factors are not firmly established (Koh et al. 1993).
Malignant melanoma is an aggressive skin cancer (ICD-9 172.0 to 173.9; ICD-10: C43). It arises from pigment-producing cells of the skin, usually in an existing naevus. The tumour is usually a few millimetres to several centimetres thick, brown or black in colour, that has grown in size, changed colour and may bleed or ulcerate (Balch et al. 1993).
Indicators of poor prognosis of malignant melanoma of the skin include nodular subtype, tumour thickness, multiple primary tumours, metastases, ulceration, bleeding, long tumour duration, body site and, for some tumour sites, male sex. A history of malignant melanoma of the skin increases the risk for a secondary melanoma. Five-year post-diagnosis survival rates in high incidence areas are 80 to 85%, but in low incidence areas the survival is poorer (Ellwood and Koh 1994; Stidham et al. 1994).
There are four histologic types of malignant melanoma of the skin. Superficial spreading melanomas (SSM) represent 60 to 70% of all melanomas in Whites and less in non-Whites. SSMs tend to progress slowly and are more common in women than in men. Nodular melanomas (NM) account for 15 to 30% of malignant melanomas of the skin. They are invasive, grow rapidly and are more frequent in men. Four to 10% of malignant melanomas of the skin are lentigo malignant melanomas (LMM) or Hutchinson’s melanotic freckles. LMMs grow slowly, occur frequently in the face of old persons and rarely metastasize. Acral lentiginous melanomas (ALM) represent 35 to 60% of all malignant melanomas of the skin in non-Whites and 2 to 8% in Whites. They occur frequently on the sole of the foot (Bijan 1993).
For the treatment of malignant melanomas of the skin, surgery, radiation therapy, chemotherapy and biologic therapy (interferon alpha or interleukin-2) may be applied singly or in combination.
During the 1980s, the reported age-standardized annual incidence rates of malignant melanoma of the skin varied per 100,000 from 0.1 in males in Khon Kaen, Thailand to around 30.9 in males and 28.5 in females in Queensland, Australia (IARC 1992b). Malignant melanomas of the skin represent less than 1% of all cancers in most populations. An annual increase of about 5% in melanoma incidence has been observed in most white populations from the early 1960s to about 1972. Melanoma mortality has increased in the last decades in most populations, but less rapidly than incidence, probably due to early diagnoses and awareness of the disease (IARC 1985b, 1992b). More recent data show different rates of change, some of them suggesting even downward trends.
Malignant melanomas of the skin are among the ten most frequent cancers in incidence statistics in Australia, Europe and North America, representing a lifetime risk of 1 to 5%. White-skinned populations are more susceptible than non-White populations. Melanoma risk in white-skinned populations increases with proximity to the equator.
The gender distribution of melanomas of the skin varies widely between populations (IARC 1992a). Women have lower incidence rates than men in most populations. There are gender differences in patterns of body distribution of the lesions: trunk and face dominate in men, extremities in women.
Malignant melanomas of the skin are more common in higher than in lower socio-economic groups (IARC 1992b).
Familial melanomas are uncommon, but have been well documented. with between 4% and 10% of patients describing a history of melanoma among their first degree relatives.
Solar UV-B irradiation is probably the major cause for the widespread increase in the incidence of melanomas of the skin (IARC 1993). It is not clear whether depletion of the stratospheric ozone layer and the consequent increase in UV irradiance has caused the increase in the incidence of malignant melanoma (IARC 1993, Kricker et al. 1993). The effect of UV irradiation depends on some characteristics, such as I or II phenotype and blue eyes. A role for UV radiation emanating from fluorescent lamps is suspected, but not conclusively established (Beral et al. 1982).
It has been estimated that reduction in recreational sun exposure and use of sun-screens could reduce the incidence of malignant melanomas in high risk populations by 40% (IARC 1990). Among outdoor workers, the application of sunscreens having a protective UV-B factor rating of at least 15 and UV-A sunscreen and the use of appropriate clothing are practical protective measures. Although a risk from outdoor occupations is plausible, given the increased exposure to solar radiation, results of studies on regular outdoor occupational exposure are inconsistent. This is probably explained by the epidemiological findings suggesting that it is not regular exposures but rather intermittent high doses of solar radiation that are associated with excess melanoma risk (IARC 1992b).
Therapeutic immunosuppression may result in increased risk of malignant melanoma of the skin. An increased risk with the use of oral contraceptives has been reported, but it seems unlikely to increase the risk of malignant melanoma of the skin (Hannaford et al. 1991). Melanomas can be produced by oestrogen in hamsters. There is no evidence of such an effect in humans.
In White adults, the majority of primary intraocular malignant tumours are melanomas, usually arising from uveal melanocytes. The estimated rates for these cancers do not show the geographic variations and increasing time trends observed for melanomas of the skin. The incidence and mortality of ocular melanomas are very low in Black and Asiatic populations (IARC 1990, Sahel et al. 1993) The causes of ocular melanoma are unknown (Higginson et al. 1992).
In epidemiological studies, excess risk for malignant melanoma has been observed in administrators and managers, airline pilots, chemical processing workers, clerks, electrical power workers, miners, physical scientists, policemen and guards, refinery workers and gasoline exposed workers, salesmen and warehouse clerks. Excess melanoma risks have been reported in industries such as cellulose fibre production, chemical products, clothing industry, electrical and electronics products, metal industry, non-metallic mineral products, petrochemical industry, printing industry and telecommunications. Many of these findings are, however, solitary and have not been replicated in other studies. A series of meta-analyses of cancer risks in farmers (Blair et al. 1992; Nelemans et al. 1993) indicated a slight, but significant excess (aggregated risk ratio of 1.15) of malignant melanoma of the skin in 11 epidemi-ological studies.
In a multi-site case-control study of occupational cancer in Montreal, Canada (Siemiatycki et al. 1991), the following occupational exposures were associated with a significant excess of malignant melanoma of the skin: chlorine, propane engine emissions, plastics pyrolysis products, fabric dust, wool fibres, acrylic fibres, synthetic adhesives, “other” paints, varnishes, chlorinated alkenes, trichloroethylene and bleaches. It was estimated that the population attributable risk due to occupational exposures based on the significant associations in the data of the same study was 11.1%.
There are three histological types of non-melanocytic skin cancers (NMSC) (ICD-9: 173; ICD-10: C44): basal cell carcinoma, squamous cell carcinoma and rare soft tissue sarcomas involving the skin, subcutaneous tissue, sweat glands, sebaceous glands and hair follicles.
Basal cell carcinoma is the most common NMSC in white populations, representing 75 to 80% of them. It develops usually on the face, grows slowly and has little tendency to metastasize.
Squamous cell cancers account for 20 to 25% of reported NMSCs. They can occur on any part of the body, but especially on the hands and legs and can metastasize. In darkly pigmented populations squamous cell cancers are the most common NMSC.
Multiple primary NMSCs are common. The bulk of the NMSCs occur on the head and neck, in contrast with most of the melanomas which occur on the trunk and limbs. The localization of NMSCs reflects clothing patterns.
NMSCs are treated by various methods of excision, radiation and topical chemotherapy. They respond well to treatment and over 95% are cured by excision (IARC 1990).
The incidence of NMSCs is hard to estimate because of gross underreporting and since many cancer registries do not record these tumours. The number of new cases in the US was estimated at 900,000 to 1,200,000 in 1994, a frequency comparable to the total number of all non-cutaneous cancers (Miller & Weinstock 1994). The reported incidences vary widely and are increasing in a number of populations, e.g., in Switzerland and the US. The highest annual rates have been reported for Tasmania (167/100,000 in men and 89/100,000 in women) and the lowest for Asia and Africa (overall 1/100,000 in men and 5/100,000 in women). NMSC is the most common cancer in Caucasians. NMSC is about ten times as common in White as in non-White populations. The lethality is very low (Higginson et al. 1992).
Susceptibility to skin cancer is inversely related to the degree of melanin pigmentation, which is thought to protect by buffering against the carcinogenic action of solar ultraviolet (UV) radiation. Non-melanoma risk in white-skinned populations increases with the proximity to the equator.
In 1992, the International Agency for Research on Cancer (IARC 1992b) evaluated the carcinogenicity of solar radiation and concluded that there is sufficient evidence in humans for the carcinogenicity of solar radiation and that solar radiation causes cutaneous malignant melanoma and NMSC.
Reduction of exposure to sunlight would probably reduce the incidence of NMSCs. In Whites, 90 to 95% of NMSCs are attributable to solar radiation (IARC 1990).
NMSCs may develop in areas of chronic inflammation, irritation and scars from burns. Traumas and chronic ulcers of the skin are important risk factors for squamous cell skin cancers, particularly in Africa.
Radiation therapy, chemotherapy with nitrogen mustard, immunosuppressive therapy, psoralen treatment combined with UV-A radiation and coal tar preparations applied on skin lesions have been associated with an increased risk of NMSC. Environmental exposure to arsenic trivalent and arsenical compounds have been confirmed to be associated with skin cancer excess in humans (IARC 1987). Arsenicism can give rise to palmar or plantar arsenical keratoses, epidermoid carcinoma and superficial basal cell carcinoma.
Hereditary conditions such as lack of enzymes required to repair the DNA damaged by UV radiation may increase the risk of NMSC. Xeroderma pigmentosum represents such a hereditary condition.
A historical example of an occupational skin cancer is scrotal cancer that Sir Percival Pott described in chimney sweeps in 1775. The cause of these cancers was soot. In the early 1900s, scrotal cancers were observed in mulespinners in cotton textile factories where they were exposed to shale oil, which was used as a lubricant for cotton spindles. The scrotal cancers in both chimney sweeps and mulespinners were later associated with polycyclic aromatic hydrocarbons (PAHs), many of which are animal carcinogens, particularly some 3-, 4- and 5-ring PAHs such as benz(a)pyrene and dibenz(a,h)anthracene (IARC 1983, 1984a, 1984b, 1985a). In addition to mixtures that readily contain carcinogenic PAHs, carcinogenic compounds may be formed by cracking when organic compounds are heated.
Further occupations with which PAH-related excesses of NMSC have been associated include: aluminium reduction workers, coal gasification workers, coke oven workers, glass blowers, locomotive engineers, road pavers and highway maintenance workers, shale oil workers, tool fitters and tool setters (see table 1). Coal tars, coal-based pitches, other coal-derived products, anthracene oil, creosote oil, cutting oils and lubricating oils are some of the materials and mixtures that contain carcinogenic PAHs.
Table 1. Occupations at risk
Carcinogenic |
Industry or hazard |
Process or group at risk |
Pitch, tar or |
Aluminium reduction |
Pot room worker |
Soot |
Chimney sweeps |
|
Lubricating and |
Glass blowing |
|
Arsenic |
Oil refinery |
Still cleaners |
Ionizing radiation |
Radiologists |
|
Ultraviolet radiation |
Outdoor workers |
Farmers, fishermen, vineyard and |
Additional job titles that have been associated with increased NMSC risk include jute processors, outdoor workers, pharmacy technicians, sawmill workers, shale oil workers, sheep-dip workers, fishermen, tool setters, vineyard workers and watermen. The excess for watermen (who are primarily involved in traditional fishing tasks) was noticed in Maryland, USA and was confined to squamous cell cancers. Solar radiation probably explains fishermen’s, outdoor workers’, vineyard workers’ and watermen’s excess risks. Fishermen also may be exposed to oils and tar and inorganic arsenic from the consumed fish, which may contribute to the observed excess, which was threefold in a Swedish study, as compared with the county-specific rates (Hagmar et al. 1992). The excess in sheep dip workers may be explained by arsenical compounds, which induce skin cancers through ingestion rather than through skin contact. While farmers have slightly increased risk of melanoma, they do not appear to have increased risk of NMSC, based on epidemiological observations in Denmark, Sweden and the USA (Blair et al. 1992).
Ionizing radiation has caused skin cancer in early radiologists and workers who handled radium. In both situations, the exposures were long-lasting and massive. Occupational accidents involving skin lesions or long-term cutaneous irritation may increase the risk on NMSC.
Prevention (of Non-Melanocytic OccupationalSkin Cancer)
The use of appropriate clothing and a sunscreen having a protective UV-B factor of 15 or greater will help protect outdoor workers exposed to ultraviolet radiation. Further, the replacement of carcinogenic materials (such as feed stocks) by non-carcinogenic alternatives is another obvious protective measure which may, however, not always be possible. The degree of exposure to carcinogenic materials can be reduced by the use of protective shields on equipment, protective clothing and hygienic measures.
Of overriding importance is the education of the workforce about the nature of the hazard and the reasons for and value of the protective measures.
Finally, skin cancers usually take many years to develop and many of them pass through several premalignant stages before achieving their full malignant potential such as arsenic keratoses and actinic keratoses. These early stages are readily detectable by visual inspection. For this reason, skin cancers offer the real possibility that regular screening could reduce mortality among those known to have been exposed to any skin carcinogen.
The growth of industry, agriculture, mining and manufacturing has been paralleled by the development of occupational diseases of the skin. The earliest reported harmful effects were ulcerations of the skin from metal salts in mining. As populations and cultures have expanded the uses of new materials, new skills and new processes have emerged. Such technological advances brought changes to the work environment and during each period some aspect of the technical change has impaired workers’ health. Occupational diseases, in general and skin diseases, in particular, have long been an unplanned by-product of industrial achievement.
Fifty years ago in the United States, for example, occupational diseases of the skin accounted for no less than 65-70% of all reported occupational diseases. Recently, statistics collected by the United States Department of Labor indicate a drop in frequency to approximately 34%. This decreased number of cases is said to have resulted from increased automation, from enclosure of industrial processes and from better education of management, supervisors and workers in the prevention of occupational diseases in general. Without doubt such preventive measures have benefited the workforce in many larger plants where good preventive services may be available, but many people are still employed in conditions which are conducive to occupational diseases. Unfortunately, there is no accurate assessment of the number of cases, causal factors, time lost or actual cost of occupational skin disease in most countries.
General terms, such as industrial or occupational dermatitis or professional eczema, are used for occupational skin diseases but names related both to cause and effect are also commonly used. Cement dermatitis, chrome holes, chloracne, fibreglass itch, oil bumps and rubber rash are some examples. Because of the variety of skin changes induced by agents or conditions at work, these diseases are appropriately called occupational dermatoses—a term which includes any abnormality resulting directly from, or aggravated by, the work environment. The skin can also serve as an avenue of entry for certain toxicants which cause chemical poisoning via percutaneous absorption.
Cutaneous Defence
From experience we know that the skin can react to a large number of mechanical, physical, biological and chemical agents, acting alone or in combination. Despite this vulnerability, occupational dermatitis is not an inevitable accompaniment of work. The majority of the workforce manages to remain free of disabling occupational skin problems, due in part to the inherent protection provided by the skin’s design and function, and in part due to the daily use of personal protective measures directed towards minimizing skin contact with known skin hazards at the worksite. Hopefully, the absence of disease in the majority of workers may also be due to jobs which have been designed to minimize exposure to conditions hazardous to the skin.
The skin
Human skin, except for palms and soles, is quite thin and of variable thickness. It has two layers: the epidermis (outer) and dermis (inner). Collagen and elastic components in the dermis allow it to function as a flexible barrier. The skin provides a unique shield which protects within limits against mechanical forces, or penetration by various chemical agents. The skin limits water loss from the body and guards against the effects of natural and artificial light, heat and cold. Intact skin and its secretions provide a fairly effective defence zone against micro-organisms, providing mechanical or chemical injury does not impair this defence. Figure 1 provides an illustration of the skin and description of its physiological functions.
Figure 1. Schematic representation of the skin.
The outer epidermal layer of dead cells (keratin) provides a shield against elements in the outside world. These cells, if exposed to frictional pressures, can form a protective callus and can thicken after ultraviolet exposure. Keratin cells are normally arranged in 15 or 16 shingle-like layers and provide a barrier, though limited, against water, water-soluble materials and mild acids. They are less able to act as a defence against repeated or prolonged contact with even low concentrations of organic or inorganic alkaline compounds. Alkaline materials soften but do not totally dissolve the keratin cells. The softening disturbs their inner structure enough to weaken cellular cohesiveness. The integrity of the keratin layer is allied to its water content which, in turn, influences its pliability. Lowered temperatures and humidity, dehydrating chemicals such as acids, alkali, strong cleaners and solvents, cause water loss from the keratin layer, which, in turn, causes the cells to curl and crack. This weakens its ability to serve as a barrier and compromises its defence against water loss from the body and entry of various agents from outside.
Cutaneous defence systems are effective only within limits. Anything which breaches one or more of the links endangers the entire defence chain. For example, percutaneous absorption is enhanced when the continuity of the skin has been altered by physical or chemical injury or by mechanical abrasion of the keratin layer. Toxic materials can be absorbed not only through the skin, but also through the hair follicules, sweat orifices and ducts. These latter routes are not as important as transepidermal absorption. A number of chemicals used in industry and in farming have caused systemic toxicity by absorption through the skin. Some well established examples are mercury, tetraethyllead, aromatic and amino nitro compounds and certain organophosphates and chlorinated hydrocarbon pesticides. It should be noted that for many substances, systemic toxicity generally arises through inhalation but percutaneous absorption is possible and should not be overlooked.
A remarkable feature of cutaneous defence is the ability of the skin to continually replace the basal cells which provide the epidermis with its own built-in replication and repair system.
The skin’s ability to act as a heat exchanger is essential to life. Sweat gland function, vascular dilation and constriction under nervous control are vital to regulating body heat, as is evaporation of surface water on skin. Constriction of the blood vessels protects against cold exposures by preserving central body heat. Multiple nerve endings within the skin act as sensors for heat, cold and other excitants by relaying the presence of the stimulant to the nervous system which responds to the provoking agent.
A major deterrent against injury from ultraviolet radiation, a potentially harmful component of sunlight and some forms of artificial light is the pigment (melanin) manufactured by the melanocytes located in the basal cell layer of the epidermis. Melanin granules are picked up by the epidermal cells and serve to add protection against the rays of natural or artificial light which penetrate the skin. Additional protection, though less in degree, is furnished by the keratin cell layer which thickens following ultraviolet exposure. (As discussed below, for those whose worksites are outdoors it is essential to protect exposed skin with a sun-screen coating agent having a protective against UV-A and against UV-B (rating of 15 or greater) together with appropriate clothing to provide a high level of shielding against sun light injury.)
Types of Occupational Skin Diseases
Occupational dermatoses vary both in their appearance (morphology) and severity. The effect of an occupational exposure may range from the slightest erythema (reddening) or discoloration of the skin to a far more complex change, as a malignancy. Despite the wide range of substances that are known to cause skin effects, in practice it is difficult to associate a specific lesion with exposure to a specific material. However, certain chemical groups are associated with characteristic reaction patterns. The nature of the lesions and their location may provide a strong clue as to causality.
A number of chemicals with or without direct toxic effect on the skin can also cause systemic intoxication following absorption through the skin. In order to act as a systemic toxin, the agent must pass through the keratin and the epidermal cell layers, then through the epidermal-dermal junction. At this point it has ready access to the bloodstream and the lymphatic system and can now be carried to vulnerable target organs.
Acute contact dermatitis (irritant or allergic).
Acute contact eczematous dermatitis can be caused by hundreds of irritant and sensitizing chemicals, plants and photoreactive agents. Most occupational allergic dermatoses can be classified as acute eczematous contact dermatitis. Clinical signs are heat, redness, swelling, vesiculation and oozing. Symptoms include itch, burning and general discomfort. The back of the hands, the inner wrists and the forearms are the usual sites of attack, but acute contact dermatitis can occur anywhere on the skin. If the dermatosis occurs on the forehead, the eyelids, the ears, the face or the neck, it is logical to suspect that a dust or a vapour may be involved in the reaction. When there is a generalized contact dermatitis, not restricted to one or a few specific sites, it is usually caused by a more extensive exposure, such as the wearing of contaminated clothing, or by autosensitization from a pre-existing dermatitis. Severe blistering or destruction of tissue generally indicates the action of an absolute or strong irritant. The exposure history, which is taken as part of the medical control of occupational dermatitis, may reveal the suspected causative agent. An accompanying article in this chapter provides more details on contact dermatitis.
Sub-acute contact dermatitis
Through a cumulative effect repeated contact with both weak and moderate irritants can cause a sub-active form of contact dermatitis characterized by dry, red plaques. If the exposure continues, the dermatitis will become chronic.
Chronic eczematous contact dermatitis
When a dermatitis recurs over an extended period of time it is called chronic eczematous contact dermatitis. The hands, fingers, wrists and forearms are the sites most often affected by chronic eczematous lesions, characterized by dry, thickened and scaly skin. Cracking and fissuring of the fingers and the palms may be present. Chronic nail dystrophy is also commonly found. Frequently, the lesions will begin to ooze (sometimes called “weeping”) because of re-exposure to the responsible agent or by imprudent treatment and care. Many materials not responsible for the original dermatosis will sustain this chronic recurrent skin problem.
Photosensitivity dermatitis (phototoxic or photoallergic)
Most photoreactions on the skin are phototoxic. Either natural and artificial light sources alone or in combination with various chemicals, plants or drugs can induce a phototoxic or photosensitive response. Phototoxic reaction is generally limited to light-exposed areas while photosensitive reaction can develop frequently on non-exposed body surfaces. Some examples of photoreactive chemicals are coal tar distillation products, such as creosote, pitch and anthracene. Members of the plant family Umbelliferae are well known photoreactors. Family members include cow parsnip, celery, wild carrot, fennel and dill. The reactive agent in these plants are psoralens.
Folliculitis and acneform dermatoses, including chloracne
Workers with dirty jobs often develop lesions involving the follicular openings. Comedones (blackheads) may be the only obvious effect of the exposure, but often a secondary infection of the follicle may ensure. Poor personal hygiene and ineffective cleansing habits can add to the problem. Follicular lesions generally occur on the forearms and less often on the thighs and buttocks, but they can occur anywhere except on the palms and soles.
Follicular and acneform lesions are caused by overexposure to insoluble cutting fluids, to various tar products, paraffin, and certain aromatic chlorinated hydrocarbons. The acne caused by any of the above agents can be extensive. Chloracne is the most serious form, not only because it can lead to disfigurement (hyperpigmentation and scarring) but also because of the potential liver damage, including porphyria cutanea tarda and other systemic effects that the chemicals can cause. Chloronaphthalenes, chlorodi-phenyls, chlorotriphenyls, hexachlorodibenzo-p-dioxin, tetrachloroazoxybenzene and tetrachlorodibenzodioxin (TCDD), are among the chloracne-causing chemicals. The blackheads and cystic lesions of chloracne often appear first on the sides of the forehead and the eyelids. If exposure continues, lesions may occur over widespread areas of the body, except for the palms and soles.
Sweat-induced reactions
Many types of work involve exposure to heat and where there is too much heat and sweating, followed by too little evaporation of the sweat from the skin, prickly heat can develop. When there is chafing of the affected area by skin rubbing against skin, a secondary bacterial or fungal infection may frequently occur. This happens particularly in the underarm area, under the breast, in the groin and between the buttocks.
Pigment change
Occupationally induced changes in skin colour can be caused by dyes, heavy metals, explosives, certain chlorinated hydrocarbons, tars and sunlight. The change in skin colour may be the result of a chemical reaction within the keratin, as for example, when the keratin is stained by metaphenylene-diamine or methylene blue or trinitrotoluene. Sometimes permanent discoloration may occur more deeply in the skin as with argyria or traumatic tattoo. Increased pigmentation induced by chlorinated hydrocarbons, tar compounds, heavy metals and petroleum oils generally results from melanin stimulation and overproduction. Hypopigmentation or depigmentation at selected sites can be caused by a previous burn, contact dermatitis, contact with certain hydroquinone compounds or other antioxidant agents used in selected adhesives and sanitizing products. Among the latter are tertiary amyl phenol, tertiary butyl catechol and tertiary butyl phenol.
New growths
Neoplastic lesions of occupational origin may be malignant or benign (cancerous or non-cancerous). Melanoma and non-melanocytic skin cancer are discussed in two other articles in this chapter. Traumatic cysts, fibromata, asbestos, petroleum and tar warts and keratoacanthoma, are typical benign new growths. Keratoacanthomas can be associated with excessive exposure to sunlight and also have been ascribed to contact with petroleum, pitch and tar.
Ulcerative changes
Chromic acid, concentrated potassium dichromate, arsenic trioxide, calcium oxide, calcium nitrate and calcium carbide are documented ulcerogenic chemicals. Favourite attack sites are the fingers, hands, folds and palmar creases. Several of these agents also cause perforation of the nasal septum.
Chemical or thermal burns, blunt injury or infections resulting from bacteria and fungi may result in ulcerous excavations on the affected part.
Granulomas
Granulomas can arise from many occupational sources if the appropriate circumstances are present. Granulomas can be caused by occupational exposures to bacteria, fungi, viruses or parasites. Inanimate substances, such as bone fragments, wood splinters, cinders, coral and gravel, and minerals such as beryllium, silica and zirconium, can also cause granulomas after skin embedment.
Other conditions
Occupational contact dermatitis accounts for at least 80% of all cases of occupational skin diseases. However, a number of other changes that affect the skin, hair and nails are not included in the foregoing classification. Hair loss caused by burns, or mechanical trauma or certain chemical exposures, is one example. A facial flush that follows the combination of drinking alcohol and inhaling certain chemicals, such as trichlorethylene and disulfuram, is another. Acroosteolysis, a type of bony disturbance of the digits, plus vascular changes of the hands and forearm (with or without Raynaud’s syndrome) has been reported among polyvinyl chloride polymerization tank cleaners. Nail changes are covered in a separate article in this chapter.
Physiopathology or Mechanismsof Occupational Skin Diseases
The mechanisms by which primary irritants act are understood only in part—for instance, vesicant or blister gases (nitrogen mustard or bromomethane and Lewisite, etc.)—interfere with certain enzymes and thereby block selective phases in the metabolism of carbohydrates, fats and proteins. Why and how the blister results is not clearly understood but observations of how chemicals react outside the body yield some ideas about possible biological mechanisms.
In brief, because alkali reacts with acid or lipid or protein, it has been presumed that it also reacts with skin lipid and protein. In so doing, surface lipids are changed and keratin structure becomes disturbed. Organic and inorganic solvents dissolve fats and oils and have the same effect on cutaneous lipids. Additionally, however, it appears that solvents abstract some substance or change the skin in such a way that the keratin layer dehydrates and the skin’s defence is no longer intact. Continued insult results in an inflammatory reaction eventuating in contact dermatitis.
Certain chemicals readily combine with the water within skin or on the surface of the skin, and cause a vigorous chemical reaction. Calcium compounds, such as calcium oxide and calcium chloride, produce their irritant effect in this way.
Substances such as coal tar pitch, creosote, crude petroleum, certain aromatic chlorinated hydrocarbons, in combination with sunlight exposure, stimulate the pigment-producing cells to over function, leading to hyperpigmentation. Acute dermatitis also may give rise to hyperpigmentation after healing. Conversely, burns, mechanical trauma, chronic contact dermatitis, contact with monobenzyl ether of hydroquinone or certain phenolics can induce hypo- or de-pigmented skin.
Arsenic trioxide, coal tar pitch, sunlight and ionizing radiation, among other agents, can damage the skin cells so that abnormal cell growth results in cancerous change of the exposed skin.
Unlike primary irritation, allergic sensitization is the result of a specifically acquired alteration in the capacity to react, brought about by T-cell activation. For several years it has been agreed that contact allergic eczematous dermatitis accounts for about 20% of all the occupational dermatoses. This figure is probably too conservative in view of the continued introduction of new chemicals, many of which have been shown to cause allergic contact dermatitis.
Causes of Occupational Skin Diseases
Materials or conditions known to cause occupational skin disease are unlimited. They are currently divided into mechanical, physical, biological and chemical categories, which continue to grow in number each year.
Mechanical
Friction, pressure or other forms of more forceful trauma may induce changes ranging from callus and blisters to myositis, tenosynovitis, osseous injury, nerve damage, laceration, shearing of tissue or abrasion. Lacerations, abrasions, tissue disruption and blisters additionally pave the way for secondary infection by bacteria or, less often, fungi to set in. Almost everyone is exposed each day to one or more forms of mechanical trauma which may be mild or moderate in degree. However, those who use pneumatic riveters, chippers, drills and hammers are at greater risk of suffering neurovascular, soft tissue, fibrous or bone injury to the hands and forearms. because of the repetitive trauma from the tool. The use of vibration-producing tools which operate in a certain frequency range can induce painful spasms in the fingers of the tool-holding hand. Transfer to other work, where possible, generally provides relief. Modern equipment is designed to reduce vibration and thus obviate the problems.
Physical agents
Heat, cold, electricity, sunlight, artificial ultraviolet, laser radiation and high energy sources such as x rays, radium and other radioactive substances are potentially injurious to skin and to the entire body. High temperature and humidity at work or in a tropical work environment can impair the sweat mechanism and cause systemic effects known as sweat retention syndrome. Milder exposure to heat may induce prickly heat, intertrigo (chafing), skin maceration and supervening bacterial or fungal infection, particularly in overweight and diabetic individuals.
Thermal burns are frequently experienced by electric furnace operators, lead burners, welders, laboratory chemists, pipe-line workers, road repairmen, roofers and tar plant workers contacting liquid tar. Prolonged exposure to cold water or lowered temperatures causes mild to severe injury ranging from erythema to blistering, ulceration and gangrene. Frostbite affecting the nose, ears, fingers and toes of construction workers, firemen, postal workers, military personnel and other outdoor workers is a common form of cold injury.
Electricity exposure resulting from contact with short circuits, bare wires or defective electrical apparatus cause burns of the skin and destruction of deeper tissue.
Few workers are without exposure to sunlight and some individuals with repeated exposure incur severe actinic damage to skin. Modern industry also has many sources of potentially injurious artificial ultraviolet wavelengths, such as in welding, metal burning, molten-metal pouring, glass blowing, electric furnace tending, plasma torch burning and laser beam operations. Apart from the natural capacity of ultraviolet rays in natural or artificial light to injure skin, coal tar and several of its by-products, including certain dyes, selected light-receptive components of plants and fruits and a number of topical and parenteral medications contain harmful chemicals which are activated by certain wavelengths of ultraviolet rays. Such photoreaction effects may operate by either phototoxic or photoallergic mechanisms.
High-intensity electromagnetic energy associated with laser beams is well able to injure human tissue, notably the eye. Skin damage is less of a risk but can occur.
Biological
Occupational exposures to bacteria, fungi, viruses or parasites may cause primary or secondary infections of the skin. Prior to the advent of modern antibiotic therapy, bacterial and fungal infections were more commonly encountered and associated with disabling illness and even death. While bacterial infections can occur in any kind of work setting, certain jobs, such as animal breeders and handlers, farmers, fishermen, food processors and hide handlers have greater exposure potential. Similarly, fungal (yeast) infections are common among bakers, bartenders, cannery workers, cooks, dishwashers, child-care workers and food processors. Dermatoses due to parasitic infections are not common, but when they do occur they are seen most often among agricultural and livestock workers, grain handlers and harvesters, longshoremen and silo workers.
Cutaneous viral infections caused by work are few in number, yet some, such as milker’s nodules among dairyworkers, herpes simplex among medical and dental personnel and sheep pox among livestock handlers continue to be reported.
Chemicals
Organic and inorganic chemicals are the major source of hazards to the skin. Hundreds of new agents enter the work environment each year and many of these will cause cutaneous injury by acting as primary skin irritants or allergic sensitizers. It has been estimated that 75% of the occupational dermatitis cases are caused by primary irritant chemicals. However, in clinics where the diagnostic patch test is commonly used, the frequency of occupational allergic contact dermatitis is increased. By definition, a primary irritant is a chemical substance which will injure every person’s skin if sufficient exposure takes place. Irritants can be rapidly destructive (strong or absolute) as would occur with concentrated acids, alkalis, metallic salts, certain solvents and some gases. Such toxic effects can be observed within a few minutes, depending upon the concentration of the contactant and the length of contact which occurs. Conversely, dilute acids and alkalis, including alkaline dusts, various solvents and soluble cutting fluids, among other agents, may require several days of repeated contact to produce observable effects. These materials are termed “marginal or weak irritants”.
Plants and woods
Plants and woods are often classified as a separate cause of skin disease, but they can also be correctly included in the chemical grouping. Many plants cause mechanical and chemical irritation and allergic sensitization, while others have gained attention because of their photoreactive capacity. The family Anacardiaceae, which includes poison ivy, poison oak, poison sumac, cashew-nut shell oil and the Indian marking nut, is a well-known cause of occupational dermatitis due to its active ingredients (polyhydric phenols). Poison ivy, oak and sumac are common causes of allergic contact dermatitis. Other plants associated with occupational and non-occupational contact dermatitis include castor bean, chrysanthemum, hops, jute, oleander, pineapple, primrose, ragweed, hyacinth and tulip bulbs. Fruits and vegetables, including asparagus, carrots, celery, chicory, citrus fruits, garlic and onions, have been reported as causing contact dermatitis in harvesters, food packing and food preparation workers.
Several varieties of wood have been named as causes of occupational dermatoses among lumberers, sawyers, carpenters and other wood craftspeople. However, the frequency of skin disease is much less than is experienced from contact with poisonous plants. It is likely that some of the chemicals used for preserving the wood cause more dermatitic reactions than the oleoresins contained in wood. Among the preservative chemicals used to protect against insects, fungi and deterioration from soil and moisture are chlorinated diphenyls, chlorinated naphthalenes, copper naphthenate, creosote, fluorides, organic mercurials, tar and certain arsenical compounds, all known causes of occupational skin diseases.
Non-Occupational Factors in OccupationalSkin Disease
Considering the numerous direct causes of occupational skin disease cited above, it can be readily understood that practically any job has obvious and often hidden hazards. Indirect or predisposing factors may also merit attention. A predisposition can be inherited and related to skin colour and type or it may represent a skin defect acquired from other exposures. Whatever the reason, some workers have lower tolerance to materials or conditions in the work environment. In large industrial plants, medical and hygiene programmes can provide the opportunity for placement of such employees in work situations that will not further impair their health. In small plants, however, predisposing or indirect causal factors may not be given proper medical attention.
Pre-existing skin conditions
Several non-occupational diseases affecting the skin can be worsened by various occupational influences.
Acne. Adolescent acne in employees is generally made worse by machine tool, garage and tar exposures. Insoluble oils, various tar fractions, greases and chloracnegenic chemicals are definite hazards to these people.
Chronic eczemas. Detecting the cause of chronic eczema affecting the hands and sometimes distant sites can be elusive. Allergic dermatitis, pompholyx, atopic eczema, pustular psoriasis and fungal infections are some examples. Whatever the condition, any number of irritant chemicals, including plastics, solvents, cutting fluids, industrial cleansers and prolonged moisture, can worsen the eruption. Employees who must continue to work will do so with much discomfort and probably lowered efficiency.
Dermatomycosis. Fungal infections can be worsened at work. When fingernails become involved it may be difficult to assess the role of chemicals or trauma in the nail involvement. Chronic tinea of the feet is subject to periodic worsening, particularly when heavy footgear is required.
Hyperhidrosis. Excessive sweating of the palms and soles can soften the skin (maceration), particularly when impervious gloves or protective footgear are required. This will increase a person’s vulnerability to the effects of other exposures.
Miscellaneous conditions. Employees with polymorphous light eruption, chronic discoid lupus erythematous, porphyria or vitiligo are definitely at greater risk, particularly if there is simultaneous exposure to natural or artificial ultraviolet radiation.
Skin type and pigmentation
Redheads and blue-eyed blondes, particularly those of Celtic origin, have less tolerance to sunlight than people of darker skin type. Such skin is also less able to tolerate exposures to photoreactive chemicals and plants and is suspected of being more susceptible to the action of primary irritant chemicals, including solvents. In general, black skin has a superior tolerance to sunlight and photoreactive chemicals and is less prone to the induction of cutaneous cancer. However, darker skin tends to respond to mechanical, physical or chemical trauma by displaying post-inflammatory pigmentation. It is also more prone to develop keloids following trauma.
Certain skin types, such as hairy, oily, swarthy skins, are more likely to incur folliculitis and acne. Employees with dry skin and those with ichthyoses are at a disadvantage if they must work in low humidity environments or with chemical agents which dehydrate skin. For those workers who sweat profusely, a need to wear impervious protective gear will add to their discomfort. Similarly, overweight individuals usually experience prickly heat during the warm months in hot working environments or in tropical climates. While sweat can be helpful in cooling the skin, it can also hydrolyze certain chemicals that will act as skin irritants.
Diagnosing Occupational Skin Diseases
Cause and effect of occupational skin disease can be best ascertained through a detailed history, which should cover the past and present health and work status of the employee. Family history, particularly of allergies, personal illness in childhood and the past, is important. The title of the job, the nature of the work, the materials handled, how long the job has been done, should be noted. It is important to know when and where on the skin the rash appeared, the behaviour of the rash away from work, whether other employees were affected, what was used to cleanse and protect the skin, and what has been used for treatment (both self-medication and prescribed medication); as well as whether the employee has had dry skin or chronic hand eczema or psoriasis or other skin problems; what drugs, if any, have been used for any particular disease; and finally, which materials have been used in home hobbies such as the garden or woodworking or painting.
The following elements are important parts of the clinical diagnosis:
An occupationally induced acute contact eczematous dermatitis tends to improve upon cessation of contact. Additionally, modern therapeutic agents can facilitate the period of recovery. However, if a worker returns to work and to the same conditions, without proper preventive measures undertaken by the employer and necessary precautions explained and understood by the worker, it is probable that the dermatosis will recur soon after re-exposure.
Chronic eczematous dermatoses, acneform lesions and pigmentary changes are less responsive to treatment even when contact is eliminated. Ulcerations usually improve with elimination of the source. With granulomatous and tumour lesions, eliminating contact with the offending agent may prevent future lesions but will not dramatically change already existing disease.
When a patient with a suspected occupational dermatosis has not improved within two months after no longer having contact with the suspected agent, other reasons for the persistence of the disease should be explored. However, dermatoses caused by metals such as nickel or chrome have a notoriously prolonged course partly because of their ubiquitous nature. Even removal from work cannot eliminate the workplace as the source of the disease. If these and other potential allergens have been eliminated as causal, it is reasonable to conclude that the dermatitis is either non-occupational or is being perpetuated by non-occupational contacts, such as maintenance and repair of automobiles and boats, tile setting glues, garden plants or including even medical therapy, prescribed or otherwise.
Cutaneous sensitivity shares the main elements of all the basic senses. Properties of the external world, such as colour, sound, or vibration, are received by specialized nerve cell endings called sensory receptors, which convert external data into nervous impulses. These signals are then conveyed to the central nervous system, where they become the basis for interpreting the world around us.
It is useful to recognize three essential points about these processes. First, energy, and changes in energy levels, can be perceived only by a sense organ capable of detecting the specific type of energy in question. (This is why microwaves, x rays, and ultraviolet light are all dangerous; we are not equipped to detect them, so that even at lethal levels they are not perceived.) Second, our perceptions are necessarily imperfect shadows of reality, as our central nervous system is limited to reconstructing an incomplete image from the signals conveyed by its sensory receptors. Third, our sensory systems provide us with more accurate information about changes in our environment than about static conditions. We are well-equipped with sensory receptors sensitive to flickering lights, for example, or to the tiny fluctuations of temperature provoked by a slight breeze; we are less well-equipped to receive information about a steady temperature, say, or a constant pressure on the skin.
Traditionally the skin senses are divided into two categories: cutaneous and deep. While deep sensitivity relies on receptors located in muscle, tendons, joints, and the periosteum (membrane surrounding the bones), cutaneous sensitivity, with which we are concerned here, deals with information received by receptors in the skin: specifically, the various classes of cutaneous receptors that are located in or near the junction of the dermis and the epidermis.
All sensory nerves linking cutaneous receptors to the central nervous system have roughly the same structure. The cell’s large body resides in a cluster of other nerve cell bodies, called a ganglion, located near the spinal cord and connected to it by a narrow branch of the cell’s trunk, called its axon. Most nerve cells, or neurons, that originate at the spinal cord send axons to bones, muscle, joints, or, in the case of cutaneous sensitivity, to the skin. Just like an insulated wire, each axon is covered along its course and at its endings with protective layers of cells known as Schwann cells. These Schwann cells produce a substance known as myelin, which coats the axon like a sheath. At intervals along the way are tiny breaks in the myelin, known as nodes of Ranvier. Finally, at the end of the axon are found the components that specialize in receiving and retransmitting information about the external environment: the sensory receptors (Mountcastle 1974).
The different classes of cutaneous receptors, like all sensory receptors, are defined in two ways: by their anatomical structures, and by the type of electrical signals they send along their nerve fibres. Distinctly structured receptors are usually named after their discoverers. The relatively few classes of sensory receptors found in the skin can be divided into three main categories: mechanoreceptors, thermal receptors, and nociceptors.
All of these receptors can convey information about a particular stimulus only after they have first encoded it in a type of electrochemical neural language. These neural codes use varying frequencies and patterns of nerve impulses that scientists have only just begun to decipher. Indeed, an important branch of neurophysiological research is devoted entirely to the study of sensory receptors and the ways in which they translate energy states in the environment into neural codes. Once the codes are generated, they are conveyed centrally along afferent fibres, the nerve cells that serve receptors by conveying the signals to the central nervous system.
The messages produced by receptors can be subdivided on the basis of the response given to a continuous, unvarying stimulation: slowly adapting receptors send electrochemical impulses to the central nervous system for the duration of a constant stimulus, whereas rapidly adapting receptors gradually reduce their discharges in the presence of a steady stimulus until they reach a low baseline level or cease entirely, thereupon ceasing to inform the central nervous system about the continuing presence of the stimulus.
The distinctly different sensations of pain, warmth, cold, pressure, and vibration are thus produced by activity in distinct classes of sensory receptors and their associated nerve fibres. The terms “flutter” and “vibration,” for example, are used to distinguish two slightly different vibratory sensations encoded by two different classes of vibration-sensitive receptors (Mountcastle et al. 1967). The three important categories of pain sensation known as pricking pain, burning pain, and aching pain have each been associated with a distinct class of nociceptive afferent fibre. This is not to say, however, that a specific sensation necessarily involves only one class of receptor; more than one receptor class may contribute to a given sensation, and, in fact, sensations may differ depending on the relative contribution of different receptor classes (Sinclair 1981).
The preceding summary is based on the specificity hypothesis of cutaneous sensory function, first formulated by a German physician named Von Frey in 1906. Although at least two other theories of equal or perhaps greater popularity have been proposed during the past century, Von Frey’s hypothesis has now been strongly supported by factual evidence.
Receptors that Respond to Constant Skin Pressure
In the hand, relatively large myelinated fibres (5 to 15 mm in diameter) emerge from a subcutaneous nerve network called the subpapillary nerve plexus and end in a spray of nerve terminals at the junction of the dermis and the epidermis (figure 1). In hairy skin, these nerve endings culminate in visible surface structures known as touch domes; in glabrous, or hairless, skin, the nerve endings are found at the base of skin ridges (such as those forming the fingerprints). There, in the touch dome, each nerve fibre tip, or neurite, is enclosed by a specialized epithelial cell known as a Merkel cell (see figures 2 and 3).
Figure 1. A schematic illustration of a cross-section of the skin
Figure 2. The touch dome on each raised region of skin contains 30 to 70 Merkel cells.
Figure 3. At a higher magnification available with the electron microscope, the Merkel cell, a specialized epithelial cell, is seen to be attached to the basement membrane that separates the epidermis from the dermis.
The Merkel cell neurite complex transduces mechanical energy into nerve impulses. While little is known about the cell’s role or about its mechanism of transduction, it has been identified as a slowly adapting receptor. This means that pressure on a touch dome containing Merkel cells causes the receptors to produce nerve impulses for the duration of the stimulus. These impulses rise in frequency in proportion to the intensity of the stimulus, thereby informing the brain of the duration and magnitude of pressure on the skin.
Like the Merkel cell, a second slowly adapting receptor also serves the skin by signalling the magnitude and duration of steady skin pressures. Visible only through a microscope, this receptor, known as the Ruffini receptor, consists of a group of neurites emerging from a myelinated fibre and encapsulated by connective tissue cells. Within the capsule structure are fibres that apparently transmit local skin distortions to the neurites, which in turn produce the messages sent along the neural highway to the central nervous system. Pressure on the skin causes a sustained discharge of nerve impulses; as with the Merkel cell, the frequency of nerve impulses is proportional to the intensity of the stimulus.
Despite their similarities, there is one outstanding difference between Merkel cells and Ruffini receptors. Whereas sensation results when Ruffini receptors are stimulated, stimulation of touch domes housing Merkel cells produces no conscious sensation; the touch dome is thus a mystery receptor, for its actual role in neural function remains unknown. Ruffini receptors, then, are believed to be the only receptors capable of providing the neural signals necessary for the sensory experience of pressure, or constant touch. In addition, it has been shown that the slowly adapting Ruffini receptors account for the ability of humans to rate cutaneous pressure on a scale of intensity.
Receptors that Respond to Vibration and Skin Movement
In contrast with slowly adapting mechanoreceptors, rapidly adapting receptors remain silent during sustained skin indentation. They are, however, well-suited to signal vibration and skin movement. Two general categories are noted: those in hairy skin, which are associated with individual hairs; and those which form corpuscular endings in glabrous, or hairless, skin.
Receptors serving hairs
A typical hair is enveloped by a network of nerve terminals branching from five to nine large myelinated axons (figure 4). In primates, these terminals fall into three categories: lanceolate endings, spindle-like terminals, and papillary endings. All three are rapidly adapting, such that a steady deflection of the hair causes nerve impulses only while movement occurs. Thus, these receptors are exquisitely sensitive to moving or vibratory stimuli, but provide little or no information about pressure, or constant touch.
Figure 4. The shafts of hairs are a platform for nerve terminals that detect movements.
Lanceolate endings arise from a heavily myelinated fibre that forms a network around the hair. The terminal neurites lose their usual coverage of Schwann cells and work their way among the cells at the base of the hair.
Spindle-like terminals are formed by axon terminals surrounded by Schwann cells. The terminals ascend to the sloping hair shaft and end in a semicircular cluster just below a sebaceous, or oil-producing, gland. Papillary endings differ from spindle-like terminals because instead of ending on the hair shaft, they terminate as free nerve endings around the orifice of the hair.
There are, presumably, functional differences among the receptor types found on hairs. This can be inferred in part from structural differences in the way the nerves end on the hair shaft and in part from differences in the diameter of axons, as axons of different diameters connect to different central relay regions. Still, the functions of receptors in hairy skin remains an area for study.
Receptors in glabrous skin
The correlation of a receptor’s anatomical structure with the neural signals it generates is most pronounced in large and easily manipulable receptors with corpuscular, or encapsulated, endings. Particularly well understood are the pacininan and Meissner corpuscles, which, like the nerve endings in hairs discussed above, convey sensations of vibration.
The pacinian corpuscle is large enough to be seen with the naked eye, making it easy to link the receptor with a specific neural response. Located in the dermis, usually around tendons or joints, it is an onion-like structure, measuring 0.5 × 1.0 mm. It is served by one of the body’s largest afferent fibres, having a diameter of 8 to 13 μm and conducting at 50 to 80 metres per second. Its anatomy, well-studied by both light and electron microscopy, is well known.
The principal component of the corpuscle is an outer core formed of cellular material enclosing fluid-filled spaces. The outer core itself is then surrounded by a capsule that is penetrated by a central canal and a capillary network. Passing through the canal is a single myelinated nerve fibre 7 to 11 mm in diameter, which becomes a long, nonmyelinated nerve terminal that probes deep into the centre of the corpuscle. The terminal axon is elliptical, with branch-like processes.
The pacinian corpuscle is a rapidly adapting receptor. When subjected to sustained pressure, it thus produces an impulse only at the beginning and the end of the stimulus. It responds to high-frequency vibrations (80 to 400 Hz) and is most sensitive to vibrations around 250 Hz. Often, these receptors respond to vibrations transmitted along bones and tendons, and because of their extreme sensitivity, they may be activated by as little as a puff of air on the hand (Martin 1985).
In addition to the pacinian corpuscle, there is another rapidly adapting receptor in glabrous skin. Most researchers believe it to be the Meissner corpuscle, located in the dermal papillae of the skin. Responsive to low-frequency vibrations of 2 to 40 Hz, this receptor consists of the terminal branches of a medium-sized myelinated nerve fibre enveloped in one or several layers of what appear to be modified Schwann cells, called laminar cells. The receptor’s neurites and laminar cells may connect to a basal cell in the epidermis (figure 5).
Figure 5. The Meissner corpuscle is a loosely encapsulated sensory receptor in the dermal papillae of glabrous skin.
If the Meissner corpuscle is selectively inactivated by the injection of a local anaesthetic through the skin, the sense of flutter or low-frequency vibration is lost. This suggests that it functionally complements the high frequency capacity of the pacinian corpuscles. Together, these two receptors provide neural signals sufficient to account for human sensibility to a full range of vibrations (Mountcastle et al. 1967).
Cutaneous Receptors Associated with Free Nerve Endings
Many still unidentifiable myelinated and unmyelinated fibres are found in the dermis. A large number are only passing through, on their way to skin, muscles, or periosteum, while others (both myelinated and unmyelinated) appear to end in the dermis. With a few exceptions, such as the pacinian corpuscle, most fibres in the dermis appear to end in poorly defined ways or simply as free nerve endings.
While more anatomical study is needed to differentiate these ill-defined endings, physiological research has clearly shown that these fibres encode a variety of environmental events. For example, free nerve endings found at the junction between the dermis and epidermis are responsible for encoding the environmental stimuli that will be interpreted as cold, warmth, heat, pain, itch, and tickle. It is not yet known which of these different classes of small fibres convey particular sensations.
The apparent anatomical similarity of these free nerve endings is probably due to the limitations of our investigative techniques, since structural differences among free nerve endings are slowly coming to light. For example, in glabrous skin, two different terminal modes of free nerve endings have been distinguished: a thick, short pattern and a long, thin one. Studies of human hairy skin have demonstrated histochemically recognizable nerve endings that terminate at the dermal-epidermal junction: the penicillate and papillary endings. The former arise from unmyelinated fibres and form a network of endings; in contrast, the latter arise from myelinated fibres and end around the hair orifices, as mentioned earlier. Presumably, these structural disparities correspond to functional differences.
Although it is not yet possible to assign specific functions to individual structural entities, it is clear from physiological experiments that there exist functionally different categories of free nerve endings. One small myelinated fibre has been found to respond to cold in humans. Another unmyelinated fibre serving free nerve endings responds to warmth. How one class of free nerve endings can respond selectively to a drop in temperature, while an increase of skin temperature can provoke another class to signal warmth is unknown. Studies show that activation of one small fibre with a free ending may be responsible for itching or tickling sensations, while there are believed to be two classes of small fibres specifically sensitive to noxious mechanical and noxious chemical or thermal stimuli, providing the neural basis for pricking and burning pain (Keele 1964).
The definitive correlation between anatomy and physiological response awaits the development of more advanced techniques. This is one of the major stumbling blocks in the management of disorders such as causalgia, paraesthesia, and hyperpathia, which continue to present a dilemma to the physician.
Peripheral Nerve Injury
Neural function can be divided into two categories: sensory and motor. Peripheral nerve injury, usually resulting from the crushing or severing of a nerve, can impair either function or both, depending on the types of fibres in the damaged nerve. Certain aspects of motor loss tend to be misinterpreted or overlooked, as these signals do not go to muscles but rather affect autonomic vascular control, temperature regulation, the nature and thickness of the epidermis, and the condition of cutaneous mechano-receptors. The loss of motor innervation will not be discussed here, nor will the loss of innervation affecting senses other than those responsible for cutaneous sensation.
The loss of sensory innervation to the skin creates a vulnerability to further injury, as it leaves an anaesthetic surface that is incapable of signalling potentially harmful stimuli. Once injured, anaesthetized skin surfaces are slow to heal, perhaps in part on account of the lack of autonomic innervation that normally regulates such key factors as temperature regulation and cellular nutrition.
Over a period of several weeks, denervated cutaneous sensory receptors begin to atrophy, a process which is easy to observe in large encapsulated receptors such as pacinian and Meissner corpuscles. If regeneration of the axons can occur, recovery of function may follow, but the quality of the recovered function will depend upon the nature of the original injury and upon the duration of denervation (McKinnon and Dellon 1988).
Recovery following a nerve crush is more rapid, much more complete and more functional than is recovery after a nerve is severed. Two factors explain the favourable prognosis for a nerve crush. First, more axons may again achieve contact with the skin than after a transection; second, the connections are guided back to their original site by Schwann cells and linings known as basement membranes, both of which remain intact in a crushed nerve, whereas after a nerve transection the nerves often travel to incorrect regions of the skin surface by following the wrong Schwann cell paths. The latter situation results in distorted spatial information being sent to the somatosensory cortex of the brain. In both cases, however, regenerating axons appear capable of finding their way back to the same class of sensory receptors that they previously served.
The reinnervation of a cutaneous receptor is a gradual process. As the growing axon reaches the skin surface, receptive fields are smaller than normal, while the threshold is higher. These receptive points expand with time and gradually coalesce into larger fields. Sensitivity to mechanical stimuli becomes greater and often approaches the sensitivity of normal sensory receptors of that class. Studies using the stimuli of constant touch, moving touch, and vibration have shown that the sensory modalities attributed to different types of receptors return to anaesthetic areas at different rates.
Viewed under a microscope, denervated glabrous skin is seen to be thinner than normal, having flattened epidermal ridges and fewer layers of cells. This confirms that nerves have a trophic, or nutritional, influence on skin. Soon after innervation returns, the dermal ridges become better developed, the epidermis becomes thicker, and axons can be found penetrating the basement membrane. As the axon comes back to the Meissner corpuscle, the corpuscle begins to increase in size, and the previously flattened, atrophic structure returns to its original form. If the denervation has been of long duration, a new corpuscle may form adjacent to the original atrophic skeleton, which remains denervated (Dellon 1981).
As can be seen, an understanding of the consequences of peripheral nerve injury requires knowledge of normal function as well as the degrees of functional recovery. While this information is available for certain nerve cells, others require further investigation, leaving a number of murky areas in our grasp of the role of cutaneous nerves in health and disease.
Three sensory systems are uniquely constructed to monitor contact with environmental substances: olfaction (smell), taste (sweet, salty, sour, and bitter perception), and the common chemical sense (detection of irritation or pungency). Because they require stimulation by chemicals, they are termed “chemosensory” systems. Olfactory disorders consist of temporary or permanent: complete or partial smell loss (anosmia or hyposmia) and parosmias (perverted smells dysosmia or phantom smells phantosmia) (Mott and Leopold 1991; Mott, Grushka and Sessle 1993). After chemical exposures, some individuals describe a heightened sensitivity to chemical stimuli (hyperosmia). Flavour is the sensory experience generated by the interaction of the smell, taste and irritating components of food and beverages, as well as texture and temperature. Because most flavour is derived from the smell, or aroma, of ingestants, damage to the smell system is often reported as a problem with “taste”.
Chemosensory complaints are frequent in occupational settings and may result from a normal sensory system’s perceiving environmental chemicals. Conversely, they may also indicate an injured system: requisite contact with chemical substances renders these sensory systems uniquely vulnerable to damage. In the occupational setting, these systems can also be damaged by trauma to the head and agents other than chemicals (e.g., radiation). Pollutant-related environmental odours can exacerbate underlying medical conditions (e.g., asthma, rhinitis), precipitate development of odour aversions, or cause a stress-related type of illness. Malodors have been demonstrated to decrease complex task performance (Shusterman 1992).
Early identification of workers with olfactory loss is essential. Certain occupations, such as the culinary arts, wine making and the perfume industry, require a good sense of smell as a prerequisite. Many other occupations require normal olfaction for either good job performance or self-protection. For example, parents or day care workers generally rely on smell to determine children’s hygiene needs. Firefighters need to detect chemicals and smoke. Any worker with ongoing exposure to chemicals is at increased risk if olfactory ability is poor.
Olfaction provides an early warning system to many harmful environmental substances. Once this ability is lost, workers may not be aware of dangerous exposures until the concentration of the agent is high enough to be irritating, damaging to respiratory tissues or lethal. Prompt detection can prevent further olfactory damage through treatment of inflammation and reduction of subsequent exposure. Lastly, if loss is permanent and severe, it may be considered a disability requiring new job training and/or compensation.
Anatomy and Physiology
Olfaction
The primary olfactory receptors are located in patches of tissue, termed olfactory neuroepithelium, at the most superior portion of the nasal cavities (Mott and Leopold 1991). Unlike other sensory systems, the receptor is the nerve. One portion of an olfactory receptor cell is sent to the surface of the nasal lining, and the other end connects directly via a long axon to one of two olfactory bulbs in the brain. From here, the information travels to many other areas of the brain. Odorants are volatile chemicals that must contact the olfactory receptor for smell perception to occur. Odorant molecules are trapped by and then diffuse through mucus to attach to cilia at the ends of the olfactory receptor cells. It is not yet known how we are able to detect more than ten thousand odorants, discriminate from as many as 5,000, and judge varying odorant intensities. Recently, a multigene family was discovered that codes for odorant receptors on primary olfactory nerves (Ressler, Sullivan and Buck 1994). This has allowed investigation of how odours are detected and how the smell system is organized. Each neuron may respond broadly to high concentrations of a variety of odorants, but will respond to only one or a few odorants at low concentrations. Once stimulated, surface receptor proteins activate intracellular processes that translate sensory information into an electrical signal (transduction). It is not known what terminates the sensory signal despite continued odorant exposure. Soluble odorant binding proteins have been found, but their role is undetermined. Proteins that metabolize odorants may be involved or carrier proteins may transport odorants either away from the olfactory cilia or toward a catalytic site within the olfactory cells.
The portions of the olfactory receptors connecting directly to the brain are fine nerve filaments that travel through a plate of bone. The location and delicate structure of these filaments render them vulnerable to shear injury from blows to the head. Also, because the olfactory receptor is a nerve, physically contacts odorants, and connects directly to the brain, substances entering the olfactory cells can travel along the axon into the brain. Because of continued exposure to agents damaging to the olfactory receptor cells, olfactory ability might be lost early in the lifespan if it were not for a critical attribute: olfactory receptor nerves are capable of regeneration and may be replaced, provided the tissue has not been completely destroyed. If the damage to the system is more centrally located, however, the nerves can not be restored.
Common chemical sense
The common chemical sense is initiated by stimulation of mucosal, multiple, free nerve endings of the fifth (trigeminal) cranial nerve. It perceives the irritating properties of inhaled substances and triggers reflexes designed to limit exposure to dangerous agents: sneezing, mucus secretion, reduction of breathing rate or even breath-holding. Strong warning cues compel removal from the irritation as soon as possible. Although the pungency of substances vary, generally the odour of the substance is detected before irritation becomes apparent (Ruth 1986). Once irritation is detected, however, small increases in concentration enhance irritation more than odorant appreciation. Pungency may be evoked through either physical or chemical interactions with receptors (Cometto-Muñiz and Cain 1991). The warning properties of gases or vapours tend to correlate with their water solubilities (Shusterman 1992). Anosmics appear to require higher concentrations of pungent chemicals for detection (Cometto-Muñiz and Cain 1994), but thresholds of detection are not elevated as one ages (Stevens and Cain 1986).
Tolerance and adaptation
Perception of chemicals can be altered by previous encounters. Tolerance develops when exposure reduces the response to subsequent exposures. Adaptation occurs when a constant or rapidly repeated stimulus elicits a diminishing response. For example, short-term solvent exposure markedly, but temporarily, reduces solvent detection ability (Gagnon, Mergler and Lapare 1994). Adaptation can also occur when there has been prolonged exposure at low concentrations or rapidly, with some chemicals, when extremely high concentrations are present. The latter can lead to rapid and reversible olfactory “paralysis”. Nasal pungency typically shows less adaptation and development of tolerance than olfactory sensations. Mixtures of chemicals can also alter perceived intensities. Generally, when odorants are mixed, perceived odorant intensity is less than would be expected from adding the two intensities together (hypoadditivity). Nasal pungency, however, generally shows additivity with exposure to multiple chemicals, and summation of irritation over time (Cometto-Muñiz and Cain 1994). With odorants and irritants in the same mixture, the odour is always perceived as less intense. Because of tolerance, adaptation, and hypoadditivity, one must be careful to avoid relying on these sensory systems to gauge the concentration of chemicals in the environment.
Olfactory Disorders
General concepts
Olfaction is disrupted when odorants can not reach olfactory receptors, or when olfactory tissue is damaged. Swelling within the nose from rhinitis, sinusitis or polyps can preclude odorant accessibility. Damage can occur with: inflammation in the nasal cavities; destruction of the olfactory neuroepithelium by various agents; trauma to the head; and transmittal of agents via the olfactory nerves to the brain with subsequent injury to the smell portion of the central nervous system. Occupational settings contain varying amounts of potentially damaging agents and conditions (Amoore 1986; Cometto-Muñiz and Cain 1991; Shusterman 1992; Schiffman and Nagle 1992). Recently published data from 712,000 National Geographic Smell Survey respondents suggests that factory work impairs smell; male and female factory workers reported poorer senses of smell and demonstrated decreased olfaction on testing (Corwin, Loury and Gilbert 1995). Specifically, chemical exposures and head trauma were more frequently reported than by workers in other occupational settings.
When an occupational olfactory disorder is suspected, identification of the offending agent can be difficult. Current knowledge is largely derived from small series and case reports. It is of importance that few studies mention examination of the nose and sinuses. Most rely on patient history for olfactory status, rather than testing of the olfactory system. An additional complicating factor is the high prevalence of non-occupationally related olfactory disturbances in the general population, mostly due to viral infections, allergies, nasal polyps, sinusitis or head trauma. Some of these, however, are also more common in the work environment and will be discussed in detail here.
Rhinitis, sinusitis and polyposis
Individuals with olfactory disturbance must first be assessed for rhinitis, nasal polyps and sinusitis. It is estimated that 20% of the United States population, for example, has upper airway allergies. Environmental exposures can be unrelated, cause inflammation or exacerbate an underlying disorder. Rhinitis is associated with olfactory loss in occupational settings (Welch, Birchall and Stafford 1995). Some chemicals, such as isocyanates, acid anhydrides, platinum salts and reactive dyes (Coleman, Holliday and Dearman 1994), and metals (Nemery 1990) can be allergenic. There is also considerable evidence that chemicals and particles increase sensitivity to nonchemical allergens (Rusznak, Devalia and Davies 1994). Toxic agents alter the permeability of the nasal mucosa and allow greater penetration of allergens and enhanced symptoms, making it difficult to discriminate between rhinitis due to allergies and that due to exposure to toxic or particulate substances. If inflammation and/or obstruction in the nose or sinuses is demonstrated, return of normal olfactory function is possible with treatment. Options include topical corticosteroid sprays, systemic antihistamines and decongestants, antibiotics and polypectomy/sinus surgery. If inflammation or obstruction is not present or treatment does not secure improvement in olfactory function, olfactory tissue may have sustained permanent damage. Irrespective of cause, the individual must be protected from future contact with the offending substance or further injury to the olfactory system could occur.
Head trauma
Head trauma can alter olfaction through (1) nasal injury with scarring of the olfactory neuroepithelium, (2) nasal injury with mechanical obstruction to odours, (3) shearing of the olfactory filaments, and (4) bruising or destruction of the part of the brain responsible for smell sensations (Mott and Leopold 1991). Although trauma is a risk in many occupational settings (Corwin, Loury and Gilbert 1995), exposure to certain chemicals can increase this risk.
Smell loss occurs in 5% to 30% of head trauma patients and may ensue without any other nervous system abnormalities. Nasal obstruction to odorants may be surgically correctable, unless significant intranasal scarring has occurred. Otherwise, no treatment is available for smell disorders resulting from head trauma, although spontaneous improvement is possible. Rapid initial improvement may occur as swelling subsides in the area of injury. If olfactory filaments have been sheared, regrowth and gradual improvement of smell may also occur. Although this occurs in animals within 60 days, improvements in humans have been reported as long as seven years after injury. Parosmias developing as the patient recovers from injury may indicate regrowth of olfactory tissue and herald return of some normal smell function. Parosmias occurring at the time of injury or shortly thereafter are more likely due to brain tissue damage. Damage to the brain will not repair itself and improvement in smell ability would not be expected. Injury to the frontal lobe, the portion of the brain integral to emotion and thinking, may be more frequent in head trauma patients with smell loss. The resultant changes in socialization or thinking patterns may be subtle, though harmful to family and career. Formal neuropsychiatric testing and treatment may, therefore, be indicated in some patients.
Environmental agents
Environmental agents can gain access to the olfactory system through either the bloodstream or inspired air and have been reported to cause smell loss, parosmia and hyperosmia. Responsible agents include metallic compounds, metal dusts, nonmetallic inorganic compounds, organic compounds, wood dusts and substances present in various occupational environments, such as metallurgical and manufacturing processes (Amoore 1986; Schiffman and Nagle 1992 (table 1). Injury can occur both after acute and chronic exposures and can be either reversible or irreversible, depending on the interaction between host susceptibility and the damaging agent. Important substance attributes include bioactivity, concentration, irritant capacity, length of exposure, rate of clearance and potential synergism with other chemicals. Host susceptibility varies with genetic background and age. There are gender differences in olfaction, hormonal modulation of odorant metabolism and differences in specific anosmias. Tobacco use, allergies, asthma, nutritional status, pre-existing disease (e.g., Sjogren’s syndrome), physical exertion at time of exposure, nasal airflow patterns and possibly psychosocial factors influence individual differences (Brooks 1994). Resistance of the peripheral tissue to injury and presence of functioning olfactory nerves can alter susceptibility. For example, acute, severe exposure could decimate the olfactory neuroepithelium, effectively preventing spread of the toxin centrally. Conversely, long-term, low-level exposure might allow preservation of functioning peripheral tissue and slow, but steady-transit of damaging substances into the brain. Cadmium, for example, has a half-life of 15 to 30 years in humans, and its effects might not be apparent until years after exposure (Hastings 1990).
Table 1. Agents/processes associated with olfactory abnormalities
Agent |
Smell disturbance |
Reference |
Acetaldehyde |
H |
2 |
Benzaldehyde |
H |
2 |
Cadmium compounds, dust, oxides |
H/A |
1 ; Bar-Sela et al. 1992; Rose, Heywood and Costanzo 1992 |
Dichromates |
H |
2 |
Ethyl acetate Ethyl ether Ethylene oxide |
H/A |
1 |
Flax |
H |
2 |
Grain |
H or A |
4 |
Halogen compounds |
H |
2 |
Iodoform |
H |
2 |
Lead |
H |
4 |
Magnet production |
H |
2 |
Nickel dust, hydroxide, plating and refining |
H/A |
1;4; Bar-Sela et al. 1992 |
Oil of peppermint |
H/A |
1 |
Paint (lead) |
Low normal H or A |
2 |
Rubber vulcanization |
H |
2 |
Selenium compounds (volatile) |
H |
2 |
Tanning |
H |
2 |
Vanadium fumes |
H |
2 |
Wastewater |
Low normal |
2 |
Zinc (fumes, chromate) and production |
Low normal |
2 |
H = hyposmia; A = anosmia; P = parosmia; ID = odour identification ability
1 = Mott and Leopold 1991. 2 = Amoore 1986. 3 = Schiffman and Nagle 1992. 4 = Naus 1985. 5 = Callendar et al. 1993.
Specific smell disturbances are as stated in the articles referenced.
Nasal passages are ventilated by 10,000 to 20,000 litres of air per day, containing varying amounts of potentially harmful agents. The upper airways almost totally absorb or clear highly reactive or soluble gases, and particles larger than 2 mm (Evans and Hastings 1992). Fortunately, a number of mechanisms exist to protect tissue damage. Nasal tissues are enriched with blood vessels, nerves, specialized cells with cilia capable of synchronous movement, and mucus-producing glands. Defensive functions include filtration and clearing of particles, scrubbing of water soluble gases, and early identification of harmful agents through olfaction and mucosal detection of irritants that can initiate an alarm and removal of the individual from further exposure (Witek 1993). Low levels of chemicals are absorbed by the mucus layer, swept away by functioning cilia (mucociliary clearance) and swallowed. Chemicals can bind to proteins or be rapidly metabolized to less damaging products. Many metabolizing enzymes reside in the nasal mucosa and olfactory tissues (Bonnefoi, Monticello and Morgan 1991; Schiffman and Nagle 1992; Evans et al. 1995). Olfactory neuroepithelium, for example, contains cytochrome P-450 enzymes which play a major role in the detoxification of foreign substances (Gresham, Molgaard and Smith 1993). This system may protect the primary olfactory cells and also detoxify substances that would otherwise enter the central nervous system through olfactory nerves. There is also some evidence that intact olfactory neuroepithelium can prevent invasion by some organisms (e.g., cryptococcus; see Lima and Vital 1994). At the level of the olfactory bulb, there may also be protective mechanisms preventing transport of toxic substances centrally. For example, it has been recently shown that the olfactory bulb contains metallothioneins, proteins which have a protective effect against toxins (Choudhuri et al. 1995).
Exceeding protective capacities can precipitate a worsening cycle of injury. For example, loss of olfactory ability halts early warning of the hazard and allows continued exposure. Increase in nasal blood flow and blood vessel permeability causes swelling and odorant obstruction. Cilial function, necessary for both mucociliary clearance and normal smell, may be impaired. Change in clearance will increase contact time between injurious agents and nasal mucosa. Intranasal mucus abnormalities alter absorption of odorants or irritant molecules. Overpowering the ability to metabolize toxins allows tissue damage, increased absorption of toxins, and possibly enhanced systemic toxicity. Damaged epithelial tissue is more vulnerable to subsequent exposures. There are also more direct effects on olfactory receptors. Toxins can alter the turnover rate of olfactory receptor cells (normally 30 to 60 days), injure receptor cell membrane lipids, or change the internal or external environment of the receptor cells. Although regeneration can occur, damaged olfactory tissue can exhibit permanent changes of atrophy or replacement of olfactory tissue with nonsensory tissue.
The olfactory nerves provide a direct connection to the central nervous system and may serve as a route of entry for a variety of exogenous substances, including viruses, solvents and some metals (Evans and Hastings 1992). This mechanism may contribute to some of the olfactory-related dementias (Monteagudo, Cassidy and Folb 1989; Bonnefoi, Monticello and Morgan 1991) through, for example, transmittal of aluminium centrally. Intranasally, but not intraperitoneally or intracheally, applied cadmium can be detected in the ipsilateral olfactory bulb (Evans and Hastings 1992). There is further evidence that substances may be preferentially taken up by olfactory tissue irrespective of the site of initial exposure (e.g., systemic versus inhalation). Mercury, for example, has been found in high concentrations in the olfactory brain region in subjects with dental amalgams (Siblerud 1990). On electroencephalography, the olfactory bulb demonstrates sensitivity to many atmospheric pollutants, such as acetone, benzene, ammonia, formaldehyde and ozone (Bokina et al. 1976). Because of central nervous system effects of some hydrocarbon solvents, exposed individuals might not readily recognize and distance themselves from the danger, thereby prolonging exposure. Recently, Callender and colleagues (1993) obtained a 94% frequency of abnormal SPECT scans, which assess regional cerebral blood flow, in subjects with neurotoxin exposures and a high frequency of olfactory identification disorders. The location of abnormalities on SPECT scanning was consistent with distribution of toxin through olfactory pathways.
The site of injury within the olfactory system differs with various agents (Cometto-Muñiz and Cain 1991). For example, ethyl acrylate and nitroethane selectively damage olfactory tissue while the respiratory tissue within the nose is preserved (Miller et al. 1985). Formaldehyde alters the consistency, and sulphuric acid the pH of nasal mucus. Many gases, cadmium salts, dimethylamine and cigarette smoke alter ciliary function. Diethyl ether causes leakage of some molecules from the junctions between cells (Schiffman and Nagle 1992). Solvents, such as toluene, styrene and xylene change olfactory cilia; they also appear to be transmitted into the brain by the olfactory receptor (Hotz et al. 1992). Hydrogen sulphide is not only irritating to mucosa, but highly neurotoxic, effectively depriving cells of oxygen, and inducing rapid olfactory nerve paralysis (Guidotti 1994). Nickel directly damages cell membranes and also interferes with protective enzymes (Evans et al. 1995). Dissolved copper is thought to directly interfere with different stages of transduction at the olfactory receptor level (Winberg et al. 1992). Mercuric chloride selectively distributes to olfactory tissue, and may interfere with neuronal function through alteration of neurotransmitter levels (Lakshmana, Desiraju and Raju 1993). After injection into the bloodstream, pesticides are taken up by nasal mucosa (Brittebo, Hogman and Brandt 1987), and can cause nasal congestion. The garlic odour noted with organophosphorus pesticides is not due to damaged tissue, but to detection of butylmercaptan, however.
Although smoking can inflame the lining of the nose and reduce smell ability, it may also confer protection from other damaging agents. Chemicals within the smoke may induce microsomal cytochrome P450 enzyme systems (Gresham, Molgaard and Smith 1993), which would accelerate metabolism of toxic chemicals before they can injure the olfactory neuroepithelium. Conversely, some drugs, for example tricyclic antidepressants and antimalarial drugs, can inhibit cytochrome P450.
Olfactory loss after exposure to wood and fibre board dusts (Innocenti et al. 1985; Holmström, Rosén and Wilhelmsson 1991; Mott and Leopold 1991) may be due to diverse mechanisms. Allergic and nonallergic rhinitis can result in obstruction to odorants or inflammation. Mucosal changes can be severe, dysplasia has been documented (Boysen and Solberg 1982) and adenocarcinoma may result, especially in the area of the ethmoid sinuses near the olfactory neuroepithelium. Carcinoma associated with hard woods may be related to a high tannin content (Innocenti et al. 1985). Inability to effectively clear nasal mucus has been reported and may be related to an increased frequency of colds (Andersen, Andersen and Solgaard 1977); resultant viral infection could further damage the olfactory system. Olfactory loss may also be due to chemicals associated with woodworking, including varnishes and stains. Medium-density fibre board contains formaldehyde, a known respiratory tissue irritant that impairs mucociliary clearance, causes olfactory loss, and is associated with a high incidence of oral, nasal and pharyngeal cancer (Council on Scientific Affairs 1989), all of which could contribute to an understanding of formaldehyde-induced olfactory losses.
Radiation therapy has been reported to cause olfactory abnormalities (Mott and Leopold 1991), but little information is available about occupational exposures. Rapidly regenerating tissue, such as olfactory receptor cells, would be expected to be vulnerable. Mice exposed to radiation in a spaceflight demonstrated smell tissue abnormalities, while the rest of the nasal lining remained normal (Schiffman and Nagle 1992).
After chemical exposures, some individuals describe a heightened sensitivity to odorants. “Multiple chemical sensitivities” or “environmental illness” are labels used to describe disorders typified by “hypersensitivity” to diverse environmental chemicals, often in low concentrations (Cullen 1987; Miller 1992; Bell 1994). Thus far, however, lower thresholds to odorants have not been demonstrated.
Non-occupational causes of olfactory problems
Ageing and smoking decrease olfactory ability. Upper respiratory viral damage, idiopathic (“unknown”), head trauma, and diseases of the nose and sinuses appear to be the four leading causes of smell problems in the United States (Mott and Leopold 1991) and must be considered as part of the differential diagnosis in any individual presenting with possible environmental exposures. Congenital inabilities to detect certain substances are common. For example, 40 to 50% of the population can not detect androsterone, a steroid found in sweat.
Testing of chemosensation
Psychophysics is the measurement of a response to an applied sensory stimulus. “Threshold” tests, tests that determine the minimum concentration that can be reliably perceived, are frequently used. Separate thresholds can be obtained for detection of odorants and identification of odorants. Suprathreshold tests assess ability of the system to function at levels above threshold and also provide useful information. Discrimination tasks, telling the difference between substances, can elicit subtle changes in sensory ability. Identification tasks may yield different results than threshold tasks in the same individual. For example, a person with central nervous system injury may be able to detect odorants at usual threshold levels, but may not be able to identify common odorants.
Summary
The nasal passages are ventilated by 10,000 to 20,000 litres of air per day, which may be contaminated by possibly hazardous materials in varying degrees. The olfactory system is especially vulnerable to damage because of requisite direct contact with volatile chemicals for odorant perception. Olfactory loss, tolerance and adaptation prevent recognition of the proximity of dangerous chemicals and may contribute to local injury or systemic toxicity. Early identification of olfactory disorders can prompt protective strategies, ensure appropriate treatment and prevent further damage. Occupational smell disorders can manifest themselves as temporary or permanent anosmia or hyposmia, as well as distorted smell perception. Identifiable causes to be considered in the occupational setting include rhinitis, sinusitis, head trauma, radiation exposure and tissue injury from metallic compounds, metal dusts, nonmetallic inorganic compounds, organic compounds, wood dusts, and substances present in metallurgical and manufacturing processes. Substances differ in their site of interference with the olfactory system. Powerful mechanisms for trapping, removing and detoxifying foreign nasal substances serve to protect olfactory function and also prevent spread of damaging agents into the brain from the olfactory system. Exceeding protective capacities can precipitate a worsening cycle of injury, ultimately leading to greater severity of impairment and extension of sites of injury, and converting temporary reversible effects into permanent damage.
The documentation of occupational diseases in a country like Taiwan is a challenge to an occupational physician. For lack of a system including material safety data sheets (MSDS), workers were usually not aware of the chemicals with which they work. Since many occupational diseases have long latencies and do not show any specific symptoms and signs until clinically evident, recognition and identification of the occupational origin are often very difficult.
To better control occupational diseases, we have accessed databases which provide a relatively complete list of industrial chemicals and a set of specific signs and/or symptoms. Combined with the epidemiological approach of conjectures and refutations (i.e., considering and ruling out all possible alternative explanations), we have documented more than ten kinds of occupational diseases and an outbreak of botulism. We recommend that a similar approach be applied to any other country in a similar situation, and that a system involving an identification sheet (e.g., MSDS) for each chemical be advocated and implemented as one means to enable prompt recognition and hence the prevention of occupational diseases.
Hepatitis in a Colour Printing Factory
Three workers from a colour printing factory were admitted to community hospitals in 1985 with manifestations of acute hepatitis. One of the three had superimposed acute renal failure. Since viral hepatitis has a high prevalence in Taiwan, we should consider a viral origin among the most likely aetiologies. Alcohol and drug use, as well as organic solvents in the workplace, should also be included. Because there was no system of MSDS in Taiwan, neither the employees nor the employer were aware of all the chemicals used in the factory (Wang 1991).
We had to compile a list of hepatotoxic and nephrotoxic agents from several toxicological databases. Then, we deduced all possible inferences from the above hypotheses. For example, if hepatitis A virus (HAV) were the aetiology, we should observe antibodies (HAV-IgM) among the affected workers; if hepatitis B virus were the aetiology, we should observe more hepatitis B surface antigens (HBsAg) carriers among the affected workers as compared with non-affected workers; if alcohol were the main aetiology, we should observe more alcohol abusers or chronic alcoholics among affected workers; if any toxic solvent (e.g., chloroform) were the aetiology, we should find it at the workplace.
We performed a comprehensive medical evaluation for each worker. The viral aetiology was easily refuted, as well as the alcohol hypothesis, because they could not be supported by the evidence.
Instead, 17 of 25 workers from the plant had abnormal liver function tests, and a significant association was found between the presence of abnormal liver function and a history of recently having worked inside any of three rooms in which an interconnecting air-conditioning system had been installed to cool the printing machines. The association remained after stratification by the carrier status of hepatitis B. It was later determined that the incident occurred following inadvertent use of a “cleaning agent” (which was carbon tetrachloride) to clean a pump in the printing machine. Moreover, a simulation test of the pump-cleaning operation revealed ambient air levels of carbon tetrachloride of 115 to 495 ppm, which could produce hepatic damage. In a further refutational attempt, by eliminating the carbon tetrachloride in the workplace, we found that no more new cases occurred, and all affected workers improved after removal from the workplace for 20 days. Therefore, we concluded that the outbreak was from the use of carbon tetrachloride.
Neurological Symptoms in a Colour Printing Factory
In September 1986, an apprentice in a colour printing factory in Chang-Hwa suddenly developed acute bilateral weakness and respiratory paralysis. The victim’s father alleged on the telephone that there were several other workers with similar symptoms. Since colour printing shops were once documented to have occupational diseases resulting from organic solvent exposures, we went to the worksite to determine the aetiology with an hypothesis of possible solvent intoxication in mind (Wang 1991).
Our common practice, however, was to consider all alternative conjectures, including other medical problems including the impaired function of upper motor neurones, lower motor neurones, as well as the neuromuscular junction. Again, we deduced outcome statements from the above hypotheses. For example, if any solvent reported to produce polyneuropathy (e.g., n-hexane, methyl butylketone, acrylamide) were the cause, it would also impair the nerve conduction velocity (NCV); if it were other medical problems involving upper motor neurones, there would be signs of impaired consciousness and/or involuntary movement.
Field observations disclosed that all affected workers had a clear consciousness throughout the clinical course. An NCV study of three affected workers showed intact lower motor neurones. There was no involuntary movement, no history of medication or bites prior to the appearance of symptoms, and the neostigmine test was negative. A significant association between illness and eating breakfast in the factory cafeteria on September 26 or 27 was found; seven of seven affected workers versus seven of 32 unaffected workers ate breakfast in the factory on these two days. A further testing effort showed that type A botulinum toxin was detected in canned peanuts manufactured by an unlicensed company, and its specimen also showed a full growth of Clostridium botulinum. A final refutational trial was the removal of such products from the commercial market, which resulted in no new cases. This investigation documented the first cases of botulism from a commercial food product in Taiwan.
Premalignant Skin Lesions among Paraquat Manufacturers
In June 1983, two workers from a paraquat manufacturing factory visited a dermatology clinic complaining of numerous bilateral hyperpigmented macules with hyperkeratotic changes on parts of their hands, neck and face exposed to the sun. Some skin specimens also showed Bowenoid changes. Since malignant and premalignant skin lesions were reported among bipyridyl manufacturing workers, an occupational cause was strongly suspected. However, we also had to consider other alternative causes (or hypotheses) of skin cancer such as exposure to ionizing radiation, coal tar, pitch, soot or any other polyaromatic hydrocarbons (PAH). To rule out all of these conjectures, we conducted a study in 1985, visiting all of the 28 factories which ever engaged in paraquat manufacturing or packaging and examining the manufacturing processes as well as the workers (Wang et al. 1987; Wang 1993).
We examined 228 workers and none of them had ever been exposed to the aforementioned skin carcinogens except sunlight and 4’-4’-bipyridine and its isomers. After excluding workers with multiple exposures, we found that one out of seven administrators and two out of 82 paraquat packaging workers developed hyperpigmented skin lesions, as compared with three out of three workers involved in only bipyridine crystallization and centrifugation. Moreover, all 17 workers with hyperkeratotic or Bowen’s lesions had a history of direct exposure to bipyridyl and its isomers. The longer the exposure to bipyridyls, the more likely the development of skin lesions, and this trend cannot be explained by sunlight or age as demonstrated by stratification and logistic regression analysis. Hence, the skin lesion was tentatively attributed to a combination of bipyridyl exposures and sunlight. We made further refutational attempts to follow up if any new case occurred after enclosing all processes involving bipyridyls exposure. No new case was found.
Discussion and Conclusions
The above three examples have illustrated the importance of adopting a refutational approach and a database of occupational diseases. The former makes us always consider alternative hypotheses in the same manner as the initial intuitional hypothesis, while the latter provides a detailed list of chemical agents which can guide us toward the true aetiology. One possible limitation of this approach is that we can consider only those alternative explanations which we can imagine. If our list of alternatives is incomplete, we may be left with no answer or a wrong answer. Therefore, a comprehensive database of occupational disease is crucial to the success of this strategy.
We used to construct our own database in a laborious manner. However, the recently published OSH-ROM databases, which contain the NIOSHTIC database of more than 160,000 abstracts, may be one of the most comprehensive for such a purpose, as discussed elsewhere in the Encyclopaedia. Furthermore, if a new occupational disease occurs, we might search such a database and rule out all known aetiological agents, and leave none unrefuted. In such a situation, we may try to identify or define the new agent (or occupational setting) as specifically as possible so that the problem can first be mitigated, and then test further hypotheses. The case of premalignant skin lesions among paraquat manufacturers is a good example of this kind.
The preceding articles of this chapter have shown the need for a careful evaluation of the study design in order to draw credible inferences from epidemiological observations. Although it has been claimed that inferences in observational epidemiology are weak because of the non-experimental nature of the discipline, there is no built-in superiority of randomized controlled trials or other types of experimental design over well-planned observation (Cornfield 1954). However, to draw sound inferences implies a thorough analysis of the study design in order to identify potential sources of bias and confounding. Both false positive and false negative results can originate from different types of bias.
In this article, some of the guidelines that have been proposed to assess the causal nature of epidemiological observations are discussed. In addition, although good science is a premise for ethically correct epidemiological research, there are additional issues that are relevant to ethical concerns. Therefore, we have devoted some discussion to the analysis of ethical problems that may arise in doing epidemiological studies.
Causality Assessment
Several authors have discussed causality assessment in epidemiology (Hill 1965; Buck 1975; Ahlbom 1984; Maclure 1985; Miettinen 1985; Rothman 1986; Weed 1986; Schlesselman 1987; Maclure 1988; Weed 1988; Karhausen 1995). One of the main points of discussion is whether epidemiology uses or should use the same criteria for the ascertainment of cause-effect relationships as used in other sciences.
Causes should not be confused with mechanisms. For example, asbestos is a cause of mesothelioma, whereas oncogene mutation is a putative mechanism. On the basis of the existing evidence, it is likely that (a) different external exposures can act at the same mechanistic stages and (b) usually there is not a fixed and necessary sequence of mechanistic steps in the development of disease. For example, carcinogenesis is interpreted as a sequence of stochastic (probabilistic) transitions, from gene mutation to cell proliferation to gene mutation again, that eventually leads to cancer. In addition, carcinogenesis is a multifactorial process—that is, different external exposures are able to affect it and none of them is necessary in a susceptible person. This model is likely to apply to several diseases in addition to cancer.
Such a multifactorial and probabilistic nature of most exposure-disease relationships implies that disentangling the role played by one specific exposure is problematic. In addition, the observational nature of epidemiology prevents us from conducting experiments that could clarify aetiologic relationships through a wilful alteration of the course of the events. The observation of a statistical association between exposure and disease does not mean that the association is causal. For example, most epidemiologists have interpreted the association between exposure to diesel exhaust and bladder cancer as a causal one, but others have claimed that workers exposed to diesel exhaust (mostly truck and taxi drivers) are more often cigarette smokers than are non-exposed individuals. The observed association, according to this claim, thus would be “confounded” by a well-known risk factor like smoking.
Given the probabilistic-multifactorial nature of most exposure-disease associations, epidemiologists have developed guidelines to recognize relationships that are likely to be causal. These are the guidelines originally proposed by Sir Bradford Hill for chronic diseases (1965):
These criteria should be considered only as general guidelines or practical tools; in fact, scientific causal assessment is an iterative process centred around measurement of the exposure-disease relationship. However, Hill’s criteria often are used as a concise and practical description of causal inference procedures in epidemiology.
Let us consider the example of the relationship between exposure to vinyl chloride and liver angiosarcoma, applying Hill’s criteria.
The usual expression of the results of an epidemiological study is a measure of the degree of association between exposure and disease (Hill’s first criterion). A relative risk (RR) that is greater than unity means that there is a statistical association between exposure and disease. For instance, if the incidence rate of liver angiosarcoma is usually 1 in 10 million, but it is 1 in 100,000 among those exposed to vinyl chloride, then the RR is 100 (that is, people who work with vinyl chloride have a 100 times increased risk of developing angiosarcoma compared to people who do not work with vinyl chloride).
It is more likely that an association is causal when the risk increases with increasing levels of exposure (dose-response effect, Hill’s second criterion) and when the temporal relationship between exposure and disease makes sense on biological grounds (the exposure precedes the effect and the length of this “induction” period is compatible with a biological model of disease; Hill’s third criterion). In addition, an association is more likely to be causal when similar results are obtained by others who have been able to replicate the findings in different circumstances (“consistency”, Hill’s fourth criterion).
A scientific analysis of the results requires an evaluation of biological plausibility (Hill’s fifth criterion). This can be achieved in different ways. For example, a simple criterion is to evaluate whether the alleged “cause” is able to reach the target organ (e.g., inhaled substances that do not reach the lung cannot circulate in the body). Also, supporting evidence from animal studies is helpful: the observation of liver angiosarcomas in animals treated with vinyl chloride strongly reinforces the association observed in man.
Internal coherence of the observations (for example, the RR is similarly increased in both genders) is an important scientific criterion (Hill’s sixth criterion). Causality is more likely when the relationship is very specific—that is, involves rare causes and/or rare diseases, or a specific histologic type/subgroup of patients (Hill’s seventh criterion).
“Enumerative induction” (the simple enumeration of instances of association between exposure and disease) is insufficient to describe completely the inductive steps in causal reasoning. Usually, the result of enumerative induction produces a complex and still confused observation because different causal chains or, more frequently, a genuine causal relationship and other irrelevant exposures, are entangled. Alternative explanations have to be eliminated through “eliminative induction”, showing that an association is likely to be causal because it is not “confounded” with others. A simple definition of an alternative explanation is “an extraneous factor whose effect is mixed with the effect of the exposure of interest, thus distorting the risk estimate for the exposure of interest” (Rothman 1986).
The role of induction is expanding knowledge, whereas deduction’s role is “transmitting truth” (Giere 1979). Deductive reasoning scrutinizes the study design and identifies associations which are not empirically true, but just logically true. Such associations are not a matter of fact, but logical necessities. For example, a selection bias occurs when the exposed group is selected among ill people (as when we start a cohort study recruiting as “exposed” to vinyl chloride a cluster of liver angiosarcoma cases) or when the unexposed group is selected among healthy people. In both instances the association which is found between exposure and disease is necessarily (logically) but not empirically true (Vineis 1991).
To conclude, even when one considers its observational (non-experimental) nature, epidemiology does not use inferential procedures that differ substantially from the tradition of other scientific disciplines (Hume 1978; Schaffner 1993).
Ethical Issues in Epidemiological Research
Because of the subtleties involved in inferring causation, special care has to be exercised by epidemiologists in interpreting their studies. Indeed, several concerns of an ethical nature flow from this.
Ethical issues in epidemiological research have become a subject of intense discussion (Schulte 1989; Soskolne 1993; Beauchamp et al. 1991). The reason is evident: epidemiologists, in particular occupational and environmental epidemiologists, often study issues having significant economic, social and health policy implications. Both negative and positive results concerning the association between specific chemical exposures and disease can affect the lives of thousands of people, influence economic decisions and therefore seriously condition political choices. Thus, the epidemiologist may be under pressure, and be tempted or even encouraged by others to alter—marginally or substantially—the interpretation of the results of his or her investigations.
Among the several relevant issues, transparency of data collection, coding, computerization and analysis is central as a defence against allegations of bias on the part of the researcher. Also crucial, and potentially in conflict with such transparency, is the right of the subjects enrolled in epidemiological research to be protected from the release of personal information
(confidentiality issues).
From the point of view of misconduct that can arise especially in the context of causal inference, questions that should be addressed by ethics guidelines are:
Other crucial issues, in the case of occupational and environmental epidemiology, relate to the involvement of the workers in preliminary phases of studies, and to the release of the results of a study to the subjects who have been enrolled and are directly affected (Schulte 1989). Unfortunately, it is not common practice that workers enrolled in epidemiological studies are involved in collaborative discussions about the purposes of the study, its interpretation and the potential uses of the findings (which may be both advantageous and detrimental to the worker).
Partial answers to these questions have been provided by recent guidelines (Beauchamp et al. 1991; CIOMS 1991). However, in each country, professional associations of occupational epidemiologists should engage in a thorough discussion about ethical issues and, possibly, adopt a set of ethics guidelines appropriate to the local context while recognizing internationally accepted normative standards of practice.
The three chemosensory systems, smell, taste, and the common chemical sense, require direct stimulation by chemicals for sensory perception. Their role is to monitor constantly both harmful and beneficial inhaled and ingested chemical substances. Irritating or tingling properties are detected by the common chemical sense. The taste system perceives only sweet, salty, sour, bitter and possibly metallic and monosodium glutamate (umami) tastes. The totality of the oral sensory experience is termed “flavour,” the interaction of smell, taste, irritation, texture and temperature. Because most flavour is derived from the smell, or aroma, of food and beverages, damage to the smell system is often reported as a problem with “taste”. Verifiable taste deficits are more likely present if specific losses to sweet, sour, salty and bitter sensations are described.
Chemosensory complaints are frequent in occupational settings and may result from a normal sensory system perceiving environmental chemicals. Conversely, they may also indicate an injured system: requisite contact with chemical substances renders these sensory systems uniquely vulnerable to damage (see table 1). In the occupational setting, these systems can also be damaged by trauma to the head as well as by agents other than chemicals (e.g., radiation). Taste disorders are either temporary or permanent: complete or partial taste loss (ageusia or hypogeusia), heightened taste (hypergeusia) and distorted or phantom tastes (dysgeusia) (Deems, Doty and Settle 1991; Mott, Grushka and Sessle 1993).
Table 1. Agents/processes reported to alter the taste system
Agent/process |
Taste disturbance |
Reference |
Amalgam |
Metallic taste |
Siblerud 1990; see text |
Dental restorations/appliances |
Metallic taste |
See text |
Diving (dry saturation) |
Sweet, bitter; salt, sour |
See text |
Diving and welding |
Metallic taste |
See text |
Drugs/Medications |
Varies |
See text |
Hydrazine |
Sweet dysgeusia |
Schweisfurth and Schottes 1993 |
Hydrocarbons |
Hypogeusia, “glue” dysgeusia |
Hotz et al. 1992 |
Lead poisoning |
Sweet/metallic dysgeusia |
Kachru et al. 1989 |
Metals and metal fumes |
Sweet/Metallic |
See text; Shusterman and Sheedy 1992 |
Nickel |
Metallic taste |
Pfeiffer and Schwickerath 1991 |
Pesticides |
Bitter/metallic dysgeusia |
+ |
Radiation |
Increased DT & RT |
* |
Selenium |
Metallic taste |
Bedwal et al. 1993 |
Solvents |
“Funny taste”, H |
+ |
Sulphuric acid mists |
“Bad taste” |
Petersen and Gormsen 1991 |
Underwater welding |
Metallic taste |
See text |
Vanadium |
Metallic taste |
Nemery 1990 |
DT = detection threshold, RT = recognition threshold, * = Mott & Leopold 1991, + = Schiffman & Nagle 1992
Specific taste disturbances are as stated in the articles referenced.
The taste system is sustained by regenerative capability and redundant innervation. Because of this, clinically notable taste disorders are less common than olfactory disorders. Taste distortions are more common than significant taste loss and, when present, are more likely to have secondary adverse effects such as anxiety and depression. Taste loss or distortion can interfere with occupational performance where keen taste acuity is required, such as culinary arts and blending of wines and spirits.
Anatomy and Physiology
Taste receptor cells, found throughout the oral cavity, the pharynx, the larynx and the oesophagus, are modified epithelial cells located within the taste buds. While on the tongue taste buds are grouped in superficial structures called papillae, extralingual taste buds are distributed within the epithelium. The superficial placement of taste cells makes them susceptible to injury. Damaging agents usually come in contact with the mouth through ingestion, although mouth breathing associated with nasal obstruction or other conditions (e.g., exercise, asthma) allows oral contact with airborne agents. The taste receptor cell’s average ten-day life span permits rapid recovery if superficial damage to receptor cells has occurred. Also, taste is innervated by four pairs of peripheral nerves: the front of the tongue by the chorda tympani branch of the seventh cranial nerve (CN VII); the posterior of the tongue and the pharynx by the glossopharyngeal nerve (CN IX); the soft palate by the greater superficial petrosal branch of CN VII; and the larynx/oesophagus by the vagus (CN X). Last, taste central pathways, although not completely mapped in humans (Ogawa 1994), appear more divergent than olfactory central pathways.
The first step in taste perception involves interaction between chemicals and taste receptor cells. The four taste qualities, sweet, sour, salty and bitter, enlist different mechanisms at the level of the receptor (Kinnamon and Getchell 1991), ultimately generating action potentials in taste neurons (transduction).
Tastants diffuse through salivary secretions and also mucus secreted around taste cells to interact with the surface of taste cells. Saliva ensures that tastants are carried to the buds, and provides an optimal ionic environment for perception (Spielman 1990). Alterations in taste can be demonstrated with changes in the inorganic constituents of saliva. Most taste stimuli are water soluble and diffuse easily; others require soluble carrier proteins for transport to the receptor. Salivary output and composition, therefore, play an essential role in taste function.
Salt taste is stimulated by cations such as Na+, K+ or NH4+. Most salty stimuli are transduced when ions travel through a specific type of sodium channel (Gilbertson 1993), although other mechanisms may also be involved. Changes in the composition of taste pore mucus or the taste cell’s environment could alter salt taste. Also, structural changes in nearby receptor proteins could modify receptor membrane function. Sour taste corresponds to acidity. Blockade of specific sodium channels by hydrogen ions elicits sour taste. As with salt taste, however, other mechanisms are thought to exist. Many chemical compounds are perceived as bitter, including cations, amino acids, peptides and larger compounds. Detection of bitter stimuli appears to involve more diverse mechanisms that include transport proteins, cation channels, G proteins and second messenger mediated pathways (Margolskee 1993). Salivary proteins may be essential in transporting lipophilic bitter stimuli to the receptor membranes. Sweet stimuli bind to specific receptors linked to G protein-activated second-messenger systems. There is also some evidence in mammals that sweet stimuli can gate ion channels directly (Gilbertson 1993).
Taste Disorders
General Concepts
The anatomic diversity and redundancy of the taste system is sufficiently protective to prevent total, permanent taste loss. Loss of a few peripheral taste fields, for example, would not be expected to affect whole mouth taste ability (Mott, Grushka and Sessle 1993). The taste system may be far more vulnerable to taste distortion or phantom tastes. For example, dysgeusias appear to be more common in occupational exposures than taste losses per se. Although taste is thought to be more robust than smell with respect to the ageing process, losses in taste perception with ageing have been documented.
Temporary taste losses can occur when the oral mucosa has been irritated. Theoretically, this can result in inflammation of the taste cells, closure of taste pores or altered function at the surface of taste cells. Inflammation can alter blood flow to the tongue, thereby affecting taste. Salivary flow can also be compromised. Irritants can cause swelling and obstruct salivary ducts. Toxicants absorbed and excreted through salivary glands, could damage ductal tissue during excretion. Either of these processes could cause long-term oral dryness with resultant taste effects. Exposure to toxicants could alter the turnover rate of taste cells, modify the taste channels at the surface of the taste cell, or change the internal or external chemical environments of the cells. Many substances are known to be neurotoxic and could injure peripheral taste nerves directly, or damage higher taste pathways in the brain.
Pesticides
Pesticide use is widespread and contamination occurs as residues in meat, vegetables, milk, rain and drinking water. Although workers exposed during the manufacture or use of pesticides are at greatest risk, the general population is also exposed. Important pesticides include organochloride compounds, organophosphate pesticides, and carbamate pesticides. Organochloride compounds are highly stable and therefore exist in the environment for lengthy periods. Direct toxic effects on central neurons have been demonstrated. Organophosphate pesticides have more widespread use because they are not as persistent, but they are more toxic; inhibition of acetylcholinesterase can cause neurological and behavioural abnormalities. Carbamate pesticide toxicity is similar to that for the organophosphorus compounds and are often used when the latter fail. Pesticide exposures have been associated with persistent bitter or metallic tastes (Schiffman and Nagle 1992), unspecified dysgeusia (Ciesielski et al. 1994), and less commonly with taste loss. Pesticides can reach taste receptors via air, water and food and can be absorbed from the skin, gastrointestinal tract, conjunctiva, and respiratory tract. Because many pesticides are lipid soluble, they can easily penetrate lipid membranes within the body. Interference with taste can occur peripherally irrespective of route of initial exposure; in mice, binding to the tongue has been seen with certain insecticides after injection of pesticide material into the bloodstream. Alterations in taste bud morphology after pesticide exposure have been demonstrated. Degenerative changes in the sensory nerve terminations have been also noted and may account for reports of abnormalities of neural transmission. Metallic dysgeusia may be a sensory paresthesia caused by the impact of pesticides on taste buds and their afferent nerve endings. There is some evidence, however, that pesticides can interfere with neurotransmitters and therefore disrupt transmission of taste information more centrally (El-Etri et al. 1992). Workers exposed to organophosphate pesticides can demonstrate neurological abnormalities on electroencephalography and neuropsychological testing independent of cholinesterase depression in the blood stream. It is thought that these pesticides have a neurotoxic effect on the brain independent of the effect upon cholinesterase. Although increased salivary flow has been reported to be associated with pesticide exposure, it is unclear what effect this might have on taste.
Metals and metal fume fever
Alterations of taste have occurred after exposure to certain metals and metallic compounds including mercury, copper, selenium, tellurium, cyanide, vanadium, cadmium, chromium and antimony. Metallic taste has also been noted by workers exposed to the fumes of zinc or copper oxide, from the ingestion of copper salt in poisoning cases, or from exposure to emissions resulting from the use of torches for cutting of brass piping.
Exposure to freshly formed fumes of metal oxides can result in a syndrome known as metal fume fever (Gordon and Fine 1993). Although zinc oxide is most frequently cited, this disorder has also been reported after exposure to oxides of other metals, including copper, aluminium, cadmium, lead, iron, magnesium, manganese, nickel, selenium, silver, antimony and tin. The syndrome was first noted in brass foundry workers, but is now most common in welding of galvanized steel or during galvanization of steel. Within hours after exposure, throat irritation and a sweet or metallic dysgeusia may herald more generalized symptoms of fever, shaking chills, and myalgia. Other symptoms, such as cough or headache, may also occur. The syndrome is notable for both rapid resolution (within 48 hours) and development of tolerance upon repeated exposures to the metal oxide. A number of possible mechanisms have been suggested, including immune system reactions and a direct toxic effect on respiratory tissue, but it is now thought that lung exposure to metal fumes results in release of specific mediators into the blood stream, called cytokines, that cause the physical symptoms and findings (Blanc et al. 1993). A more severe, potentially fatal, variant of metal fume fever occurs after exposure to zinc chloride aerosol in military screening smoke bombs (Blount 1990). Polymer fume fever is similar to metal fume fever in presentation, with the exception of the absence of metallic taste complaints (Shusterman 1992).
In lead poisoning cases, sweet metallic tastes are often described. In one report, silver jewellery workers with confirmed lead toxicity exhibited taste alterations (Kachru et al. 1989). The workers were exposed to lead fumes by heating jewellers’ silver waste in workshops which had poor exhaust systems. The vapours condensed on the skin and hair of the workers and also contaminated their clothing, food and drinking water.
Underwater welding
Divers describe oral discomfort, loosening of dental fillings and metallic taste during electrical welding and cutting underwater. In a study by Örtendahl, Dahlen and Röckert (1985), 55% of 118 divers working under water with electrical equipment described metallic taste. Divers without this occupational history did not describe metallic taste. Forty divers were recruited into two groups for further evaluation; the group with underwater welding and cutting experience had significantly more evidence of dental amalgam breakdown. Initially, it was theorized that intraoral electrical currents erode dental amalgam, releasing metal ions which have direct effects on taste cells. Subsequent data, however, demonstrated intraoral electrical activity of insufficient magnitude to erode dental amalgam, but of sufficient magnitude to directly stimulate taste cells and cause metallic taste (Örtendahl 1987; Frank and Smith 1991). Divers may be vulnerable to taste changes without welding exposure; differential effects on taste quality perception have been documented, with decreased sensitivity to sweet and bitter and increased sensitivity to salty and sour tastants (O’Reilly et al. 1977).
Dental restorations and oral galvanism
In a large prospective, longitudinal study of dental restorations and appliances, approximately 5% of subjects reported a metallic taste at any given time (Participants of SCP Nos. 147/242 & Morris 1990). Frequency of metallic taste was higher with a history of teeth grinding; with fixed partial dentures than with crowns; and with an increased number of fixed partial dentures. Interactions between dental amalgams and the oral environment are complex (Marek 1992) and could affect taste through a variety of mechanisms. Metals that bind to proteins can acquire antigenicity (Nemery 1990) and might cause allergic reactions with subsequent taste alterations. Soluble metal ions and debris are released and may interact with soft tissues in the oral cavity. Metallic taste has been reported to correlate with nickel solubility in saliva from dental appliances (Pfeiffer and Schwickerath 1991). Metallic taste was reported by 16% of subjects with dental fillings and none of subjects without fillings (Siblerud 1990). In a related study of subjects who had amalgam removed, metallic taste improved or abated in 94% (Siblerud 1990).
Oral galvanism, a controversial diagnosis (Council on Dental Materials report 1987), describes the generation of oral currents from either corrosion of dental amalgam restorations or electrochemical differences between dissimilar intraoral metals. Patients considered to have oral galvanism appear to have a high frequency of dysgeusia (63%) described as metallic, battery, unpleasant or salty tastes (Johansson, Stenman and Bergman 1984). Theoretically, taste cells could be directly stimulated by intraoral electric currents and generate dysgeusia. Subjects with symptoms of oral burning, battery taste, metallic taste and/or oral galvanism were determined to have lower electrogustometric thresholds (i.e. more sensitive taste) on taste testing than control subjects (Axéll, Nilner and Nilsson 1983). Whether galvanic currents related to dental materials are causative is debatable, however. A brief tin-foil taste shortly after restorative work is thought to be possible, but more permanent effects are probably unlikely (Council on Dental Materials 1987). Yontchev, Carlsson and Hedegård (1987) found similar frequencies of metallic taste or oral burning in subjects with these symptoms whether or not there was contact between dental restorations. Alternative explanations for taste complaints in patients with restorations or appliances are sensitivity to mercury, cobalt, chrome, nickel or other metals (Council on Dental Materials 1987), other intraoral processes (e.g., periodontal disease), xerostomia, mucosal abnormalities, medical illnesses, and medication side effects.
Drugs and medications
Many drugs and medications have been linked to taste alterations (Frank, Hettinger and Mott 1992; Mott, Grushka and Sessle 1993; Della Fera, Mott and Frank 1995; Smith and Burtner 1994) and are mentioned here because of possible occupational exposures during the manufacture of these drugs. Antibiotics, anticonvulsants, antilipidemics, antineoplastics, psychiatric, antiparkinsonism, antithyroid, arthritis, cardiovascular, and dental hygiene drugs are broad classes reported to affect taste.
The presumed site of action of drugs on the taste system varies. Often the drug is tasted directly during oral administration of the drug or the drug or its metabolites are tasted after being excreted in saliva. Many drugs, for example anticholinergics or some antidepressants, cause oral dryness and affect taste through inadequate presentation of the tastant to the taste cells via saliva. Some drugs may affect taste cells directly. Because taste cells have a high turnover rate, they are especially vulnerable to drugs that interrupt protein synthesis, such as antineoplastic drugs. It has also been thought that there may be an effect on impulse transmission through the taste nerves or in the ganglion cells, or a change in the processing of the stimuli in higher taste centres. Metallic dysgeusia has been reported with lithium, possibly through transformations in receptor ion channels. Anti-thyroid drugs and angiotensin converting enzyme inhibitors (e.g., captopril and enalapril) are well known causes of taste alterations, possibly because of the presence of a sulphydryl (-SH) group (Mott, Grushka and Sessle 1993). Other drugs with -SH groups (e.g., methimazole, penicillamine) also cause taste abnormalities. Drugs that affect neurotransmitters could potentially alter taste perception.
Mechanisms of taste alterations vary, however, even within a class of drug. For example, taste alterations after treatment with tetracycline may be caused by oral mycosis. Alternatively, an increased blood urea nitrogen, associated with the catabolic effect of tetracycline, can sometimes result in a metallic or ammonia-like taste.
Side effects of metronidazole include alteration of taste, nausea, and a distinctive distortion of the taste of carbonated and alcoholic beverages. Peripheral neuropathy and paraesthesias can also sometimes occur. It is thought that the drug and its metabolites may have a direct affect upon taste receptor function, and also on the sensory cell.
Radiation exposure
Radiation treatment can cause taste dysfunction through (1) taste cell changes, (2) damage to taste nerves, (3) salivary gland dysfunction, and (4) opportunistic oral infection (Della Fera et al. 1995). There have been no studies of occupational radiation effects on the taste system.
Head trauma
Head trauma occurs in the occupational setting and can cause alterations in the taste system. Although perhaps only 0.5% of head trauma patients describe taste loss, the frequency of dysgeusia may be much higher (Mott, Grushka and Sessle 1993). Taste loss, when it occurs, is likely quality-specific or localized and may not even be subjectively apparent. The prognosis of subjectively noted taste loss appears better than that for olfactory loss.
Non-occupational causes
Other causes of taste abnormalities must be considered in the differential diagnosis, including congenital/genetic, endocrine/metabolic, or gastrointestinal disorders; hepatic disease; iatrogenic effects; infection; local oral conditions; cancer; neurological disorders; psychiatric disorders; renal disease; and dry mouth/Sjogren’s syndrome (Deems, Doty and Settle 1991; Mott and Leopold 1991; Mott, Grushka and Sessle 1993).
Taste testing
Psychophysics is the measurement of a response to an applied sensory stimulus. “Threshold” tasks, tests that determine the minimum concentration that can be reliably perceived, are less useful in taste than olfaction because of wider variability in the former in the general population. Separate thresholds can be obtained for detection of tastants and recognition of tastant quality. Suprathreshold tests assess the ability of the system to function at levels above threshold and may provide more information about “real world” taste experience. Discrimination tasks, telling the difference between substances, can elicit subtle changes in sensory ability. Identification tasks may yield different results than threshold tasks in the same individual. For example, a person with central nervous system injury may be able to detect and rank tastants, but may not be able to identify them. Taste testing can assess whole mouth taste through swishing of tastants throughout the oral cavity, or can test specific taste areas with targeted droplets of tastants or focally applied filter paper soaked with tastants.
Summary
The taste system is one of three chemosensory systems, together with olfaction and the common chemical sense, committed to monitoring harmful and beneficial inhaled and ingested substances. Taste cells are rapidly replaced, are innervated by pairs of four peripheral nerves, and appear to have divergent central pathways in the brain. The taste system is responsible for the appreciation of four basic taste qualities (sweet, sour, salty, and bitter) and, debatably, metallic and umami (monosodium glutamate) tastes. Clinically significant taste losses are rare, probably because of the redundancy and diversity of innervation. Distorted or abnormal tastes, however, are more common and can be more distressing. Toxic agents unable to destroy the taste system, or to halt transduction or transmission of taste information, nevertheless have ample opportunities to impede the perception of normal taste qualities. Irregularities or obstacles can occur through one or more of the following: suboptimal tastant transport, altered salivary composition, taste cell inflammation, blockage of taste cell ion pathways, alterations in the taste cell membrane or receptor proteins, and peripheral or central neurotoxicity. Alternatively, the taste system may be intact and functioning normally, but be subjected to disagreeable sensory stimulation through small intraoral galvanic currents or the perception of intraoral medications, drugs, pesticides or metal ions.
Anatomy of the Eye
The eye is a sphere (Graham et al. 1965; Adler 1992), approximately 20 mm in diameter, that is set in the body orbit with the six extrinsic (ocular) muscles that move the eye attached to the sclera, its external wall (figure 1). In front, the sclera is replaced by the cornea, which is transparent. Behind the cornea in the interior chamber is the iris, which regulates the diameter of the pupil, the space through which the optic axis passes. The back of the anterior chamber is formed by the biconvex crystalline lens, whose curvature is determined by the ciliary muscles attached at the front to the sclera and behind to the choroidal membrane, which lines the posterior chamber. The posterior chamber is filled with the vitreous humour—a clear, gelatinous liquid. The choroid, the inner surface of the posterior chamber, is black to prevent interference with visual acuity by internal light reflections.
Figure 1. Schematic representation of the eye.
The eyelids help to maintain a film of tears, produced by the lacrymal glands, which protects the anterior surface of the eye. Blinking facilitates the spread of tears and their emptying into the lacrymal canal, which empties in the nasal cavity. The frequency of blinking, which is used as a test in ergonomics, varies greatly depending on the activity being undertaken (for example, it is slower during reading) and also on the lighting conditions (the rate of blinking is lowered by an increase of illumination).
The anterior chamber contains two muscles: the sphincter of the iris, which contracts the pupil, and the dilator, which widens it. When a bright light is directed toward a normal eye, the pupil contracts (pupillary reflex). It also contracts when viewing a nearby object.
The retina has several inner layers of nerve cells and an outer layer containing two types of photoreceptor cells, the rods and cones. Thus, light passes through the nerve cells to the rods and cones where, in a manner not yet understood, it generates impulses in the nerve cells which pass along the optic nerve to the brain. The cones, numbering four to five millions, are responsible for the perception of bright images and colour. They are concentrated in the inner portion of the retina, most densely at the fovea, a small depression at the centre of the retina where there are no rods and where vision is most acute. With the help of spectrophotometry, three types of cones have been identified, whose absorption peaks are yellow, green and blue zones accounting for the sense of colour. The 80 to 100 million rods become more and more numerous toward the periphery of the retina and are sensitive to dim light (night vision). They also play a major role in black-white vision and in the detection of motion.
The nerve fibres, along with the blood vessels which nourish the retina, traverse the choroid, the middle of the three layers forming the wall of the posterior chamber, and leave the eye as the optic nerve at a point somewhat off-centre, which, because there are no photoreceptors there, is known as the “blind spot.”
The retinal vessels, the only arteries and veins that can be viewed directly, can be visualized by directing a light through the pupil and using an ophthalmoscope to focus on their image (the images can also be photographed). Such retinoscopic examinations, part of the routine medical examination, are important in evaluating the vascular components of such diseases as arteriosclerosis, hypertension and diabetes, which may cause retinal haemorrhages and/or exudates that may cause defects in the field of vision.
Properties of the Eye that Are Important for Work
Mechanism of accommodation
In the emmetropic (normal) eye, as light rays pass through the cornea, the pupil and the lens, they are focused on the retina, producing an inverted image which is reversed by the visual centres in the brain.
When a distant object is viewed, the lens is flattened. When viewing nearby objects, the lens accommodates (i.e., increases its power) by a squeezing of the ciliary muscles into a more oval, convex shape. At the same time, the iris constricts the pupil, which improves the quality of the image by reducing the spherical and chromatic aberrations of the system and increasing the depth of field.
In binocular vision, accommodation is necessarily accompanied by proportional convergence of both eyes.
The visual field and the field of fixation
The visual field (the space covered by the eyes at rest) is limited by anatomical obstacles in the horizontal plane (more reduced on the side towards the nose) and in the vertical plane (limited by the upper edge of the orbit). In binocular vision, the horizontal field is about 180 degrees and the vertical field 120 to 130 degrees. In daytime vision, most visual functions are weakened at the periphery of the visual field; on the contrary, perception of movement is improved. In night vision there is a considerable loss of acuity at the centre of the visual field, where, as noted above, the rods are less numerous.
The field of fixation extends beyond the visual field thanks to the mobility of the eyes, head and body; in work activities it is the field of fixation that matters. The causes of reduction of the visual field, whether anatomical or physiological, are very numerous: narrowing of the pupil; opacity of the lens; pathological conditions of the retina, visual pathways or visual centres; the brightness of the target to be perceived; the frames of spectacles for correction or protection; the movement and speed of the target to be perceived; and others.
Visual acuity
“Visual acuity (VA) is the capacity to discriminate the fine details of objects in the field of view. It is specified in terms of the minimum dimension of some critical aspects of a test object that a subject can correctly identify” (Riggs, in Graham et al. 1965). A good visual acuity is the ability to distinguish fine details. Visual acuity defines the limit of spatial discrimination.
The retinal size of an object depends not only on its physical size but also on its distance from the eye; it is therefore expressed in terms of the visual angle (usually in minutes of arc). Visual acuity is the reciprocal of this angle.
Riggs (1965) describes several types of “acuity task”. In clinical and occupational practice, the recognition task, in which the subject is required to name the test object and locate some details of it, is the most commonly applied. For convenience, in ophthalmology, visual acuity is measured relative to a value called “normal” using charts presenting a series of objects of different sizes; they have to be viewed at a standard distance.
In clinical practice Snellen charts are the most widely used tests for distant visual acuity; a series of test objects are used in which the size and broad shape of characters are designed to subtend an angle of 1 minute at a standard distance which varies from country to country (in the United States, 20 feet between the chart and the tested individual; in most European countries, 6 metres). The normal Snellen score is thus 20/20. Larger test objects which form an angle of 1 minute of arc at greater distances are also provided.
The visual acuity of an individual is given by the relation VA = D¢/D, where D¢ is the standard viewing distance and D the distance at which the smallest test object correctly identified by the individual subtends an angle of 1 minute of arc. For example, a person’s VA is 20/30 if, at a viewing distance of 20 ft, he or she can just identify an object which subtends an angle of 1 minute at 30 feet.
In optometric practice, the objects are often letters of the alphabet (or familiar shapes, for illiterates or children). However, when the test is repeated, charts should present unlearnable characters for which the recognition of differences involve no educational and cultural features. This is one reason why it is nowadays internationally recommended to use Landolt rings, at least in scientific studies. Landolt rings are circles with a gap, the directional position of which has to be identified by the subject.
Except in ageing people or in those individuals with accommodative defects (presbyopia), the far and the near visual acuity parallel each other. Most jobs require both a good far (without accommodation) and a good near vision. Snellen charts of different kinds are also available for near vision (figures 2 and 3). This particular Snellen chart should be held at 16 inches from the eye (40 cm); in Europe, similar charts exist for a reading distance of 30 cm (the appropriate distance for reading a newspaper).
Figure 2. Example of a Snellen chart: Landolt rings (acuity in decimal values (reading distance not specified)).
Figure 3. Example of a Snellen chart: Sloan letters for measuring near vision (40 cm)(acuity in decimal values and in distance equivalents).
With the broad use of visual display units, VDUs, however, there is an increased interest in occupational health to test operators at a longer distance (60 to 70 cm, according to Krueger (1992), in order to correct VDU operators properly.
Vision testers and visual screening
For occupational practice, several types of visual testers are available on the market which have similar features; they are named Orthorater, Visiotest, Ergovision, Titmus Optimal C Tester, C45 Glare Tester, Mesoptometer, Nyctometer and so on.
They are small; they are independent of the lighting of the testing room, having their own internal lighting; they provide several tests, such as far and near binocular and monocular visual acuity (most of the time with unlearnable characters), but also depth perception, rough colour discrimination, muscular balance and so on. Near visual acuity can be measured, sometimes for short and intermediate distance of the test object. The most recent of these devices makes extensive use of electronics to provide automatically written scores for different tests. Moreover, these instruments can be handled by non-medical personnel after some training.
Vision testers are designed for the purpose of pre-recruitment screening of workers, or sometimes later testing, taking into account the visual requirements of their workplace. Table 1 indicates the level of visual acuity needed to fulfil unskilled to highly skilled activities, when using one particular testing device (Fox, in Verriest and Hermans 1976).
Table 1. Visual requirements for different activities when using Titmus Optimal C Tester, with correction
Category 1: Office work
Far visual acuity 20/30 in each eye (20/25 for binocular vision)
Near VA 20/25 in each eye (20/20 for binocular vision)
Category 2: Inspection and other activities in fine mechanics
Far VA 20/35 in each eye (20/30 for binocular vision)
Near VA 20/25 in each eye (20/20 for binocular vision)
Category 3: Operators of mobile machinery
Far VA 20/25 in each eye (20/20 for binocular vision)
Near VA 20/35 in each eye (20/30 for binocular vision)
Category 4 : Machine tools operations
Far and near VA 20/30 in each eye (20/25 for binocular vision)
Category 5 : Unskilled workers
Far VA 20/30 in each eye (20/25 for binocular vision)
Near VA 20/35 in each eye (20/30 for binocular vision)
Category 6 : Foremen
Far VA 20/30 in each eye (20/25 for binocular vision)
Near VA 20/25 in each eye (20/20 for binocular vision)
Source: According to Fox in Verriest and Hermans 1975.
It is recommended by manufacturers that employees are measured when wearing their corrective glasses. Fox (1965), however, stresses that such a procedure may lead to wrong results—for example, workers are tested with glasses which are too old in comparison with the time of the present measurement; or lenses may be worn out by exposure to dust or other noxious agents. It is also very often the case that people come to the testing room with the wrong glasses. Fox (1976) suggests therefore that, if “the corrected vision is not improved to 20/20 level for distance and near, referral should be made to an ophthalmologist for a proper evaluation and refraction for the current need of the employee on his job”. Other deficiencies of vision testers are referred to later in this article.
Factors influencing visual acuity
VA meets its first limitation in the structure of the retina. In daytime vision, it may exceed 10/10ths at the fovea and may rapidly decline as one moves a few degrees away from the centre of the retina. In night vision, acuity is very poor or nil at the centre but may reach one tenth at the periphery, because of the distribution of cones and rods (figure 4).
Figure 4. Density of cones and rods in the retina as compared with the relative visual acuity in the corresponding visual field.
The diameter of the pupil acts on visual performance in a complex manner. When dilated, the pupil allows more light to enter into the eye and stimulate the retina; the blur due to the diffraction of the light is minimized. A narrower pupil, however, reduces the negative effects of the aberrations of the lens mentioned above. In general, a pupil diameter of 3 to 6 mm favours clear vision.
Thanks to the process of adaptation it is possible for the human being to see as well by moonlight as by full sunshine, even though there is a difference in illumination of 1 to 10,000,000. Visual sensitivity is so wide that luminous intensity is plotted on a logarithmic scale.
On entering a dark room we are at first completely blind; then the objects around us become perceptible. As the light level is increased, we pass from rod-dominated vision to cone-dominated vision. The accompanying change in sensitivity is known as the Purkinje shift. The dark-adapted retina is mainly sensitive to low luminosity, but is characterized by the absence of colour vision and poor spatial resolution (low VA); the light-adapted retina is not very sensitive to low luminosity (objects have to be well illuminated in order to be perceived), but is characterized by a high degree of spatial and temporal resolution and by colour vision. After the desensitization induced by intense light stimulation, the eye recovers its sensitivity according to a typical progression: at first a rapid change involving cones and daylight or photopic adaptation, followed by a slower phase involving rods and night or scotopic adaptation; the intermediate zone involves dim light or mesopic adaptation.
In the work environment, night adaptation is hardly relevant except for activities in a dark room and for night driving (although the reflection on the road from headlights always brings some light). Simple daylight adaptation is the most common in industrial or office activities, provided either by natural or by artificial lighting. However, nowadays with emphasis on VDU work, many workers like to operate in dim light.
In occupational practice, the behaviour of groups of people is particularly important (in comparison with individual evaluation) when selecting the most appropriate design of workplaces. The results of a study of 780 office workers in Geneva (Meyer et al. 1990) show the shift in percentage distribution of acuity levels when lighting conditions are changed. It may be seen that, once adapted to daylight, most of the tested workers (with eye correction) reach a quite high visual acuity; as soon as the surrounding illumination level is reduced, the mean VA decreases, but also the results are more spread, with some people having very poor performance; this tendency is aggravated when dim light is accompanied by some disturbing glare source (figure 5). In other words, it is very hard to predict the behaviour of a subject in dim light from his or her score in optimal daylight conditions.
Figure 5. Percentage distribution of tested office workers’ visual acuity.
Glare. When the eyes are directed from a dark area to a lighted area and back again, or when the subject looks for a moment at a lamp or window (illuminance varying from 1,000 to 12,000 cd/m2), changes in adaptation concern a limited area of the visual field (local adaptation). Recovery time after disabling glare may last several seconds, depending on illumination level and contrast (Meyer et al. 1986) (figure 6).
Figure 6. Response time before and after exposure to glare for perceiving the gap of a Landolt ring: Adaption to dim light.
Afterimages. Local disadaptation is usually accompanied by the continued image of a bright spot, coloured or not, which produces a veil or masking effect (this is the consecutive image). Afterimages have been studied very extensively to better understand certain visual phenomena (Brown in Graham et al. 1965). After visual stimulation has ceased, the effect remains for some time; this persistence explains, for example, why perception of continuous light may be present when facing a flickering light (see below). If the frequency of flicker is high enough, or when looking at cars at night, we see a line of light. These afterimages are produced in the dark when viewing an enlighted spot; they are also produced by coloured areas, leaving coloured images. It is the reason why VDU operators may be exposed to sharp afterimages after looking for a prolonged time at the screen and then moving their eyes towards another area in the room.
Afterimages are very complicated. For example, one experiment on afterimages found that a blue spot appears white during the first seconds of observation, then pink after 30 seconds, and then bright red after a minute or two. Another experiment showed that an orange-red field appeared momentarily pink, then within 10 to 15 seconds passed through orange and yellow to a bright green appearance which remained throughout the whole observation. When the point of fixation moves, usually the afterimage moves too (Brown in Graham et al. 1965). Such effects could be very disturbing to someone working with a VDU.
Diffused light emitted by glare sources also has the effect of reducing the object/background contrast (veiling effect) and thus reducing visual acuity (disability glare). Ergophthalmologists also describe discomfort glare, which does not reduce visual acuity but causes uncomfortable or even painful sensation (IESNA 1993).
The level of illumination at the workplace must be adapted to the level required by the task. If all that is required is to perceive shapes in an environment of stable luminosity, weak illumination may be adequate; but as soon as it is a question of seeing fine details that require increased acuity, or if the work involves colour discrimination, retinal illumination must be markedly increased.
Table 2 gives recommended illuminance values for the lighting design of a few workstations in different industries (IESNA 1993).
Table 2. Recommended illuminance values for the lighting design of a few workstations
Cleaning and pressing industry | |
Dry and wet cleaning and steaming | 500-1,000 lux or 50-100 footcandles |
Inspection and spotting | 2,000-5,000 lux or 200-500 footcandles |
Repair and alteration | 1,000-2,000 lux or 100-200 footcandles |
Dairy products, fluid milk industry | |
Bottle storage | 200-500 lux or 20-50 footcandles |
Bottle washers | 200-500 lux or 20-50 footcandles |
Filling, inspection | 500-1,000 lux or 50-100 footcandles |
Laboratories | 500-1,000 lux or 50-100 footcandles |
Electrical equipment, manufacturing | |
Impregnating | 200-500 lux or 20-50 footcandles |
Insulating coil winding | 500-1,000 lux or 50-100 footcandles |
Electricity-generating stations | |
Air-conditioning equipment, air preheater | 50-100 lux or 50-10 footcandles |
Auxiliaries, pumps, tanks, compressors | 100-200 lux or 10-20 footcandles |
Clothing industry | |
Examining (perching) | 10,000-20,000 lux or 1,000-2,000 footcandles |
Cutting | 2,000-5,000 lux or 200-500 footcandles |
Pressing | 1,000-2,000 lux or 100-200 footcandles |
Sewing | 2,000-5,000 lux or 200-500 footcandles |
Piling up and marking | 500-1,000 lux or 50-100 footcandles |
Sponging, decating, winding | 200-500 lux or 20-50 footcandles |
Banks | |
General | 100-200 lux or 10-20 footcandles |
Writing area | 200-500 lux or 20-50 footcandles |
Tellers’ stations | 500-1,000 lux or 50-100 footcandles |
Dairy farms | |
Haymow area | 20-50 lux or 2-5 footcandles |
Washing area | 500-1,000 lux or 50-100 footcandles |
Feeding area | 100-200 lux or 10-20 footcandles |
Foundries | |
Core-making: fine | 1,000-2,000 lux or 100-200 footcandles |
Core-making: medium | 500-1,000 lux or 50-100 footcandles |
Moulding: medium | 1,000-2,000 lux or 100-200 footcandles |
Moulding: large | 500-1,000 lux or 50-100 footcandles |
Inspection: fine | 1,000-2,000 lux or 100-200 footcandles |
Inspection: medium | 500-1,000 lux or 50-100 footcandles |
Source: IESNA 1993.
Brightness contrast and spatial distribution of luminances at the workplace. From the point of view of ergonomics, the ratio between luminances of the test object, its immediate background and the surrounding area has been widely studied, and recommendations on this subject are available for different requirements of the task (see Verriest and Hermans 1975; Grandjean 1987).
The object-background contrast is currently defined by the formula (Lf – Lo)/Lf, where Lo is the luminance of the object and Lf the luminance of the background. It thus varies from 0 to 1.
As shown by figure 7, visual acuity increases with the level of illumination (as previously said) and with the increase of object-background contrast (Adrian 1993). This effect is particularly marked in young people. A large light background and a dark object thus provides the best efficiency. However, in real life, contrast will never reach unity. For example, when a black letter is printed on a white sheet of paper, the object-background contrast reaches a value of only around 90%.
Figure 7. Relationship between visual acuity of a dark object perceived on a background receiving increasing illumination for four contrast values.
In the most favourable situation—that is, in positive presentation (dark letters on a light background)—acuity and contrast are linked, so that visibility can be improved by affecting either one or the other factor—for example, increasing the size of letters or their darkness, as in Fortuin’s table (in Verriest and Hermans 1975). When video display units appeared on the market, letters or symbols were presented on the screen as light spots on a dark background. Later on, new screens were developed which displayed dark letters on a light background. Many studies were conducted in order to verify whether this presentation improved vision. The results of most experiments stress without any doubt that visual acuity is enhanced when reading dark letters on a light background; of course a dark screen favours reflections of glare sources.
The functional visual field is defined by the relationship between the luminosity of the surfaces actually perceived by the eye at the workpost and those of the surrounding areas. Care must be taken not to create too great differences of luminosity in the visual field; according to the size of the surfaces involved, changes in general or local adaptation occur which cause discomfort in the execution of the task. Moreover, it is recognized that in order to achieve good performance, the contrasts in the field must be such that the task area is more illuminated than its immediate surroundings, and that the far areas are darker.
Time of presentation of the object. The capacity to detect an object depends directly on the quantity of light entering the eye, and this is linked with the luminous intensity of the object, its surface qualities and the time during which it appears (this is known in tests of tachystocopic presentation). A reduction in acuity occurs when the duration of presentation is less than 100 to 500 ms.
Movements of the eye or of the target. Loss of performance occurs particularly when the eye jerks; nevertheless, total stability of the image is not required in order to attain maximum resolution. But it has been shown that vibrations such as those of construction site machines or tractors can adversely affect visual acuity.
Diplopia. Visual acuity is higher in binocular than in monocular vision. Binocular vision requires optical axes that both meet at the object being looked at, so that the image falls into corresponding areas of the retina in each eye. This is made possible by the activity of the external muscles. If the coordination of the external muscles is failing, more or less transitory images may appear, such as in excessive visual fatigue, and may cause annoying sensations (Grandjean 1987).
In short, the discriminating power of the eye depends on the type of object to be perceived and the luminous environment in which it is measured; in the medical consulting room, conditions are optimal: high object-background contrast, direct daylight adaptation, characters with sharp edges, presentation of the object without a time limit, and certain redundancy of signals (e.g., several letters of the same size on a Snellen chart). Moreover, visual acuity determined for diagnosis purposes is a maximal and unique operation in the absence of accommodative fatigue. Clinical acuity is thus a poor reference for the visual performance attained on the job. What is more, good clinical acuity does not necessarily mean the absence of discomfort at work, where conditions of individual visual comfort are rarely attained. At most workplaces, as stressed by Krueger (1992), objects to be perceived are blurred and of low contrast, background luminances are unequally scattered with many glare sources producing veiling and local adaptation effects and so on. According to our own calculations, clinical results do not carry much predictive value of the amount and nature of visual fatigue encountered, for example, in VDU work. A more realistic laboratory set-up in which conditions of measurement were closer to task requirements did somewhat better (Rey and Bousquet 1990; Meyer et al. 1990).
Krueger (1992) is right when claiming that ophthalmological examination is not really appropriate in occupational health and ergonomics, that new testing procedures should be developed or extended, and that existing laboratory set-ups should be made available to the occupational practitioner.
Relief Vision, Stereoscopic Vision
Binocular vision allows a single image to be obtained by means of synthesis of the images received by the two eyes. Analogies between these images give rise to the active cooperation that constitutes the essential mechanism of the sense of depth and relief. Binocular vision has the additional property of enlarging the field, improving visual performance generally, relieving fatigue and increasing resistance to glare and dazzle.
When the fusion of both eyes is not sufficient, ocular fatigue may appear earlier.
Without achieving the efficiency of binocular vision in appreciating the relief of relatively near objects, the sensation of relief and the perception of depth are nevertheless possible with monocular vision by means of phenomena that do not require binocular disparity. We know that the size of objects does not change; that is why apparent size plays a part in our appreciation of distance; thus retinal images of small size will give the impression of distant objects, and vice versa (apparent size). Near objects tend to hide more distant objects (this is called interposition). The brighter one of two objects, or the one with a more saturated colour, seems to be nearer. The surroundings also play a part: more distant objects are lost in mist. Two parallel lines seem to meet at infinity (this is the perspective effect). Finally, if two targets are moving at the same speed, the one whose speed of retinal displacement is slower will appear farther from the eye.
In fact, monocular vision does not constitute a major obstacle in the majority of work situations. The subject needs to get accustomed to the narrowing of the visual field and also to the rather exceptional possibility that the image of the object may fall on the blind spot. (In binocular vision the same image never falls on the blind spot of both eyes at the same time.) It should also be noted that good binocular vision is not necessarily accompanied by relief (stereoscopic) vision, since this also depends on complex nervous system processes.
For all these reasons, regulations for the need of stereoscopic vision at work should be abandoned and replaced by a thorough examination of individuals by an eye doctor. Such regulations or recommendations exist nevertheless and stereoscopic vision is supposed to be necessary for such tasks as crane driving, jewellery work and cutting-out work. However, we should keep in mind that new technologies may modify deeply the content of the task; for example, modern computerized machine-tools are probably less demanding in stereoscopic vision than previously believed.
As far as driving is concerned, regulations are not necessarily similar from country to country. In table 3 (overleaf), French requirements for driving either light or heavy vehicles are mentioned. The American Medical Association guidelines are the appropriate reference for American readers. Fox (1973) mentions that, for the US Department of Transportation in 1972, drivers of commercial motor vehicles should have a distant VA of at least 20/40, with or without corrective glasses; a field of vision of at least 70 degrees is needed in each eye. Ability to recognize the colours of the traffic lights was also required at that time, but today in most countries traffic lights can be distinguished not only by colour but also by shape.
Table 3. Visual requirements for a driving licence in France
Visual acuity (with eyeglasses) | |
For light vehicles | At least 6/10th for both eyes with at least 2/10th in the worse eye |
For heavy vehicles | VA with both eyes of 10/10th with at least 6/10th in the worse eye |
Visual field | |
For light vehicles | No licence if peripheral reduction in candidates with one eye or with the second eye having a visual acuity of less than 2/10th |
For heavy vehicles | Complete integrity of both visual fields (no peripheral reduction, no scotoma) |
Nystagmus (spontaneous eye movements) | |
For light vehicles | No licence if binocular visual acuity of less than 8/10th |
Heavy vehicles | No defects of night vision are acceptable |
Eye Movements
Several types of eye movements are described whose objective is to allow the eye to take advantage of all the information contained in the images. The system of fixation allows us to maintain the object in place at the level of the foveolar receptors where it can be examined in the retinal region with the highest power of resolution. Nevertheless, the eyes are constantly subject to micromovements (tremor). Saccades (particularly studied during reading) are intentionally induced rapid movements the aim of which is to displace the gaze from one detail to another of the motionless object; the brain perceives this unanticipated motion as the movement of an image across the retina. This illusion of movement is met in pathological conditions of the central nervous system or the vestibular organ. Search movements are partially voluntary when they involve the tracking of relatively small objects, but become rather irrepressible when very large objects are concerned. Several mechanisms for suppressing images (including jerks) allow the retina to prepare to receive new information.
Illusions of movements (autokinetic movements) of a luminous point or a motionless object, such as the movement of a bridge over a watercourse, are explained by retinal persistence and conditions of vision that are not integrated in our central system of reference. The consecutive effect may be merely a simple error of interpretation of a luminous message (sometimes harmful in the working environment) or result in serious neurovegetative disturbances. The illusions caused by static figures are well known. Movements in reading are discussed elsewhere in this chapter.
Flicker Fusion and de Lange Curve
When the eye is exposed to a succession of short stimuli, it first experiences flicker and then, with an increase in frequency, has the impression of stable luminosity: this is the critical fusion frequency. If the stimulating light fluctuates in a sinusoidal manner, the subject may experience fusion for all frequencies below the critical frequency insofar as the level of modulation of this light is reduced. All these thresholds can then be joined by a curve which was first described by de Lange and which can be altered when changing the nature of the stimulation: the curve will be depressed when the luminance of the flickering area is reduced or if the contrast between the flickering spot at its surrounding decreases; similar changes of the curve can be observed in retinal pathologies or in post-effects of cranial trauma (Meyer et al. 1971) (Figure 8).
Figure 8. Flicker-fusion curves connecting the frequency of intermittent luminous stimulation and its amplitude of modulation at threshold (de Lange’s curves), average and standard deviation, in 43 patients suffering from cranial trauma and 57 controls (dotted line).
Therefore one must be cautious when claiming to interpret a fall in critical flicker fusion in terms of work-induced visual fatigue.
Occupational practice should make a better use of flickering light to detect small retinal damage or dysfunctioning (e.g., an enhancement of the curve can be observed when dealing with slight intoxication, followed by a drop when intoxication becomes greater); this testing procedure, which does not alter retinal adaptation and which does not require eye correction, is also very useful for the follow-up of functional recovery during and after a treatment (Meyer et al. 1983) (figure 9).
Figure 9. De Lange’s curve in a young man absorbing ethambutol; the effect of treatment can de deduced from comparing the flicker sensitivity of the subject before and after treatment.
Colour Vision
The sensation of colour is connected with the activity of the cones and therefore exists only in the case of daylight (photopic range of light) or mesopic (middle range of light) adaptation. In order for the system of colour analysis to function satisfactorily, the illuminance of the perceived objects must be at least 10 cd/m2. Generally speaking, three colour sources, the so-called primary colours—red, green and blue—suffice to reproduce a whole spectrum of colour sensations. In addition, a phenomenon is observed of induction of colour contrast between two colours which mutually reinforce each other: the green-red pair and the yellow-blue pair.
The two theories of colour sensation, the trichromatic and the dichromatic, are not exclusive; the first appears to apply at the level of the cones and the second one at more central levels of the visual system.
To understand the perception of coloured objects against a luminous background, other concepts need to be used. The same colour may in fact be produced by different types of radiation. To reproduce a given colour faithfully, it is therefore necessary to know the spectral composition of the light sources and the spectrum of the reflectance of the pigments. The index of colour reproduction used by lighting specialists allows the selection of fluorescent tubes appropriate to the requirements. Our eyes have developed the faculty of detecting very slight changes in the tonality of a surface obtained by changing its spectral distribution; the spectral colours (the eye can distinguish more than 200) recreated by mixtures of monochromatic light represent only a small proportion of the possible colour sensation.
The importance of the anomalies of colour vision in the work environment should thus not be exaggerated except in activities such as inspecting the appearance of products, and e.g., for decorators and similar, where colours must be correctly identified. Moreover, even in electricians’ work, size and shape or other markers may replace colour.
Anomalies of colour vision may be congenital or acquired (degenerations). In abnormal trichromates, the change may affect the basic red sensation (Dalton type), or the green or the blue (the rarest anomaly). In dichromates, the system of three basic colours is reduced to two. In deuteranopia, it is the basic green that is lacking. In protanopia, it is the disappearance of the basic red; although less frequent, this anomaly, as it is accompanied by a loss of luminosity in the range of reds, deserves attention in the work environment, in particular by avoiding the deployment of red notices especially if they are not very well lighted. It should also be noted that these colour vision defects can be found in various degrees in the so-called normal subject; hence the need for caution in using too many colours. It should be kept in mind also that only broad colour defects are detectable with vision testers.
Refractive Errors
The near point (Weymouth 1966) is the shortest distance at which an object can be brought into sharp focus; the farthest away is the far point. For the normal (emmetropic) eye, the far point is situated at infinity. For the myopic eye, the far point is situated in front of the retina, at a finite distance; this excess of strength is corrected by means of concave lenses. For the hyperopic (hypermetropic) eye, the far point is situated behind the retina; this lack of strength is corrected by means of convex lenses (figure 10). In a case of light hyperopia, the defect is spontaneously compensated with accommodation and may be ignored by the individual. In myopics who are not wearing their spectacles the loss of accommodation can be compensated for by the fact that the far point is nearer.
Figure 10. Schematic representation of refractive errors and their correction.
In the ideal eye, the surface of the cornea should be perfectly spherical; however, our eyes show differences in curvature in different axes (this is called astigmatism); refraction is stronger when the curvature is more accentuated, and the result is that rays emerging from a luminous point do not form a precise image on the retina. These defects, when pronounced, are corrected by means of cylindrical lenses (see lowest diagram in figure 10, overleaf); in irregular astigmatism, contact lenses are recommended. Astigmatism becomes particularly troublesome during night driving or in work on a screen, that is, in conditions where light signals stand out on a dark background or when using a binocular microscope.
Contact lenses should not be used at workstations where air is too dry or in case of dusts and so on (Verriest and Hermans 1975).
In presbyopia, which is due to loss of elasticity of the lens with age, it is the amplitude of accommodation that is reduced—that is, the distance between the far and near points; the latter (from about 10 cm at the age of 10 years) moves further away the older one gets; the correction is made by means of unifocal or multifocal convergent lenses; the latter correct for ever nearer distances of the object (usually up to 30 cm) by taking into account that nearer objects are generally perceived in the lower part of the visual field, while the upper part of the spectacles is reserved for distance vision. New lenses are now proposed for work at VDUs which are different from the usual type. The lenses, known as progressive, almost blur the limits between the correction zones. Progressive lenses require the user to be more accustomed to them than do the other types of lenses, because their field of vision is narrow (see Krueger 1992).
When the visual task requires alternative far and near vision, bifocal, trifocal or even progressive lenses are recommended. However, it should be kept in mind that the use of multifocal lenses can create important modifications to the posture of an operator. For example, VDU operators with presbyopia corrected by the means of bifocal lenses tend to extend the neck and may suffer cervical and shoulder pain. Spectacles manufacturers will then propose progressive lenses of different kinds. Another cue is the ergonomic improvement of VDU workplaces, to avoid placing the screen too high.
Demonstrating refractive errors (which are very common in the working population) is not independent of the type of measurement. Snellen charts fixed on a wall will not necessarily give the same results as various kinds of apparatus in which the image of the object is projected on a near background. In fact, in a vision tester (see above), it is difficult for the subject to relax the accommodation, particularly as the axis of vision is lower; this is known as “instrumental myopia”.
Effects of Age
With age, as already explained, the lens loses its elasticity, with the result that the near point moves farther away and the power of accommodation is reduced. Although the loss of accommodation with age can be compensated for by means of spectacles, presbyopia is a real public health problem. Kauffman (in Adler 1992) estimates its cost, in terms of means of correction and loss of productivity, to be of the order of tens of billions of dollars annually for the United States alone. In developing countries we have seen workers obliged to give up work (in particular the making of silk saris) because they are unable to buy spectacles. Moreover, when protective glasses need to be used, it is very expensive to offer both correction and protection. It should be remembered that the amplitude of accommodation declines even in the second ten years of life (and perhaps even earlier) and that it disappears completely by the age of 50 to 55 years (Meyer et al. 1990) (figure 11).
Figure 11. Near point measured with the rule of Clement and Clark, percentage distribution of 367 office workers aged 18-35 years (below) and 414 office workers aged 36-65 years (above).
Other phenomena due to age also play a part: the sinking of the eye into the orbit, which occurs in very old age and varies more or less according to individuals, reduces the size of the visual field (because of the eyelid). Dilation of the pupil is at its maximum in adolescence and then declines; in older people, the pupil dilates less and the reaction of the pupil to light slows down. Loss of transparency of the media of the eye reduces visual acuity (some media have a tendency to become yellow, which modifies colour vision) (see Verriest and Hermans 1976). Enlargement of the blind spot results in the reduction of the functional visual field.
With age and illness, changes are observed in the retinal vessels, with consequent functional loss. Even the movements of the eye are modified; there is a slowing down and reduction in amplitude of the exploratory movements.
Older workers are at a double disadvantage in conditions of weak contrast and weak luminosity of the environment; first, they need more light to see an object, but at the same time they benefit less from increased luminosity because they are dazzled more quickly by glare sources. This handicap is due to changes in the transparent media which allow less light to pass and increase its diffusion (the veil effect described above). Their visual discomfort is aggravated by too sudden changes between strongly and weakly lighted areas (slowed pupil reaction, more difficult local adaptation). All these defects have a particular impact in VDU work, and it is very difficult, indeed, to provide good illumination of workplaces for both young and older operators; it can be observed, for example, that older operators will reduce by all possible means the luminosity of the surrounding light, although dim light tends to decrease their visual acuity.
Risks to the Eye at Work
These risks may be expressed in different ways (Rey and Meyer 1981; Rey 1991): by the nature of the causal agent (physical agent, chemical agents, etc.), by the route of penetration (cornea, sclera, etc.), by the nature of the lesions (burns, bruises, etc.), by the seriousness of the condition (limited to the outer layers, affecting the retina, etc.) and by the circumstances of the accident (as for any physical injury); these descriptive elements are useful in devising preventive measures. Only the eye lesions and circumstances most frequently encountered in the insurance statistics are mentioned here. Let us stress that workers’ compensation can be claimed for most eye injuries.
Eye conditions caused by foreign bodies
These conditions are seen particularly among turners, polishers, foundry workers, boilermakers, masons and quarrymen. The foreign bodies may be inert substances such as sand, irritant metals such as iron or lead, or animal or vegetable organic materials (dusts). This is why, in addition to the eye lesions, complications such as infections and intoxications may occur if the amount of substance introduced into the organism is sufficiently large. Lesions produced by foreign bodies will of course be more or less disabling, depending on whether they remain in the outer layers of the eye or penetrate deeply into the bulb; treatment will thus be quite different and sometimes requires the immediate transfer of the victim to the eye clinic.
Burns of the eye
Burns are caused by various agents: flashes or flames (during a gas explosion); molten metal (the seriousness of the lesion depends on the melting point, with metals melting at higher temperature causing more serious damage); and chemical burns due, for example, to strong acids and bases. Burns due to boiling water, electrical burns and many others also occur.
Injuries due to compressed air
These are very common. Two phenomena play a part: the force of the jet itself (and the foreign bodies accelerated by the air flow); and the shape of the jet, a less concentrated jet being less harmful.
Eye conditions caused by radiation
Ultraviolet (UV) radiation
The source of the rays may be the sun or certain lamps. The degree of penetration into the eye (and consequently the danger of the exposure) depends on the wavelength. Three zones have been defined by the International Lighting Commission: UVC (280 to 100 nm) rays are absorbed at the level of the cornea and conjunctiva; UVB (315 to 280 nm) are more penetrating and reach the anterior segment of the eye; UVA (400 to 315 nm) penetrate still further.
For welders the characteristic effects of exposure have been described, such as acute keratoconjunctivitis, chronic photo-ophthalmia with decreased vision, and so on. The welder is subjected to a considerable amount of visible light, and it is essential that the eyes be protected with adequate filters. Snowblindness, a very painful condition for workers in mountains, needs to be avoided by wearing appropriate sunglasses.
Infrared radiation
Infrared rays are situated between the visible rays and the shortest radio-electric waves. They begin, according to the International Lighting Commission, at 750 nm. Their penetration into the eye depends on their wavelength; the longest infrared rays can reach the lens and even the retina. Their effect on the eye is due to their calorigenicity. The characteristic condition is found in those who blow glass opposite the oven. Other workers, such as blast furnace workers, suffer from thermal irradiation with various clinical effects (such as keratoconjunctivitis, or membranous thickening of the conjunctiva).
LASER (Light amplification by stimulated emission of radiation)
The wavelength of the emission depends on the type of laser—visible light, ultraviolet and infrared radiation. It is principally the quantity of energy projected that determines the level of the danger incurred.
Ultraviolet rays cause inflammatory lesions; infrared rays can cause caloric lesions; but the greatest risk is destruction of retinal tissue by the beam itself, with loss of vision in the affected area.
Radiation from cathode screens
The emissions coming from the cathode screens commonly used in offices (x rays, ultraviolet, infrared and radio rays) are all situated below the international standards. There is no evidence of any relationship between video terminal work and the onset of cataract (Rubino 1990).
Harmful substances
Certain solvents, such as the esters and aldehydes (formaldehyde being very widely used), are irritating to the eyes. The inorganic acids, whose corrosive action is well known, cause tissue destruction and chemical burns by contact. The organic acids are also dangerous. Alcohols are irritants. Caustic soda, an extremely strong base, is a powerful corrosive that attacks the eyes and the skin. Also included in the list of harmful substances are certain plastic materials (Grant 1979) as well as allergenic dusts or other substances such as exotic woods, feathers and so on.
Finally, infectious occupational diseases can be accompanied by effects on the eyes.
Protective glasses
Since the wearing of individual protection (glasses and masks) can obstruct vision (reduction of visual acuity owing to loss of transparency of the glasses on account of the projection of foreign bodies, and obstacles in the visual field such as the sidepieces of glasses), workplace hygiene also tends towards using other means such as the extraction of dust and dangerous particles from the air through general ventilation.
The occupational physician is frequently called upon to advise on the quality of glasses adapted to the risk; national and international directives will guide this choice. Moreover, better goggles are now available, which include improvements in efficacy, comfort and even aesthetics.
In the United States, for example, reference can be made to ANSI standards (particularly ANSI Z87.1-1979) that have the force of law under the federal Occupational Safety and Health Act (Fox 1973). ISO Standard No. 4007-1977 refers also to protective devices. In France, recommendations and protective material are available from the INRS in Nancy. In Switzerland, the national insurance company CNA provides rules and procedures for extraction of foreign bodies at the workplace. For serious damage, it is preferable to send the injured worker to the eye doctor or the eye clinic.
Finally, people with eye pathologies may be more at risk than others; to discuss such a controversial problem goes beyond the scope of this article. As previously said, their eye doctor should be aware of the dangers that they may encounter at their workplace and survey them carefully.
Conclusion
At the workplace, most information and signals are visual in nature, although acoustic signals may play a role; nor should we forget the importance of tactile signals in manual work, as well as in office work (for example, the speed of a keyboard).
Our knowledge of the eye and vision comes mostly from two sources: medical and scientific. For the purpose of diagnosis of eye defects and diseases, techniques have been developed which measure visual functions; these procedures may not be the most effective for occupational testing purposes. Conditions of medical examination are indeed very far from those which are encountered at the workplace; for example, to determine visual acuity the eye doctor will make use of charts or instruments where contrast between test object and background is the highest possible, where the edges of test objects are sharp, where no disturbing glare sources are perceptible and so on. In real life, lighting conditions are often poor and visual performance is under stress for several hours.
This emphasizes the need to utilize laboratory apparatus and instrumentation which display a higher predictive power for visual strain and fatigue at the workplace.
Many of the scientific experiments reported in textbooks were performed for a better theoretical understanding of the visual system, which is very complex. References in this article have been limited to that knowledge which is immediately useful in occupational health.
While pathological conditions may impede some people in fulfilling the visual requirements of a job, it seems safer and fairer—apart from highly demanding jobs with their own regulations (aviation, for example)—to give the eye doctor the power of decision, rather than refer to general rules; and it is in this way that most countries operate. Guidelines are available for more information.
On the other hand, hazards exist for the eye when exposed at the workplace to various noxious agents, whether physical or chemical. Hazards for the eye in industry are briefly enumerated. From scientific knowledge, no danger of developing cataracts may be expected from working on a VDU.
" DISCLAIMER: The ILO does not take responsibility for content presented on this web portal that is presented in any language other than English, which is the language used for the initial production and peer-review of original content. Certain statistics have not been updated since the production of the 4th edition of the Encyclopaedia (1998)."