Millions of people wear contact lenses safely. However, contact lens wear is not without serious complications; contact lens-related corneal ulcers pose a risk of significant vision loss. A corneal ulcer is a localized area of tissue erosion associated with inflammatory cells in the cornea. Corneal ulcers can be either infectious or noninfectious . When infectious, the disease is clinically referred to as 'microbial keratitis' (MK). Infectious corneal ulcers are more serious than sterile ulcers, and usually cause significant scarring of the cornea. Corneal scarring opacifies the normally clear corneal tissue; this can lead to visual impairment or blindness when the resulting opacity covers the pupil.
In the USA, contact lenses are regulated by the FDA. Rigid, plastic, corneal contact lenses made from polymethylmethacrylate began to achieve a significant level of use in the 1950s. However, it was not until the 1970s, after soft lenses were approved for marketing, that contact lenses became a popular alternative to spectacles for the correction of refractive error. The FDA regulations required significant clinical testing of soft contact lenses prior to marketing approval in order to provide a reasonable assurance of safety and effectiveness. In 1981, the FDA approved the first cosmetic extended wear lenses ('extended wear' means that the lenses could be worn overnight, during sleep; in this context, 'cosmetic' means not restricted to the special use of correcting vision after cataract surgery). In the 1980s, extended wear use for periods of up to 30 continuous days began to increase in the contact lens-wearing population.
In the 1970s and early 80s, reports of corneal ulcers in contact lens wearers started appearing in the literature [15,16]. These cases seemed to be associated with soft lenses - particularly extended wear soft lenses. Spurred by FDA's growing public health concerns, the industry-supported Contact Lens Institute sponsored the first population-based studies to assess the level and significance of the problem. The results were published in 1989 in two landmark papers by Oliver Schein, Eugene Poggio, and their co-workers [17,18]. Their work represented the first significant epidemiologic research that determined the incidence of corneal ulcers and the relative risk of different modes of wear.
To estimate the incidence of ulcers in different modes of contact lens wear, investigators had to determine: (a) the number of contact lens ulcers in a given population; and (b) the number of patients in that population using each type of contact lens . Investigators contacted all ophthalmologists in five New England states to collect the total number of lens-related, new ulcer cases over a 4 month period. They determined the proportion of the New England population using each type of contact lens by a telephone survey of about 4000 randomly-selected households. The investigators combined the survey data on contact lens usage with census population data to estimate the number of people in the area who wore each type of contact lens. Thus, annual incidence was calculated using the following formula:
population x(proportion of people wearing lenses)
Poggio et al.  reported that the annualized incidence of MK was 20.9 cases per 10 000 patient-years in users of extended wear soft lenses, and 4.1 per 10 000 patient-years in users of daily wear soft lenses (designed for daytime use only). From these rates, they estimated that annually among the 13 million nationwide contact lens wearers in the late 1980s, there were about 8000 cases of MK in extended wear lens users and about 4000 in daily wear users. Not all patients wore their lenses in the mode appropriate for their lens type. Some slept in 'daily wear lenses' and some never slept in 'extended wear lenses'. The authors estimated MK rates based upon actual lens wear. The incidence for daily wear lenses worn strictly on a daily wear schedule (never worn overnight) was about 2-3 per 10 000 patient-years; the incidence for extended wear lenses worn overnight was about 22 to 32 per 10 000 patient-years .
In the companion study, Schein etal.  reported on a case-control study of soft lens patients conducted at six university ophthalmology centers. They compared 86 patients with MK to two different control populations: one hospital-based (n = 61) and one population-based (n = 410). Investigators found that extended wear soft lenses were associated with approximately four times the risk of MK as daily wear soft lenses. As discussed above, some patients followed wearing schedules inconsistent with lens type. Patients who used extended wear lenses in an overnight wear mode showed 10-15 times the risk of those who wore daily wear lenses on a strictly daily wear basis. Increasing length of extended wear was directly related to increased risk. The risk for patients who wore extended wear lenses 2-7 days before removal was about 7-10 times the risk for those who wore daily wear lenses on a strictly daily wear basis. By combining this with the Poggio  incidence data, we can conclude that for this '2-7 day' continuous wear schedule, the incidence was on the order of about 20 per 10 000 patient-years. Somewhat unexpectedly, investigators found that wearers who smoked were at higher risk than non-smokers.
Basedupon Schein's data relating length of extended wear to increased risk of MK, the FDA recommended limiting continuous wear to a maximum of 7 days. In 1989, the FDA asked manufacturers to voluntarily reduce the maximum indicated time for extended wear to 7 days, and sent letters to eyecare practitioners advising them of the situation. In the USA, cosmetic contact lenses were not marketed for continuous wear longer than 7 days for the rest of the twentieth century.
Following up on the work of Schein and Poggio, investigators in a number of other countries have conducted similar studies overthepast 15 years. An important 1991 study in the UK  suggested that contact lens wear had become the primary cause of MK. This case-control study assessed the relative risk (RR) of several predisposing factors associated with MK (compared to 'no predisposing factor'). These relative risks for the most important factors were as follows:
This study also confirmed earlier findings that extended wear was significantly riskier than daily wear; that extended wear for > 6 days was riskier than for shorter time periods; that contact lens hygiene probably was a significant, but limited, factor in risk; and that rigid lens wear was safer than soft lens wear. Analysis showed that more than half of all cases of MK were associated with contact lens wear.
In a study published in 1994 , investigators surveyed all cases of 'contact lens-induced keratitis' (epithelial defects with an underlying infiltrate or ulcer) in Sweden over a 3 month period. The maximum length of continuous wear was 14 nights. This study found substantially lower annualized incidences than the US study: about 2/10 000 for daily wear soft lenses, and about 10-13/10 000 for extended wear soft lenses.
Cheng et al.  reported the incidence of contact lens corneal ulcers in Denmark in the late 1990s to be very similar to the rates in the USA reported in the earlier Poggio  study. This survey of all practicing Danish ophthalmologists over a 3 month period found an annualized incidence of 3.5/10 000 in daily wear soft lenses and 20.0/10 000 in extended wear soft lenses (up to 2 weeks of continuous wear). This study also reported on the morbidity of the ulcerative disease. Of the 92 eyes with MK, 5% ended therapy with vision of 20/70 or worse, 13% with 20/30 or worse.
A study  in Hong Kong in the late 1990s reported the incidence of MK to be about 3/10 000 for daily wear soft lenses, and about 9/10 000 for extended wear soft lenses (maximum of 6 nights of continuous wear).
Although the incidence of significant problems was low in most studies, it must be kept in mind that there are currently an estimated 36 million contact lens wearers in the USA . Therefore, it is likely that there are thousands of lens-related cases of MKeach year. Additionally, many patients wear contact lenses for a significant portion of their lives. A patient who uses extended wear lenses from age 20 to age 39 will have a cumulative risk of MK of approximately 0.04 (20 years x 0.0020 ulcer risk/year), if one assumes a constant risk using the Poggio  estimate. For this hypothetical patient, a 1 in 25 chance of getting ulcerative keratitis would represent a significant risk.
An early illustration of how epidemiologic research influenced public policy was provided by the investigation of Acanthamoeba contamination of contact lens solutions. The majority of contact lens-related MK cases are bacterial. However, Acanthamoeba, a free-living amoeba commonly found in fresh water and soil, causes a relatively small number of cases, which are difficult to treat . Acanthamoeba keratitis cases related to contact lens wear were initially seen in the early to mid-1980s [25,26]. In the 1980s, many patients made their own contact lens saline solution from distilled water and salt tablets (for use in heat disinfection units). This was considerably less expensive than purchasing commercial contact lens saline solution. In an early case-control study of 27 cases and 81 matched controls conducted by the Centers for Disease Control , investigators identified the use of home-made nonsterile saline solution as a highly significant risk factor. Based largely upon this study, in 1987 the FDA sent to eye care practitioners a safety alert concerning home-made saline, and requested stronger warnings on salt tablet containers. Since then, patient use of home-made saline has almost become nonexistent. This decline was also related to the gradually decreasing use of home heat disinfection as chemical disinfection products came to dominate the market (note that, except for the early Poggio  study, all of the MK incidence rates found in the previously cited studies were determined after the risks of home-made saline were well known.)
Microbial keratitis in contact lens wear is thought to be related to factors such as corneal hypoxia, contamination of contact lenses and solutions, changes in the tear film, and microtrauma . Hypoxia (lack of oxygen flow to the cornea under the contact lens) has generally been considered to be the most important factor. With the development of new silicone-containing contact lens materials in the late 1990s, many clinicians believed that safer extended wear of contact lenses might be at hand. The silicone-containing lenses could pass many times more oxygen to the cornea than lenses made from conventional materials. For the first time since the 1980s, manufacturers requested approval from the FDA of up to 30 days of continuous wear for some of these 'hyper-permeable' contact lenses.
The FDA realized that conventional premarket studies of 400-800 patients would be insufficient to determine the risk of MK in any new 30 day continuous wear lenses. In order to both safeguard public health and provide a 'least burdensome' approach for the introduction of new technology, the FDA offered manufacturers two options. They could either have an unconventionally large preapproval study, or have a more conventionally sized preapproval study, with approval contingent on the requirement to run a large postapproval, population-based study to determine the incidence of ulcerative disease. To date, all manufacturers have chosen the latter option, using significant infiltrative keratitis (including non-infectious and infectious cases) as a surrogate endpoint in premarket studies for these new lenses.
An FDA advisory panel met in November of 2000 for discussion of the nature of any postapproval studies. Two different epidemiologic approaches were suggested for these studies. The first was to prospectively follow a large cohort of 30 day lens wearers for at least 1 year to accumulate enough 'patient-years' of exposure to make a reasonable estimate of the disease incidence. The second was to do a 'case-control' study to assess the relative risk of the new hyperpermeable lenses (worn up to 30 days continuously) compared to the previously approved conventional extended wear lenses (worn up to 7 days). The advisory panel recommended that the new lenses should not have a rate of MK substantially higher than that in the currently approved 7 day wear lenses (generally thought to be in the order of 20 per 10 000 patient-years).
In a case-control study of this type, an investigation would collect a sample of 'cases' of extended-wear-related MK and would also collect a sample of extended wear contact lens patients without MK as a control. For each group, investigators would determine the number of patients in 7 day wear and the number in 30 day wear. From these numbers, investigators could estimate the relative risk of MK in the two modes of wear.
It may not be immediately apparent how the relative risk can be estimated from this type of study. Consider the situation in which the entire population of extended wear patients is available. They could be placed in a table of the following type:
7-Day extended wear
30 day extended wear
Here, a, b, c, and d represent the number of patients in each category. The proportion of 7 day wearers with MK is: a/(a + b). Likewise, the proportion of 30 day wearers with MK
is: c/(c + d). The relative risk of 30 day to 7 day wear is:
This cannot be directly estimated from the case and control sample data. However, consider the 'odds' that a 7 day wearer will get MK, as opposed to not getting MK. These 'odds' are a/b. Similarly the 'odds' of a 30 day wearer of getting MK are c/d. The 'odds ratio' of 30 day wear to 7 day wear is:
Notice that this ratio is mathematically equivalent to:
The numerator of the last expression (cla) can be estimated from a representative sample of extended wear MK cases. Similarly, the denominator (dlb) can be estimated from a representative sample of extended wear patients who do not have MK. Thus, the sample ratio:
c/a dlb provides an estimate of the population 'odds ratio'. Note that when the incidence of a disease is very low (as is the case with MK), the 'odds' of getting a disease, e.g. c/d, are approximately equal to the risk of getting the disease, e.g. c/(c + d); where c << d. Thus, in this case, the 'odds ratio' calculated from the case-control data can provide a good estimate of the 'relative risk' .
The case-control study design has a number of points in its favor, and is often the method of choice for studying diseases of low incidence. Aside from permitting an assessment of the relative risk of different modes of wear, it also permits the estimation of the risk presented by other co-factors (actual length of continuous wear, hygiene, patient compliance, age, etc.), uses patients in an uncontrolled 'real world' environment, and can be conducted with a small sample size at relatively low cost. Disadvantages include the following:
• Estimates can be prone to several types of selection bias. In particular, if both the exposure factor and the outcome are causes for referral, a type of bias known as Berkson's bias results .
• The actual incidence of the disease cannot be ascertained.
• If the number of patients in any one of the cells is very small, the confidence interval for the relative risk will be very large.
The reason for this last factor can be clarified by considering the equation for the variance of the log of the 'odds ratio' :
As can be seen, if any one of a, b, c, or d are very small, the variance will be quite large. With regard to 30 day lenses, the FDA believed that it would take quite a few years (at least 3-5) for the new lenses to establish a substantial market share and thereby produce enough '30 day ulcers' to make the 'c' number in the above table large enough to produce a reasonable variance.
As opposed to the case-control design, prospectively following a large cohort has a number of advantages. It can:
• Directly assess the actual risk of the of the disease.
• Achieve results in a timely manner, perhaps within 2-3 years of initiation.
However, a prospective study might be subject to some of the same problems that can occur in a controlled premarket study: possible self-selection of more responsible, motivated patients or practitioners, and a relatively controlled follow-up environment. It also requires a larger patient sample with resulting increased cost.
After considering all the factors, the FDA recommended the prospective cohort study design, although it is still open to other approaches. Probably the most important consideration was the importance of getting the information in a timely manner, as a case-control design would likely necessitate a longer delay after approval. In the recently initiated postapproval studies, each protocol involved approximately 100 monitoring sites and was designed to collect data on 4500-5000 patient-years of 30 day wear . Data was to be collected from a variety of clinical settings, such as private optometry and ophthalmology practices, and commercial optical chains. Endpoints were defined as cases of presumed microbial keratitis and cases of reduction in visual acuity.
Any study of contact lens-related microbial keratitis has the problem of distinguishing noninfectious (sterile) ulcers from infectious ones (MK). Infectious ulcers are more likely than sterile ulcers to be central, large, and associated with significant inflammation within the eye (anterior chamber reaction) and severe pain . The ulcers that provide a less severe clinical impression are often referred to as 'contact lens peripheral ulcers' (although lesions given this designation are not always 'peripheral') and are often considered to be sterile . Presumed infectious ulcers often do not grow significant bacteria when cultured  and 'contact lens peripheral ulcers' are often culture-positive . Since the clinical features of the two types of keratitis overlap, both are usually treated vigorously with topical antibiotic drops. It has been argued that some of the population-based studies may have included significant numbers of non-infectious ulcers, thereby artificially inflating the estimates of the incidence of MK . The FDA is attempting to ensure that each postapproval study makes the distinction between infectious and sterile ulcers in as consistent and objective a manner as possible.
The first postapproval study for a 30 day continuous wear contact lens, a prospective study conducted by CIBA Vision , was published in December 2005. The study used 131 centers to recruit 6245 participants, who ultimately accumulated 5561 patient-years of exposure (4292 patient-years for typical continuous wear of at least 3 weeks). Questionnaires were used at 3 and 12 months to ascertain typical time for continuous wear, and to determine the occurrence of red or painful eyes. For patients who reported having had red or painful eyes, medical records were obtained. The study used a special Endpoint Adjudication Committee of recognized experts in order to categorize infiltrative cases. For all cases with a corneal infiltrate, this committee reviewed the medical records in detail and made a determination concerning likelihood of etiology, based upon predetermined guidelines. Ten patients in the study were classified as having MK; two of these experienced vision loss and eight did not. Thus, the incidence of patients with presumed MK for the study (all patients, regardless of wearing schedule) was 10 per 5561 patient-years, or 18.0 per 10 000 patient-years (95% CI = 8.5-33.1). Similarly, the incidence of vision loss related to MK was 3.6 per 10 000 patient-years (95% CI = 0.4-12.9). There were a large number (56) of incidents of indeterminate etiology. It is possible that a few of the 'indeterminate' cases were infectious, but the Endpoint Adjudication Committee made the infectious/sterile distinction, in ways they could reproducibly identify (e.g. lesion size, anterior chamber reaction, etc. were utilized).
We note that the estimated incidence of MK is highly dependent upon the definition used for the clinical condition. Because the definitions of MK used in a number of earlier studies (e.g. [17,18]) were somewhat more restrictive than the one used in the CIBA postapproval study , it is difficult to compare the different study rates directly. It is encouraging that the prospectively determined rate of vision loss in this postapproval study was low. However, it should also be noted that in the order of 1% of patients in any given year may be given antibiotic treatment for infiltrative keratitis.
In summation, epidemiologic methods have proved to be of great value in the investigation of contact lens-related corneal ulcers. Past studies have had a significant influence upon public policy in the USA and the FDA continues to use population-based methods to help better evaluate the relevant risks. Future studies will further clarify the public health issues related to contact lens wear.
Was this article helpful?