Why Do So Many Humans Need Glasses?

When I was very young, I was given an assignment in school to write a report on the Peregrine Falcon. One interesting fact about this bird happens to be that it’s quite fast: when the bird spots prey (sometimes from over a mile away) it can enter into a high-altitude dive, reaching speeds in excess of 200 mph, and snatch its prey out of midair (if you’re interested in watching a video of such a hunt, you can check one out here). The Peregrine would be much less capable of achieving these tasks – both the location and capture of prey – if its vision was not particularly acute: failures of eyesight can result in not spotting the prey in the first place, or failing to capture it if distances and movements aren’t properly tracked. For this reason I suspect (though am not positive) that you’ll find very few Peregrines that have bad vision: their survival depends very heavily on seeing well. These birds would probably not be in need of corrective lens, like the glasses and contacts that humans regularly rely upon in modern environments. This raises a rather interesting question: why do so many humans wear glasses?

And why does this human wear so many glasses?

What I’m referring to in this case is not the general degradation of vision with age. As organisms age, all their biological systems should be expected to breakdown and fail with increasing regularity, and eyes are no exception. Crucially, all these systems should be expected to all breakdown, more-or-less, at the same time. This is because there’s little point in a body investing loads of metabolic resources into maintaining a completely healthy heart that will last for 100 years if the liver is going to shut down at 60. The whole body will die if the liver does, healthy heart (or eyes) included, so it would be adaptive to allocate those development resources differently. The mystery posed by frequently-poor human eyesight is appreciably different, as poor vision can develop early in life; often before puberty. When you observe apparent maladaptive development early in life like that, it requires another type of explanation.

So what might explain why human visual acuity appears so lackluster early in life (to the tune of over 20% of teenagers using corrective lenses)? There are a number of possible explanations we might entertain. The first of these is that visual acuity hasn’t been terribly important to human populations for some time, meaning that having poor eyesight did not have an appreciable impact on people’s ability to survive and reproduce. This strikes me as a rather implausible hypothesis on the face of it not only because vision seems rather important for navigating the world, but also because it ought to predict that having poor vision should be something of a species universal. While 20% of young people using corrective lenses is a lot, eyes (and the associated brain regions dedicated to vision) are costly organs to grow and maintain. If they truly weren’t that important to have around, then we might expect that everyone needs glasses to see better; not just pockets of the population. Humans don’t seem to resemble the troglobites that have lost their vision after living in caves away from sunlight for many generations.

Another possibility is that visual acuity has been important – it’s adaptive to have good vision – but people’s eyes fail to develop properly sometimes because of development insults, like infectious organisms. While this isn’t implausible in principle – infectious agents have been known to disrupt development and result in blindness, deafness, and even death on the extreme end – the sheer numbers of people who need corrective lenses seem a bit high to be caused by some kind of infection. Further, the numbers of younger children and adults who need glasses appear to have been rising over time, which might seem strange as medical knowledge and technologies have been steadily improving. If the need for glasses is caused by some kind of infectious agent, we would need to have been unaware of its existence and not accidentally treated it with antibiotics or other such medications. Further, we might expect glasses to be associated with other signs of developmental stress, like bodily asymmetries, low IQ, or other such outcomes. If your immune system didn’t fight off the bugs that harmed your eyes, it might not be good enough to fight off other development-disrupting infections. However, there seems to be a positive correlation between myopia and intelligence, which would be strange under a disease hypothesis.

The negative correlation with fashion sense begs for explanation, too

A third possible explanation is that visual acuity is indeed important for humans, but our technologies have been relaxing the selection pressures that were keeping it sharp. In other words, since humans invented glasses and granted those who cannot see as well a crutch to overcome this issue, any reproductive disadvantage associated with poor vision was effectively removed. It’s an interesting hypothesis that should predict people’s eyesight in a population begins to get worse following the invention and/or proliferation of corrective lenses. So, if glasses were invented in Italy around 1300, that should have lead to the Italian population’s eyesight growing worse, followed by the eyesight of other cultures to which glasses spread but not beforehand. I don’t know much about the history of vision across time in different cultures, but something tells me that pattern wouldn’t show up if it could be assessed. In no small part, that intuition is driven by the relatively-brief window of historical time between when glasses were invented, and subsequently refined, produced in sufficient numbers, distributed globally, and today. A window of only about 700 years for all of that to happen and reduce selection pressures for vision isn’t a lot of time. Further, there seems to be evidence that myopia can develop rather rapidly in a population, sometimes as quick as a generation:

One of the clearest signs came from a 1969 study of Inuit people on the northern tip of Alaska whose lifestyle was changing2. Of adults who had grown up in isolated communities, only 2 of 131 had myopic eyes. But more than half of their children and grandchildren had the condition. 

That’s much too fast for a relaxation of selection pressures to be responsible for the change.

This brings us to the final hypothesis I wanted to cover today: an evolutionary mismatch hypothesis. In the event that modern environments differ in some key ways from the typical environments humans have faced ancestrally, it is possible that people will develop along an atypical path. In this case, the body is (metaphorically) expecting certain inputs during its development, and if they aren’t received things can go poorly. As a for instance, it has been suggested that people develop allergies, in part, as a result of improved hygiene: our immune systems are expecting a certain level of pathogen threat which, when not present, can result in our immune system attacking inappropriate targets, like pollen.

There does seem to be some promising evidence on this front for understanding human vision issues. A paper by Rose et al (2008) reports on myopia in two samples of similarly-aged Chinese children: 628 children living in Singapore and 124 living in Sydney. Of those living in Singapore, 29% appeared to display myopia, relative to only 3% of those living in Sydney. These dramatic differences in rates of myopia are all the stranger when you consider the rates of myopia in their parents were quite comparable. For the Sydney/Singapore samples, respectively, 32/29% of the children had no parent with myopia, 43/43% had one parent with myopia, and 25/28% had two parents with myopia. If myopia was simply the result of inherited genetic mutations, its frequencies between countries shouldn’t be as different as they are, disqualifying hypotheses one and three from above.

When examining what behavioral correlates of myopia existed between countries, several were statistically – but not practically – significant, including number of books read and hours spent on computers or watching TV. The only appreciable behavioral difference between the two samples was the number of hours the children tended to spend outdoors. In Sydney, the children spent an average of about 14 hours a week outside, compared to a mere 3 hours in Singapore. It might be the case, then, that the human eye requires exposure to certain kinds of stimulation provided by outdoor activities to develop properly, and some novel aspects of modern culture (like spending lots of time indoors in a school when children are young) reduce such exposure (which might also explain the aforementioned IQ correlation: smarter children may be sent to school earlier). If that were true, we should expect that providing children with more time outdoors when they are young is preventative against myopia, which it actually seems to be.

Natural light and no Wifi? Maybe I’ll just go blind instead…

It should always strike people as strange when key adaptive mechanisms appear to develop along an atypical path early in life that ultimately makes them worse at performing their function. An understanding of what types of biological explanations can account for these early maladaptive outcomes goes a long way in helping you understand where to begin your searches and what patterns of data to look out for.

References: Rose, K., Morgan, I., Smith, W., Burlutsky, G., Mitchell, P., & Saw, S. (2008). Myopia, lifestyle, and schooling in students of Chinese ehtnicity in Singapore and Sydney. Archives of Ophthalmology, 126, 527-530.

Comments are closed.