Edouard Manet’s “The Beach at Boulogne” (1868) depicts fully clothed men, women and children sporting parasols while at the beach.

Changes in clothing styles, attitudes toward tanned skin, leisure activities and other cultural factors are linked to the rising rates of melanoma in developed countries, including the United States, according to a fascinating study published Monday in the American Journal of Public Health.

Understanding those trends may help public health officials develop more effective strategies for reversing our society’s continued and dangerous worshiping of the sun, the study also suggests.

The last 100 years has seen an alarming increase in the incidence of melanoma, the deadliest form of skin cancer. Between the 1930s and 1960s, the melanoma rate in the U.S. shot up more than 300 percent in men and 400 percent in women, followed by a jump of another 244 percent in men and 167 percent in women between the 1960s and 1990s.

And it has continued to rise in the 21st century. Between 2000 and 2009, the number of people diagnosed with melanoma climbed an average of almost 2 percent per year, according to the National Cancer Institute.

Of course, some of this increase is the result of earlier diagnosis and better reporting of the disease. But those factors don’t explain all of it. The new study attempts to figure out what else might have been involved. For the study, researchers at New York University analyzed various socioeconomic changes from the early 1900s until now, including trends in fashion, medicine and leisure-time activities. They then used that data to estimate how those changes might have impacted people’s exposure to ultraviolet (UV) radiation, the leading cause of melanoma.

Revealing more and more skin

The study is divided into four historical periods. Here’s a very brief summary of the findings for each period:

Pre-1900s through 1910.  During this period, pale, porcelain-like skin was in vogue. A stigma was attached to tanned skin, primarily because it was associated with manual labor. People walked about with parasols (if they could afford them) and lots of layered clothing. Swimwear exposed only 23 percent of men’s and 18 percent of women’s total skin surface. Other kinds of sportswear covered all but 9 percent of both men’s and women’s skin. To prevent sunburn and freckling, people wore sunscreens that contained either white petroleum or almond oil combined with a heavy powder composed of magnesium, zinc oxide or bismuth. But few people had much time for swimming and other outdoor leisure activities, the study’s authors point out, for late 19th-century reformers declared that “work was more important than play and warned the public about social dangers that could result from idle time.”

Then, around the turn of the 20th century, scientists began promoting sunlight as a way of inhibiting the growth of certain microorganisms and of treating diseases, most notably rickets and painful skin lesions caused by a condition called lupus vulgaris. By 1903, “heliotherapy” (sunlight therapy) was soon being prescribed for patients with tuberculosis, and by 1910, the medical journal Lancet declared, “The face browned by the sun is regarded as an index of health.”

1910 to late 1930s. During this period, UV phototherapy became fully embraced by the medical community, so much so that by 1939, the editors of the “Ladies Home Journal” were telling mothers that “the sunbath is just as important as the water bath” for their children. Companies began making and selling sunlamps that people could use in their homes. Some dermatologists expressed concerns that all this UV worshiping was causing “sunlight cancer” as well as premature wrinkling of the skin, but their warnings were largely ignored. By the 1920s, having a tan became a fashion statement and a symbol of wealth and leisure. For example, the upper-class characters in F. Scott Fitzgerald’s book “The Beautiful and Damned,” which is set in the years 1913 and 1914, discuss how to achieve a tan.

Health care equity: How do we get there?

Addressing the biggest barriers to meaningful reduction in health-care disparities
Oct. 21 breakfast event at Northrop sponsored by UCare

Eventbrite - Health Care Equity: How do we get there?

Leisure time also changed dramatically during this period. Men saw the length of their work week fall from 62 hours in 1880 to 42.5 by 1940, and many jobs now came with paid vacations. Outdoor activities became a popular way of filling those newly acquired leisure hours, and to meet the demand, communities across the country built parks, tennis courts, baseball fields and swimming pools. Clothing also changed dramatically, with swimwear now exposing 47 percent of the skin surface of both men and women. Sportswear was equally revealing.

1940s through 1970s. This period saw more travel to warm destinations and greater access to outdoor activities. Sales of boating and camping gear soared as people took to the great outdoors, and amusement parks became favorite family destinations. (Disneyland opened in 1954.) Participation in organized sports — baseball, football and basketball — increased rapidly, partly, say the authors of this study, as a means of promoting “the American way of life” during the Cold War.

Clothing became more and more revealing. Interestingly, many of the changes  — such as the T-shirt as acceptable outer wear for men and the two-piece bathing suit for women — were a direct result of fabric rations during World War II.  In 1946, a French designer introduced the bikini, which became widely accepted in the U.S. by the 1960s. Swimwear now exposed 80 percent of women’s bodies (92 percent of those who wore bikinis) and 89 percent of men’s. (Men could now go swimming bare-chested.)

During this period, however, the scientific evidence about tanning’s link to skin cancer grew to the point where it was hard for both the medical community and the public to ignore. People began to (mistakenly) believe that wearing one of the many new sunscreens that were being rapidly developed and marketed would protect them from the danger. But, as already noted, the incidence of melanoma — and deaths from the disease — kept climbing.

1980s to present. Earlier trends regarding sun exposure continued. The one new trend that has developed in these more recent decades is the proliferation of indoor tanning beds. In 1981, 10 new tanning centers opened each week in the U.S., and by 1988, there were more than 18,000 such centers across the country. Currently, some 28 million Americans tan indoors each year. A disproportionate number of them are teenage girls or young women, a factor that may explain why melanoma rates among women aged 15 to 39 increased 3.6 percent per year between 2000 and 2009.

A historical framework

By the start of the 21st century, surveys indicated that well over half of Americans believed that people look healthier with a tan — a sentiment that “stands in stark contrast to the negative social context of tanned skin that defined the early 1900s,” write the NYU researchers.

Although this study can’t make a direct causal link to melanoma, it does, say its authors, provide a “historical framework” for how changing socioeconomic factors have contributed to the disease.

That framework, they add, may lead to “public health and educational measures that may ultimately help reverse melanoma incidence trends.”

A new law in Minnesota, which went into effect Aug. 1, bans indoor commercial tanning for people under 18. Several other states have adopted similar laws.

Public education efforts have already shown evidence of working in Australia, which is the only developed country that has seen a drop (albeit a small one) in its skin cancer rates — a drop directly attributed to the aggressive “slip, slop, slap” skin-cancer-prevention campaign launched there in 1980.

You can read the NYU study on the American Journal of Public Health’s website.

Leave a comment