Beach Bummer

New evidence suggests that sunscreens don’t prevent skin cancer and may even promote some forms of it. The manufacturers know it. Some researchers know it. Why don’t consumers?

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.


Anyone who’s ever heard a smoke alarm go off knows how horrid its whine can be. Why keep it around? Because a little unpleasantness can prevent injury or death in a fire. But in the last few decades, millions of people who cherish smoke detectors may have disabled one of nature’s equally protective, if annoying, alarms. They’ve rubbed on sunscreen, never thinking that sunburns, like smoke alarms, might prevent a greater harm.

Ironically, sunscreen devotees have turned off their dermatological smoke detectors in the name of preventive medicine. Sunburn, experts say, is a key risk factor for malignant melanoma, the potentially fatal skin cancer that’s become a headline-grabbing epidemic since 1980. Forget what used to be called a “healthy tan.” Today, experts insist, we’re paying for decades of naive, post-World War II, beach- blanket sun worship with an unprecedented melanoma rate. But there’s hope, they tell us. If you can’t avoid the midday sun, pour on the sunscreen.

Unfortunately, the public health authorities who urge routine, liberal use of sunscreen (especially on children) fail to mention that sunscreens have never been shown to prevent melanoma. The medical research community knows this. The Food and Drug Administration knows it. And sunscreen makers know it. Yet, as a result of scientific myopia, bureaucratic inertia, and the almighty bottom line, they’ve essentially told us to use sunscreen and not to worry.

But two San Diego epidemiologists, Cedric and Frank Garland, are worried. Best known for their work linking sunshine with the prevention of breast and colon cancer, the Garland brothers (with research associate Edward Gorham) have compiled a body of evidence suggesting that sunscreens dupe the public into believing they’re covered by state-of-the-art melanoma protection, when, in fact, they may be highly vulnerable to the disease. Even worse, the Garlands’ research suggests that sunscreen use just might promote melanoma.

Unfortunately for the public health, the Garlands refuse to discuss their theory for fear of professional ostracism. After they presented their case against sunscreens at a 1990 epidemiological meeting in Los Angeles, both the New York Times and the Washington Post ran articles explaining their theory. Epidemiologists accused the Garlands of grandstanding for speaking to the press before publishing their analysis in a scientific journal. Stung by this criticism (which could threaten funding of their other work), the brothers have since avoided journalists. But anyone who examines the Garlands’ claims might feel, well, burned by sunscreens.

Meanwhile, most dermatologists, epidemiologists, and sunscreen makers continue to suggest that sunscreens prevent melanoma. With a $380 million market at stake, the sunscreen industry in particular has an interest in keeping the Garlands’ argument out of the public eye. Perhaps, as the industry claims, sunscreens prevent melanoma; perhaps they promote it. No one knows for certain, but worse, almost no one is trying to find out. So before you rub on another drop of sunscreen, consider the evidence. Because sunscreen makers are watching out for everything except your health.

 

Malignant melanoma’s dark, mole-derived tumors are the fastest-rising cancer under the sun. From 1975 to 1992, the number of melanoma cases reported annually in the U.S. tripled, increasing more than any other cancer. Since the 1950s, melanoma rates have also risen dramatically among fair-skinned Australians, Brits, Canadians, and Scandinavians (it is extremely unusual for dark-skinned people to get skin cancer).

Melanoma now strikes thirty-two thousand Americans each year and kills sixty-eight hundred. But before 1950 it was quite rare. Two other skin cancers, basal and squamous cell skin tumors, were dermatologists’ major concern. These slow-spreading cancers usually occur in white men over forty-five who work outdoors or live near the equator. They are by far the nation’s most prevalent cancers, with 600,000 new diagnoses each year, but they rarely prove fatal, with successful treatment in 99 percent of cases.

Around the turn of the century, doctors linked risk of basal and squamous cell tumors with lifetime sun exposure–the more sun, the more risk. They also discovered a far rarer–and more fatal–skin cancer, later dubbed malignant melanoma, which they believed had nothing to do with sunlight because it usually appeared in people who spent little time in the sun. Victims of the fatal cancer had only two things in common–fair skin and red or blonde hair. Doctors concluded that the cancer was a consequence of being fair-skinned and light- haired.

But by the late 1960s, numerous studies showed a connection between melanoma and the ultraviolet radiation in sunlight, demonstrating, for example, that whites who live near the equator have higher melanoma rates than those in temperate climes.

The fact that outdoor workers rarely develop melanoma was apparently explained when researchers shifted their attention from sunlight to sunburn. They hypothesized that deeply tanned skin protects against melanoma, even though it increases the risk of basal and squamous cell cancers. Indoor office workers have brief, intense exposures to the sun–the kind that cause sunburn, which in turn could lead to melanoma.

During the 1970s, scientists used the sunburn theory to explain the dramatic rise in the melanoma rate in the second half of this century. After World War II, they argued, record numbers of Americans became white-collar workers, which limited their sun exposure and, as a result, increased their risk of weekend sunburns. In addition, sunbathing became a national pastime, and women’s swimsuits became more revealing.

Because melanoma has been associated with teenage sunburns, but the median age for diagnosis is in the forties, researchers concluded that melanoma, like many cancers, takes decades to develop. Estimating a twenty-five- to thirty-year lag time, proponents of the sunburn theory claim that postwar sunbathing resulted in the melanoma epidemic of the 1980s.

 

There is no animal model for melanoma (as there is for squamous cell skin cancer, which mice can contract), so it is impossible to conduct laboratory experiments to discover exactly what causes the disease. And, although the sunburn theory is an advance over the “fate of the fair-skinned” theory, it fails to explain a few things.

For instance, sunburn was a common medical problem long before people started wearing bikinis. Turn-of-the-century medical texts dealt with it as a fact of life, and folk medicine abounds with remedies. Yet melanoma was extremely rare before 1950.

Many of the social changes sunburn-theory supporters attribute to the 1950s actually occurred about thirty years earlier. Sunbathing first became popular during the 1920s, thanks to fashion designer Coco Chanel, who launched a tanning chic after returning from a yachting vacation with a golden tan. Assuming a lag time of twenty-five to thirty years, the melanoma rate should have risen considerably starting in the mid-1940s. It didn’t.

Furthermore, several studies suggest that melanoma actually may have a short lag time. Sunspots, which cause complex effects in the upper atmosphere, appear cyclically on the sun’s surface about every eleven years. A study of sunspot activity between 1935 and 1975 showed that every sunspot cycle was followed a few years later by a small but significant increase in the melanoma rate.

A Scottish study corroborated the idea of a short lag time by finding a “highly significant” correlation between melanoma diagnoses and severe sunburns occurring just five years earlier. Melanoma diagnoses in fifteen- to twenty-four-year-olds have increased noticeably since 1973. And studies in Sweden, Hawaii, and the continental U.S. have shown consistent seasonal patterns in melanoma diagnoses, another hallmark of biological events with short lag times.

But if melanoma has a lag time of only a few years, then the explosive increases of the 1960s, 1970s, and 1980s can’t be blamed on changes in beach attire of the 1920s or the 1940s. The factor that accounts for these changes must have appeared in the middle or late 1950s and become gradually more significant as time has progressed.

No one knows what this factor is. But Cedric and Frank Garland are afraid that it may be sunscreen use.

 

Most sunscreens block about 5 percent of ultraviolet radiation–the UVB rays that cause burning. The other 95 percent of the UV spectrum, UVA, has long been thought to play a minor role in sunburn, so sunscreens block only a small portion of it (see sidebar). But studies have shown that UVA may play an important role in skin cancer. UVA radiation penetrates more deeply into the skin than UVB, down to the melanocytes, the cells that turn cancerous in melanoma.

Scientists have yet to identify exactly what corrupts healthy melanocytes, largely because there is no animal model for melanoma. But mice develop nonmelanoma skin cancers under UV light.

Proponents of the sunburn theory are quick to point to a Danish study in which sunscreen was shown to delay (but not completely prevent) the development of squamous cell tumors in mice exposed to artificial sunlight. The higher the sunscreen’s sun protection factor, the longer it took the mice to develop tumors. To date, this is the closest scientists have come to establishing the preventive value of sunscreens.

However, another study at the same lab should give sunscreen advocates pause. In this experiment, mice exposed to artificial sunlight developed a small number of squamous cell tumors. But ones exposed to artificial sunlight followed by additional UVA developed more than twice as many tumors.

Not only does this study suggest that UVA may play a role in skin cancer, it also points to the particular danger of sunlight followed by UVA alone–a cycle similar to that which occurs when people use sunscreen. They hit the beach, playground, or ballfield and remove some clothing, exposing themselves to full-spectrum sunlight. Then they apply sunscreen, blocking UVB, but continuing their exposure to UVA. As the sunscreen wears off, they’re again exposed to full sun. After reapplying sunscreen, they get additional UVA–and possibly cancer.

Of course, mice are not human beings, and squamous cell cancers are not melanoma, so either study (or both) may mean nothing. But melanoma experts trumpet the implications of the first study, that sunscreens help prevent skin cancer, while ignoring those of the second, that sunscreen use fosters a cancer-promoting pattern of UV exposure.

The Garlands have more disturbing news about sunscreen: by impairing the body’s production of vitamin D, it may also remove a defense against cancer. According to studies, vitamin D has a hormone-like effect that interferes with the growth of several tumors, including those associated with melanoma and colon and breast cancers. Although we get small amounts of the vitamin from milk and cold-water fish, most of our bodies’ supply is produced when skin is exposed to UVB. By blocking UVB, sunscreens interfere with vitamin D synthesis. A recent study shows that habitual sunscreen users have unusually low vitamin D levels–sometimes low enough that researchers call them “deficient.”

A sunscreen-melanoma link might also illuminate a peculiar fact unexplained by the sunburn theory: melanoma risk rises with income. Although both professionals and clerical workers work indoors, the former have a significantly higher melanoma rate. Because health consciousness is generally an upper-income phenomenon, sunscreens presumably appeal to the more affluent.

Even if sunscreen is one day shown to protect against melanoma, the Garlands worry that it may give users a dangerously false sense of security. No one knows how large doses of UVA might affect the body. Historically, whites would have been dangerously sunburned long before they received the levels of UVA radiation that they may now get in one sunscreen-wearing day at the beach. Whatever UVA’s role in causing melanoma, the Garlands strongly recommend that you not make yourself a guinea pig.

 

Remember those old Coppertone ads with the puppy pulling down the little girl’s bathing suit? Suntan lotions, introduced in the mid- 1950s for cosmetic purposes, were the first commercial use of sunscreens. As sales increased throughout the 1950s and 1960s, so did the melanoma rate.

During the 1970s and 1980s, suntan lotions were repositioned as sunscreens, which, experts said, prevented skin cancer by preventing sunburn. Sales in 1991 were $380 million, more than twice as much as a decade earlier. But as experts persuaded more and more Americans to use sunscreens, melanoma became an epidemic, with new diagnoses roughly paralleling sunscreen sales.

This epidemic has been a godsend for sunscreen makers. According to the journal Drug and Cosmetic Industry, “Every indicator that skin cancer is on the rise, every utterance by a dermatologist . . . seems to reinforce the need for consumers to use more of these products. The missionary work required to double the market [by 1995] has already been done, and not just by the industry.”

Sunscreen makers frankly admit that their products have never been shown to prevent human skin cancers. “The studies show that sunscreens prevent squamous cell cancers in animals,” says Patricia Agin, sunscreen product manager for Schering-Plough, whose brands, including Coppertone, account for one-third of the market. “I think they do the same in humans. But . . . we don’t know for certain. Because there’s no animal model for [melanoma], we don’t know if sunscreens prevent it. I see no reason to think that they wouldn’t, but we have no proof that they do.”

Jack Surrette, marketing vice-president of Tanning Research Laboratories (makers of Hawaiian Tropic sunscreens), goes further. “To some extent, when you protect only for UVB, it would seem to run a risk for potential skin cancer,” he says. “UVA is a more damaging ray. We may be hurting ourselves by protecting ourselves too well on the UVB side.”

Unfortunately, sunscreen labels do not reflect Agin or Surrette’s understanding of the research. They echo the claims of dermatologists and cancer-education organizations: “Regular use may prevent skin cancer.” Of course, when a label says “skin cancer,” sunscreen makers insist that it means nonfatal squamous cell skin cancer. But if the label doesn’t distinguish between melanoma and other skin cancers, then how can consumers be expected to?

Further confusing consumers, most sunscreens carry a seal of approval from the nonprofit Skin Cancer Foundation, which claims to alert people to safe products. More than 130 different suncare products have earned the right to display this seal–for a price. In addition to submitting their products for testing and review, corporations also dole out ten thousand dollars to use the seal. John Epstein, who sits on the Skin Cancer Foundation’s four-member seal review committee, says he can’t recall anyone ever being denied the seal, but companies that file inadequate paperwork must resubmit their requests. (A foundation spokesperson claimed that some companies have withdrawn rather than resubmit their applications.)

The foundation, which boasts celebrity backers such as Tom Selleck, Lauren Bacall, Dick Cavett, Paul Newman, and Joanne Woodward, earns about one-fourth of its $1.7 million budget from corporate donations. Recently the foundation sent sixty thousand elementary schools posters that urge students: “Always use sunscreen when you go outdoors, no matter the season or the color of your skin.”

The Skin Cancer Foundation isn’t the only case of corporate funding blurring the boundary between public health and the bottom line. The Skin Phototrauma Foundation (acronym: SPF) was founded by Ortho Pharmaceuticals (whose parent company, Johnson & Johnson, makes Sundown suncreens), Procter & Gamble (Bain de Soleil), and Mary Kay Cosmetics (Sun Essentials). Even the Academy of American Dermatology, a medical association that counts 98 percent of dermatologists in the United States and Canada among its members, uses corporate donations to fund its public-education efforts.

So who should consumers turn to for an untainted view of sunscreens? The likely choice would be the Food and Drug Administration. Sometime this year, the FDA plans to release new labeling regulations for the first time since 1978. One proposal would require all sunscreen labels to carry “sun alerts,” warning consumers that preventing sunburn may not protect them against wrinkling and skin cancer. But even after new regulations are released, the public and sunscreen makers will have eighteen months to comment before the FDA issues its final regulations, probably in 1995.

 

When the Garland brothers first presented their case against sunscreens at the 1990 meeting, a few epidemiologists expressed guarded interest. Dr. Leonard Kurland of the Mayo Clinic called their analysis “intriguing and worth exploring further.” But most supporters of the sunburn theory considered the brothers’ argument ludicrous.

The Garlands openly admit that the case against sunscreens is not airtight. The controversial assertion that melanoma has a brief lag time needs corroboration. The study showing that UVA promotes squamous cell tumors in sun-exposed mice may not be applicable to human beings. And a recent study showed no correlation between vitamin D levels and melanoma risk.

But the best theory is the one that answers the most questions, and the sunburn/long-lag-time theory looks shaky. It ignores the studies showing a brief lag time. It doesn’t address why sunburn rarely caused melanoma before the 1950s. It sheds no light on why melanoma risk is linked to income. And it fails to explain why increases in the melanoma rate have so closely paralleled the rise in sunscreen use.

Despite these shortcomings, the sunburn theory continues to be the dominant scientific theory, and it often takes decades to overturn a dominant theory. It took almost twenty years for researchers to accept a connection between sunburn and melanoma, and no one was out there saying, “Get burned. It’s good for you.” Today leading scientists are saying, “Use sunscreens. They’re good for you.” If sunscreens provide a false sense of security, or worse, promote melanoma, convincing the scientific establishment could take well into the next century.

Medical journalist Michael Castleman writes for national magazines and has authored six books, most recently An Aspirin A Day (forthcoming from Hyperion). Kerry Lauerman of Mother Jones contributed additional reporting to this story. To have a copy of this article and the thirty scientific papers on which it is based mailed to you or your dermatologist, send $8 to Mother Jones, 1663 Mission Street, 2d floor, San Francisco, CA 94103.

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate