Laxatives cause cancer. Caffeine prevents pregnancy. Pizza prevents prostate cancer. Vitamins prevent cancer. Vitamins cause cancer. It seems like we've been bombarded by such claims recently. One report says something's bad for us; another says no, it's really good. What's the truth? Do you need to be a scientist to figure it out? What, and who, should we believe?
A Grain of Salt
Despite their urgent tone, many of these claims actually mean little by themselves. And they usually mean even less to any specific individual. Any study gives only part of the rarely reported larger picture. A study may report that electric power lines, chlorinated tap water or some pesticide increases cancer risk. But often the study is only one of a number of studies, the great majority of which may be contradictory.
Over the past 15 years, many studies have claimed to link electric power lines with brain cancer. But there also have been many that didn't make that link. Even as study techniques have improved, the split in these results has remained, and no study has yet indicated how electric power lines could possibly cause brain cancer. Even where no health risk actually exists, just by chance alone, some studies will falsely report that a risk exists.
In addition to the shortcomings of these studies, there's also a "publication bias" the recognized tendency for medical journals to publish studies that identify risks rather than studies that don't. A 1994 survey by the Academy of Emergency Medicine of the most widely read medical research journals found that 80 percent of published research consists of positive studies.
Scientific knowledge develops incrementally over time and isn't a one-study endeavor. Be wary of any one study. It's likely to mean less than it claims.
One in a Million
Another problem with these ominous studies is that their relevance to the individual is often oversold. Much research relies on epidemiology, the study of disease in actual human populations. But epidemiology's strengths are also its Achilles' heel. Epidemiology is about observed rates of disease in particular small populations, not future risks to all beings.
Although studies may show that, in the past, one in three Americans got cancer, it's not necessarily true that every American has a one-in-three risk of getting cancer. Many individuals will be at greater risk of cancer and many will be at lower risk. And most of the time it's impossible to identify with any accuracy who's at greater risk.
Many researchers like to claim that something "causes" disease. But claims of causation are rarely true. Consider smoking-probably the most heavily studied and highly publicized health risk.
Studies have reported that populations of heavy smokers have gotten lung cancer at a rate 20 times greater than nonsmokers. As a result of such studies, researchers have concluded that smoking "causes" lung cancer. But is the cause-and-effect relationship that simple?
Based on American Cancer Society statistics, one of every 10,000 nonsmokers gets lung cancer. If the rate of lung cancer among heavy smokers is 20 times greater, then out of every 10,000 heavy smokers, 20 will get lung cancer. This means that less than 1 percent of heavy smokers (20 out of 10,000) get lung cancer.
So, does smoking "cause" lung cancer in the same way that, say, a gunshot causes death? Or is it more likely that some unknown combination of additional factors, including genetics and lifestyle, also play a role in who gets lung cancer? Genetic predisposition to cancer was recently highlighted in a study published in the Journal of the American Medical Association on smoking and breast cancer.
In reality, cause-and-effect relationships are complex and not well understood. Beware of attempts to simplify them.
Loose Connections
Unbeknownst to the public, studies often depend upon unproven and often undisclosed assumptions. For example, the National Cancer Institute recently concluded that radon gas in homes causes 12,000 lung cancer deaths annually.
Although studies show that high levels of radon gas in underground uranium mines increased cancer rates among miners, the typical home isn't an underground uranium mine. How does NCI conclude that 12,000 die annually from radon in the home?
NCI assumed that because high levels of radon increased cancer rates among miners, then any radon exposure increases cancer risk. This assumption, never scientifically validated, is then applied to a mathematical model that calculates the 12,000 figure.
Not only do many studies of radon in homes contradict this assumption, some even suggest that lower lung cancer rates occur in geographic areas with relatively higher levels of radon.
A favorite attention-getting technique of researchers is to estimate "body counts" such as 300,000 die each year from obesity, 120,000 from inactivity or 60,000 from air pollution. However, body counts depend on the assumption that death rates observed in small study populations are predictive of what happens to the entire population in the future.
Also, body counts are strictly hypothetical. The figures are arrived at using statistics, not by counting bodies. They're not audited to see if they're accurate. In contrast, the 150,000 annual deaths from trauma-induced injuries are countable.
Too Many Variables
Finally, beware of researchers who try to split a hair with an ax. Epidemiology is useful only in studying high disease rates or rare diseases, and preferably both. For example, heavy smokers are reported to have a relatively high rate of the relatively rare disease of lung cancer.
Population data on small risks or common diseases typically have too much "noise" for epidemiologists to filter out. Simply put, epidemiology cannot eliminate all the variability among humans.
Also, researchers often don't really know how much of a risk factor the individuals in a studied population have been exposed to and are forced to guess at exposure levels.
Let's say a study reports that living near a hazardous waste incinerator increases cancer risk to a very high one in 1,000. For practical reasons, the researchers will likely not have measured the extent to which the studied individuals have been exposed to emissions from the incinerator.
On the other hand, one in three Americans develops cancer as a function of being alive. And any population, by chance alone, may have a higher risk than that. How sure can anyone be that an additional one-in-1,000 risk can be traced to incinerator emissions, particularly if actual exposures, let alone their toxicity, aren't known with reasonable certainty?
A good rule of thumb for epidemiologic studies is that reported increases in risk on the order of 100 percent or less are too small to detect with certainty.
Keep in mind the recent words of Dr. Charles Hennekens of the Harvard School of Public Health: "Epidemiology is a crude and inexact science. We tend to overstate findings either because we want attention or more grant money."
Clearly, there is more to medical research news than just the headlines.
Steven Milloy is president of the Environmental Policy Analysis Network in Washington, D. C, and the publisher of The Junk Science Home Page (http://www.junkscience.com).
Materian Washington, D. C, and the publisher of The Junk Science Home Page (http://www.junkscience.com).
Material presented on this home page constitutes opinion of the author.
Copyright © 1997 Steven J. Milloy. All rights reserved. Site developed and hosted by WestLake Solutions, Inc.