Creating "consensus”
Once politicians, the press, and
the public had decided dietary fat policy, the science was left to catch up. In
the early 1970s, when NIH opted to forgo a $1 billion trial that might be definitive
and instead fund a half-dozen studies at one-third the cost, everyone hoped
these smaller trials would be sufficiently persuasive to conclude that low-fat
diets prolong lives. The results were published between 1980 and 1984. Four of
these trials --comparing heart disease rates and diet within Honolulu, Puerto
Rico, Chicago, and Framingham--showed no evidence that men who ate less fat lived
longer or had fewer heart attacks. A fifth trial, the Multiple Risk Factor
Intervention Trial (MRFIT), cost $115 million and tried to amplify the subtle influences
of diet on health by persuading subjects to avoid fat while simultaneously
quitting smoking and taking medication for high blood pressure. That trial
suggested, if anything, that eating less fat might shorten life. In each study,
however, the investigators concluded that methodological flaws had led to the
negative results. They did not, at least publicly, consider their results
reason to lessen their belief in the evils of fat.
The sixth study was the $140
million Lipid Research Clinics (LRC) Coronary Primary Prevention Trial, led by
NHLBI administrator Basil Rifkind and biochemist Daniel Steinberg of the
University of California, San Diego. The LRC trial was a drug trial, not a diet
trial, but the NHLBI heralded its outcome as the end of the dietary fat debate.
In January 1984, LRC investigators reported that a medication called cholestyramine
reduced cholesterol levels in men with abnormally high cholesterol levels and
modestly reduced heart disease rates in the process. (The probability of
suffering a heart attack during the seven-plus years of the study was reduced
from 8.6% in the placebo group to 7.0%; the probability of dying from a heart attack
dropped from 2.0% to 1.6%.) The investigators then concluded, without benefit
of dietary data, that cholestyramine's benefits could be extended to diet as
well. And although the trial tested only middle-aged men with cholesterol
levels higher than those of 95% of the population, they concluded that those
benefits "could and should be extended to other age groups and women and
... other more modest elevations of cholesterol levels."
Why go so far? Rifkind says their
logic was simple: For 20 years, he and his colleagues had argued that lowering cholesterol
levels prevented heart attacks. They had spent enormous sums trying to prove
it. They felt they could never actually demonstrate that low-fat diets prolonged
lives--that would be too expensive, and MRFIT had failed--but now they had stablished
a fundamental link in the causal chain, from lower cholesterol levels to cardiovascular
health. With that, they could take the leap of faith from cholesterol-lowering drugs
and health to cholesterol-lowering diet and health. And after all their effort,
they were eager--not to mention urged by Congress--to render helpful advice.
"There comes a point when, if you don't make a deision, the consequences
can be great as well," says Rifkind. "If you just allow Americans to
keep on consuming 40% of calories from fat, there's an outcome to that as
well."
With the LRC results in press, the
NHLBI launched what Levy called "a massive public health campaign."
The media obligingly went along. Time, for instance, reported the LRC findings under
the headline "Sorry, It's True. Cholesterol really is a killer." The
article about a drug trial began: "No whole milk. No butter. No fatty
meats ..." Time followed up 3 months later with a cover story: "And
Cholesterol and Now the Bad News. ..." The cover photo was a frowning
face: a breakfast plate with two fried eggs as the eyes and a bacon strip for
the mouth. Rifkind was quoted saying that their results "strongly indicate
that the more you lower
cholesterol and fat in your diet,
the more you reduce your risk of heart disease," a statement
that still lacked direct scientific
support.8
The following December, NIH effectively ended the debate with
a "Consensus Conference."
The idea of such a conference is that an expert panel,
ideally unbiased, listens to 2 days of
testimony and arrives at a conclusion with which everyone
agrees. In this case, Rifkind
chaired the planning committee, which chose his LRC
co-investigator Steinberg to lead the
expert panel. The 20 speakers did include a handful of
skeptics --including Ahrens, for
instance, and cardiologist Michael Oliver of Imperial College
in London--who argued that it
was unscientific to equate the effects of a drug with the
effects of a diet. Steinberg's panel
members, however, as Oliver later complained in The Lancet,
"were selected to include only
experts who would, predictably, say that all levels of blood
cholesterol in the United States are
too high and should be lowered. And, of course, this is
exactly what was said." Indeed, the
conference report, written by Steinberg and his panel,
revealed no evidence of discord. There
was "no doubt," it concluded, that low-fat diets
"will afford significant protection against
coronary heart disease" to every American over 2 years
old. The Consensus Conference
officially gave the appearance of unanimity where none
existed. After all, if there had been a
true consensus, as Steinberg himself told Science, "you
wouldn't have had to have a consensus
conference."
The test of time
To the outside observer, the challenge in making sense of any
such long-running scientific
controversy is to establish whether the skeptics are simply
on the wrong side of the new
paradigm, or whether their skepticism is well founded. In
other words, is the science at issue
based on sound scientific thinking and unambiguous data, or
is it what Sir Francis Bacon, for
instance, would have called "wishful science,"
based on fancies, opinions, and the exclusion
of contrary evidence? Bacon offered one viable suggestion for
diffe rentiating the two: the test
of time. Good science is rooted in reality, so it grows and
develops and the evidence gets
increasingly more compelling, whereas wishful science
flourishes most under its first authors
before "going downhill."
Such is the case, for instance,
with the proposition that dietary fat causes cancer, which was an integral part
of dietary fat anxiety in the late 1970s. By 1982, the evidence supporting this
idea was thought to be so undeniable that a landmark NAS report on nutrition
and cancer equated
those researchers who remained
skeptical with "certain interested parties [who] formerly
argued that the association between lung cancer and smoking
was not causational." Fifteen
years and hundreds of millions of research dollars later, a
similarly massive expert report by
the World Cancer Research Fund and the American Institute for
Cancer Research could find
neither "convincing" nor even "probable"
reason to believe that dietary fat caused cancer.
The hypothesis that low-fat diets are the requisite route to
weight loss has taken a similar
downward path. This was the ultimate fallback position in all
low-fat recommendations: Fat
has nine calories per gram compared to four calories for
carbohydrates and protein, and so
cutting fat from the diet surely would cut pounds. "This
is held almost to be a religious truth,"
says Harvard's Willett. Considerable data, however, now suggest
otherwise. The results of
well-controlled clinical trials are consistent: People on
low-fat diets initially lose a couple of
kilograms, as they would on any diet, and then the weight
tends to return. After 1 to 2 years,
little has been achieved. Consider, for instance, the 50,000
women enrolled in the ongoing
$100 million Women's Health Initiative (WHI). Half of these
women have been extensively
counseled to consume only 20% of their calories from fat.
After 3 years on this near-draconian
regime, say WHI sources, the women had lost, on average, a
kilogram each.
The link between dietary fat and heart disease is more
complicated, because the hypothesis
has diverged into two distinct propositions: first, that
lowering cholesterol prevents heart
disease; second, that eating less fat not only lowers
cholesterol and prevents heart disease but
prolongs life. Since 1984, the evidence that
cholesterol-lowering drugs are beneficial--
proposition number one--has indeed blossomed, at least for
those at high risk of heart attack.
These drugs reduce serum cholesterol levels dramatically, and
they prevent heart attacks,
perhaps by other means as well. Their market has now reached
$4 billion a year in the United
States alone, and every new trial seems to confirm their benefits.
The evidence supporting the second proposition, that eating
less fat makes for a healthier and
longer life, however, has remained stubbornly ambiguous. If
anything, it has only become less
compelling over time. Indeed, since Ancel Keys started
advocating low-fat diets almost 50
years ago, the science of fat and cholesterol has evolved
from a simple story into a very
complicated one. The catch has been that few involved in this
business were prepared to deal
with a complicated story. Researchers initially preferred to
believe it was simple--that a single
unwholesome nutrient, in effect, could be isolated from the
diverse richness of human diets;
public health administrators required a simple story to give
to Congress and the public; and
the press needed a simple story--at least on any particular
day--to give to editors and readers in
30 column inches. But as contrarian data continued to
accumulate, the complications became
increasingly more difficult to ignore or exclude, and the
press began waffling or adding
caveats. The scientists then got the blame for not sticking
to the original simple story, which
had, regrettably, never existed.
More fats,
fewer answers
The original simple story in the 1950s was that high
cholesterol levels increase heart disease
risk. The seminal Framingham Heart Study, for instance, which
revealed the association
between cholesterol and heart disease, originally measured
only total serum cholesterol. But
cholesterol shuttles through the blood in an array of packages.
Low-density lipoprotein
particles (LDL, the "bad" cholesterol) deliver fat
and cholesterol from the liver to tissues that
need it, including the arterial cells, where it can lead to
atherosclerotic plaques. High-density
lipoproteins (HDLs, the "good" cholesterol) return
cholesterol to the liver. The higher the
HDL, the lower the heart disease risk. Then there are
triglycerides, which contain fatty acids,
and very low density lipoproteins (VLDLs), which transport
triglycerides.
All of these particles have some effect on heart disease
risk, while the fats, carbohydrates, and
protein in the diet have varying effects on all these
particles. The 1950s story was that
saturated fats increase total cholesterol, polyunsaturated
fats decrease it, and monounsaturated
fats are neutral. By the late 1970s--when researchers
accepted the benefits of HDL--they
realized that monounsaturated fats are not neutral. Rather,
they raise HDL, at least compared
to carbohydrates, and lower LDL. This makes them an ideal
nutrient as far as cholesterol goes.
Furthermore, saturated fats cannot be quite so evil because,
while they elevate LDL, which is
bad, they also elevate HDL, which is good. And some saturated
fats--stearic acid, in
particular, the fat in chocolate--are at worst neutral.
Stearic acid raises HDL levels but does
little or nothing to LDL. And then there are trans fatty
acids, which raise LDL, just like
saturated fat, but also lower HDL. Today, none of this is
controversial, although it has yet to
be reflected in any Food Guide Pyramid.
To understand where this complexity can lead in a simple
example, consider a steak--to be
precise, a porterhouse, select cut, with a half-centimeter
layer of fat, the nutritional10
constituents of which can be found in the Nutrient Database
for Standard Reference at the
USDA Web site. After broiling, this porterhouse reduces to a
serving of almost equal parts fat
and protein. Fifty-one percent of the fat is monounsaturated,
of which virtually all (90%) is
oleic acid, the same healthy fat that's in olive oil.
Saturated fat constitutes 45% of the total fat,
but a third of that is stearic acid, which is, at the very
least, harmless. The remaining 4% of the
fat is polyunsaturated, which also improves cholesterol
levels. In sum, well over half--and
perhaps as much as 70%--of the fat content of a porterhouse
will improve cholesterol levels
compared to what they would be if bread, potatoes, or pasta
were consumed instead. The
remaining 30% will raise LDL but will also raise HDL. All of
this suggests that eating a
porterhouse steak rather than carbohydrates might actually
improve heart disease risk,
although no nutritional authority who hasn't written a
high-fat diet book will say this publicly.
As for the scientific studies, in the years since the 1984
consensus conference, the one thing
they have not done is pile up evidence in support of the
low-fat-for-all approach to the public
good. If anything, they have added weight to Ahrens's fears
that there may be a downside to
populationwide low-fat recommendations. In 1986, for
instance, just 1 year after NIH
launched the National Cholesterol Education Program, also
advising low-fat diets for
everyone over 2 years old, epidemiologist David Jacobs of the
University of Minnesota, Twin
Cities, visited Japan. There he learned that Japanese
physicians were advising patients to raise
their cholesterol levels, because low cholesterol levels were
linked to hemorrhagic stroke. At
the time, Japanese men were dying from stroke almost as
frequently as American men were
succumbing to heart disease. Back in Minnesota, Jacobs looked
for this low-cholesterol-stroke
relationship in the MRFIT data and found it there, too. And
the relationship transcended
stroke: Men with very low cholesterol levels seemed prone to premature
death; below 160
milligrams per deciliter (mg/dl), the lower the cholesterol
level, the shorter the life.
Jacobs reported his results to NHLBI, which in 1990 hosted a
conference to discuss the issue,
bringing together researchers from 19 studies around the
world. The data were consistent:
When investigators tracked all deaths, instead of just heart
disease deaths, the cholesterol
curves were U-shaped for men and flat for women. In other
words, men with cholesterol
levels above 240 mg/dl tended to die prematurely from heart
disease. But below 160 mg/dl,
the men tended to die prematurely from cancer, respiratory
and digestive diseases, and trauma.
As for women, if anything, the higher their cholesterol, the
longer they lived (see graph on p.
2540).
These mortality data can be interpreted in two ways. One,
preferred by low-fat advocates, is
that they cannot be meaningful. Rifkind, for instance, told
Science that the excess deaths at
low cholesterol levels must be due to preexisting conditions.
In other words, chronic illness
leads to low cholesterol levels, not vice versa. He pointed
to the 1990 conference report as the
definitive document on the issue and as support for his
argument, although the report states
unequivocally that this interpretation is not supported by
the data.
The other interpretation is that what a low-fat diet does to
serum cholesterol levels, and what
that in turn does to arteries, may
be only one component of the diet's effect on health. In other
words, while low-fat diets might help
prevent heart disease, they might also raise
susceptibility to other conditions.
This is what always worried Ahrens. It's also one reason
why the American College of
Physicians, for instance, now suggests that cholesterol reduction
is certainly worthwhile for those
at high, short-term risk of dying of coronary heart disease but
of "much smaller or ...
uncertain" benefit for everyone else.11
This interpretation--that the
connection between diet and health far transcends cholesterol--is
also supported by the single most
dramatic diet-heart trial ever conducted: the Lyon Diet Heart
Study, led by Michel de Lorgeril of
the French National Institute of Health and Medical
Research (INSERM) and published in
Circulation in February 1999. The investigators
randomized 605 heart attack
survivors, all on cholesterol-lowering drugs, into two groups.
They counseled one to eat an AHA
"prudent diet," very similar to that recommended for all
Americans. They counseled the other
to eat a Mediterranean-type diet, with more bread,
cereals, legumes, beans,
vegetables, fruits, and fish and less meat. Total fat and types of fat
differed markedly in the two diets,
but the HDL, LDL, and total cholesterol levels in the two
groups remained virtually
identical. Nonetheless, over 4 years of follow-up, the
Mediterranean-diet group had only
14 cardiac deaths and nonfatal heart attacks compared to
44 for the "Western-type"
diet group. The likely explanation, wrote de Lorgeril and his
colleagues, is that the
"protective effects [of the Mediterranean diet] were not related to serum
concentrations of total, LDL or HDL
cholesterol."
Many researchers find the Lyon data
so perplexing that they're left questioning the
methodology of the trial.
Nonetheless, says NIH's Harlan, the data "are very provocative. They
do bring up the issue of whether if
we look only at cholesterol levels we aren't going to miss
something very important." De
Lorgeril believes the diet's protective effect comes primarily
from omega-3 fatty acids, found in
seed oils, meat, cereals, green leafy vegetables, and fish,
and from antioxidant compounds,
including vitamins, trace elements, and flavonoids. He told
Science that most researchers and
journalists in the field are prisoners of the "cholesterol
paradigm." Although dietary
fat and serum cholesterol "are obviously connected," he says,
"the connection is not a
robust one" when it comes to heart disease.
Dietary trade-offs
One inescapable reality is that
death is a trade-off, and so is diet. "You have to eat something," says
epidemiologist Hugh Tunstall Pedoe of the University of Dundee, U.K.,
spokesperson for
the 21-nation Monitoring
Cardiovascular Disease Project run by the World Health
Organization. "If you eat more
of one thing, you eat a lot less of something else. So for every
theory saying this disease is
caused by an excess in x, you can produce an alternative theory
saying it's a deficiency in
y." It would be simple if, say, saturated fats could be cut from the
diet and the calories with it, but
that's not the case. Despite all expectations to the contrary,
people tend to consume the same
number of calories despite whatever diet they try. If they eat
less total fat, for instance, they
will eat more carbohydrates and probably less protein, because
most protein comes in foods like
meat that also have considerable amounts of fat.
This plus-minus problem suggests a
different interpretation for virtually every diet study ever
done, including, for instance, the
kind of metabolic-ward studies that originally demonstrated
the ability of saturated fats to
raise cholesterol. If researchers reduce the amount of saturated
fat in the test diet, they have to
make up the calories elsewhere. Do they add polyunsaturated
fats, for instance, or add
carbohydrates? A single carbohydrate or mixed carbohydrates? Do
they add green leafy vegetables, or
do they add pasta? And so it goes. "The sky's the limit,"
says nutritionist Alice
Lichtenstein of Tufts University in Boston. "There are a million
perturbations."
These trade-offs also confound the
kind of epidemiological studies that demonized saturated
fat from the 1950s onward. In
particular, individuals who eat copious amounts of meat and12
dairy products, and plenty of
saturated fats in the process, tend not to eat copious amounts of
vegetables and fruits. The same
holds for entire populations. The eastern Finns, for instance,
whose lofty heart disease rates
convinced Ancel Keys and a generation of researchers of the
evils of fat, live within 500
kilometers of the Arctic Circle and rarely see fresh produce or a
green vegetable. The Scots,
infamous for eating perhaps the least wholesome diet in the
developed world, are in a similar
fix. Basil Rifkind recalls being laughed at once on this point
when he lectured to Scottish
physicians on healthy diets: "One said, 'You talk about increasing
fruits and vegetable consumption,
but in the area I work in there's not a single grocery store.' "
In both cases, researchers joke
that the only green leafy vegetable these populations consume
regularly is tobacco. As for the
purported benefits of the widely hailed Mediterranean diet, is
it the fish, the olive oil, or the
fresh vegetables? After all, says Harvard epidemiologist
Dimitrios Trichopoulos, a native of
Greece, the olive oil is used either to cook vegetables or
as dressing over salads. "The
quantity of vegetables consumed is almost a pound [half a
kilogram] a day," he says,
"and you cannot eat it without olive oil. And we eat a lot of
legumes, and we cannot eat legumes
without olive oil."
Indeed, recent data on heart
disease trends in Eu rope suggest that a likely explanation for the
differences between countries and
over time is the availability of fresh produce year-round
rather than differences in fat
intake. While the press often plays up the French paradox--the
French have little heart disease
despite seemingly high saturated fat consumption--the real
paradox is throughout Southern
Europe, where heart disease death rates have steadily dropped
while animal fat consumption has
steadily risen, says University of Cambridge epidemiologist
John Powles, who studies national
disease trends. The same trend appears in Japan. "We have
this idea that it's the Arcadian
past, the life in the village, the utopia that we've lost," Powles
says; "that the really
protective Mediterranean diet is what people ate in the 1950s." But that
notion isn't supported by the data:
As these Mediterranean nations became more affluent, says
Powles, they began to eat
proportionally more meat and with it more animal fat. Their heart
disease rates, however, continued
to improve compared to populations that consumed as much
animal fat but had less access to
fresh vegetables throughout the year. To Powles, the antifat
movement was founded on the Puritan
notion that "something bad had to have an evil cause,
and you got a heart attack because
you did something wrong, which was eating too much of a
bad thing, rather than not having
enough of a good thing."
The other salient trade-off in the
plus-minus problem of human diets is carbohydrates. When
the federal government began
pushing low-fat diets, the scientists and administrators, and
virtually everyone else involved,
hoped that Americans would replace fat calories with fruits
and vegetables and legumes, but it
didn't happen. If nothing else, economics worked against it.
The food industry has little
incentive to advertise nonproprietary items: broccoli, for instance.
Instead, says NYU's Nestle, the
great bulk of the $30-billion-plus spent yearly on food
advertising goes to selling
carbohydrates in the guise of fast food, sodas, snacks, and candy
bars. And carbohydrates are all too
often what Americans eat.
Carbohydrates are what Harvard's
Willett calls the flip side of the calorie trade-off problem.
Because it is exceedingly difficult
to add pure protein to a diet in any quantity, a low-fat diet
is, by definition, a
high-carbohydrate diet--just as a low-fat cookie or low-fat yogurt are, by
definition, high in carbohydrates.
Numerous studies now suggest that high-carbohydrate diets
can raise triglyceride levels,
create small, dense LDL particles, and reduce HDL--a
combination, along with a condition
known as "insulin resistance," that Stanford
endocrinologist Gerald Reaven has
labeled "syndrome X." Thirty percent of adult males and13
10% to 15% of postmenopausal women
have this particular syndrome X profile, which is
associated with a several-fold
increase in heart disease risk, says Reaven, even among those
patients whose LDL levels appear
otherwise normal. Reaven and Ron Krauss, who studies
fats and lipids at Lawrence
Berkeley National Laboratory in California, have shown that when
men eat high-carbohydrate diets
their cholesterol profiles may shift from normal to syndrome
X. In other words, the more
carbohydrates replace saturated fats, the more likely the end result
will be syndrome X and an increased
heart disease risk. "The problem is so clear right now it's
almost a joke," says Reaven.
How this balances out is the unknown. "It's a bitch of a
question," says Marc
Hellerstein, a nutritional biochemist at the University of California,
Berkeley, "maybe the great
public health nutrition question of our era."
The other worrisome aspect of the
carbohydrate trade-off is the possibility that, for some
individuals, at least, it might
actually be easier to gain weight on low-fat/high-carbohydrate
regimens than on higher fat diets.
One of the many factors that influence hunger is the
glycemic index, which measures how
fast carbohydrates are broken down into simple sugars
and moved into the bloodstream.
Foods with the highest glycemic index are simple sugars and
processed grain products like pasta
and white rice, which cause a rapid rise in blood sugar
after a meal. Fruits, vegetables,
legumes, and even unprocessed starches--pasta al dente, for
instance--cause a much slower rise
in blood sugar. Researchers have hypothesized that eating
high-glycemic index foods increases
hunger later because insulin overreacts to the spike in
blood sugar. "The high insulin
levels cause the nutrients from the meal to get absorbed and
very avidly stored away, and once
they are, the body can't access them," says David Ludwig,
director of the obesity clinic at
Children's Hospital Boston. "The body appears to run out of
fuel." A few hours after
eating, hunger returns.
If the theory is correct, calories
from the kind of processed carbohydrates that have become
the staple of the American diet are
not the same as calories from fat, protein, or complex
carbohydrates when it comes to
controlling weight. "They may cause a hormonal change that
stimulates hunger and leads to
overeating," says Ludwig, "especially in environments where
food is abundant. ..."
In 1979, 2 years after McGovern's
committee released its Dietary Goals, Ahrens wrote to The Lancet describing
what he had learned over 30 years of studying fat and cholesterol
metabolism: "It is absolutely
certain that no one can reliably predict whether a change in
dietary regimens will have any
effect whatsoever on the incidence of new events of [coronary
heart disease], nor in whom."
Today, many nutrition researchers, acknowledging the
complexity of the situation, find
themselves siding with Ahrens. Krauss, for instance, who
chairs the AHA Dietary Guidelines
Committee, now calls it "scientifically na•ve" to expect
that a single dietary regime can be
beneficial for everybody: "The 'goodness' or 'badness' of
anything as complex as dietary fat
and its subtypes will ultimately depend on the context of
the individual."
Given the proven success and low
cost of cholesterol-lowering drugs, most physicians now
prescribe drug treatment for
patients at high risk of heart disease. The drugs reduce LDL
cholesterol levels by as much as
30%. Diet rarely drops LDL by more than 10%, which is
effectively trivial for healthy
individuals, although it may be worth the effort for those at high
risk of heart disease whose
cholesterol levels respond well to it.14
The logic underlying populationwide
recommendations such as the latest USDA Dietary
Guidelines is that limiting
saturated fat intake--even if it does little or nothing to extend the
lives of healthy individuals and
even if not all saturated fats are equally bad--might still delay
tens of thousands of deaths each
year throughout the entire country. Limiting total fat
consumption is considered
reasonable advice because it's simple and easy to understand, and it
may limit calorie intake. Whether
it's scientifically justifiable may simply not be relevant.
"When you don't have any real
good answers in this business," says Krauss, "you have to
accept a few not so good ones as
the next best thing."
^^^^^^^^^^^^^^^^
”Science”, Vol 291, 30
March 2001
What If Americans Ate Less Saturated
Fat?
Gary Taubes
Eat less saturated fat, live longer. For 30 years, this has
stood as one cornerstone of nutritional
advice given to Americans (see main text). But how much
longer? Between 1987 and 1992,
three independent research groups used computer models to
work out the answer. All three
analyses agreed, but their conclusions have been buried in
the literature, rarely if ever cited.
All three models estimated how much longer people might
expect to live, on average, if only
10% of their calories came from saturated fat as recommended.
In the process their total fat
intake would drop to the recommended 30% of calories. All
three models assumed that LDL
cholesterol--the "bad cholesterol"--levels would
drop accordingly and that this diet would
have no adverse effects, although that was optimistic at the
time and has become considerably
more so since then. All three combined national vital
statistics data with cholesterol risk factor
data from the Framingham Heart Study.
The first study came out of Harvard Medical School and was
published in the Annals of
Internal Medicine in April 1987. Led by William Taylor, it
concluded that individuals with a
high risk of heart disease--smokers, for instance, with high
blood pressure--could expect to
gain, on average, one extra year by shunning saturated fat.
Healthy nonsmokers, however,
might add 3 days to 3 months. "Although there are
undoubtedly persons who would choose to
participate in a lifelong regimen of dietary change to
achieve results of this magnitude, we
suspect that some might not," wrote Taylor and his
colleagues.
The following year, the U.S. Surgeon General's Office funded
a study at the University of
California, San Francisco, with the expectation that its
results would counterbalance those of
the Harvard analysis. Led by epidemiologist Warren Browner,
this study concluded that
cutting fat consumption in America would delay 42,000 deaths
each year, but the net increase
in life expectancy would average out to only 3 to 4 months.
The key word was "delay," for
death, like diet, is a trade-off: Everyone has to die of
something. "Deaths are not prevented,
they are merely delayed," Browner later wrote. "The
'saved' people mainly die of the same
things everyone else dies of; they do so a little later in
life." To be precise, a woman who15
might otherwise die at 65 could expect to live two extra
weeks after a lifetime of avoiding
saturated fat. If she lived to be 90, she could expect 10
additional weeks. The third study, from
researchers at McGill University in Montreal, came to
virtually identical conclusions.
Browner reported his results to the Surgeon General's Office,
then submitted a paper to The
Journal of the American Medical Association (JAMA).
Meanwhile, the Surgeon General's
Office--his source of funding--contacted JAMA and tried to
prevent publication, claiming that
the analysis was deeply flawed. JAMA reviewers disagreed and
published his article, entitled
"What If Americans Ate Less Fat?" in June 1991. As
for Browner, he was left protecting his
work from his own funding agents. "Shooting the
messenger," he wrote to the Surgeon
General's Office, "or creating a smoke screen--does not
change those estimates."
”Science”, Vol 291, 30
March 2001
The Epidemic That
Wasn't?
Gary Taubes
For half a century, nutritionists have pointed to soaring
death rates as the genesis of their
research into dietary fat and heart disease and as reason to
advise Americans to eat less fat
(see main text). "We had an epidemic of heart disease
after World War II," obesity expert
Jules Hirsch of Rockefeller University in New York City said
just 3 months ago in The New
York Times. "The rates were growing higher and higher,
and people became suddenly aware of
that, and that diet was a factor."
To proponents of the antifat message, this heart disease
epidemic has always been an
indisputable reality. Yet, to the statisticians at the
mortality branch of the National Center for
Health Statistics (NCHS), the source of all the relevant
statistics, the epidemic was illusory. In
their view, heart disease deaths have been steadily declining
since the late 1940s.
According to Harry Rosenberg, director of the NCHS mortality
branch since 1977, the key
factor in the apparent epidemic, paradoxically, was a
healthier American population. By the
1950s, premature deaths from infectious diseases and
nutritional deficiencies had been all but
eliminated, which left more Americans living long enough to
die of chronic diseases such as
heart disease. In other words, the actual risk of dying from
a heart attack at any particular age
remained unchanged: Rather, the rising number of 50-year-olds
dropping dead of heart attacks
was primarily due to the rising number of 50-year-olds.
The secondary factor was an increase from 1948 to 1968 in the
probability that a death would
be classified on a death certificate as arteriosclerotic
disease or coronary heart disease. This
increase, however, was a figment of new diagnostic
technologies--the wider use of
electrocardiograms, for instance--and the changing
terminology of death certificates. In 1949,
the International Classification of Diseases (ICD) added a
new category, "arteriosclerotic heart
disease," under the more general rubric "diseases
of the heart." The result, as a 1958 report to16
the American Heart Association noted, was dramatic: "In
one year, 1948 to 1949, the effect of
this revision was to raise coronary disease death rates by
about 20% for white males and about
35% for white females." In 1965, the ICD added a
category for coronary heart disease, which
added yet more deaths and capped off the apparent epidemic.
To Rosenberg and others at NCHS, the most likely explanation
for the postwar upsurge in
coronary heart disease deaths is that physicians slowly
caught on to the new terminology and
changed the wording on death certificates. "There is
absolutely no evidence that there was an
epidemic," says Rosenberg.