Paradigm shifts, correlations and intervention trials

I have read that paradigms change from the outside, and do so slowly.

I got into the field I am in now (therapeutic nutrition) over 40 years ago. Back then, the conventional wisdom of the medical community was that vitamin supplements created nothing more than expensive urine, except in cases of frank vitamin deficiency. If you weren’t on a starvation diet, you would get everything you needed from your diet and would not need to take a vitamin supplement. And in actual fact, since the advent in the Forties and Fifties of food fortification, deficiency diseases such as scurvy (vitamin C deficiency), beri-beri (vitamin B1 deficiency) and so forth are extremely rare, at least here in the US. But we believed that vitamin supplements were needed anyway, for a couple of reasons. First, our food supply had shifted from what our parents and grandparents ate (primarily home-grown or fresh produce, locally-produced and minimally processed) to highly-processed food where the nutritional value was largely compromised. Secondly, changes in our environment (air and water pollution) and in lifestyle habits (cigarette smoking, dependence on pharmaceutical intervention for every ill, etc.) had all contributed to increased need for particular nutrients just to be healthy. Thirdly, a concept pioneered by Roger Williams and others called “biochemical individuality.” Williams’ book showed that there was a tremendous variation in the biochemistry of humans, and since activation of important enzymes is dependent upon availability of their nutrient cofactors, there was also tremendous variation of vitamin and mineral requirements. This meant that I might require hundreds of times more vitamin B6 (for example) to activate my enzymes than the person next to me, making supplementation of that vitamin necessary for me function normally. It would be a “functional vitamin deficiency” as opposed to a traditionally-understood deficiency. And lastly, we believed that vitamins had functions and effects that went beyond simply preventing or treating deficiencies. For example, the level of vitamin C required to prevent scurvy was about 30 or 40 milligrams per day (about what would be found in an orange). But based on work done by Linus Pauling and others, doses hundreds or even thousands of times higher were shown to be effective in preventing or treating cold and flu and even serious diseases such as cancer.

In the Seventies this was the realm of “alternative,” “preventive” or “holistic” medicine. There were a few studies that supported these beliefs back then, but those of us in the therapeutic nutritional supplementation field were confident that much more support would emerge over time.

Fast forward to today. Terms like alternative, holistic or preventive have given way to “functional” or “integrative.” To me this indicates a less polarized view of medical options, where “allopaths” are on one side and “homeopaths” on the other and never the twain shall meet. Instead we’re incorporating those aspects of all disciplines that work and make sense, and this now includes nutritional supplements, in much higher levels than what is required to prevent deficiency disease. TV shows like Dr. Oz have popularized what used to be avant-garde or considered unscientific, and now the clinicians who tell you that supplements give you expensive urine or in the minority. I read recently that something like 80% of gastroenterologists who were polled said that they recommended or provided probiotics to their patients, and most studies show a significant deficiency of vitamin D across much of the United States during winter months. There is still controversy, but it’s more arguing over nuance, as opposed to calling one another frauds. The paradigm is definitely shifting.

Part of this is due, I believe, to the incredible explosion of studies showing that the “traditional” approach to health care is well-suited for crisis intervention, where powerful pharmacological agents or surgical intervention are required, but it’s not so well-suited for the chronic diseases we see today, such as type 2 diabetes, obesity, cardiovascular disease and so forth. Most of these studies are correlational studies, where large populations are evaluated for presence or absence of disease, and then different variables examined to see what associations might emerge. Almost without fail, people who have the chronic diseases I mentioned above choose a diet high in refined sugar and fat, low in nutrient value and fiber. To say it differently, lifestyle choices people make have a direct correlation to health. Eat a diet high in fiber and micronutrients and you reduce your risk to disease; eat a “western-style” diet (high in refined sugars and fat, low in micronutrients and fiber) and your risk to chronic disease goes up dramatically. And if you can’t (or don’t want to) consume a more micronutrient-rich diet, take a supplement and you’re good to go.

Maybe.

While there’s a strong correlation between making better lifestyle choices and reducing the risk to disease, it may or may not have anything to do with the micronutrient content of the diet (with its corollary of vitamin supplementation). There could be other factors that are contributing to the undeniable observation that we can benefit from healthier lifestyle choices; it could even be a simple coincidence (unlikely but not impossible). For many people (I count myself as one of them), the correlational studies are enough. I take supplements, try to exercise and select a healthier diet because I think the benefits (potential and proven) far outweigh the downside. And it’s tempting to say that’s enough extrapolate to the population at large; people should eat a healthier diet, exercise and take supplements and their health will improve. But for others, that is not enough. The say “correlation does not prove causation; I need a definitive cause-and-effect relationship to be established before I’ll believe.

And there’s a justification for this conservativism, at least from a policy approach. If, for example, we are able to clearly prove that, say, taking vitamin D in winter months reduces the number of colds in a given population by 40%. Furthermore, let’s say we have established that the cost of the vitamin D works out to about $.10 a day for everyone, while the cost to business and schools in lost productivity amounts to an average of $1.00 per day per person. And one last consideration (safety):  the amount of vitamin D necessary to do that has never been shown to be harmful to anyone. And while this is stretching the science and economics a bit, it’s possible that everything I’ve said could be true. With the facts as I’ve laid out, it would obviously make sense to get everyone the ten-cents-worth of vitamin D every day. But how would that dime for every day for every person get paid for? (It adds up quickly; for a family of four that would be nearly $150 per year; a small town of 10,000 would be shelling out almost $375,000 every year.  Try to get taxes raised to pay for that!

Thus, many people (especially policy-making government types) push back pretty hard against the notion that low levels of micronutrients (which could be fixed with supplements) is the cause of our health woes; they want it proven beyond doubt before they buy in (figuratively as well as literally).

How do you prove cause and effect?  Intervention trials. This is where a group of people are selected who are at high risk for a specific disease; part of the group is given an intervention (nutritional supplements, for example) and another part is not. Both groups are followed over time, and if the intervention group stays healthy and the control gets sick, you have proof. Of course the trial has to be conducted a number of times to make sure it’s not a coincidence, but you’re on your way.

But there’s a problem.  A puzzling thing I’ve observed over the last several years is that the majority of large intervention studies of nutritional supplements have not shown benefit. For example, one study  (called “The Women’s Health Study”) looked at the effect of vitamin E in reducing heart disease in a large group of women (following nearly 40,000 participants for more than 10 years) and found no benefit to the supplement. In addition, several of these studies even appear to show that taking supplements is a bad idea, and could lead to harm. This study (called the “SELECT” trial) concluded that fish oil consumption increased risk to a particularly virulent form of prostate cancer in men. This is of course not universal, as some studies have shown significant improvement, but there haven’t been the flood of positive studies we had expected. As you might expect there are bones to pick with the way these and other similar studies are designed, conducted and interpreted; nonetheless it is surprising to me that there are not more positive studies.

It got me thinking that there might be a problem with the way this question is being approached.

About BigBill

Stats: Married male boomer. Hobbies: Hiking, woodworking, reading, philosophy, good conversation.
This entry was posted in Nutrition and eating. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *