I have read that paradigms change from the outside, and do so slowly.
I got into the field I am in now (therapeutic nutrition) over 40 years ago. Back then, the conventional wisdom of the medical community was that vitamin supplements created nothing more than expensive urine, except in cases of frank vitamin deficiency. If you weren’t on a starvation diet, you would get everything you needed from your diet and would not need to take a vitamin supplement. And in actual fact, since the advent in the Forties and Fifties of food fortification, deficiency diseases such as scurvy (vitamin C deficiency), beri-beri (vitamin B1 deficiency) and so forth are extremely rare, at least here in the US. But we believed that vitamin supplements were needed anyway, for a couple of reasons. First, our food supply had shifted from what our parents and grandparents ate (primarily home-grown or fresh produce, locally-produced and minimally processed) to highly-processed food where the nutritional value was largely compromised. Secondly, changes in our environment (air and water pollution) and in lifestyle habits (cigarette smoking, dependence on pharmaceutical intervention for every ill, etc.) had all contributed to increased need for particular nutrients just to be healthy. Thirdly, a concept pioneered by Roger Williams and others called “biochemical individuality.” Williams’ book showed that there was a tremendous variation in the biochemistry of humans, and since activation of important enzymes is dependent upon availability of their nutrient cofactors, there was also tremendous variation of vitamin and mineral requirements. This meant that I might require hundreds of times more vitamin B6 (for example) to activate my enzymes than the person next to me, making supplementation of that vitamin necessary for me function normally. It would be a “functional vitamin deficiency” as opposed to a traditionally-understood deficiency. And lastly, we believed that vitamins had functions and effects that went beyond simply preventing or treating deficiencies. For example, the level of vitamin C required to prevent scurvy was about 30 or 40 milligrams per day (about what would be found in an orange). But based on work done by Linus Pauling and others, doses hundreds or even thousands of times higher were shown to be effective in preventing or treating cold and flu and even serious diseases such as cancer.
In the Seventies this was the realm of “alternative,” “preventive” or “holistic” medicine. There were a few studies that supported these beliefs back then, but those of us in the therapeutic nutritional supplementation field were confident that much more support would emerge over time.
Fast forward to today. Terms like alternative, holistic or preventive have given way to “functional” or “integrative.” To me this indicates a less polarized view of medical options, where “allopaths” are on one side and “homeopaths” on the other and never the twain shall meet. Instead we’re incorporating those aspects of all disciplines that work and make sense, and this now includes nutritional supplements, in much higher levels than what is required to prevent deficiency disease. TV shows like Dr. Oz have popularized what used to be avant-garde or considered unscientific, and now the clinicians who tell you that supplements give you expensive urine or in the minority. I read recently that something like 80% of gastroenterologists who were polled said that they recommended or provided probiotics to their patients, and most studies show a significant deficiency of vitamin D across much of the United States during winter months. There is still controversy, but it’s more arguing over nuance, as opposed to calling one another frauds. The paradigm is definitely shifting.
Part of this is due, I believe, to the incredible explosion of studies showing that the “traditional” approach to health care is well-suited for crisis intervention, where powerful pharmacological agents or surgical intervention are required, but it’s not so well-suited for the chronic diseases we see today, such as type 2 diabetes, obesity, cardiovascular disease and so forth. Most of these studies are correlational studies, where large populations are evaluated for presence or absence of disease, and then different variables examined to see what associations might emerge. Almost without fail, people who have the chronic diseases I mentioned above choose a diet high in refined sugar and fat, low in nutrient value and fiber. To say it differently, lifestyle choices people make have a direct correlation to health. Eat a diet high in fiber and micronutrients and you reduce your risk to disease; eat a “western-style” diet (high in refined sugars and fat, low in micronutrients and fiber) and your risk to chronic disease goes up dramatically. And if you can’t (or don’t want to) consume a more micronutrient-rich diet, take a supplement and you’re good to go.
Maybe.
While there’s a strong correlation between making better lifestyle choices and reducing the risk to disease, it may or may not have anything to do with the micronutrient content of the diet (with its corollary of vitamin supplementation). There could be other factors that are contributing to the undeniable observation that we can benefit from healthier lifestyle choices; it could even be a simple coincidence (unlikely but not impossible). For many people (I count myself as one of them), the correlational studies are enough. I take supplements, try to exercise and select a healthier diet because I think the benefits (potential and proven) far outweigh the downside. And it’s tempting to say that’s enough extrapolate to the population at large; people should eat a healthier diet, exercise and take supplements and their health will improve. But for others, that is not enough. The say “correlation does not prove causation; I need a definitive cause-and-effect relationship to be established before I’ll believe.
And there’s a justification for this conservativism, at least from a policy approach. If, for example, we are able to clearly prove that, say, taking vitamin D in winter months reduces the number of colds in a given population by 40%. Furthermore, let’s say we have established that the cost of the vitamin D works out to about $.10 a day for everyone, while the cost to business and schools in lost productivity amounts to an average of $1.00 per day per person. And one last consideration (safety): the amount of vitamin D necessary to do that has never been shown to be harmful to anyone. And while this is stretching the science and economics a bit, it’s possible that everything I’ve said could be true. With the facts as I’ve laid out, it would obviously make sense to get everyone the ten-cents-worth of vitamin D every day. But how would that dime for every day for every person get paid for? (It adds up quickly; for a family of four that would be nearly $150 per year; a small town of 10,000 would be shelling out almost $375,000 every year. Try to get taxes raised to pay for that!
Thus, many people (especially policy-making government types) push back pretty hard against the notion that low levels of micronutrients (which could be fixed with supplements) is the cause of our health woes; they want it proven beyond doubt before they buy in (figuratively as well as literally).
How do you prove cause and effect? Intervention trials. This is where a group of people are selected who are at high risk for a specific disease; part of the group is given an intervention (nutritional supplements, for example) and another part is not. Both groups are followed over time, and if the intervention group stays healthy and the control gets sick, you have proof. Of course the trial has to be conducted a number of times to make sure it’s not a coincidence, but you’re on your way.
But there’s a problem. A puzzling thing I’ve observed over the last several years is that the majority of large intervention studies of nutritional supplements have not shown benefit. For example, one study (called “The Women’s Health Study”) looked at the effect of vitamin E in reducing heart disease in a large group of women (following nearly 40,000 participants for more than 10 years) and found no benefit to the supplement. In addition, several of these studies even appear to show that taking supplements is a bad idea, and could lead to harm. This study (called the “SELECT” trial) concluded that fish oil consumption increased risk to a particularly virulent form of prostate cancer in men. This is of course not universal, as some studies have shown significant improvement, but there haven’t been the flood of positive studies we had expected. As you might expect there are bones to pick with the way these and other similar studies are designed, conducted and interpreted; nonetheless it is surprising to me that there are not more positive studies.
It got me thinking that there might be a problem with the way this question is being approached.
Electric trains and delayed gratification
When I was a kid (I think maybe 11), my brother Jim and I decided we wanted to have an electric train set.
We looked at the different sizes (called “gauges”); HO gauge is sized in between the two other popular gauges and we decided it would be best for us. Small enough to be able to build a decent layout in a fairly small space, yet still look right and to scale. Since our intent was to create a permanent layout with a town, switch yard and countryside, we wanted something that had lots of available buildings, train cars and other scale accessories. HO gauge had (and still has) probably the greatest selection of accessories so it fit our needs perfectly.
Once we had sorted out exactly what we wanted we went to the various catalogs and selected the source; we knew we had to start small because of budget but wanted a good basic train set. It was going to be enough track to make a large oval, a couple of switches and then of course the train itself. All told the cost was $28.00, which was a princely sum to two kids.
We never seriously considered hitting Dad up for the money; our family was not poor (Dad was a dentist in a tiny little Illinois town), but we weren’t wealthy either, and we just didn’t have the dynamics of lots of gift-giving, nor of our parents paying for all of our stuff. The basics, of course, and they’d buy toys for us now and then, so we never felt like we were deprived in any way, but both Mom and Dad lived through the Depression and were frugal. Like other kids we had chores and a small allowance, but most of the time we were taught to look to our resources and “save up” if we wanted something. So Jim and I put together a plan to save up the $28.00 we needed to pay for our train set. I recall I was old enough to mow lawns, so Jim and I pooled our earnings and set aside our money all one spring and summer. This was obviously long before Excel and we didn’t know anything about accounting spreadsheets, so we tracked our progress by hand on a sheet that showed each entry: how much we had, how much to go before we had enough, and projected dates of completion.
By fall we had enough money to order our train set.
I’ll never forget how exciting it was for us when we finally ordered the train from the Murray catalog. When it arrived, we set it up on the floor of our living room and played with that train set for hours. This was the first thing that we had specifically worked toward and planned for; I think it’s safe to say that I’ve never bought something since then that’s been more gratifying. In retrospect, I know Dad could have easily funded us, but he knew that we would appreciate much more something we had worked so hard for.
I also think a big part of what made it so fun was the anticipation; the delayed gratification. We worked literally for months to get the money to buy our train. Today, it’s rare that you hear of anyone paying cash. Credit cards have become so common that when someone says they don’t have any the automatic assumption is that they must have some kind of problem; maybe they’ve had trouble with cards or declared bankruptcy so they can’t get anyone to extend credit. I think a big part of that is the message that delaying gratification is a bad thing; if you want something why not go get it right away and start enjoying whatever benefit you’re supposed to derive? Don’t worry about paying for it; you can use your trusty MasterCard and spread the pain over months or years! Unfortunately you find yourself still paying for something long after its useful life. Combine the unwillingness (inability?) to delay gratification with planned obsolescence and you’ve got a marketer’s dream come true. And a society that’s never content, and never out of debt.
I think of the feeling Jim and I had when that train set finally got to our house and it strikes me that maybe we’ve missed a valuable lesson there somewhere.