Vitamins: Is Nature’s Magic Enough?

When I was a medical intern I watched my supervising resident perform an immediate and visible cure and in that moment understood the appeal of vitamins to our pill-loving culture.  We were laboring over an old gentlemen brought to the emergency room from Boston’s Commons – a park that was home to many people whose diets came largely from brown-bagged liquor bottles.  Our patient was agitated and confused. Try as we might we could not get his eyes to move in any direction. My resident disappeared and returned with a tiny syringe filled with a Vitamin B1, also known as thiamine. He injected the liquid into the patient’s vein and, as if he’d waved a wand, our patient’s eye movements returned and he calmed down. Here was a miracle drug, and it was something nature made for us.

Vitamin deficiency

The magic of our patient’s recovery was a clear example of the function of vitamins. In minute amounts, they act as facilitators of chemical reactions necessary for energy production and cellular maintenance of all kinds. Our patient had a textbook case of vitamin deficiency, the result of a very bad diet or failure to absorb vitamins from the stomach and small intestine, or both. Alcoholism is the most common setting, but vitamin deficiencies occur with other severe gastrointestinal problems and in the malnutrition associated famine or devastating illness like cancer and AIDS. Sometimes medical treatment itself is the perpetrator, in the form of anticancer drugs or bypass surgery for morbid obesity.

Vital nutrients

For thousands of years, people have understood that certain foods contain substances vital to human life. The ancient Egyptians recognized that night blindness was cured by eating liver. In the 1700s, seagoing men found that lime juice prevented scurvy – the aches, skin rashes and loss of teeth from painful gum disease that occurred when men attempted to live for months without fresh food. When the nature of food’s magic yielded to chemical analysis, scientists found complex molecules with many active forms that acted as co-factors or triggers in energy-producing chemical reactions in all cells of the body. They were also involved in cell maintenance and reproduction.

Naming the magic

Chemists named the indispensible compounds vitamins (vita: root word for life; amine: a chemical group containing nitrogen, which early studies suggested all vitamins contained) and tagged them with letters as well as chemical names (see list below). Vitamins F – K eventually became part of the large Vitamin B complex group, and some vitamins were downgraded to “vital nutrients.”  Synthetic vitamins appeared on store shelves, joining age-old remedies like cod liver oil, yeast and wheat germ.  But even in our times, the best source of vitamins remains the whole foods in which nature embeds them with other factors that we may not yet recognize as important.

Water soluble vitamins

The B vitamins and Vitamin C dissolve in water. They aren’t stored in the body and can be lost or inactivated by cooking. These water-soluble vitamins find their way to their target cells, get used, recycled a bit, and then find their way out of the body in the urine. They need to be eaten on a daily basis.  You cannot overdose on B vitamins in food, but very high doses of B vitamin pills can damage the nerves.

Fat soluble vitamins

Fat-soluble vitamins (A, D, E and K) accumulate in liver and fat tissue, ready to be used when necessary, but damaging if too much is stored.  Some Arctic explorers died of brain swelling from consuming polar bear liver, very high in Vitamin A. Too many carrots (source of carotenes, or pre-Vitamin A) cause yellow skin. Too much Vitamin D raises blood calcium levels, producing weakness, lethargy and kidney stones.  Vitamin K can interfere with Coumadin, a medicine used to prevent blood clotting, so patients are cautioned to eat only small amounts of very flavorful greens like Kale and collards.

If you are not alcoholic or malnourished from serious illness, if you live in a western countries where vitamin fortification (enrichment) of common foods is the routine, if you eat well-balanced meals drawing fresh food from plant and animal sources, if you are meeting your energy needs and not trying to lose weight by restricting calories, and if you get enough sun exposure, you do not need any vitamin pills. Vitamins are best absorbed from real food.

Vitamin supplements?

In our current eating culture, however, a couple of vitamins do warrant concern. Folate (Vitamin B9) consumption, vital to cell replacement, is inadequate when fruits and vegetables are not chosen or hard to come by.  Vitamin D deficiency, which became rare when fortification of milk began, is again on the rise, producing rickets (malformed bones) in children, weakened bones in adults, and weakened immune systems in all age groups. Cholesterol phobia makes people avoid good Vitamin D sources like whole milk and egg yolks.  Sun exposure of head and arms for just 15 minutes 2 or 3 times a week makes enough Vitamin D in skin to our needs, but effective sunscreens and lack of outdoor activity have put serious dents in sun exposure.

What about Vitamin C, the wonder vitamin? Most plants and animals make it. We do not.  Linus Pauling, Nobel prize-winning chemist, speculated that our intake should be much higher than the small amount required to prevent scurvy. Apes, who’ve also lost the ability to make Vitamin C, consume 10 -20 times as much as we do. Goats, who make Vitamin C in huge quantities, make even more when stressed.  Does Vitamin C help prevent colds, strengthen our connective tissue, and get used up faster in times of physical stress? Maybe.  We just don’t know. But in the meantime, large doses, up to several thousand milligrams per day, appear to do no harm. (Smokers do need extra C.)

Take advice with a grain of salt

What are we to think of all the articles we see extolling the virtues of this vitamin or that in preventing this disease or that? Be wary of these words: suggests, indicates, may be, could prevent. If any of the putative effects were as clear as our emergency room patient’s revival, or the salvaging of sailors’ gums and teeth, or the cure of the Egyptians’ night vision, we would not be using tentative words. Keep your focus on a fresh food diet that excludes no food group, and on the physical activity that enables you to eat enough food to get everything you need without getting fat. Take Vitamin C if you want to, and add a multivitamin from a reputable company if you are dieting or restricting your diet in any way, or don’t like vegetables and fruit.





Major Vitamins and Some Food Sources


Vitamin name

Chemical name

(RDA) Recommended daily allowance
(male, age    19–70)

Animal Source

Plant Source

Vitamin A (retinol, retinoids
and carotenoids)
900 µg


Beef and chicken liver*

Whole milk, eggs, cheese

Carrots, spinach, yellow vegetables and fruits
Vitamin B1 Thiamine 1.2 mg


Pork*, lean meats, fish Brewer’s yeast*, wheat germ*, whole grains

Enriched grains, legumes, nuts

Vitamin B Riboflavin 1.3 mg Eggs, lean meats, milk Brewer’s yeast*, cereals, nuts, leafy greens
Vitamin B3 Niacin 16.0 mg Lean meats, poultry, fish, eggs Beets, Brewer’s yeast*, peanuts, other nuts, sunflower seeds, green leafy vegetables, coffee, tea
Vitamin B5 Pantothenic acid 5.0 mg Calf’s liver*, eggs, yogurt Brewer’s yeast*, whole grains,sunflower seeds, mushrooms, squash, cauliflower, broccoli
Vitamin B6 Pyridoxine 1.3-1.7 mg Liver, egg yolks, poultry, fish Wheat germ, whole grains, peanuts, walnuts, bananas, avocados
Vitamin B7 Biotin 30.0 µg Eggs yolk, liver Brewer’s yeast, wheat bran cauliflower, avocado
Vitamin B9 Folic acid 400 µg Beef liver*, egg yolk Fortified cereals*, leafy green vegetables, citrus fruits
Vitamin B12 Cyanocobalamin 2.4 µg Meat, eggs, dairy products, shellfish, salmon Fortified plant milks and cereals only. No natural plant sources.
Vitamin C Ascorbic acid 90.0 mg   Citrus fruits*, tomatoes, berries, green and red peppers, broccoli, spinach
Vitamin D Ergocalciferol and
5.0 µg-10 µg Dairy products, salmon, tuna Fortified cereals
Vitamin E Tocopherol and
15.0 mg   Wheat germ oil*, almonds*, hazelnuts,sunflower seeds and oil, safflower oil
Vitamin K Naphthquinone 120 µg   Broccoli*, Kale*, Swiss chard*, soybean oil*, canola oil, olive oil

*excellent source

Vitamin D: A Developing Story

Ten years ago, it is unlikely that vitamin D was on your list of things to worry about.  Now you probably know someone who is vitamin D – deficient and taking a vitamin pill every day – or at least during the winter months when the sun is low in the sky. Finding a magazine or newspaper that hasn’t published stories warning about deficiencies in the “sun vitamin” is difficult. What happened?  Do we have a new problem, or have we just learned more about an old one?

The magic ingredient in cod liver oil

The old part of the Vitamin D story is about a childhood disease called rickets and an adult version of rickets called osteomalacia. Both afflictions became common when urbanization crowded people into the sooty cities of northern Europe in the 1700s. Affected children had bowed legs, malformed chests and teeth, weak muscles and easily fractured bones. In adults, whose bones had stopped growing, the symptoms were bone pain, fractures, and muscle weakness. Though folklore from coastal cities had long described cod liver oil as a remedy for these problems and for other rheumatic complaints, it wasn’t until the early 20th C that scientists discovered that the magic ingredient in cod liver oil was one of the newly described vitamins- special compounds the body can’t make but needs to get in small amounts from specific foods. Vitamin D was the fourth one named, after Vitamins A, B, and C.

Further research demonstrated that Vitamin D is present in many animal fats, and is necessary for the transport of calcium from the intestine into the blood. The rampant rickets and osteomalacia of the early industrial revolution years seemed accounted for by poor diet, but the concentration of these problems in northern climates prompted more questions.  The answers started a new chapter in the story of Vitamin D—its relationship to the sun and its reclassification as a hormone rather than a vitamin.

Vitamin D is actually a hormone, not a vitamin

Vitamin D, as demonstrated by an elegant series of experiments in the 1920s, can be made by the body, in the skin —as long as the skin is exposed to sufficient sunlight. By the time of this discovery though, the vitamin label was too well established to be removed. What started out being known as the bone vitamin became the sunshine vitamin, and more research into its biochemistry placed the “vitamin” firmly in the camp of the  hormones, which are made in the body’s glands and and which act on many different and distant parts of the body to signal changes in cellular functions. Chemically , Vitamin  D most resembles steroid hormones such as testosterone and cortisol and estradiol.

The discovery of Vitamin D receptors

Vitamin D research took its next leap when hormones were discovered to have receptors in the tissues where they were active. Sure enough, Vitamin D had receptors too, in virtually all tissues. By the 1990s researchers were busy trying to find out why. They observed that vitamin D suppressed the growth of cancer cells – at least in the laboratory. Statistical studies showed lower cardiac death rates in the people with the highest vitamin D levels. The bone vitamin suddenly had many possible functions.

In the last decade thousands of studies have attempted to relate hosts of medical problems to vitamin D deficiency, including autism, depression, dementia and other neurodegenerative diseases, many varieties of musculoskeletal pain and arthritis, and autoimmune diseases like multiple sclerosis. So far, most of the research implies that vitamin D exerts its effects in a variety of tissues over the long term, altering the way genes are expressed rather than acting rapidly and directly as it does in intestinal transport of calcium.

Deciding who is deficient in D

Nevertheless, the race is on to see whether or not vitamin D might help many of the ailments that plague us. The first step is trying to decide who is deficient. Measurements of vitamin D, which is a general term applied to a number of different forms of the vitamin, were not standardized until 2006. There is still sometimes contentious debate about which form of the vitamin to measure and what constitutes a normal level.  The general consensus is that vitamin D3 (1-hydroxycholecalciferol) is the best measure of the body’s stores of vitamin D.  The range of normal values of vitamin D3 comes from studies of healthy Hawaiian surfers, who rarely have levels below 30 nanograms (ng)/ml and rarely above 60 ng/ml.  Different laboratories sometimes cite different values, but generally a value in the 20-30 range or lower indicates deficiency.

Requirements change with age

Requirements for vitamin D vary and change over life. As people age, skin produces less. Darker skinned people make less vitamin D. The recommended dose of vitamin D supplements is 200 IU/day (5 micrograms), 400 IU after age fifty, and 600 IU after age 70. Research enthusiasts suggest more.  Sun exposure is by far the most efficient route to adequate vitamin D. Twenty minutes of face and arm exposure produces as much as 10,000 IU vitamin D, which is stored efficiently for weeks. Most supplements are made from the skin of animals or derived from plants chemicals exposed to UV light. (Plant derived vitamin D is known as vitamin D2.)

Rickets makes a comeback

Are we more in need of vitamin D now than previously? In the past, we looked for vitamin D deficiency only in obvious cases of bone disease and kidney failure (the kidney converts vitamin D3 to its most active form). But now, with the best of intentions, we may have created another version of the sunlight deficient, dietary-restricted cities where rickets once thrived. We assiduously shun fat, meat, dairy products and eggs to avoid cardiac disease.  We apply sunscreen liberally to avoid skin cancer. Rickets is on the rise, there are more pediatric bone fractures than in there were a few decades ago, and general arthritic complaints abound. And we now suspect vitamin D may be required for basic cell functions in all organs.

What to do?

A little unprotected sun exposure every few days, and judicious intake of eggs, milk, fatty fish – even a little cod liver oil now and then –  are reasonable tactics to increase the body’s Vitamin D production . Or you could ask your doctor to check a vitamin D blood level and consider taking a supplement if the level is low, especially if you spend the winter above the latitudes of Boston and the California/Oregon border.


                                                    Food Sources of Vitamin D

                                      (From NIH Office of Dietary Supplements)

                                                                                                                                                                                                                                                                                                                                                                       IU         %RDA

Cod liver oil*, 1 tablespoon                                            1,360         340

Salmon (sockeye), cooked, 3 ounces                             794         199

Mackerel, cooked, 3 ounces                                              388           97

Tuna fish, canned in water, drained, 3 ounces            154           39

Milk, vitamin D-fortified, 1 cup                                  115-124    29-31

Yogurt, fortified, 6 ounces                                                  80            20

Margarine, fortified, 1 tablespoon                                    60             15

Sardines, canned in oil, drained, 2 sardines                  46             12

Beef liver, 3.5 ounces                                                             46             12

Fortified ready-to-eat cereal, 0.75-1 cup                        40            10

Egg, 1 whole (vitamin D is in yolk)                                     25              6

*The problem with cod liver oil as a source is that Vitamin A tags along and Vitamin A can be toxic in high doses, producing brain swelling. Check the source information and composition.

Holding the Line: Stop Gaining First

One of the most remarkable failures of modern medicine is its inability to combat obesity and its associated ills. Obesity is not a new human condition, nor will it ever completely disappear. But since the 1970s, something has changed in the environment and culture to make the condition epidemic, despite sophisticated medical research, a multi-billion dollar diet industry, and constant media attention.  The most effective solution remains not gaining excess weight in the first place, but that is no longer an option for over 60% of the population, many of whom are veteran dieters.

The body wants to keep the fat

Diets depend on adherence to a long-term plan for eating that fails to meet the body’s need for energy.   In response to this semi-starvation, the body mounts a defense. Hair and fingernails grow more slowly. Heat generation declines and the dieter feels cold and is less inclined to move around. Cells throughout the body ramp down their energy needs.  Within a few days, even sleeping burns fewer calories.  Caloric requirements remain suppressed long after the target weight is achieved.  Upward weight creep begins as soon as vigilance about food intake and exercise declines, and happens at a lower calorie intake than in the pre-diet days.  So begins the yo-yo dieting cycle, unless the dieter just gives up.

Stop the upward creep first

Giving up the attempt to starve away the pounds will eventually bring the metabolic rate back up, but only as the pounds re-accumulate. At this point a tactic other than a repeat diet attempt may be in order.  The most reliable way to achieve weight loss that lasts is by burning slightly more energy than is consumed on a daily basis over a long period of time – a sneak attack rather than a frontal assault.  Such long term daily commitment requires habit formation, and habit formation requires patient repetition of actions over long periods of time. Holding weight stable- just simply trying not to gain any more for at least 6-12 months- is the first preparation for mounting a sneak attack.

Going on defense

In contrast to the coordinated offense of a diet plan, not gaining any more weight requires defensive tactics.  Mindfulness – thinking before eating – is the primary tool.  Each day presents dozens of choices that might contribute to weight gain – or not. The only concern is reacting to choices presented.  Reacting correctly to just a few of them every day adds up over time.  At the end of 6-12 months of no weight gain, you are better off than at the end of another diet cycle that winds up on the upside of the starting weight.  You’ll have the habits of a person who maintains stable weight, and you will be ready to lose weight slowly and permanently by undershooting energy requirements just a little each day – but not enough to put your body into energy conservation mode.

Learn from the people who succeed

People who maintain stable weight often have some sensible guidelines for themselves. A common behavior is refusal to buy larger clothing sizes. Another is the choice of clothes with zippers and buttons and belts. If clothing becomes uncomfortable, they cut the sweets and alcohol back and pay more attention to activity level.  A weekly weight check keeps others on track. These people know better than to obsess about daily weight fluctuations, but 3-5pound gain in a week gets their attention. While a common mindfulness tactic is procrastination of eating to sort out true hunger from urges of emotional origin, people who maintain stable weight also do not go long periods without eating. The body begins to downshift into a lower energy gear if no food appears to break a fast of more than 6 hours.

Choices, choice, choices

Easily digestible carbohydrates in the modern diet, especially those combined with fats, make good targets for people seeking stable weight. Carbohydrates trigger surges in insulin.  Insulin blocks fat usage for energy needs, and hunger recurs much sooner after a high carbohydrate snack or meal than after one containing more protein and fat.  Choose to keep insulin levels down: eggs instead of cereal; one slice of bread on a sandwich instead of two; one M&M instead of a handful; nuts instead of M&Ms; half the normal spaghetti serving – or eat just the meat sauce; drink water instead of juices or soft drinks, even diet ones. (The taste of artificial sweeteners also triggers a burst of insulin, even though they have no caloric value.) Put off eating something that you really don’t need – distract yourself with an activity or task. Practice self-control in other areas of life. Self-control is a “transferable skill” and any practice helps build it.

Activity choices abound. Park far away from your destination. Walk if the trip is less than a mile (get a pull cart for groceries if you are lucky enough to live near the store). Skip the elevators. Make dates for walking instead of eating. Keep your hands busy and mind busy (mental activity takes energy too). Sit on an exercise ball instead of a desk chair. If you have a wireless printer, put it far away from the computer – on another floor if possible. Mow your own lawn. Shovel your own snow. Buy a pedometer and watch the steps add up. Engage in some strengthening activities to build high-energy demanding muscle tissue.

Stay in the present

Dieting to lose weight is always focused on the future. Weight maintenance is a present-moment task. There will never be a better time than now to go on the defense and begin to stop gaining weight. Now is the only time you have in which to take action – all the rest of time is either a memory or an imaginary future.

The Problem with Sugar: Insulin

This article is about insulin, not diabetes. Diabetic or not, you need to know about insulin. My epiphany about the importance of this hormone occurred when one of my children brought Micah, a friend with Type 1 diabetes, home for dinner. We had a healthy “Mediterranean” dinner – pasta tossed with olive oil, chicken, fresh tomatoes, and cilantro, with accompanying salad and French bread. And birthday cake.  Fresh from life in a college dorm, the young friend ate with gusto – at least two helpings of everything. We all did.  Later, I found him groggy and in need of two to three times his normal insulin dose. The epiphany was this: all the non-diabetics at the table that night required a lot of insulin to cover hefty carbohydrate intakes, but we were blissfully unaware of the consequences of over-indulgence. We did not have to fill syringes with extra insulin. Our pancreases did the work behind the scenes.

Awash in Insulin

Why was this realization an epiphany? Because we live in an age of excess, consuming large amounts of refined carbohydrates and frequently eating more than our energy requirements demand.  We are awash in insulin of our own making and need to understand this hormone’s central role in metabolism. More and more research links insulin to the chronic diseases of civilization: high blood pressure, heart disease, obesity, and Type II diabetes (the variety in which insulin is too plentiful and doesn’t work properly, as opposed to Type I, in which the pancreas fails to produce insulin).

How insulin works

Insulin is the hormone that moves sugar from the blood into all the body’s cells. Blood sugar comes from carbohydrates in food and from glycogen made by liver and muscles as a way to store a twelve-hour supply of sugar. When glycogen stores run out and little food is coming in, as in starvation or very low calorie diets, we make sugar, first from our muscle proteins and then from our fat.  The main goal of all metabolism is to keep blood sugar in a tight range – just right for the brain’s needs, because sugar is the only fuel the brain uses under normal circumstances. (It will resort to using ketone bodies, formed from fat in the liver during prolonged fasting or total carbohydrate restriction, but if sugar is available it is the preferred fuel).

Incoming dietary sugar elicits a burst of insulin from the pancreas. Insulin’s job is ferry the needed sugar to cells and to squirrel away extra sugar as glycogen and fat. Insulin is a lipogenic, or fat-producing hormone. Every time we overindulge, insulin goes into high gear to produce fat. It also raises triglyceride levels and lowers high density lipoproteins, exactly the changes in blood lipids that are associated with heart disease.

When insulin fails to work

Insulin is also mysterious. For unknown reasons, many people –   one in every three of us – have a tendency to become “resistant” to insulin’s effects. Their pancreases put out more and more insulin to handle routine blood sugar levels. No one knows what makes the insulin inefficient, though fat accumulation in muscle cells may be part of the problem.  This stage of “insulin resistance” goes unnoticed for years because there are no symptoms. Blood insulin levels are expensive to measure and difficult to standardize, so they are not part of any kind of routine, preventive screening.

Insulin promotes fat storage

High levels of insulin make fat storage and weight gain easier. Weight gain, particularly around the middle, promotes insulin resistance, and the pancreas responds with yet more insulin. A vicious cycle is underway.  Insulin resistance can become so pronounced that blood sugar escapes control and spills into the urine. Insulin resistance is now Type II diabetes, treated with medicines that help insulin work, and ultimately, with shots of yet more insulin.   Before this happens, and even afterwards, weight loss and exercise can reverse insulin resistance, leading medical researchers to believe that insulin resistance has something to do with abnormal energy processing in muscle cells. They’ve found that the muscles of some lean, healthy relatives of Type II diabetics show insulin resistance long before there is any fat in muscle, or abnormality in blood insulin levels.


In our sedentary age of super-sized, sugar-laced, low fiber meals, we produce far more insulin than our ancestors did. In addition, the genetic make-up of many people, particularly Hispanics, Native Americans and some African-Americans makes their insulin less effective.  We don’t measure insulin levels routinely. Instead, we concentrate on easily-measured cholesterol and fret about fat in the diet. At the same time we are in the middle of an epidemic of insulin resistance and on the verge of an epidemic of Type II diabetes, which is no longer just a disease of middle and older age. For the first time in history, type II diabetes is appearing regularly in children, teens and twenty year olds.

The average American fast food diet sets people on the road to obesity, insulin resistance and type II diabetes.  Lack of exercise keeps them there.In a world of easily available food that requires little or no work, the only defense against overeating is mental.  Education and self- discipline are the weapons. Insulin-requiring, Type I diabetics like Micah know how much insulin has to be paid out for a big meal. The rest of us have to visualize that syringe full of extra insulin and imagine tucking away excess calories as fat. We have to see ourselves requiring more and more insulin as time goes on and becoming unable to produce enough to meet the needs of an insulin resistant body. It’s enough to make that second helping seem less desirable and regular exercise more attractive.

Keeping insulin levels under control:

  1. Avoid weight gain
  2. Lose any extra weight
  3. Exercise 30 minutes per day.
  4. Eat regular, small, balanced meals, and 25-30gm/day of fiber
  5. Avoid the “white stuff:” Flour, sugar, white rice
  6.  If you are overweight and/or have relatives who have diabetes do all the above, and see if your doctor thinks a glucose tolerance test is warranted.







A Sweet Decision: Artificial Sweetener or Sugar?

“I would feel more optimistic about a bright future for man

if he spent less time proving that he can outwit Nature

and more time tasting her sweetness and respecting her seniority.E. B. White

         Little packets of faux sugar sit beside all convenience store coffee pots. Grocery store shelves are lined with lo-cal, no-cal, and no-sugar foods.  Authorities assure us that these staples of modern life are safe. Nevertheless, unease persists.  Should millions of people, including children, be engaged in an attempt to “outwit Nature?”  In deciding whether or not to participate in this vast modern experiment, there are two questions to answer:

1. Are artificial sweeteners necessary for me?

The first question has an easy answer. Artificial sweeteners are not necessary for anyone at any time. But for someone struggling with weight problems or diabetes, artificial sweeteners can add some “better living through chemistry.”  Bear in mind, though, that the only studies showing any positive effects on weight loss by the addition of artificial sweeteners are those involving serious attempts at long term dieting – the kind that involves lifestyle change. Casual, habitual users of sweeteners typically weigh more and gain more than non-users.  In addition, frequent consumption of sweetened foods and beverages aggravates the sugar addiction that drives so many poor food choices. Artificial sweeteners also contribute to elevated insulin levels.  As soon as the tongue perceives sweetness, a quick burst of insulin begins the body’s preparation for an influx of sugar (the “cephalic insulin repsonse”). When no real sugar appears, insulin falls back quickly, stimulating hunger. Or if food  accompanies the diet drink, the insulin helps make any excess calories into fat.

2. What is the likely harm if I choose to use them?

The question of potential harm is difficult to answer. Wading through the contradictory literature on safety studies of non-nutritive sweeteners is a confusing trek that exposes the influences of politics, power, money and fear on science. FDA approval of food additives, or designation of them as “GRAS”  – generally recognized as safe – does not make safety questions disappear. Saccharin (Sweet’N Low) for instance, is known to produce bladder cancer in rats, but human population studies show only “a trend” toward more bladder cancer if more than 6 packs a day are used.

Widespread use of any substance is very hard to tie to small changes in physiology or upticks in disease processes for which there are no clear, single causes. For instance, one of the worries about aspartame (NutraSweet, Equal) was its ability to cause brain tumors in rats. There was a  rise in human brain tumor rate that coincided with the introduction of aspartame in the early 1980s. But the increase may well have reflected better diagnosis due to the introduction of the CAT scan.  A more recent increase in brain tumors of high malignancy prompted some scientists in 1996 to call for a reevaluation of aspartame’s role, but other opinions prevailed.

Safety testing

Safety testing of individual sweeteners in bacteria and laboratory animals involves huge doses over months to years.  But only when the products reach the market does the most important test begin – long term consumption under varying circumstances by large numbers of people who have not been prescreened for other problems.  To make sweeteners more palatable, manufacturers often combine them in foods, exposing the consumer to chemical mixes never tested in the lab.  Anyone using artificial sweeteners regularly is a volunteer in long term safety experimentation, so wisdom dictates having at least a rudimentary understanding of the most common ones.


Saccharin, a petroleum derivative, is one of the oldest sweeteners. Time on the market has given it an aura of safety, but it has been used sparingly in soft drinks, making it less used than aspartame. A persistent group of scientists still rings the warning bell about saccharin’s carcinogenic potential and about its unstudied effects on fetuses and children. Even a weak carcinogen, they say is of concern over a lifetime of use.


Aspartame is dogged by the most complaints, including legitimate ones like headache and mood disorders and skin rashes, and unproven ones like links to Alzheimer’s disease and brain tumors.  Rare people with an inherited condition called phenylketonuria cannot tolerate one of the amino acids from which it is made. In 2002, a new version of aspartame without that that amino acid (Neotame) was approved but is not yet widely used.


Sucralose (Splenda) has the shortest track record. Better taste, heat stability that enables it to be used in cooking, and masterful marketing as “made from sugar” and “not absorbed”  gave Splenda 60% of the sweetener market by 2006. Eleven per cent of prepared foods on the grocery shelves are now sucralose sweetened. The additive does start out as sugar.  Chemical alteration replaces three parts of the sugar molecule with chlorine atoms, making a “chlorocarbon” that is structurally most similar to insecticides – but still called “natural.” On average, about 15% of Splenda is absorbed into the body. (The legal definition of “unabsorbed” applies if at least 80% of the product passes through the intestine unchanged.)  Test rats wound up with enlarged kidneys and livers, but so far, the large pool of human subjects seems to be tolerating the sweetener. Splenda is also not quite free of calories. While the chlorocarbon compound at the heart the sweetness has no calories, the added bulk needed to stabilize it is a mixture of carbohydrates – which contain about 12 cal/ teaspoon or 96 calories per cup.


Acesulfame-K(Sunette) is bitter tasting sugar substitute seldom used alone. It has undergone safety evaluation multiple times since the 1980s and is considered by some to have a poor test record

    Before you make your decision, consider one more thing. Eating real, whole fresh food rather than artificially flavored processed versions revives dormant taste buds. Smaller amounts are more satisfying,  allowing room for a few extra calories from naturally sweetsources.

Addendum: What about the “natural” sugar substitutes?

      Stevia, a no-calorie sweetener chemically extracted from plant leaves was in exile in health food stores since an anonymous complaint to the FDA in the early 1990s. Some say this was political exile since Stevia requires no patent. The beverage industry subsequently developed Stevia flavored products and the FDA changed its stance in Dec. 2008. There is already a long history of Stevia use in Japan and China, but expect to see it combined with other sweeteners to improve its vaguely licorice-like flavor.

Nectresse is the Splenda manufacturer’s entry into the natural market. It is derived from the monk plant and is 300 times sweeter than sugar, but must be combined with molasses and a sugar alcohol to make it work. It interferes with sugar absorption and the alcohol can ferment in the gut causing gas production. And yes, these plant derived substances have some calories – the FDA allows up to 5 cal/.5 tsp. in its definition of “no-cal.”

The Sweet Tooth: Pathway to a Broken Heart?

For the last half a century or more we have believed the dietary cholesterol theory about heart disease, a hypothesis (idea to be tested by experiment) that found favor with researchers, grant makers, doctors and drug makers. What if this theory is wrong? What if cholesterol in artery walls has less to do with dietary fat than with the way the body processes carbohydrates? What if refined sugars and grains are the dietary culprits? Could insulin, the master hormone at the center of all energy processing, be a better marker than cholesterol for heart disease?

What is blood sugar?

The first thing to understand about sugar is that the blood sugar is not the same thing as the sugar in your pantry. Or the sugar in soft drinks or the sugar in fresh fruit. Blood sugar is a simple molecule called glucose – a product of plants’ ability to convert the energy of the sun into starches, long chains of glucose linked together. When you eat a starch, the digestion process breaks down the chains into simple glucose molecules which circulate in your blood. Glucose is used by every cell in the body for energy, and is also made into glycogen for storage in liver and muscle.The sugar in your pantry is sucrose extracted from plants, specifically cane grasses and beets, by a refining process that concentrates and crystallizes it. Each sucrose molecule is a combination of one glucose molecule with another of fructose, a chemically different plant sugar molecule.

The taste for sweetness is innate and possibly addictive. Before the advent of refined sugar, indulging the sweet tooth was difficult. The only edible sources were berries and fruits and small amounts of honey guarded by nasty bees – all confined by climate and geography. Sugar made its way into the human diet slowly, spreading from the East to the West as the secret of this “liquid gold” made its way along routes of commerce.

Sugar and the diseases of civilization

With time and commerce, consumption of sugar and refined grains skyrocketed. The diseases of civilization – diabetes, heart disease and obesity – followed refined sugar, flour and rice around the world, appearing wherever old dietary staples were replaced by these “white” foods. By the 1920s, the Americans averaged 110-120 pounds of sugar per person per year. We inched up to 124 pounds by the late 1970s. Then came the Japanese chemical innovation that made high-fructose corn syrup (HFCS) a dietary staple. By 2000, HFCS bumped sugar consumption up to 150 lbs. per year, largely in the form of sweetened drinks.

High fructose corn syrup 

HFCS differs from sucrose because the ratio of fructose to glucose in corn syrup is 10% higher than in table sugar – 55:45 instead of 50:50. Some scientists believe that it is the remarkable increase in fructose consumption in modern times that correlates with the appearance of the metabolic syndrome – abdominal obesity, high fasting blood sugar, high triglycerides, abnormal lipoprotein levels and high blood pressure. If so, a 10% increase in fructose combined with a recent, large jump in overall sugar consumption may spell real trouble.
How can fructose cause trouble? Isn’t it the primary sugar of fruits? Yes, but eating an apple with a small amount of fructose combined with absorption-slowing fiber hardly nudges blood sugar up – a far cry from the blood sugar spike after 20 ounces of an HFCS sweetened beverage. Drink a coke, and about 60% of the glucose in the HFCS goes directly into the blood for immediate use, and 40 % into the liver for storage as glycogen. The fructose all goes to the liver for conversion into fat – released into the blood as triglycerides. The higher the fructose in the diet, the higher the triglycerides in the blood. Fructose is a “lipogenic” or fat-producing sugar, and long term consumption also raises LDL or bad cholesterol.

The problems with too much sugar

Once sugar consumption exceeds the small amounts nature provides without refining techniques, trouble begins. The different ways the body processes fructose and glucose combine to produce very efficient fat production. A rise in blood glucose prompts the pancreas to put out insulin to help ferry glucose into cells for energy use or storage. Insulin, like fructose, is “lipogenic” because it helps move fats into storage depots in three areas – the liver, fat tissue, and the walls of arteries. And as triglycerides are formed from fructose, insulin busies itself shuttling them around the liver and out into the blood. The pancreas then produces even more insulin to take care of the glucose – this is the phenomenon known as insulin resistance, part of the metabolic syndrome associated with heart disease.

Is it the cholesterol or the sugar?

The theory that cholesterol in dietary fat is the direct cause of cholesterol deposits in arteries requires a leap over the metabolic pathways that process simple sugars and are intimately involved in fat formation and storage – and over the fact that many people with low cholesterol levels have heart disease. Over the last half century, many researchers and doctors made the leap because they believed the theory. Just as important to widespread acceptance, though, were less scientific influences like the cheap availability of a test for blood cholesterol, the difficulty and expense of measuring insulin, and the dominance of researchers devoted to the dietary cholesterol theory over those who questioned it.

Medical history books contain an embarrassing array of once-unassailable theories and practices that have fallen by the wayside. Despite a modern sense of scientific invincibility, current medical ideas are not immune from error. Sugar and refined carbohydrates are not yet the poster children for the scourge of heart disease, but they may be a far better target than cholesterol. If the dietary fat theory gives way to the sugar theory, the massive push to lower cholesterol by diet and drugs may go into the books as one of those once-unassailable ideas that eventually fell.

Iodine: An Unfinished Story

In these days of high tech medicine it is easy to forget that some of the most effective and efficient health interventions are simple and cheap. One example is the addition of iodine to salt, an idea which began in the early 1900s with experimental trials in schoolchildren living in what was then known as the “goiter belt” of the USA. In that region surrounding the Great Lakes, many children developed enlarged thyroid glands called goiters.  A goiter is a sign of iodine deficiency.  So successful were the trials of iodine-supplemented diets that, by the 1930s, 90% of residents of the Great Lakes region used iodized salt and goiter rates in the region had plummeted.  Now, 70% of the world’s population uses iodized salt.

Iodine as an essential element

When iodine is in short supply, thyroid glands grow large in an attempt to harvest as much of the vital element as possible from the blood.  Iodine is necessary for making thyroid hormone and thyroid hormone is crucial for normal development and metabolism.  Pregnant women who have  low iodine levels and insufficient thyroid hormone often miscarry their babies or produce babies who are deaf, mentally-retarded  and stunted in growth.  In children and adults, iodine and thyroid hormone deficiencies cause fatigue, weight gain, lowered IQ levels, mental apathy and numerous metabolic abnormalities.  Regular intake of iodine is a simple preventive measure for a host of serious problems.

Unequal distribution

Iodine exists in an inorganic form in soil and water and makes its way into the plants and animals that we consume by combining with larger carbon-containing molecules.  In its inorganic form, iodine is a water-soluble salt which washes out of soil easily, especially in areas where the land is rocky and exposed. Where soil is iodine deficient, so are crops, unless supplemented with iodine containing fertilizers.  In contrast to its variable presence in soil, iodine is much more uniformly distributed in salt water seas.  Algae, kelp and other seawater plants, as well as saltwater fish and shell fish are the most reliable natural sources of dietary iodine, while iodine concentrations in land based plants depend on the amount of iodine in soil that supports them.  Terrestrial animals supply iodine proportional to the iodine in their food sources. Egg yolks are a good iodine source, because, like people, chickens develop goiters, and chicken feed is supplemented with iodine. Dairy products are also good sources. Cattle feed was originally supplemented with iodine to prevent hoof rot, and and because of the supplemented feed, iodine is secreted in the milk the cows produce.

Iodine and breast tissue

Milk contains iodine because mammary gland tissue, like thyroid gland tissue, accumulates iodine. The fact that iodine is found in human breast tissue, where it has no known function, has prompted studies of the element’s relationship to breast health.  Japanese women have low rates of breast cancer and fibrocystic breast disease compared to American women, and their regular iodine consumption via seaweed is high, perhaps 25x higher than the recommended daily iodine consumption in the US. Studies on the treatment of fibrocystic breast disease with iodine supplements have been promising but so far a direct relationship between breast disease and iodine consumption has not been proven.

Iodine supplementation?

Even if high dietary iodine content has something to do with low breast cancer rates among Japanese women, translating this information to attempts to prevent breast cancer is not a straightforward task. While it is clear that iodine supplementation prevents goiter, hypothyroidism and cognitive impairment, it is also clear that increasing iodine intake is not risk free, particularly in people who are accustomed to low levels of dietary iodine.  The thyroid gland, when faced with insufficient iodine in the blood, becomes a ruthless scavenger, extracting every last iodine molecule it can find. When iodine levels in the blood suddenly increase because of supplementary iodine intake, some thyroid glands will actually grow in size, pump out excessive thyroid hormone and even develop cancerous nodules. It may be that Japanese women can tolerate high amounts of iodine because it has never been in short supply for them. Caution and careful follow-up are always advisable when supplementing the diet with iodine in the form of tablets, drops or multivitamins.

Dietary iodine in the age of dietary angst

Obtaining enough iodine through the diet should be possible in almost all circumstances, especially because of the wisdom of public health policies regarding iodine.  Nevertheless, some eating trends in health in the closing decades of the 20thC have again raised public health concerns about iodine intake.   Assessments of body iodine content are made by measuring urinary iodine levels, since the body extracts as much iodine as it needs and excretes the rest in urine. But individual measurements are so variable that averages of all people tested are used to estimate the iodine status in a given geographic area.  Between 1971 and 2001, American iodine intake dropped dramatically then leveled off at half of the 1971 levels.

What happened over the last few decades?   Americans began getting much more of their salt in the form of the un-iodized salt in processed foods. Many people began avoiding salt altogether, some quite unnecessarily. Sea salt appeared on the grocery store shelves as part of the natural and organic food trends.   It is also possible that the 1971 levels of iodine consumption were artificially high. Studies in the 1970s showed that iodine-containing sanitizers were raising iodine levels in cows’ milk. Practices changed and milk iodine levels returned to normal.   Between the 1960s and 1980s, iodine was used in dough making and bread supplied 25% of the iodine consumed during that period.*  Perhaps the baseline measurements of iodine intake in the early 1970s were unnecessarily high. Perhaps intakes in 2001 and since  are adequate, at least to prevent goiters from developing. But the fact that Japanese people ingest far higher levels of iodine from whole food sources without ill effect suggests that we can tolerate more. Stay tuned.


*Note: Iodine in the Nuclear Age

In the wake of the atmospheric nuclear testing period, the government mandated the use of iodine containing oxidizing agents for dough conditioning in commercial baking. The iodine in the bread  competed  in the diet for uptake into the thyroid gland with radioactive iodine isotopes generated in  the wake of atmospheric nuclear testing. Saturating the thyroid gland with normal iodine is standard practice when radioactive iodine in the atmosphere  is a threat, as it was after the Chernobyl disaster. Taken within 8 hours after, or 48 hours prior to a nuclear disaster, iodine can prevent accumulation of radioactive iodine in the thyroid gland and thus prevent radiation damage to the gland. Pills to be taken in the event of a nuclear catastrophe are simply potassium iodide.

No more posts.