Iron Is the New Cholesterol


Cheerios are the best-selling breakfast cereal in America. The multi-grain version contains 18 milligrams of iron per serving, according to the label. Like almost any refined food made with wheat flour, it is fortified with iron. As it happens, there’s not a ton of oversight in the fortification process. One study measured the actual iron content of 29 breakfast cereals, and found that 21 contained 20 percent1 more than the label value, and 8 contained 50 percent more.1 One contained nearly 200 percent of the label value.

If your bowl of cereal actually contains 120 percent more iron than advertised, that’s about 22 mg. A safe assumption is that people tend to consume at least two serving sizes at a time.1 That gets us to 44 mg. The recommended daily allowance of iron is 8 mg for men and 18 mg for pre-menopausal women. The tolerable upper intake—which is the maximum daily intake thought to be safe by the National Institutes of Health—is 45 mg for adults.

Dalton_BREAKER-2

It is entirely feasible that an average citizen could get awfully close to exceeding the maximum daily iron intake regarded as safe with a single bowl of what is supposed to be a pretty healthy whole-grain breakfast option.

And that’s just breakfast.

At the same time that our iron consumption has grown to the borders of safety, we are beginning to understand that elevated iron levels are associated with everything from cancer to heart disease. Christina Ellervik, a research scientist at Boston Children’s Hospital who studies the connection between iron and diabetes, puts it this way: “Where we are with iron now is like where we were with cholesterol 40 years ago.”

The story of energy metabolism—the basic engine of life at the cellular level—is one of electrons flowing much like water flows from mountains to the sea. Our cells can make use of this flow by regulating how these electrons travel, and by harvesting energy from them as they do so. The whole set-up is really not so unlike a hydroelectric dam.

The sea toward which these electrons flow is oxygen, and for most of life on earth, iron is the river. (Octopuses are strange outliers here—they use copper instead of iron, which makes their blood greenish-blue rather than red). Oxygen is hungry for electrons, making it an ideal destination. The proteins that facilitate the delivery contain tiny cores of iron, which manage the handling of the electrons as they are shuttled toward oxygen.

This is why iron and oxygen are both essential for life. There is a dark side to this cellular idyll, though.

Oxygen and iron are essential for the production of energy, but may also conspire to destroy the delicate order of our cells.

Normal energy metabolism in cells produces low levels of toxic byproducts. One of these byproducts is a derivative of oxygen called superoxide. Luckily, cells contain several enzymes that clean up most of this leaked superoxide almost immediately. They do so by converting it into another intermediary called hydrogen peroxide, which you might have in your medicine cabinet for treating nicks and scrapes. The hydrogen peroxide is then detoxified into water and oxygen.

Things can go awry if either superoxide or hydrogen peroxide happen to meet some iron on the way to detoxification. What then happens is a set of chemical reactions (described by Haber-Weiss chemistry and Fenton chemistry) that produce a potent and reactive oxygen derivative known as the hydroxyl radical. This radical—also called a free radical—wreaks havoc on biological molecules everywhere. As the chemists Barry Halliwell and John Gutteridge—who wrote the book on iron biochemistry—put it, “the reactivity of the hydroxyl radicals is so great that, if they are formed in living systems, they will react immediately with whatever biological molecule is in their vicinity, producing secondary radicals of variable reactivity.”2

Such is the Faustian bargain that has been struck by life on this planet. Oxygen and iron are essential for the production of energy, but may also conspire to destroy the delicate order of our cells. As the neuroscientist J.R. Connor has said, “life was designed to exist at the very interface between iron sufficiency and deficiency.”3

Hemoglobin, ferritin, and transferrin

At the end of the 20th century, the metabolism of iron in the human body was still a bit of a mystery. Scientists knew of only two ways that the body could excrete iron—bleeding, and the routine sloughing of skin and gastrointestinal cells. But these processes amount to only a few milligrams per day. That meant that the body must have some way to tightly regulate iron absorption from the diet. In 2000 a major breakthrough was announced—a protein was found that functioned as the master regulator for iron. The system, as so many biological systems are, is perfectly elegant. When iron levels are sufficient, the protein, called hepcidin, is secreted into the blood by the liver. It then signals to gastrointestinal cells to decrease their absorption of iron, and for other cells around the body to sequester their iron into ferritin, a protein that stores iron. When iron levels are low, blood levels of hepcidin fall, and intestinal cells begin absorbing iron again. Hepcidin has since become recognized as the principal governor of iron homeostasis in the human body.

But if hepcidin so masterfully regulates absorption of iron from the diet to match the body’s needs, is it possible for anyone to absorb too much iron?

In 1996, a team of scientists announced that they had discovered the gene responsible for hereditary hemochromatosis, a disorder causing the body to absorb too much iron. They called it HFE. Subsequent work revealed that the product of the HFE gene was instrumental in regulating hepcidin. People with a heritable mutation in this gene effectively have a gross handicap in the entire regulatory apparatus that hepcidin coordinates.

This, then, leaves open the possibility that some of us could in fact take in more iron than the body is able to handle. But how common are these mutations? Common enough to matter for even a minority of people reading these words?

Dalton_BREAKER-1

Surprisingly, the answer is yes. The prevalence of hereditary hemochromatosis, in which two defective copies of the HFE gene are present and there are clinical signs of iron overload, is actually pretty high—as many as 1 in 200 in the United States. And perhaps 1 in 40 may have two defective HFE genes without overt hemochromatosis.4 That’s more than 8 million Americans who could have a significant short-circuit in their ability to regulate iron absorption and metabolism.

What if you have only one defective HFE gene, and one perfectly normal gene? This is called heterozygosity. We would expect to find more people in this situation than the homozygotes, or those with two bad copies of the gene. And in fact we do. Current estimates suggest that more than 30 percent of the U.S. population could be heterozygotes with one dysfunctional HFE gene.4 That’s pretty close to 100 million people.

Does this matter? Or is one good gene enough? There isn’t much research, but so far the evidence suggests that some heterozygotes do have impaired iron metabolism. Studies have shown that HFE heterozygotes seem to have modest elevations of ferritin as well as transferrin, a protein which chaperones iron through the blood, which would indicate elevated levels of iron.5,6 And a study published in 2001 concluded that HFE heterozygotes may have up to a fourfold increased risk of developing iron overload.4

A host of research articles have supported an association between iron and cancer.

Perhaps more concerning is that these heterozygotes have also been shown to be at increased risk for several chronic diseases, like heart disease and stroke. One study found that heterozygotes who smoked had a 3.5 times greater risk of cardiovascular disease than controls, while another found that heterozygosity alone significantly increased the risk of heart attack and stroke.7,8 A third study found that heterozygosity increased nearly sixfold the risk of cardiomyopathy, which can lead to heart failure.9

The connection between excessive iron and cardiovascular disease may extend beyond HFE heterozygotes. A recent meta-analysis identified 55 studies of this connection that were rigorous enough to meet their inclusion criteria. Out of 55 studies, 27 supported a positive relationship between iron and cardiovascular disease (more iron equals more disease), 20 found no significant relationship, and 8 found a negative relationship (more iron equals less disease).10

A few highlights: a Scandinavian study compared men who suffered a heart attack to men who didn’t, and found that elevated ferritin levels conferred a two- to threefold increase in heart attack risk. Another found that having a high ferritin level made a heart attack five times more likely than having a normal level. A larger study of 2,000 Finnish men found that an elevated ferritin level increased the risk of heart attack twofold, and that every 1 percent increase in ferritin level conferred a further 4 percent increase in that risk. The only other risk factor found to be stronger than ferritin in this study was smoking.

Ferritin isn’t a perfect marker of iron status, though, because it can also be affected by anything that causes inflammation. To address this problem a team of Canadian researchers directly compared blood iron levels to heart attack risk, and found that higher levels conferred a twofold increased risk in men and a fivefold increased risk in women.

If cardiovascular disease is one point in iron’s web of disease, diabetes may be another. The first hint of a relationship between iron and diabetes came in the late 1980s, when researchers discovered that patients receiving regular blood transfusions (which contain quite a bit of iron) were at significantly increased risk of diabetes. In hemochromatosis, there had been no way to know if the associated disturbance in glucose metabolism was due to the accumulation of iron itself, or to the underlying genetic defect. This new link between frequent transfusions and diabetes was indirect evidence that the iron itself may be the cause.

The next step was to mine existing data for associations between markers of iron status and diabetes. The first study to do so came out of Finland in 1997: Among 1,000 randomly selected Scandinavian men, ferritin emerged as a strong predictor of dysfunctional glucose metabolism, second only to body mass index as a risk factor.11 In 1999, researchers found that an elevated ferritin level increased the odds of having diabetes fivefold in men and nearly fourfold in women—similar in magnitude to the association between obesity and diabetes.12 Five years later, another study found that elevated ferritin roughly doubled the risk for metabolic syndrome, a condition that often leads to diabetes, hypertension, liver disease, and cardiovascular disease.13

Christina Ellervik’s first contribution to the field came in 2011, with a study investigating the association between increased transferrin saturation—a measure of how much iron is loaded onto the transferrin protein, which moves iron through the blood—and diabetes risk.14 Ellervik found that within a sample of nearly 35,000 Danes, transferrin saturation greater than 50 percent conferred a two- to threefold increased risk of diabetes. She also identified an increase in mortality rates with transferrin saturation greater than 50 percent.

In 2015, she led another study that found that, among a sample of 6,000 people, those whose ferritin levels were in the highest 20 percent had 4 times greater odds of diabetes than those with ferritin levels in the lowest 20 percent.15 Blood glucose levels, blood insulin levels, and insulin sensitivity all were raised with higher ferritin levels.

“It’s incredible that there is so much promising literature, and nobody—nobody—is doing the clinical trials.”

There’s a problem here, though. All of these studies show associations. They show that two things tend to happen together. But they don’t tell us anything about causality. To learn something about causality, you need an intervention. In the case of iron, you’d need to lower the iron and then watch what happens. Fortunately, there’s a very easy and very safe intervention to lower iron levels that’s performed millions of times every year—phlebotomy, also known as blood donation.

One of the first studies to use phlebotomy to examine the relationship between iron and diabetes was published in 1998.16 The authors found that among both healthy and diabetic subjects, phlebotomy improved insulin sensitivity and glucose metabolism. A 2005 study found that regular blood donors exhibited lower iron stores and significantly greater insulin sensitivity than non-donors.17 In 2012, researchers phlebotomized pre-diabetic volunteers until their ferritin levels dropped significantly, and found a marked subsequent improvement in their insulin sensitivity.18 In that same year, a different group of scientists studied the effect of phlebotomy on several elements of metabolic syndrome, including glucose metabolism. They found that a single phlebotomy session was associated with improvement in blood pressure, fasting glucose, hemoglobin A1C (a marker for average glucose levels), and blood cholesterol six weeks later.19

Many caveats apply to this evidence—the line between correlation and causation remains unclear, some of the studies used relatively small sample sizes, and phlebotomy may cause other changes in addition to lowering iron. But taken together, the data lends weight to the idea that iron plays a significant role in the tortuous pathophysiology of diabetes.

As more published data began to suggest a relationship between iron, cardiovascular disease, and diabetes, researchers started casting broader nets.

Next up was cancer.

It had been known since the late 1950s that injecting large doses of iron into lab animals could cause malignant tumors, but it wasn’t until the 1980s that scientists began looking for associations between iron and cancer in humans. In 1985, Ernest Graf and John Eton proposed that differences in colon cancer rates among countries could be accounted for by the variation in the fiber content of local diets, which can in turn affect iron absorption.20

The following year, Richard Stevens found that elevated ferritin was associated with triple the risk of death from cancer among a group of 20,000 Chinese men.21 Two years later Stevens showed that American men who developed cancer had higher transferrin saturation and serum iron than men who didn’t.22 In 1990, a large study of Swedish blood donors found that they were 20 percent less likely to get cancer than non-donor controls.23 Four years later, a group of Finnish researchers found that elevated transferrin saturation among 40,000 Scandinavians conferred a threefold increase risk for colorectal cancer, and a 1.5-fold increased risk for lung cancer.24

A host of research articles have been published since Graf and Eton’s first paper, and most have supported an association between iron and cancer—particularly colorectal cancer. In 2001, a review of 33 publications investigating the link between iron and colorectal cancer found that more than 75 percent of them supported the relationship.25 A 2004 study found an increased risk of death from cancer with rising serum iron and transferrin saturation. People with the highest levels were twice as likely to die from cancer than those with the lowest levels.26 And in 2008, another study confirmed that Swedish blood donors had about a 30 percent decrease in cancer risk.27

Dalton_BREAKER-3

There are a few other lines of evidence that support the association between iron and cancer. People with an HFE mutation have an increased risk of developing colon and blood cancers.28 Conversely, people diagnosed with breast, blood, and colorectal cancers are more than twice as likely to be HFE heterozygotes than are healthy controls.29

There are also a handful of interventional trials investigating the relationship between iron and cancer. The first was published in 2007 by a group of Japanese scientists who had previously found that iron reduction via phlebotomy essentially normalized markers of liver injury in patients with hepatitis C. Hepatocellular carcinoma (HCC) is a feared consequence of hepatitis C and cirrhosis, and they hypothesized that phlebotomy might also reduce the risk of developing this cancer. The results were remarkable—at five years only 5.7 percent of patients in the phlebotomy group had developed HCC compared to 17.5 percent of controls. At 10 years the results were even more striking, with 8.6 percent of phlebotomized patients developing HCC compared to an astonishing 39 percent of controls.30

The second study to investigate the effects of phlebotomy on cancer risk was published the following year by Leo Zacharski, a colorful emeritus professor at Dartmouth. In a multi-center, randomized study originally designed to look at the effects of phlebotomy on vascular disease, patients allocated to the iron-reduction group were about 35 percent less likely to develop cancer after 4.5 years than controls. And among all patients who did develop cancer, those in the phlebotomy group were about 60 percent less likely to have died from it at the end of the follow-up period.31

The brain is a hungry organ. Though only 2 to 3 percent of body mass, it burns 20 percent of the body’s total oxygen requirement. With a metabolism that hot, it’s inevitable that the brain will also produce more free radicals as it churns through all that oxygen. Surprisingly, it’s been shown that the brain appears to have less antioxidant capacity than other tissues in the body, which could make it more susceptible to oxidative stress.32 The balance between normal cellular energy metabolism and damage from reactive oxygen species may be even more delicate in the brain than elsewhere in the body. This, in turn, points to a sensitivity to iron.

It’s been known since the 1920s that neurodegenerative disease—illnesses like Alzheimer’s and Parkinson’s—is associated with increased iron deposition in the brain. In 1924, a towering Parisian neurologist named Jean Lhermitte was among the first to show that certain regions of the brain become congested with abnormal amounts of iron in advanced Parkinson’s disease.33 Thirty years later, in 1953, a physician named Louis Goodman demonstrated that the brains of patients with Alzheimer’s disease had markedly abnormal levels of iron deposited in the same regions as the famed plaques and tangles that define the illness.34 Goodman’s work was largely forgotten for several decades, until a 1992 paper resurrected and confirmed his findings and kindled new interest. Two years later an exciting new technology called MRI was deployed to probe the association between iron and disease in living patients, confirming earlier autopsy findings that Alzheimer brains demonstrated significant aberrations in tissue iron.35

Zacharski is convinced that iron overload is a huge common fulcrum underlying much of the chronic metabolic disease that is sweeping Western countries.

By the mid 1990s, there was compelling evidence that Alzheimer’s and Parkinson’s disease involved some dysregulation of iron metabolism in the brain, but no one knew whether the relationship was cause or consequence of the disease process. Hints began trickling in at around the same time the MRI findings were being published. A 1993 paper reported that iron promoted aggregation of amyloid-b, the major constituent of Alzheimer’s plaques.36 In 1997, researchers found that the aberrant iron associated with Alzheimer’s plaques was highly reactive and able to freely generate toxic oxygen radicals.37 By 2010, it had been shown that oxidative damage was one of the earliest detectable changes associated with Alzheimer’s, and that reactive iron was present in the earliest stages of the disease.38,39 And in 2015, a seven-year longitudinal study showed that cerebrospinal fluid ferritin levels were a strong predictor of cognitive decline and development of Alzheimer’s dementia.40

Perhaps most surprising was the discovery in 1999 that the pre-cursor to amyloid-b was under direct control by cellular iron levels—the more iron around, the more amyloid was produced.41 This raised the tantalizing possibility that amyloid plaques might actually represent an adaptive response rather than a cause, an idea that has been indirectly supported by the spectacular failure of essentially all efforts to directly target amyloid protein as treatment for the disease.

Together, these findings suggest that abnormal iron metabolism in the brain could be a causative factor in Alzheimer’s and other neurodegenerative diseases. If that’s true, then we might expect people who are genetically predisposed to an aberrant iron metabolism would be at higher risk of dementing diseases than others. And so they are.

In the early 2000s, it was discovered that patients with familial Alzheimer’s were more likely to possess one of the HFE genes than healthy controls.42 Another study found that these genotypes were associated with earlier onset of the disease compared to controls, and that there was an even more powerful effect in people who an HFE as well as an ApoE4 gene, the primary genetic risk factor for Alzheimer’s disease.43 A 2004 study showed that the co-occurrence of the HFE gene with a known variant in the transferrin gene conferred a fivefold increased risk of Alzheimer’s.44 Two years later a team of Portuguese scientists found that the HFE variants were associated with increased risk of Parkinson’s as well.45

What about interventional trials? For neurodegenerative disease, there has been exactly one. In 1991, a team of Canadian scientists published the results of a two-year randomized trial of the iron chelator desferrioxamine in 48 patients with Alzheimer’s disease.46 Chelators are a class of medication that bind metal cations like iron, sequester them, and facilitate their excretion from the body. Patients were randomly allocated to receive desferrioxamine, placebo, or no treatment. The results were impressive—at two years, iron reduction had cut the rate of cognitive decline in half.

The study was published in The Lancet, one of the world’s most prestigious medical journals, but seems to have been forgotten in the 20-odd year interim. Not a single interventional study testing the role of iron in Alzheimer’s disease has been published since.

If so many studies seem to show a consistent association between iron levels and chronic disease, why isn’t more work being done to clarify the risk?

“It’s incredible that there is so much promising literature, and nobody—nobody—is doing the clinical trials,” Dartmouth’s Zacharski said to me. “If people would just take up the gauntlet and do well-designed, insightful studies of the iron hypothesis, we would have a much firmer understanding of this. Just imagine if it turns out to be verified!”

His perspective on why more trials haven’t been done is fascinating, and paralleled much of what other experts in the field said. “Sexiness,” believe it or not, came up in multiple conversations—molecular biology and targeted pharmaceuticals are hot (and lucrative), and iron is definitively not. “Maybe it’s not sexy enough, too passé, too old school,” said one researcher I spoke to. Zacharski echoed this in our conversation, and pointed out that many modern trials are funded by the pharmaceutical industry, which is keen to develop the next billion-dollar drug. Government agencies like the NIH can step in to fill gaps left by the for-profit research industry, but publically funded scientists are subject to the same sexiness bias as everyone else. As one senior university scientist told me, “NIH goes for fashion.”

Zacharski is convinced that iron overload is a huge common fulcrum underlying much of the chronic metabolic disease that is sweeping Western countries. He thinks that even subtly elevated iron levels can result in free radical formation, which then contribute to chronic inflammation. And chronic inflammation, we know, is strongly linked to everything from heart disease to diabetes, cancer to Alzheimer’s.

“If this doesn’t deserve randomized trials,” he told me, “then I don’t know what does.”

Until those randomized trials arrive—I’ll see you at the blood bank.

Clayton Dalton is an emergency medicine resident at Massachusetts General Hospital in Boston. He has published stories and essays with NPR, Aeon, and The Los Angeles Review.

Lead image: Liliya Kandrashevich / Shuttterstock

References

1. Whittaker, P., Tufaro, P.R., & Rader. J.I. Iron and folate in fortified cereals. The Journal of the American College of Nutrition 20, 247-254 (2001).

2. Halliwell, B. & Gutteridge, J.M. Oxygen toxicity, oxygen radicals, transition metals and disease. Biochemical Journal 219, 1-14 (1984).

3. Connor, J.R. & Ghio, A.J. The impact of host iron homeostasis on disease. Preface. Biochimica et Biophysica Acta 1790, 581-582 (2009).

4. Hanson, E.H., Imperatore, G., & Burke, W. HFE gene and hereditary hemochromatosis: a HuGE review. Human Genome Epidemiology. American Journal of Epidemiology 154, 193-206 (2001).

5. Beutler, E., Felitti, V.J., Koziol, J.A., Ho, N.J., & Gelbart, T. Penetrance of 845G—> A (C282Y) HFE hereditary haemochromatosis mutation in the USA. The Lancet 359, 211-218 (2002).

6. Rossi, E., et al. Effect of hemochromatosis genotype and lifestyle factors on iron and red cell indices in a community population. Clinical Chemistry 47, 202-208 (2001).

7. Roest, M., et al. Heterozygosity for a hereditary hemochromatosis gene is associated with cardiovascular death in women. Circulation 100, 1268-1273 (1999).

8. Tuomainen, T.P., et al. Increased risk of acute myocardial infarction in carriers of the hemochromatosis gene Cys282Tyr mutation: A prospective cohort study in men in eastern Finland. Circulation 100, 1274-1279 (1999).

9. Pereira, A.C., et al. Hemochromatosis gene variants in patients with cardiomyopathy. American Journal of Cardiology 88, 388-391 (2001).

10. Muñoz-bravo, C., Gutiérrez-bedmar, M., Gómez-aracena, J., García-rodríguez, A., & Navajas, J.F. Iron: protector or risk factor for cardiovascular disease? Still controversial. Nutrients 5, 2384-2404 (2013).

11. Tuomainen, T.P., et al. Body iron stores are associated with serum insulin and blood glucose concentrations. Population study in 1,013 eastern Finnish men. Diabetes Care 20, 426-428 (1997).

12. Ford, E.S. & Cogswell, M.E. Diabetes and serum ferritin concentration among U.S. adults. Diabetes Care 22, 1978-1983 (1999).

13. Jehn, M., Clark, J.M., & Guallar, E. Serum ferritin and risk of the metabolic syndrome in U.S. adults. Diabetes Care 27, 2422-2428 (2004).

14. Ellervik, C., et al. Elevated transferrin saturation and risk of diabetes: three population-based studies. Diabetes Care 34, 2256-2258 (2011).

15. Bonfils, L., et al. Fasting serum levels of ferritin are associated with impaired pancreatic beta cell function and decreased insulin sensitivity: a population-based study. Diabetologia 58, 523-533 (2015).

16. Facchini, F.S. Effect of phlebotomy on plasma glucose and insulin concentrations. Diabetes Care 21, 2190 (1998).

17. Fernández-real, J.M., López-bermejo, A., & Ricart, W. Iron stores, blood donation, and insulin sensitivity and secretion. Clinical Chemistry 51, 1201-1205 (2005).

18. Gabrielsen, J.S., et al. Adipocyte iron regulates adiponectin and insulin sensitivity. Journal of Clinical Investigation 122, 3529-3540 (2012).

19. Houschyar, K.S., et al. Effects of phlebotomy-induced reduction of body iron stores on metabolic syndrome: results from a randomized clinical trial. BMC Medicine 10:54 (2012).

20. Graf, E. & Eaton, J.W. Dietary suppression of colonic cancer. Fiber or phytate?. Cancer 56, 717-718 (1985).

21. Stevens, R.G., Beasley, R.P., & Blumberg, B.S. Iron-binding proteins and risk of cancer in Taiwan. Journal of the National Cancer Institute 76, 605-610 (1986).

22. Stevens, R.G., Jones, D.Y., Micozzi, M.S., & Taylor, P.R. Body iron stores and the risk of cancer. New England Journal of Medicine 319, 1047-1052 (1988).

23. Merk, K., et al. The incidence of cancer among blood donors. International Journal of Epidemiology 19, 505-509 (1990).

24. Knekt, P., et al. Body iron stores and risk of cancer. International Journal of Cancer 56, 379-382 (1994).

25. Nelson, R.L. Iron and colorectal cancer risk: human studies. Nutrition Review 59, 140-148 (2001).

26. Wu, T., Sempos, C.T., Freudenheim, J.L., Muti, P., & Smit, E. Serum iron, copper and zinc concentrations and risk of cancer mortality in US adults. Annals of Epidemiology 14, 195-201 (2004).

27. Edgren, G., et al. Donation frequency, iron loss, and risk of cancer among blood donors. Journal of the National Cancer Institute 100, 572-579 (2008).

28. Nelson, R.L., Davis, F.G., Persky, V., & Becker, E. Risk of neoplastic and other diseases among people with heterozygosity for hereditary hemochromatosis. Cancer 76, 875-879 (1995).

29. Weinberg, E.D. & Miklossy, J. Iron withholding: a defense against disease. Journal of Alzheimer’s Disease 13, 451-463 (2008).

30. Kato, J., et al. Long-term phlebotomy with low-iron diet therapy lowers risk of development of hepatocellular carcinoma from chronic hepatitis C. Journal of Gastroenterology 42, 830-836 (2007).

31. Zacharski, L.R., et al. Decreased cancer risk after iron reduction in patients with peripheral arterial disease: results from a randomized trial. Journal of the National Cancer Institute 100, 996-1002 (2008).

32. Lee, H.G., et al. Amyloid-beta in Alzheimer disease: the null versus the alternate hypotheses. Journal of Pharmacology and Experimental Therapeutics 321, 823-829 (2007).

33. Lhermitte, J., Kraus, W.M., & Mcalpine, D. Original Papers: On the occurrence of abnormal deposits of iron in the brain in Parkinsonism with special reference to its localisation. Journal of Neurology and Psychopathology 5, 195-208 (1924).

34. Goodman, L. Alzheimer’s disease; a clinico-pathologic analysis of twenty-three cases with a theory on pathogenesis. The Journal of Nervous and Mental Disease 118, 97-130 (1953).

35. Bartzokis, G., et al. In vivo evaluation of brain iron in Alzheimer’s disease and normal subjects using MRI. Biological Psychiatry 35, 480-487 (1994).

36. Mantyh, P.W., et al. Aluminum, iron, and zinc ions promote aggregation of physiological concentrations of beta-amyloid peptide. Journal of Neurochemistry 61, 1171-1174 (1993).

37. Smith, M.A., Harris, P.L., Sayre, L.M., & Perry, G. Iron accumulation in Alzheimer disease is a source of redox-generated free radicals. Proceedings of the National Academy of Sciences 94, 9866-9868 (1997).

38. Nunomura, A., et al. Oxidative damage is the earliest event in Alzheimer disease. Journal of Neuropathology and Experimental Neurology 60, 759-767 (2001).

39. Smith, M.A., et al. Increased iron and free radical generation in preclinical Alzheimer disease and mild cognitive impairment. Journal of Alzheimer’s Disease 19, 363-372 (2010).

40. Ayton, S., Faux, N.G., & Bush, A.I. Ferritin levels in the cerebrospinal fluid predict Alzheimer’s disease outcomes and are regulated by APOE. Nature Communications 6:6760 (2015).

41. Rogers, J.T., et al. Translation of the alzheimer amyloid precursor protein mRNA is up-regulated by interleukin-1 through 5’-untranslated region sequences. Journal of Biological Chemistry 274, 6421-6431 (1999).

42. Moalem, S., et al. Are hereditary hemochromatosis mutations involved in Alzheimer disease? American Journal of Medical Genetics 93, 58-66 (2000).

43. Combarros, O., et al. Interaction of the H63D mutation in the hemochromatosis gene with the apolipoprotein E epsilon 4 allele modulates age at onset of Alzheimer’s disease. Dementia and Geriatric Cognitive Disorders 15, 151-154 (2003).

44. Robson, K.J., et al. Synergy between the C2 allele of transferrin and the C282Y allele of the haemochromatosis gene (HFE) as risk factors for developing Alzheimer’s disease. Journal of Medical Genetics 41, 261-265 (2004).

45. Pulliam, J.F., et al. Association of HFE mutations with neurodegeneration and oxidative stress in Alzheimer’s disease and correlation with APOE. American Journal of Medical Genetics; Part B 119B, 48-53 (2003).

46. Crapper-McLachlan, D.R., et al. Intramuscular desferrioxamine in patients with Alzheimer’s disease. The Lancet 337, 1304-1308 (1991).

Advertisements

Is Intermittent Fasting Really Worth It?


After all, 16 hours is a long time to go without eating. Here’s everything you need to know about the popular weight-loss regimen—including whether it actually works.

Chris Pratt! Hugh Jackman! Halle Berry! Kourtney Kardashian! What these celebrities have in common, other than a gratuitous exclamation point after their names, is a professed fondness for intermittent fasting, the diet craze turning the fitness world on its sweaty, well-toned head. For help determining whether you, too, should incorporate this into your 2019 resolution-related plans, we asked a few experts to explain what it is, why people love it, and whether it’s really worth the pain of forgoing on-demand snacks for the rest of the winter.

illustration of a man eating while using a clock as a table


What is intermittent fasting, exactly?

Intermittent fasting, unlike many other diets, is famously flexible in that you choose the days and hours during which you think it’s best to fast. The two most common methods are the 16:8 strategy—where you eat whatever you want (within reason) for eight hours a day and then fast for the other 16—and the 5:2 method, where you eat normally five days a week and then keep your food intake to roughly 500-600 calories for the other two days. It’s kind of a simplified-calories math problem that’s supposed to prevent the yo-yo effect of weight loss and weight gain.

“There are different ways to do this diet, but the bottom line is that no matter which you choose, you’re taking in less energy, and because of that, you’re going to start using your own body stores for energy,” says Lisa Sasson, a clinical professor of nutrition at NYU. “If you don’t, you’re not going to lose weight.”

Why might I want to try it?

A recent study completed by the German Cancer Research Center concluded that intermittent fasting indeed “helps lose weight and promotes health,” and noted that the regimen proved especially adept at getting rid of fat in the liver. A USC study also found that the diet reduced participants’ risk of cancer, diabetes, heart disease, and other age-related diseases. While researchers involved cautioned that more testing is necessary, the results are at least encouraging.

Most people who swear by intermittent fasting will tell you it helps not only with losing weight but also with reducing “belly fat.” This is not a conclusion with scientific backing, but it is the sort of thing to which every six-pack enthusiast aspires.


Why might I not want to try it?

“There’s really no conclusive evidence that there’s any benefit,” Sasson says. The German Cancer Research Center study qualified its findings by noting that the positive results weren’t noticeably better than those experienced by subjects who adopted a conventional calorie-reduction diet. In other words, it works, but not notably better than the alternative. (Sasson also offered a helpful list of individuals who should not give intermittent fasting a try: pregnant women and anyone with diabetes, cancer, or an eating disorder.)

The best long-term diets, no matter what their rules entail, are the ones that are least difficult to maintain—and again, in this regard, intermittent fasting isn’t inherently superior to anything else. “Are you making changes in your behavior? Have you learned positive habits so that when you go back to not fasting, you’re going to be a healthier eater?” Sasson asks. “I know people who fast because they think, Okay, I’m going to be really bad and overdrink or overeat, and then two days a week I’m going to have a clean life, and that’s just not how it works.”

Also, for many people, a full 16 hours of fasting just isn’t realistic, says Cynthia Sass, a New York City– and L.A.-based performance nutritionist. She recommends 12 hours of overnight fasting at most and believes the 16-hour gap is especially tough on those who exercise early in the morning or late at night. “If fasting makes you feel miserable and results in intense cravings and rebound overeating, it’s not the right path for you,” she says.

So—should I try it?

As long as you’re aware that it isn’t nutritional magic, Sasson isn’t against intermittent fasting altogether. “I’ve worked with patients who need positive reinforcement to see that their weight went down to feel better, and they feel in control for the first time,” she says. “That self-efficacy, that feeling that they could do it—for some, that might be important.”

Of the two most popular methods, Sasson leans toward the 5:2 schedule as slightly more manageable, since you’re only reducing your intake twice a week. But again, that’s contingent on you being a responsible dieter on your days of lowered caloric intake, which requires an immense amount of discipline—especially when it comes to remembering to drink water. “You can go a long time without food, but only a few days without adequate hydration,” she warns.

If these extended periods without delicious food sound too painful to handle, rest assured: The best available evidence indicates that a regular ol’ diet is at least as safe and healthy and efficacious as intermittent fasting. Besides, sooner or later, a shiny new fad is bound to come along for the A-listers to fawn over, she says: “There’s going to be a new darling of the month before you know it.”

Are “Natural” Sugars Any Better Than Refined White Sugar?


The dangerous health and wellness effects of refined sugar are real, we are still consuming an unbelievable amount of it.

Understanding, many people are attempting to choose a “much healthier” sugar, opting for the more natural sugar choices at the grocery stores. You’ve possibly seen many of these on your own. So what are these all-natural alternatives– and also are they any kind of far better for you than refined white sugar?

Here’s what you need to know

Improved white sugar: You probably recognize this currently, however it deserves repeating. Improved white sugar is completely removed of all nutritional value, as well as provides greater than 65 percent of the white sugar readily available readily is made from GMO sugar beet roots.

Brown sugar: Commercial brown sugar is absolutely nothing greater than refined white sugar with some molasses included back in for shade and taste. Don’t be fooled by the color or claims. It’s just as poor.

Evaporated cane juice: Made from sugar walking stick (rather than sugar beetroots), vaporized cane juice is slightly less polished than white sugar, and consequently keeps even more shade, flavor, as well as nutrients from the sugar walking cane. Yet really the only difference in between business evaporated walking stick juice and white sugar is that the former goes through one less action of improvement.

Raw cane sugar: This type of sugar is less processed than fine-tuned white sugar, as well as still consists of several of the original nutrients existing in walking stick juice. These consist of amino acids, minerals, vitamins, as well as even some anti-oxidants. Because it’s organic, you likewise won’t be subjected to the pesticides existing in readily expanded sugar. So, while certainly a better selection than improved white sugar, bear in mind that it’s still SUGAR and ought to be eaten in minimal quantities.

Coconut sugar: Coconut sugar is harvested from the sap of the coconut plant via a very all-natural procedure of drawing out the juice, and after that enabling the water to evaporate. Process-wise, it is one of one of the most lasting methods of sugar production, and the product likewise includes a small amount of fiber as well as other nutrients. Coconut likewise consists of a reduced portion of fructose than the various other sugars listed, which probably makes it somewhat healthier than the other choices.

Keep in mind, no matter what sugar you pick, quantity is equally as, if not more, essential than quality. Consuming a few teaspoons of refined white sugar is almost certainly still much healthier than consuming a whole lot more of the natural selections. So whatever kind of all-natural sugar you select, see to it you limit your consumption similarly you would certainly with polished sugar!

https://naturalhealthnews.website/2018/12/27/are-natural-sugars-any-better-than-refined-white-sugar/?fbclid=IwAR05dI5zGHtSQSVHcW1WwlMyo0sHCa2Fe7jivTF5D7-tGURA4QkPWoAE-qg

Are Vegetarian and Vegan Diets Best for Preventing and Treating Diabetes?


I’m going to try to answer a question asked by both patients and care providers: Is a vegetarian or vegan diet the ideal diet for preventing and treating diabetes?

A quick Internet search would yield plenty of popular articles that advocate a vegetarian diet. According to certain websites, such a diet prevents the onset of diabetes or, in the case of confirmed diabetes, enables one to stop treatments.

Of course, such claims are completely false. What’s more challenging, however, is to determine whether a vegetarian diet is the one that should be recommended on a first-line basis in patients with diabetes in the hopes of achieving diabetic control and preventing complications.

Interpret Studies With Caution

Reviews and meta-analyses on this subject conclude rather strongly that diets that are low in or that contain no foods of animal origin are beneficial. We can cite a review that was recently published in Current Diabetes Reports.[1]

It discusses a lower incidence of diabetes in longitudinal studies and decreased glycated hemoglobin levels and diabetes treatments required in randomized studies where a vegetarian diet was compared with low-fat diets. For my part, I think these results and claims—which seem a bit exaggerated to me—need to be seen in relative terms.

In regard to cohort studies, a lot of caution is warranted. Although the incidence of diabetes in certain studies was lower by up to one half in the followers of a vegetarian diet, it must be borne in mind that this is a typical situation where it is impossible to establish a causal link between a way of eating and a disease risk, given the importance of confounding factors. Being a vegetarian is associated, on average, with a largely healthier lifestyle. This is common knowledge.

As for randomized trials, it must be said that they are of short duration. For the most part, when we look at the meta-analyses, we find greater weight loss with vegetarian diets than with the diets tested in the control groups.

The main hypothesis for explaining this difference in weight loss between the groups is that in the open-label studies, the participants in the intervention group were more or less consciously influenced to lose weight, even if, in principle, weight loss was not an objective. Therefore, weight loss would have been what was responsible for the improvement in glycemic control rather than the quality of the diet, per se.

Also, it is well known that high-protein or high-fat diets have a dramatic effect on glycemic control in patients with diabetes, in addition to bringing about rapid weight loss.

What About the Mediterranean Diet?

Last, to the best of my knowledge, no randomized trial has ever compared a vegetarian diet with a Mediterranean-type diet, which contains foods of animal origin. This is a major shortcoming.

Ideally, these diets should be compared if one really wants to conclude that vegetarianism is superior. It is a shortcoming all the more so because there is abundant literature in favor of the Mediterranean diet for preventing and treating cardiometabolic risk factors, and especially for improving glycemic control in diabetics.

Although no clinical trial has compared the vegetarian diet with the Mediterranean diet, there is a recently published network meta-analysis[2] from which one can make an indirect comparison between the vegetarian and Mediterranean diets and, more generally, other types of diets—namely, the Paleo diet, high-protein diet, low-carb diet, and a diet with low glycemic index and load.

The overall finding of this network meta-analysis is in favor of the Mediterranean diet when it comes to glycemic control. The Mediterranean diet seems to be at least as effective or even superior to a vegetarian diet, which does not fare so badly either and which is associated with better diabetes control.

Think Long-term

The answer to the question “Is a vegetarian diet the one to recommend on a first-line basis in patients with diabetes?” is, at least in my opinion, no. My message is that one should recommend a diet that can be followed over the long term.

If the patient chooses a vegetarian diet, one can respect this choice entirely. The same goes for an animal fat–free diet, on the condition that a dietetic follow-up is provided. If, on the other hand, a patient wants to eat a diet containing foods of animal origin, one can very well recommend another balanced and health-friendly diet, the model being the Mediterranean diet.

Source:www.medscape.com

Food and Fertility: What Should Women Consume When Trying to Conceive?


What Should I Eat to Enhance Fertility?

A young fertile couple’s chance of conceiving in the first month they try is 25%-30%.[1] By the end of the first year, about 85% of couples achieve a success; the remaining 15% are diagnosed with infertility.[2]

Infertility has many known causes (eg, ovulatory defect, tubal occlusion, low sperm counts), and many factors lower the chance of pregnancy (eg, older age, lower ovarian reserve, endometriosis). There are modifiable and nonmodifiable risk factors for infertility or reduced fertility. Although some factors can’t be altered (eg, age and ovarian reserve), others, such as body weight and lifestyle habits, are modifiable.

Patients frequently ask providers to offer them guidance on the ideal diet to improve their chances of conceiving and carrying a pregnancy to term. A recent review by Chiu and colleagues[3] summarizes the available epidemiologic literature on the reproductive benefits of diets and dietary supplements.

Nutrition and Fertility: Review Findings

This article reviews the potential benefits of consumption of certain micronutrients, macronutrients, and dietary patterns. The following conclusions are drawn from this review:

Folic acid. Folic acid is important for germ cell production and pregnancy. The recommended daily dose to prevent neural tube defects is 400-800 µg. Women who take folic acid-containing multivitamins are less likely to be anovulatory, and the time to achieve a pregnancy is reduced. Those who consume more than 800 µg of folic acid daily are more likely to conceive with assisted reproductive technology (ART) than those whose daily intake is less than 400 µg.

Vitamin D. Vitamin D may affect fertility through receptors found in the ovaries and endometrium. An extremely low vitamin D level (< 20 ng/mL) is associated with higher risk for spontaneous miscarriage risk. Some reports suggest that women with adequate vitamin D levels (> 30 ng/mL) are more likely to conceive after ART when compared with those whose vitamin D levels are insufficient (20-30 ng/mL), or deficient (< 20 ng/mL). These findings, however, are inconclusive.

Carbohydrates. Dietary carbohydrates affect glucose homeostasis and insulin sensitivity, and by these mechanisms can affect reproduction. The impact is most pronounced among women with polycystic ovary syndrome (PCOS). In women with PCOS, a reduction in glycemic load improves insulin sensitivity as well as ovulatory function. Whole grains have antioxidant effects and also improve insulin sensitivity, thereby positively influencing reproduction.

Omega-3 supplements. Omega-3 polyunsaturated fatty acids lower the risk for endometriosis. Increased levels of omega-3 polyunsaturated fatty acids are associated with higher clinical pregnancy and live birth rates.

Protein and dairy. Some reports suggest that dairy protein intake lowers ovarian reserve. Other reports suggest improved ART outcomes with increased dairy intake. Meat, fish, and dairy products, however, can also serve as vehicles for environmental contamination that may adversely affect the embryo. Fish, on the other hand, has been shown to exert positive effects on fertility.

Dietary approach. In general, a Mediterranean diet is favored (high intake of fruits, vegetables, fish, chicken, and olive oil) among women diagnosed with infertility.

Viewpoint

A well-balanced diet, rich in vegetables and fruits, is preferred for infertile women and should provide the required micro- and macronutrients. It remains common for patients consume a wide variety of vitamin, mineral, and micronutrient supplements daily.[4] Supplements should not replace food sources of vitamins and trace elements because of differences in bioavailability (natural versus synthetic), and inaccuracy of label declarations may result in suboptimal intake of important nutrients.[5,6] Furthermore, naturally occurring vitamins and micronutrients are more efficiently absorbed.

With respect to overall diet, women are advised to follow a caloric intake that won’t contribute to being overweight or obese. Obesity is on the rise among younger people, including children. Obese women have a lower chance of conceiving and are less likely to have an uncomplicated pregnancy.[7] Proper weight can be maintained with an appropriate diet and regular exercise.

Finally, women must abstain from substances that are potentially harmful to pregnancy (eg, smoking, alcohol, recreational drugs, high caffeine intake).

Unfortunately, very few large studies are available to guide us in our recommendations to patients. Most of the available literature is based on retrospective data. Therefore, prospective, randomized studies are urgently needed to study the association between nutrition and fertility, as well as dietary influences on pregnancy outcomes.

11 Amazing Benefits of Bitter Melon or Bitter Gourd


bitter melon, ampalaya, bitter gourd

Bitter Melon or Bitter Gourd health benefits includes managing blood sugar and treating diabetes, enhancing body immunity, hemorrhoid relief, help sharpen vision, relieving asthma, enhance skin and treat skin conditions, plays role in cancer prevention and helps treat HIV and herpes. Other benefits includes reducing cholesterol levels, promoting bone health and promoting weight loss and good digestion.

What is Bitter Melon or Bitter Gourd?

Momordica charantia or Bitter Melon is the edible fruit-pod of a tendril-bearing vine native to India, and is now widely cultivated in Asia, Africa, and the Caribbean islands. It is harvested before it fully ripens otherwise it becomes increasingly bitter. The plant’s most prominent characteristics are its jagged warty texture and bitter taste.

Botanically, it belongs to the Cucurbitaceae family and is a close relative to the cantaloupes, cucumber, and squash. Apart from its scientific name, it also has many other names in different languages such as Ampalaya in Filipino, Cerasee in Jamaican, and Karila.

However, most people simply recognize it as bitter melon or bitter gourd due to its awful taste. Bitter melons or bitter gourds vary in shapes and sizes. Despite its sharp and acrid taste, bitter melon is present in many Asian dishes including Chinese, Japanese, and South Indian cuisines.

Moreover, bitter melon is just as prevalent in herbal medicine as much as its use in Asian culinary applications. The use of bitter melon in traditional medicine dates back 600 years ago. At present, pharmacological research and clinical trials found that the fruit contain several health benefits, particularly hypoglycemic effects.

bitter melon, bitter gourd, ampalaya

Nutrition Info of Bitter Melon or Bitter Gourd (per 100g)

Calories-17
Carbohydrates 3.7g
Protein 1g – 2% RDA
Dietary Fiber 2.8 g – 7% RDA
Folates 72 µg – 18% RDA
Niacin 0.400 mg – 2.5% RDA
Pantothenic acid 0.212 mg -4% RDA
Pyridoxine 0.043 mg – 3% RDA
Riboflavin 0.040 mg – 3% RDA
Thiamin 0.040 mg – 3.5% RDA
Vitamin A 471 IU – 16% RDA
Vitamin C 84 mg – 140% RDA
Potassium 296 mg – 6% RDA
Calcium 19 mg – 2% RDA
Copper 0.034 mg – 4% RDA
Iron 0.43 mg – 5% RDA
Magnesium 17 mg – 4% RDA
Manganese 0.089 mg – 4% RDA
Zinc 0.80mg – 7% RDA
Carotene-β -190 µg
Lutein-zeaxanthin- 170 µg

bitter melon, bitter gourd, ampalaya
 

11 Amazing Benefits of Bitter Melon or Bitter Gourd

1. Diabetes Treatment And Blood Sugar Management

Momordica charantia is used primarily as an alternative therapy for lowering blood sugar levels in patients with type 2 diabetes mellitus. Considerably it is the most potent and popular fruit in terms of managing diabetes through alternative medicine. In fact, drinking bitter melon decoctions is a common practice of diabetesmanagement in Asian countries.

Certain elements of the bitter melon, particularly polypeptide-P, have structural compositions akin to animal insulin. The overall phytochemical composition of the bitter melon consists of charantin, steroidal saponins, and alkaloids. Charantin specifically augments glycogen synthesis within liver and muscle cells. Together, these compounds greatly contribute to the fruit’s hypoglycemic effects.

Regular consumption of bitter melon- be it fruit, juice, or dried powder can provide additive effects when taken with conventional hypoglycemic medication.

2. Immunity EnhancerAdding bitter melon fruit or juice to your diet helps you recover from common illnesses much quicker and decreases your susceptibility to infections. Bitter melon is abundant in antioxidants that constitute an impregnable line of defense against viruses. The fruit is rich in Vitamin C, a powerful antioxidant. A hundred-gram serving of bitter melon provides over 80 mg of vitamin C.

Antioxidants attack free radicals within the body and eliminate other harmful compounds that may cause a number of ailments.

3. Hemorrhoid Relief

Bitter melon contains anti-inflammatory properties beneficial to individuals who have a condition called piles, or most commonly known as hemorrhoids. Simply make a salve using the plant’s roots and apply it topically to the venous swelling to alleviate pain and to stop bleeding. You can also treat sores and other skin conditions using this salve.

4. Sharper VisionBitter melon notably contains eye-health improving flavonoids such as α-carotene, β-carotene, lutein, and zeaxanthin. Together, these compounds enhance eyesight and night vision as well as decelerate macular degeneration. These compounds play a crucial role in fighting the effects of aging, eliminating oxygen-derived free radicals and reactive oxygen species that may lead to numerous complications.

 

5. Asthma Relief

Bitter melon can help reduce symptoms brought on by certain respiratory conditions such as asthma, bronchitis, and hay fever. Bitter melon has anti-histaminic, anti-inflammatory, and anti-viral properties, which makes it an ideal supplementary food in maintaining good respiratory health. It also helps promote sound sleep.

 

6. Bitter Melon Consumption Can Help To Address Skin Conditions

Ayurvedic and traditional Chinese medicine has been using bitter melon as treatment for skin conditions for centuries.

The antifungal and antibacterial compounds present in the bitter melon fight off numerous skin infections including ringworm, scabies, and even the auto-immune condition psoriasis. Bitter melon stops guanylate cyclase activity that is responsible for worsening psoriasis.

Apply extracted juice or salve to the affected areas to reduce swelling and irritation.

7. Inhibits Cancer Cell Proliferation

Free radicals seek out and destroy healthy cells, which accelerate aging and lead to numerous complications including cancer. Bitter melon is abundant in antioxidants that combat free radical effects as well as creating a strong defense against common diseases. Along with its abundance of antioxidant are its anti-tumor and anti-carcinogenic attributes. Recent clinical trials and pharmacologic studies show a link between eating bitter melon and the reduction of tumors in individuals with breast, cervical, and prostate cancer. There has been a significant body of studies conducted for its role on cancer prevention, which is promising as an alternative to potent chemotherapy agents.

8. Help Treat HIV and Herpes

A laboratory test, published in the Journal of Naturopathic Medicine, suggests that the phytochemical composition of bitter melon inhibits the activity of the human immunodeficiency virus. Bitter melon provides additive effects in combination with AIDS treatment.

Likewise, early studies suggest that bitter melon, with its antiviral properties, treat patients with herpes simplex virus-1 (HSV-1), and prevent the spread of herpetic plaques to other persons.

9. Reduce Cholesterol LevelsBitter melon is also widely consumed to help lower bad cholesterol levels, which in turn prevents atherosclerotic plaque buildup in arterial walls. Decongested arteries reduce the risk of heart attack, heart disease, and stroke.

10. Promotes Bone Health and Fast Wound Healing

It is also rich in Vitamin K, an essential nutrient that plays a key role in regulating normal blood clotting. Vitamin K also assists in calcium distribution throughout the body; thus, increasing bone density and reduces your risk of experiencing a bone fracture. Individuals with osteoporosis should consider eating foods rich in vitamin K such as the bitter melon.

Otherwise, the lack of this vitamin may cause bone fractures, easy bruising, defective blood clotting and excessive menstrual bleeding to name a few.

11. Promotes Good Digestion and Weight Loss

Bitter melon only carries 17 calories for every hundred-gram serving. Though it is low in calories, it is rich in dietary fiber, vitamins, and minerals. Dietary fiber aids in proper digestion and smooth peristaltic movement of food and waste through the digestive system. Hence, relieving indigestion and preventing constipation.

Likewise, its significant levels of charantin help increase your glucose uptake and glycogen synthesis, which in turn help you lose excess weight by decreasing storage in fat cells.

bitter melon, bitter gourd, ampalaya
Conclusion

The bitter melon may have an unattractive appearance and a taste that most people would despise, but its bitterness also comes with a wealth of health benefits. It has antibiotic, anti-allergenic, anti-inflammatory, anti-fungal, anti-viral, and anti-parasitic properties. The bitter melon’s most notable health benefit is its ability in managing type 2 diabetes. Bitter melon is a staple among Asian cuisines and traditional medicine.

Health Check: how much sugar is it OK to eat?


Consuming too much energy – whether from fat or carbohydrates, including sugar – will make you gain weight. If left unchecked, this excess weight increases your risk of lifestyle-related diseases such as diabetes, heart disease and some cancers.

In recognition of this, the World Health Organisation (WHO) recommends adults and children limit their intake of “free sugars” to less than 10% of their total energy intake. Below 5% is even better and carries additional health benefits.

Free sugars refer to monosaccharides (such as glucose) and disaccharides (sucrose or table sugar) added to foods and drinks by the manufacturer, cook or consumer. It also refers to sugars naturally present in honey, syrups, fruit juices and fruit juice concentrates.

Free sugars are different from sugars found in whole fresh fruits and vegetables. There is no scientific evidence that consuming these sugars leads to health problems. So the guidelines don’t apply to fresh fruit and vegetables.

If you’re an average-sized adult eating and drinking enough to maintain a healthy body weight (roughly 8,700 kilojoules per day), 10% of your total energy intake from free sugar roughly translates to no more than 54 grams, or around 12 teaspoons, per day.

But more than half of Australians (52%) usually exceed the WHO recommendations.

Most sugar we eat (around 75%) comes from processed and pre-packaged foods and drinks. The rest we add to tea, coffee and cereal, and other foods we cook.

Sugary drinks account for the largest proportion of Australians’ free sugar intake. A single can or 600ml bottle of soft drink can easily exceed the WHO recommendation, providing around 40-70g sugar. One teaspoon equates to 4.5g white sugar, so soft drinks range from 8.5 to 15.5 teaspoons.

More insidious sources of sugar are drinks marketed as “healthier” options, such as iced teas, coconut water, juices and smoothies. Some medium-sized smoothies have up to 14 teaspoons of sugar (63.5g) in a 475ml drink.

Flavoured milks are also high in free sugars (11 teaspoons in a 500ml carton) but can be a good source of calcium.

Other foods high in sugar are breakfast cereals. While some sugar is derived from dried fruit, many popular granola mixes add various forms of sugar. Sugar content for one cup of cereal ranges from 12.5g for creamy honey quick oats to 20.5g for granola. A cup of some types of cereal can contain 30% to 50% of your daily free sugar allowance.

A surprise for many is the added sugars in savoury foods including sauces and condiments. Tomato and barbecue sauce, salad dressing and sweet’n’sour stirfry sauces contain one to two teaspoons of sugar in each tablespoon (20ml).

Popular “health foods” and sugar-free recipes can be particularly misleading as they can contain as much sugar as their sweet alternatives. Usually this is referring to “sucrose-free” (what we know as white sugar) and doesn’t exclude the use of other sugar derivatives such as rice malt syrup, agave or maple syrup, typical of popular sugar-free recipes. These are still forms of sugar and contribute to energy intake and unhealthy weight gain when consumed in excess.

We know treats such as chocolate, pastries and ice-cream do contain sugar, but just how much might surprise you. A chocolate-coated icecream will contribute five teaspoons of sugar, or almost half the daily limit.

Sugar added to foods and drink can have different names depending on where it comes from. When reading labels, alternative names for sugar include:

  • sucrose
  • glucose
  • corn syrup
  • maltose
  • dextrose
  • raw sugar
  • cane sugar
  • malt extract
  • fruit juice concentrate
  • molasses.

The main ingredient is sugar if any of these are listed as the first three ingredients.

Note that products with “no added sugar” nutrition claims may still contain high levels of natural sugars, also considered as free sugars. A good example of this is fruit juice: the sugar content of 200ml of sweetened orange juice (21g) is 7g higher than unsweetened juice (14g).

So how can you cut down on your added sugars?

First, eat fewer foods with free sugars. Reduce your intake of sweets such as chocolate and lollies, cakes, biscuits, sugar-sweetened soft drinks, cordials, fruit drinks, vitamin waters and sports drinks.

Second, make some swaps. Swap your cereal for a lower-sugar variety and limit the amount of sugar you add. Drink plain tap water and swap brands for sugar-free or those with lower added sugar. Swap fruit juices for whole fruits, which also give you fibre and other health-promoting nutrients.

Finally, read the labels on packaged food and drink. If the product has more than 15g of sugar per 100g, check to see if sugar is one of the main ingredients. If it is, use the nutrient information panel to compare and choose products containing less sugar.

You don’t need to quit sugar to improve your health


Not long ago, fat was the evil dietary villain. Before that it was salt. Now the sugar-free diet has exploded onto the health and wellness scene – and seems to have topped many people’s list of New Year’s resolutions.

Sugar-free diets encourage people to avoid table sugar (sucrose), sweeteners such as honey and maple syrup, refined flours, condiments, soft drinks, sweets and some fruits such as bananas. Some also recommend eliminating or restricting dairy products.

The diet’s advocates rightly note that excessive sugar consumption may lead to obesity and therefore increase the risk of type 2 diabetes, heart disease and some cancers.

And it’s true that Australians are eating too much of the sweet stuff, with 35% of an adult’s total daily calories now coming from “discretionary foods”, which includes lollies, chocolates and soft drinks.


 

But you don’t need to quit sugar to lift your game on healthy eating. Quitting sugar is unlikely to improve your health any more than cutting down on ultra-processed foods, eating more vegetables, cooking food from scratch and limiting how much extra sugar you eat and drink.

At best, the sugar-free diet is confusing and imposes an arbitrary set of rules that aren’t based on scientific evidence. At worst, such a restrictive diet can create food fear or an unhealthy relationship with food.

Diet mentality

The sugar-free diet is restrictive, with lists of “allowed” foods (such as whole grains, blueberries and grapefruits) and “not allowed” foods (such as white bread, bananas and raisins). This inadvertently promotes a diet mentality and causes followers to worry about accidentally eating something that’s not allowed.

People who worry about food are more likely to diet. This may be because they are worried specifically about their weight, or about the impact certain nutrients have on their health.

Research shows dieting is not effective over the long term and can lead to greater weight gain over time. The brain interprets dieting and restriction as a famine, which causes the storage of fat for future shortages.

Dieting is stressful. In response to this, our body releases stress hormones such as cortisol, which may cause the body to store fat, particularly in the abdominal area.

Restrictive diets can cause food anxiety.

Worrying about food can lead to stress, anxiety and depression, and is one of the defining features of the condition known as orthorexia.

Orthorexia is the overwhelming preoccupation with eating healthily. People with orthorexia spend a lot of time thinking and worrying about food and eliminating foods that are deemed impure or unhealthy. Some experts suggest this behaviour is a precursor to, or a form of, an eating disorder.



Estimates suggest anywhere between 7% and 58% of the population may have the condition. There are no clear diagnostic criteria, which makes it difficult to measure its prevalence.

But we know 15% of women will experience an eating disorder at some stage in their life. So we need to ensure nutrition advice, however well-intentioned, doesn’t promote or encourage disordered eating.

Cutting out the good stuff

Some sugar-free diets advise people to cut out or restrict healthy foods and food groups such as fruit and dairy, without evidence to support their exclusion. This perpetuates the food fear/dietary restriction cycle and may contribute to nutrient deficiencies.

These diets also recommend people avoid fruit for a period of time, and then re-introduce a limited list of expensive “healthy” fruits (such as berries) while avoiding the cheaper “unhealthy” fruits such as bananas.

Bananas are usually on the list of foods to avoid or limit. Toni

Whole fruit is a wonderful source of fibre, essential vitamins and minerals, as well as antioxidants. Two serves of fruit per day can reduce the risk of developing some cancers, type 2 diabetes and heart disease. Given only half of Australians eat the recommended two serves of fruit per day, the advice to restrict fruit further could result in people missing out on these benefits.

Many sugar-free followers also avoid plain dairy products such as milk, yoghurt and cheese, due to the assumption these contain sugars.

The sugar in plain dairy products is the natural lactose (a carbohydrate), which is nothing to fear. Unnecessarily avoiding dairy may increase the risk of osteoporosis if not replaced with adequate levels of calcium from other sources.

Sugar replacements

Strangely, many of the sugar-free recipes use expensive sugar alternatives – such as rice malt syrup (due to its low fructose content), maple syrup (which is sometimes allowed and sometimes not) and dates – to replace sugar.

However, these are still sugars and contain the same number of calories per gram as any other sugar. These alternatives offer no additional nutritional benefits other than rice malt syrup, which is a useful option only for those with a fructose malabsorption issue, and dates, which contain fibre.


People often eat more of the food containing these alternatives under the guise of it being sugar-free, which could lead to unintentional weight gain. One study found people ate about 35% more of a snack when it was perceived as healthy than when it was seen as unhealthy.

What to do instead

Eat plenty of plants, enjoy whole grains, beans and legumes. Fruit is your friend – not your enemy.

Most people could probably eat a little less sugar, a little less often, but you don’t have to quit it for good to be healthy.

Savour every mouthful of that chocolate cake or “sometimes food”. Turn off technology and eat the cake mindfully, so that your brain can register that you have eaten it. That way you can get pleasure and satisfaction from it, and you won’t be craving it again an hour later.

No matter how we choose to eat, remember that health is not simply about the number on the scale, the size of our waist, or the foods we avoid. It’s also about our psychological health and our relationship with food, which is just as important as our physical health.

The role of diet in the prevention and treatment of Inflammatory Bowel Diseases.


Abstract

Inflammatory bowel diseases (IBD) – Crohn’s disease (CD) and ulcerative colitis (UC) – are chronic conditions characterised by relapsing inflammation of the gastrointestinal tract. They represent an increasing public health concern and an aetiological enigma due to unknown causal factors. The current knowledge on the pathogenesis of IBD is that genetically susceptible individuals develop intolerance to a dysregulated gut microflora (dysbiosis) and chronic inflammation develops as a result of environmental triggers. Among the environmental factors associated with IBD, diet plays an important role in modulating the gut microbiome, and, consequently, it could have a therapeutic impact on the disease course.

An overabundance of calories and some macronutrients typical of the Western dietetic pattern increase gut inflammation, whereas several micronutrients characteristic of the Mediterranean Diet have the potential to modulate gut inflammation, according to recent evidence. Immunonutrition has emerged as a new concept putting forward the role of vitamins such as vitamins A, C, E, and D, folic acid, beta carotene and trace elements such as zinc, selenium, manganese and iron. However, when assessed in clinical trials, specific micronutrients showed a limited benefit. Further research is required to evaluate the role of individual food compounds and complex nutritional interventions with the potential to decrease inflammation as a means of prevention and management of IBD.

The current dietary recommendations for disease prevention and management are scarce and non evidence-based. This review summarizes the current knowledge on the complex interaction between diet, microbiome and immune-modulation in IBD, with particular focus to the role of the Mediterranean Diet as a tool for prevention and treatment of the disease.

People who drink moderate amounts of coffee each day have a lower risk of death from disease


Image: People who drink moderate amounts of coffee each day have a lower risk of death from disease

Many people drink coffee for an energy boost, but do you know that it can also prolong your life? A study published in the journal Circulation revealed that moderate amounts — or less than five cups — of coffee each day can lower your risk of death from many diseases, such as cardiovascular disease, Type 2 diabetes, and nervous system disorders. It can also lower death risk due to suicide.

The study’s researchers explained this effect could be attributed to coffee’s naturally occurring chemical compounds. These bioactive compounds reduce insulin resistance and systematic inflammation, which might be responsible for the association between coffee and mortality. (Related: Coffee drinkers have a lower mortality rate and lower risk of various cancers.)

The researchers reached this conclusion after analyzing the coffee consumption every four years of participants from three large studies: 74,890 women in the Nurses’ Health Study; 93,054 women in the Nurses’ Health Study 2; and 40,557 men in the Health Professionals Follow-up Study. They did this by using validated food questionnaires. During the follow-up period of up to 30 years, 19,524 women and 12,432 men died from different causes.

They found that people who often consumed coffee tend to smoke cigarettes and drink alcohol. To differentiate the effects of coffee from smoking, they carried out their analysis again among non-smokers. Through this, the protective benefits of coffee on deaths became even more apparent.

With these findings, the researchers suggested that regular intake of coffee could be included as part of a healthy, balanced diet. However, pregnant women and children should consider the potential high intake of caffeine from coffee or other drinks.

The power of the elements: Discover Colloidal Silver Mouthwash with quality, natural ingredients like Sangre de Drago sap, black walnut hulls, menthol crystals and more. Zero artificial sweeteners, colors or alcohol. Learn more at the Health Ranger Store and help support this news site.

Because the study was not designed to show a direct cause and effect relationship between coffee consumption and dying from illness, the researchers noted that the findings should be interpreted with caution. Still, this study contributes to the claim that moderate consumption of coffee offers health benefits.

The many benefits of coffee

Many studies have shown that drinking a cup of coffee provides health benefits. Here are some of them:

  • Coffee helps prevent diabetes: A study conducted by University of California, Los Angeles (UCLA) researchers showed that drinking coffee helps prevent Type 2 diabetes by increasing levels of the protein sex hormone-binding globulin (SHBG), which regulates hormones that influence the development of Type 2 diabetes. Researchers from Harvard School of Public Health (HSPH) also found that increased coffee intake may lower Type 2 diabetes risk.
  • Coffee protects against Parkinson’s disease: Studies have shown that consuming more coffee and caffeine may significantly lower the risk of Parkinson’s disease. It has also been reported that the caffeine content of coffee may help control movement in people with Parkinson’s disease.
  • Coffee keeps the liver healthy: Coffee has some protective effects on the liver. Studies have shown that regular intake of coffee can protect against liver diseases, such as primary sclerosing cholangitis (PSC) and cirrhosis of the liver, especially alcoholic cirrhosis. Drinking decaffeinated coffee also decreases liver enzyme levels. Research has also shown that coffee may help ward off cancer. A study by Italian researchers revealed that coffee intake cuts the risk of liver cancer by up to 40 percent. Moreover, some of the results indicate that drinking three cups of coffee a day may reduce liver cancer risk by more than 50 percent.
  • Coffee prevents heart disease: A study conducted by Beth Israel Deaconess Medical Center (BIDMC) and HSPC researchers showed that moderate coffee intake, or two European cups, each day prevents heart failure. Drinking four European cups a day can lower heart failure risk by 11 percent.
%d bloggers like this: