Sometimes, a sound bite is taken too far. That was the case when a fellow dietitian was quoted in an article as saying that a slice of pizza would be a better choice for breakfast than most cereals (the article went viral, of course).
My hunch is that she was illustrating a point about sugary cereals—and I highly doubt she’d recommend a greasy, pepperoni-covered slice over a bowl of high-fiber shredded wheat.
As with everything, you need to read past the headlines. It’s true that some cold cereals pack a lot of sugar and are made with fiber-poor refined grains, giving you a quick, sweet lift—and leaving you hungry an hour later. On the other hand, pizza does have some protein (and fat) to keep you satisfied.
But pizza also covers some pretty wide territory. A whole-wheat crust topped with veggies will deliver more fiber and vitamins (and far less sodium) than, say, a triple-meat on white.
Same goes for cereal. There are hyper-sweetened varieties that contain very little filling fiber or protein. But there are also low- and no-sugar whole grain cereals that, when topped with milk and some berries or banana slices, make a meal that’s got up to half the fiber you need in the day, valuable vitamins and minerals like iron and calcium, and even a decent dose of protein. In fact, a serving of shredded wheat with milk has about 12 grams of protein—compared to 10 grams in a slice of thin-crust pepperoni pizza.
In other words, cereal’s bad rap isn’t necessarily deserved. Ditto for pizza’s health halo here. When a headline flies in the face of common sense (like this one touting ice cream as a brain-boosting breakfast), it probably is.
If you’re worried about the sugar in cereal, use my label-reading rule of thumb: I look for roughly 6 grams or less of added sugar per serving. You can also sprinkle a low- or no-sugar cereal (like plain o’s) with a teaspoon of sugar. Still sweet, but far less sugar than most varieties. Or swap out cereal for plain oatmeal, adding your own sweetener, fruit, and nuts.
If you’re looking for something with more savory, eggs are proven to be one of the most filling breakfasts around. You can also put a savory spin on oatmeal, topping it with avocado, veggies, and cheese.
And guess what? There’s also nothing wrong with an occasional slice of cold pizza for breakfast.
“There has been a long-standing need for additional effective treatments for treatment-resistant depression, a serious and life-threatening condition,” said Dr. Tiffany Farchione, acting director of the Division of Psychiatry Products in the FDA’s Center for Drug Evaluation and Research, in a press release about the decision.
“This is potentially a game changer for millions of people,” said Dr. Dennis Charney, dean of the Icahn School of Medicine at Mount Sinai in New York. “It offers a lot of hope.”
Esketamine works through a mechanism different from those of drugs like Prozac, Charney said. And that is probably why studies show it can often help people with major depressive disorder who haven’t been helped by other drugs.
“Many of them are suicidal,” Charney said. “So it’s essentially a deadly disease when you haven’t responded to available treatments and you’ve been suffering for years if not decades.”
Charney was part of the team that first showed two decades ago that ketamine could treat depression. He also is named as co-inventor on patents filed by the Icahn School of Medicine relating to the treatment for treatment-resistant depression, suicidal ideation and other disorders.
Esketamine, developed by Johnson & Johnson, will be administered as a nasal spray and be used in conjunction with an oral antidepressant. It will be marketed under the brand name Spravato. The FDA has approved it for patients who have failed to respond adequately to at least two other drugs.
That means about 5 million of the 16 million people in the U.S. with major depression might benefit from esketamine, said Courtney Billington, president of Janssen Neuroscience, a unit of Johnson & Johnson.
But esketamine presents some challenges because of its similarities to ketamine. In high doses, both drugs can cause sedation and out-of-body experiences. And ketamine, often called Special K in its illicit form, has become a popular party drug.
So Johnson & Johnson is taking steps to make sure esketamine will be used only as intended, Billington said.
“Spravato will not be dispensed directly to a patient to take at home,” he said. “It will only be available in approved and certified treatment centers.”
Patients will inhale the drug under supervision at these centers once or twice a week. And they will receive a dose that is unlikely to produce side effects such as hallucinations.
“The amount of active ingredient that’s in this product, it’s at a very, very low dose,” Billington said.
Even so, the FDA, according to its press release, is requiring a warning label that says patients “are at risk for sedation and difficulty with attention, judgment and thinking (dissociation), abuse and misuse, and suicidal thoughts and behaviors after administration of the drug,”
Esketamine’s approval comes as more and more doctors have begun administering a generic version of ketamine for depression. Generic ketamine is approved as an anesthetic, not as an antidepressant. Even so, doctors can legally prescribe it for off-label medical uses.
And as a growing number of studies have shown ketamine’s effectiveness against depression, ketamine clinics have sprung up around the United States. These clinics often administer the drug in an intravenous infusion that can cost more than $500 per treatment.
Many doctors who have become comfortable offering ketamine for depression probably won’t switch to esketamine, said Dr. Demitri Papolos, director of research for the Juvenile Bipolar Research Foundation and a clinical associate professor at Albert Einstein College of Medicine.
For the past 10 years, Papolos has been prescribing an intranasal form of ketamine for children and adolescents who have a disorder that includes symptoms of depression.
“I’m very pleased that finally the FDA has approved a form of ketamine for treatment-resistant mood disorders,” Papolos said. He said the approval legitimizes the approach he and other doctors have been taking.
But he hopes that doctors who are currently using ketamine continue to do so. “It’ll be a lot less expensive and a lot easier for their patients [than esketamine],” he said.
Esketamine “may not be as effective as a generic that any psychiatrist or physician can prescribe without restrictions,” Papolos said.
Johnson & Johnson said the wholesale cost of each treatment with esketamine will range from $590 to $885, depending on the dose. That means twice-weekly treatments during the first month will cost centers that offer the drug at least $4,720 to $6,785. Subsequent weekly treatments will cost about half as much.
The drugmaker says those figures don’t include administration and observation costs.
There is much more than meets the eye with the banana. A household favorite, a lost-leader at the grocery store, a metaphor for psychiatric problems, a mainstay of comic slap stick, the banana has woven itself deeply into human affairs, on both gut and mental levels. And this relationship is at least 10,000 years old, as far as conscious human cultivation of the species goes.
But, many do not realize that the banana is more than just an exceptionally starch-rich fruit, but has a complex biochemistry, with unique pharmacologically active properties which scientists have characterized.
Bananas actually contain the catecholamines dopamine[i] and norepinephrine,[ii] the very same adrenal hormones released in the human body when it undergoes the typical “fight-or-flight” response. It is believed that the banana plant uses the biosynthetic pathway for catecholamines when under the stress of attack to fight off infectious pathogens such as in crown rot disease.[iii] Some varieties excrete a form of serotonin in their sap, [iv] and there is even mention in the biomedical literature of the discovery of the NSAID drug naproxen (trade name Aleve) within the banana cultivar Musa acuminate.
Sound crazy? Well, that’s to be expected from a fruit we commonly associate with a state of unbridled madness.
But the banana has a secret second life. It has been observed slyly practicing medicine without a license, and indeed, seems readily equipped with the following nutritional “super powers”….
Green Banana Is Anti-Diarrhea
Before a banana is ripened, while it is in its green state, it contains starches which are resistant to digestion, but have been studied in combination with pectin to significantly reduce intestinal permeability and fluid loss in those suffering with bouts of diarrhea.[v][vi] Even when used without pectin, green banana has been found to hasten recovery of acute and prolonged childhood diarrhea when managed at home in rural Bangladesh.[vii]
Banana Is Anti-Ulcer Activity
Banana powder has been studied to prevent ulcer formation induced by a variety of drugs, including aspirin, indomethacin, phenylbutazone, prednisolone, cysteamine, and histamine. Researchers have found that banana powder treatment not only strengthens mucosal resistance against ulcerogens but also promotes healing by inducing cellular proliferation.[viii] One of the anti-ulcer compounds identified within unripe banana is the flavonoid known as leucocyanidin, and which is particularly effective against aspirin-induced erosion.[ix]
Banana Peel Suppresses Prostate Gland Growth
Banana peel has been found to suppress testosterone-induced prostate gland enlargement.[x]
A water extract of banana stem extract has been found to suppress the formation of oxalate-associated kidney stones in the animal model, leading researchers to conclude that it “may be a useful agent in the treatment of patients with hyperoxaluric urolithiasis.”[xi]
Banana Consumption Protects the Skin Against UV-Light Damage
UV-B light induced skin damage may be prevented or reduced through the consumption of bananas, with a protective effect against loss of skin elasticity.[xii]
Banana Has Anti-Diabetic Properties
Banana flower extract has been studied in a type 1 diabetic model,[xiii] and has been found to have both antioxidant and blood sugar lowering effects. Banana root extracts have been discovered to contain blood sugar lowering properties comparable in efficacy to the drug glibenclamide (trade name Glyburide).[xiv] Also, unripe banana contain starches resistant to hydrolysis and therefore beneficial to diabetics.[xv]
Banana Contains a Variety of Anti-Infective Compounds
Banana contains compounds with demonstrable anti-MRSA activity,[xvi] anti-HIV replicative activity,[xvii][xviii] and following metabolic transformation by fungi, anti-leishmanicidal activity.[xix] The leaves of the plant are used in many centers in India during the care of patients with toxic epidermal necrolysis (TEN) and other extensive blistering disorders which can result in deadly sepsis in the absence of treatment.[xx]
Whatever you do, don’t slip up and buy non-organic bananas. Like other foods that are grown in massive monocultures, without crop rotation, they are a pesticide-intensive crop. And this concern extends beyond simply what agrochemicals you are exposing your body to. In conventional farming, the planet gets carpet-bombed as well with these nasty toxicants, and since we all live on the same Earth, eventually those pesticides make it back up the food chain to you, whether you choose to eat organic or not.
A recently published study in the journal Emerging Infectious Diseases says wild caught Alaskan salmon may harbor a species of tapeworm previously known to infect only Asian fish. Researchers warn that based on their findings, any salmon caught along the North American Pacific coast may have the parasite. The concern is that if you eat the fish undercooked or raw, you could become a host to this gruesome organism.
CNN reports that the tapeworm newly discovered in Alaskan salmon is named Diphyllobothrium nihonkaiense, also known as the Japanese broad tapeworm. This species accounts for the most infections in humans, in contradiction to the previous belief that the dubious distinction went to the most common fish tapeworm, Diphyllobothrium latum. A team of scientists found four species of Pacific salmon known to carry the Japanese tapeworm: chum salmon, masu salmon, pink salmon and sockeye salmon. These fish are caught and then shipped worldwide, so the infection may occur in humans anywhere on the planet. (RELATED: Stay informed about the health risks of food ingredients at Ingredients.news)
Tapeworms, including the Japanese version can grow to 30 feet inside a human digestive tract. Infestation often goes undetected, because symptoms may often be mild, with symptoms largely attributed to other conditions by medical practitioners. When fish are commercially caught worldwide, they are placed on ice for the journey to port. But this does not freeze the fish, it only refrigerates them. To kill the possibly present parasite worms, the fish need to be frozen. Salmon sushi at a restaurant or store can be assumed to be an unsafe commodity unless you know it has been frozen or you freeze it yourself. Additionally, the fish can be sufficiently cooked for assurance of safety against parasitic infection.
Get CLEAN FOOD and help support our mission to keep you informed: The Health Ranger Store lab verifies everything we sell with accredited testing for heavy metals, microbiology and food safety. Certified organic facility, ISO-accredited on-site laboratory, no GMOs or synthetic ingredients. The world’s #1 source of lab-verified clean foods and superfoods for nutritional healing. 600+ products available. Explore now.
Jayde Ferguson, a scientist at the Alaska Department of Fish and Game believes, “The tapeworm itself is probably not new — it’s just that more skilled parasitologist started looking for it. Identifying these parasites is challenging. This was simply a more detailed evaluation of the Diphyllobothrium that has occurred here for over a millennium.”
Professor of preventive medicine at Vanderbilt University School of Medicine Dr. William Schaffner stated, “Because we do things that we haven’t done before, now, we have these fresh caught fish that can be transported anywhere and eaten raw. … I am sure we will be on the lookout for this kind of tapeworm going forward.”
Parasitic worms – an under-recognized epidemic
Naturopath Marijah McCain is a widely experienced healer who apprenticed with a parasitologist and knows firsthand about these disgusting critters and how to rid the body of the menace. Though rare, various helminths (worms) such as the tapeworm can find a home in your brain with grave consequences. Quoting Marijah:
“Myself and a handful of others, like Dr. Hulda Clark, have spent years trying to bring the parasite issue to the forefront of preventative & curative medicine. The good news is the medical field is slowly training their doctors once again on the health risks of parasites… Most Americans carry parasites and this is currently a serious health issue. Parasites are not meant to kill you, they just sit inside you and steal your nutrition. But, when a person gets weakened from another ailment the parasites can take hold and become life threatening. This is why EVERYONE with any health disorder should do an anti-parasite program at least once a year. Twice a year if you live with animals. People interested in maintaining good health should also do routine parasite cleansing…”
Marijah says that symptoms caused by parasites include gas, diarrhea, chronic constipation, bloating, fatigue, skin rashes, mood swings, insomnia, nail biting, dry skin, weight gain, bad breath, brittle hair, hair loss, and muscle cramping. Because parasites can invade any tissue in the body, symptoms can occur anywhere. Dr. McCain states that parasites are a contributing factor in conditions such as Crohn’s disease, ulcerative colitis, diabetes, some heart disease, arthritis, asthma, as well as others. She points out that in the US, the medical system is in denial about the health risks of parasitic infections, and doctors make a huge blunder when they fail to recognize the role that parasites play in disease. “Parasites are the cause of hundreds of misdiagnosed ailments,” she claims, and recommends natural anti-parasite formulas in lieu of conventional toxic allopathic medications.
A balanced diet promotes health and wellness in everyone, including people with cancer. By properly balancing your plate and adopting (or dropping) certain eating habits, you’ll be ensuring that your body is getting the nutrition that it needs to function well. An accurately balanced plate of food will consist of 50 percent fruits and vegetables, 25 percent lean proteins, and 25 percent whole grains.
Fruits and vegetables
Fruits and vegetables are an essential source of all the necessary vitamins, minerals, phytochemicals, and antioxidants that your body needs. The American Cancer Society recommends that you eat at least two and a half cups of fruits and vegetables every day to help maintain a healthy diet and reduce your cancer risk.
Uncertain on how to integrate that into your diet? It’s easy: embrace the rainbow. Fruits and vegetables with the most color—dark green, yellow, orange, red, blue/purple—typically have the most nutrients.
Eating the recommended amount of protein everyday will help boost your energy and immune function. However, people tend to consume too much protein per day—oftentimes from meats that are high in saturated fats such as beef, processed meats, and lamb.
It’s recommended that you stick to protein sources like skinless chicken, fish, eggs, and plant-based proteins like beans. For those that eat red meat, you don’t have to swear off it forever: Just limit your intake to less than 18 oz per week and when you do consume red meat, make sure it’s lean (less than 15 percent fat and usually containing the words “loin” or “sirloin”). You can also try using healthier cooking methods like baking or grilling.
Ultimately, it’s all about portion control. For meats, that equates to three ounces; envision the size of a deck of cards. Other helpful tips on portion sizing can be found here.
If you haven’t jumped aboard the whole grain train yet, it’s time you should. These fiber-rich foods are so healthy that the American Heart Association recommends eating three or more servings of whole grains per day. But that doesn’t mean you need to eat endless amounts of brown rice and whole wheat bread. Here are some other options:
Oats: Accessible and affordable, you can integrate whole grains into your breakfast with steel cut, old fashioned, or even instant oats.
Quinoa: This South American grain is gaining popularity among the U.S. and is crunchy in texture.
Corn: Even corn is a whole grain and there are endless amounts of forms to choose from (whole kernels, popcorn, cornmeal, corn tortillas).
The oh-so-Instagrammable food movement has been thoroughly debunked – but it shows no signs of going away. The real question is why we were so desperate to believe it.
In the spring of 2014, Jordan Younger noticed that her hair was falling out in clumps. “Not cool” was her reaction. At the time, Younger, 23, believed herself to be eating the healthiest of all possible diets. She was a “gluten-free, sugar-free, oil-free, grain-free, legume-free, plant-based raw vegan”. As The Blonde Vegan, Younger was a “wellness” blogger in New York City, one of thousands on Instagram (where she had 70,000 followers) rallying under the hashtag #eatclean. Although she had no qualifications as a nutritionist, Younger had sold more than 40,000 copies of her own $25, five-day “cleanse” programme – a formula for an all-raw, plant-based diet majoring on green juice.
But the “clean” diet that Younger was selling as the route to health was making its creator sick. Far from being super-healthy, she was suffering from a serious eating disorder: orthorexia, an obsession with consuming only foods that are pure and perfect. Younger’s raw vegan diet had caused her periods to stop and given her skin an orange tinge from all the sweet potato and carrots she consumed (the only carbohydrates she permitted herself). Eventually, she sought psychological help, and began to slowly widen the repertoire of foods she would allow herself to eat, starting with fish. She recognised that the problem was not her veganism, per se, but the particularly rigid and restrictive diet regime she had imposed on herself.
As Younger slowly recovered from her eating disorder, she faced a new dilemma. “What would people think”, she agonised, “if they knew the Blonde Vegan was eating fish?” She levelled with her followers in a blogpost entitled Why I’m Transitioning Away from Veganism. Within hours of announcing her new diet, Younger was receiving irate messages from vegans demanding money back from the cleanse programmes and T-shirts they had bought from her site (featuring slogans such as “OH KALE YES”).
She lost followers “by the thousands” and received a daily raft of angry messages, including death threats. Some responded to her confession that she was suffering from an eating disorder by accusing her of being a “fat piece of lard” who didn’t have the discipline to be truly “clean”.
For as long as people have eaten food, there have been diets and quack cures. But previously, these existed, like conspiracy theories, on the fringes of food culture. “Clean eating” was different, because it established itself as a challenge to mainstream ways of eating, and its wild popularity over the past five years has enabled it to move far beyond the fringes. Powered by social media, it has been more absolutist in its claims and more popular in its reach than any previous school of modern nutrition advice.
At its simplest, clean eating is about ingesting nothing but “whole” or “unprocessed” foods (whatever is meant by these deeply ambiguous terms). Some versions of clean eating have been vegan, while others espouse various meats (preferably wild) and something mysteriously called “bone broth” (stock, to you and me). At first, clean eating sounded modest and even homespun: rather than counting calories, you would eat as many nutritious home-cooked substances as possible.
But it quickly became clear that “clean eating” was more than a diet; it was a belief system, which propagated the idea that the way most people eat is not simply fattening, but impure. Seemingly out of nowhere, a whole universe of coconut oil, dubious promises and spiralised courgettes has emerged. Back in the distant mists of 2009, James Duigan, owner of The Bodyism gym in London and sometime personal trainer to the model Elle MacPherson, published his first Clean and Lean book. As an early adopter of #eatclean, Duigan notes that he “battled” with his publisher “to include ingredients like kale and quinoa, because no one had ever heard of them”. Now quinoa is in every supermarket and kale has become as normal as lettuce. “I long for the days when clean eating meant not getting too much down your front,” the novelist Susie Boyt joked recently.
Almost as soon as it became ubiquitous, clean eating sparked a backlash. By 2015, Nigella Lawson was speaking for many when she expressed “disgust” at clean eating as a judgmental form of body fascism. “Food is not dirty”, Lawson wrote. Clean eating has been attacked by critics such as the baker and cookbook author Ruby Tandoh (who wrote a much-shared article on the subject in Vice magazine in May 2016) for being an incitement to eating disorders.
Others have pointed out that, as a method of healthy eating, it’s founded on bad science. In June, the American Heart Association suggested that the coconut oil beloved as a panacea by clean eaters actually had “no known offsetting favourable effects”, and that consuming it could result in higher LDL cholesterol. A few weeks later, Anthony Warner – a food consultant with a background in science who blogs as The Angry Chef – published a book-length assault on the science of clean eating, calling it a world of “quinoa bowls” and “nutribollocks” fuelled by the modern information age.
When Dr Giles Yeo, a geneticist at the University of Cambridge, presented an episode of the BBC’s Horizon this year that examined the scientific evidence for different schools of clean eating, he found everything from innocuous recipes to serious malpractice.
He reported on the “alkaline diet” of Dr Robert O Young, who peddled the idea that disease is caused by eating “acidic” foods. After being diagnosed with terminal cancer in her 20s, Naima Houder-Mohammed, an officer in the British army, paid Young more than $77,000 for treatment (including meals of avocado, which Young calls “God’s butter”) at his “pH miracle” ranch in the US in 2012. She died later that year. Separately, Young was jailed in June this year after being convicted of charges including practising medicine without a licence. While he may represent an extreme case, it is clear that many wellness gurus, as Yeo’s programme concluded, tell a “troubling narrative” founded on falsehoods.
As the negative press for clean eating has intensified over the past year, many of the early goddesses of #eatclean have tried to rebrand – declaring they no longer use the word “clean” to describe the recipes that have sold them millions of books. Ella Mills – AKA Deliciously Ella, the food writer and entrepreneur whose coconut-and-oat energy balls sell for £1.79 apiece in British supermarkets – said on Yeo’s Horizon programme that she felt that the word “clean” as applied to eating originally meant nothing but natural, real, unprocessed food. “Now, it means diet, it means fad,” she complained.
But however much the concept of clean eating has been logically refuted and publicly reviled, the thing itself shows few signs of dying. Step into the cookbook section of any book shop and you will see how many recipe writers continue to promise us inner purity and outer beauty. Even if you have never knowingly tried to “eat clean”, it’s impossible to avoid the trend altogether, because it changed the foods available to all of us, and the way they are spoken of.
Avocados now outsell oranges in the UK. Susi Richards, head of product development at Sainsbury’s supermarkets, told me earlier this year that she had been taken aback by the pace at which demand for products fitting with the clean eating lifestyle have grown in the UK. Families who would once have eaten potato waffles are now experimenting with lower carb butternut “squaffles” (slices of butternut squash cut to resemble a waffle). Nutribullets – a brand of compact blenders designed for making supposedly radiance-bestowing juices and smoothies – are now mentioned in some circles as casually as wooden spoons.
Why has clean eating proved so difficult to kill off? Hadley Freeman, in this paper, identified clean eating as part of a post-truth culture, whose adherents are impervious, or even hostile, to facts and experts. But to understand how clean eating took hold with such tenacity, it’s necessary first to consider just what a terrifying thing food has become for millions of people in the modern world. The interesting question is not whether clean eating is nonsense, but why so many intelligent people decided to put their faith in it.
We are not the only generation to have looked in disgust at an unhealthy food environment and wished that we could replace it with nutrients that were perfectly safe to eat. In the 1850s, a British chemist called Arthur Hill Hassall became convinced that the whole food supply of London was riddled with toxins and fakery. What’s more, he was right. Hassall had done a series of investigations for the medical journal the Lancet, and found that much of what was for sale as food and drink was not what it seemed: “coffee” made from burnt sugar and chicory; pickles dyed green with poisonous copper colourings.
Years of exposing the toxic deceptions all around him seems to have driven Hassall to a state of paranoia. He started to see poison everywhere, and decided that the answer was to create a set of totally uncontaminated food products. In 1881, he set up his own firm, The Pure Food Company, which would only use ingredients of unimpeachable quality. Hassall took water that was “softened and purified” and combined it with the finest Smithfield beef to make the purest beef jelly and disgusting-sounding “fibrinous meat lozenges” – the energy balls of Victorian England. The Pure Food Company of 1881 sounds just like a hundred wellness food businesses today – except for the fact that it collapsed within a year due to lack of sales.
We are once again living in an environment where ordinary food, which should be something reliable and sustaining, has come to feel noxious. Unlike the Victorians, we do not fear that our coffee is fake so much as that our entire pattern of eating may be bad for us, in ways that we can’t fully identify. One of the things that makes the new wave of wellness cookbooks so appealing is that they assure the reader that they offer a new way of eating that comes without any fear or guilt.
The founding principle of these modern wellness regimes is that our current way of eating is slowly poisoning us. “Much of the food on offer to us today is nutritionally substandard,” write the Hemsley sisters, best-selling champions of “nutrient-dense” food. It’s hard to disagree with the proposition that modern diets are generally “substandard”, even if you don’t share the Hemsleys’ solution of going “grain-free”. “All of these diets have a kernel of truth that is spun out into some bigger fantasy,” Giles Yeo says – hence their huge appeal.
Clean eating – whether it is called that or not – is perhaps best seen as a dysfunctional response to a still more dysfunctional food supply: a dream of purity in a toxic world. To walk into a modern western supermarket is to be assailed by aisle upon aisle of salty, oily snacks and sugary cereals, of “bread” that has been neither proved nor fermented, of cheap, sweetened drinks and meat from animals kept in inhumane conditions.
In the postwar decades, most countries in the world underwent what the professor of nutrition Barry Popkin calls a “nutrition transition” to a westernised diet high in sugar, meat, fat, salt, refined oils and ultra-processed concoctions, and low in vegetables. Affluence and multi-national food companies replaced the hunger of earlier generations with an unwholesome banquet of sweet drinks and convenience foods that teach us from a young age to crave more of the same. Wherever this pattern of eating travelled, it brought with it dramatic rises in ill health, from allergies to cancer.
In prosperous countries, large numbers of people – whether they wanted to lose weight or not – became understandably scared of the modern food supply and what it was doing to our bodies: type 2 diabetes, obesity and cardiovascular disease, not to mention a host of other complaints that are influenced by diet, ranging from Alzheimer’s to gout. When mainstream diets start to sicken people, it is unsurprising that many of us should seek other ways of eating to keep ourselves safe from harm. Our collective anxiety around diet was exacerbated by a general impression that mainstream scientific advice on diet – inflated by newspaper headlines – could not be trusted. First these so-called experts tell us to avoid fat, then sugar, and all the while people get less and less healthy. What will these “experts” say next, and why should we believe them?
Into this atmosphere of anxiety and confusion stepped a series of gurus offering messages of wonderful simplicity and reassurance: eat this way and I will make you fresh and healthy again. It is very hard to pinpoint the exact moment when “clean eating” started, because it is not so much as a single diet as a portmanteau term that has borrowed ideas from numerous pre-existing diets: a bit of Paleo here, some Atkins there, with a few remnants of 1960s macrobiotics thrown in for good measure.
But some time in the early 2000s, two distinct but interrelated versions of clean eating became popular in the US – one based on the creed of “real” food, and the other on the idea of “detox”. Once the concept of cleanliness had entered the realm of eating, it was only a matter of time before the basic idea spread contagiously across Instagram, where fans of #eatclean could share their artfully photographed green juices and rainbow salad bowls.
The first and more moderate version of “clean” food started in 2007, when Tosca Reno, a Canadian fitness model, published a book called The Eat-Clean Diet. In it, Reno described how she lost 34kg (75lb) and transformed her health by avoiding all over-refined and “processed foods”, particularly white flour and sugar. A typical Reno eat-clean meal might be stir-fried chicken and vegetables over brown rice; or almond-date biscotti with a cup of tea. In many ways The Eat-Clean Diet was like any number of diet books that had come before, advising plenty of vegetables and modestly portioned, home-cooked meals. The difference, which Anthony Warner calls a piece of “genius” on Reno’s part, was that she presented it, above all, as a holistic way of living.
Meanwhile, a second version of clean eating was spearheaded by a former cardiologist from Uruguay called Alejandro Junger, the author of Clean: The Revolutionary Program to Restore the Body’s Natural Ability to Heal Itself, which was published in 2009 after Junger’s clean detox system had been praised by Gwyneth Paltrow on her Goop website. Junger’s system was far stricter than Reno’s, requiring, for a few weeks, a radical elimination diet based on liquid meals and a total exclusion of caffeine, alcohol, dairy and eggs, sugar, all vegetables in the “nightshade family” (tomatoes, aubergines and so on), red meat (which, according to Junger, creates an acidic “inner environment”), among other foods. During this phase, Junger advised a largely liquid diet either composed of home-made juices and soups, or of his own special powdered shakes. After the detox period, Junger advised very cautiously reintroducing “toxic triggers” such as wheat (“a classic trigger of allergic responses”) and dairy (“an acid-forming food”).
To read Junger’s book is to feel that everything edible in our world is potentially toxic. Yet, as with Arthur Hassall, many of Junger’s fears may be justified. Junger writes as a doctor with first-hand knowledge of diet-related epidemics of “cancer, cardiovascular disease, diabetes and autoimmune disease”. The book is full of case studies of individuals who follow Junger’s detox and emerge lighter, leaner and happier. “Who is the candidate for using this program?” Junger asks, replying: “Everyone who lives a modern life, eats a modern diet and inhabits the modern world.”
To my surprise, I found myself compelled by the messianic tone of Junger’s Clean – though not quite compelled enough to pay $475 for his 21-day programme (which, in any case, doesn’t ship outside of North America), or to give up my daily breakfast of inflammatory coffee, gut-irritating sourdough toast and acid-forming butter, on which I feel surprisingly well. When I told Giles Yeo how seductive I found Junger’s words, almost despite myself, he said: “This is their magic! They are all charismatic human beings. I do think the clean-eating gurus believe in it themselves. They drink the Koolaid.”
Over the past 50 years, mainstream healthcare in the west has been inexplicably blind to the role that diet plays in preventing and alleviating ill health. When it started, #eatclean spoke to growing numbers of people who felt that their existing way of eating was causing them problems, from weight gain to headaches to stress, and that conventional medicine could not help. In the absence of nutrition guidance from doctors, it was a natural step for individuals to start experimenting with cutting out this food or that.
From 2009 to 2014, the number of Americans who actively avoided gluten, despite not suffering from coeliac disease, more than tripled. It also became fashionable to drink a whole pantheon of non-dairy milks, ranging from oat milk to almond milk. I have lactose-intolerant and vegan friends who say that #eatclean has made it far easier for them to buy ingredients that they once had to go to specialist health-food stores to find. What isn’t so easy now is to find reliable information on special diets in the sea of half-truths and bunkum.
Someone who observed how quickly and radically #eatclean changed the market for health-food books is Anne Dolamore, a publisher at the independent food publishers Grub Street, based in London. Dolamore has been publishing health-related food books since 1995, a time when “free-from” cooking was a tiny subculture. In the days before Google, Dolamore – who has long believed that “food is medicine” – felt that books on special diets by authors with “proper credentials” could serve a useful purpose. In 1995, Grub Street published The Everyday Diabetic Cookbook, which has since sold over 100,000 copies in the UK. Other successful books followed, including The Everyday Wheat-Free and Gluten-Free Cookbook by Michelle Berriedale-Johnson, published in 1998.
In 2012, the market for “wellness” cookbooks in the UK suddenly changed, starting with the surprise success of Honestly Healthy by Natasha Corrett and Vicki Edgson, which sold around 80,000 copies. Louise Haines, a publisher at 4th Estate, recalls that the previous big trend in British food publishing had been baking, but the baking boom “died overnight, virtually, and a number of sugar-free books came through”.
At Grub Street, Anne Dolamore watched aghast as bestselling cookbooks piled up from a “never-ending stream of blonde, willowy ‘authorities’, many of whom seemed to be devising diets based on little but their own limited experience”. If Junger and Reno laid the groundwork for “eat clean” to become a vast global trend, it was social media and the internet that did the rest. Almost all of the authors of the British clean eating bestsellers started off as bloggers or Instagrammers, many of them beautiful women in their early 20s who were genuinely convinced that the diets they had invented had cured them of various chronic ailments.
Every wellness guru worth her Himalayan pink salt has a story of how changing what you eat can change your life. “Food has the power to make or break you,” wrote Amelia Freer in her 2014 bestseller Eat. Nourish. Glow. (which has sold more than 200,000 copies). Freer was leading a busy life as a personal assistant to the Prince of Wales when she realised that her tummy “looked and felt as if it had a football in it” from too many snatched dinners of cheese on toast or “factory-made food”. By giving up “processed” and convenience foods (“margarine, yuck!”) along with gluten and sugar, Freer claimed to have found the secrets to “looking younger and feeling healthier”.
Perhaps the best-known diet-transformation story of all is that of Ella Mills – possessor of more than a million Instagram followers. In 2011, Mills was diagnosed with postural tachycardia syndrome, a condition characterised by dizziness and extreme fatigue. Mills began blogging about food after discovering that her symptoms radically improved when she swapped her sugar-laden diet for “plant-based, natural foods”. Mills – who used to be a model – made following a “free-from” diet seem not drab or deprived, but deeply aspirational. By the time her first book appeared in January 2015, her vast following on social media helped her to sell 32,000 copies in the first week alone.
There was something paradoxical about the way these books were marketed. What they were selling purported to be an alternative to a sordidly commercial food industry. “If it’s got a barcode or a ‘promise’, don’t buy it,” wrote Freer. Yet clean eating is itself a wildly profitable commercial enterprise, promoted using photogenic young bloggers on a multi-billion-dollar tech platform. Literary agent Zoe Ross tells me that around 2015 she began to notice that “the market was scouring Instagram for copycat acts – specifically very pretty, very young girls pushing curated food and lifestyle”.
After years on the margins, health-based cooking was finally getting a mass audience. In 2016, 18 out the 20 top sellers in Amazon UK’s food and drink book category had a focus on healthy eating and dieting. The irony, however, was that the kind of well-researched books Dolamore and others once published no longer tended to sell so well, because health publishing was now dominated by social media celebrities. Bookshops were heaving with so many of these “clean” books that even the authors themselves started to feel that there were too many of them. Alice Liveing, a 23-year-old personal trainer who writes as Clean Eating Alice, argued in her 2016 book Eat Well Every Day that she was “championing what I feel is a much-needed breath of fresh air in what I think is an incredibly saturated market”. To my untrained eye, browsing through her book, Alice’s fresh approach to diet looked very similar to countless others: date and almond energy balls, kale chips, beetroot and feta burgers.
Then again, shouldn’t we give clean eating due credit for achieving the miracle of turning beetroot and kale into objects of desire? Data from analysts Kantar Worldpanel show that UK sales of fresh beetroot have risen dramatically from £42.8m in 2013 to £50.5m in 2015. Some would argue that, in developed nations where most people eat shockingly poor diets, low in greens and high in sugar, this new union of health and food has done a modicum of good. Giles Yeo – who spent some time cooking a spicy sweet-potato dish with Ella Mills for his BBC programme – agrees that many of the clean eating recipes he tried are actually “a tasty and cool way to cook vegetables”. But why, Yeo asks, do these authors not simply say “I am publishing a very good vegetarian cookbook” and stop there, instead of making larger claims about the power of vegetables to beautify or prevent disease? “The poison comes from the fact that they are wrapping the whole thing up in pseudoscience,” Yeo says. “If you base something on falsehoods, it empowers people to take extreme actions, and this is where the harm begins.”
You can’t found a new faith system with the words “I am publishing a very good vegetarian cookbook”. For this, you need something stronger. You need the assurance of make-believe, whispered sweetly. Grind this cauliflower into tiny pieces and you can make a special kind of no-carb rice! Avoid all sugar and your skin will shimmer! Among other things, clean eating confirms how vulnerable and lost millions of us feel about diet – which really means how lost we feel about our own bodies. We are so unmoored that we will put our faith in any master who promises us that we, too, can become pure and good.
I can pinpoint the exact moment that my own feelings about clean eating changed from ambivalence to outright dislike. I was on stage at the Cheltenham literary festival with dietician Renee McGregor (who works both with Olympic athletes and eating disorder sufferers) when a crowd of around 300 clean-eating fans started jeering and shouting at us. We were supposedly taking part in a clean-eating debate with “nutritionist” Madeleine Shaw, author of Get the Glow and Ready Steady Glow.
Before that week, I had never read any of Shaw’s work. As I flicked through Ready Steady Glow, I was fairly endeared by the upbeat tone (“stop depriving yourself and start living”) and bright photos of a beaming Shaw. “I often surprise myself by finding new things to spiralise” she writes, introducing a “sweet potato noodle” salad. Cauliflower pizza, in her view, is “quite simply: the best invention ever”.
But underneath the brightness there were notes of restriction that I found both worrying and confused. “As ever, all my recipes are sugar-and-wheat free”, Shaw announces, only to give a recipe for “gluten-free” brownies that contains 200g of coconut sugar, a substance that costs a lot more than your average white granulated sugar, but is metabolised by the body in the same way. I was still more alarmed by step four in Shaw’s nine-point food “philosophy”, which says that all bread and pasta should be avoided: they are “beige foods”, which are “full of chemicals, preservatives and genetically modified wheat”, and “not whole foods”. Shaw’s book makes no distinction between a loaf of, say, bleached sliced white, and a homemade wholemeal sourdough.
When we met on stage in Cheltenham, I asked Shaw why she told people to cut out all bread, and was startled when she denied she had said any such thing (rye bread was her favourite, she added). McGregor asked Shaw what she meant when she wrote that people should try to eat only “clean proteins”; meat that was “not deep-fried” was her rather baffling reply. McGregor’s main concern about clean eating, she added, was that as a professional treating young people with eating disorders, she had seen first-hand how the rules and restrictions of clean eating often segued into debilitating anorexia or orthorexia.
“But I only see the positive”, said Shaw, now wiping away tears. It was at this point that the audience, who were already restless whenever McGregor or I spoke, descended into outright hostility, shouting and hissing for us to get off stage. In a book shop after the event, as fans came up to Shaw to thank her for giving them “the glow”, I too burst into tears when one person jabbed her fingers at me and said I should be ashamed, as an “older women” (I am 43), to have criticised a younger one. On Twitter that night, some Shaw fans made derogatory comments about how McGregor and I looked, under the hashtag #youarewhatyoueat. The implication was that, if we were less photogenic than Shaw, we clearly had nothing of any value to say about food (never mind the fact that McGregor has degrees in biochemistry and nutrition).
Thinking about the event on the train home, I realised that the crowd were angry with us not because they disagreed with the details (it’s pretty clear that you can’t have sugar in “sugar-free” recipes), but because they disliked the fact that we were arguing at all. To insist on the facts made us come across as cruelly negative. We had punctured the happy belief-bubble of glowiness that they had come to imbibe from Shaw. It’s striking that in many of the wellness cookbooks, mainstream scientific evidence on diet is seen as more or less irrelevant, not least because the gurus see the complacency of science as part of what made our diets so bad in the first place.
Amelia Freer, in Eat. Nourish. Glow, admits that “we can’t prove that dairy is the cause” of ailments ranging from IBS to joint pain, but concludes that it’s “surely worth” cutting dairy out anyway, just as a precaution. In another context, Freer writes that “I’m told it takes 17 years for scientific knowledge to filter down” to become general knowledge, while advising that gluten should be avoided. Once we enter the territory where all authority and expertise are automatically suspect, you can start to claim almost anything – and many #eatclean authorities do.
That night in Cheltenham, I saw that clean eating – or whatever name it now goes under – had elements of a post-truth cult. As with any cult, it could be something dark and divisive if you got on the wrong side of it. After Giles Yeo’s BBC programme was aired, he told me he was startled to find himself subjected to relentless online trolling. “They said I was funded by big pharma, and therefore obviously wouldn’t see the benefits of a healthy diet over medicine. These were outright lies.” (Yeo is employed by the University of Cambridge, and funded by the Medical Research Council.)
It’s increasingly clear that clean eating, for all its good intentions, can cause real harm, both to truth and to human beings. Over the past 18 months, McGregor says, “every single client with an eating disorder who walks into my clinic doors is either following or wants to follow a ‘clean’ way of eating”.
In her new book, Orthorexia, McGregor observes that while eating disorders long predate the #eatclean trend, “food rules” (such as eating no dairy or avoiding all grains) easily become “a guise for restricting food intake”. Moreover, they are not even good rules, based as they are on “unsubstantiated, unscientific claims”. Take almond milk, which is widely touted as a superior alternative to cow’s milk. McGregor sees it as little better than “expensive water”, containing just 0.1g protein per 100ml, compared with 3.2g per 100ml in cow’s milk. But she often finds it very difficult to convince her clients that restricting themselves to these “clean” foods is in the long run worse for their health than what she calls “unrestrained eating” – balanced and varied meals, but no panic about the odd ice cream or chocolate bar.
Clearly, not everyone who bought a clean-eating book has developed an eating disorder. But a movement whose premise is that normal food is unhealthy has now muddied the waters of “healthy eating” for everyone else, by planting the idea that a good diet is one founded on absolutes.
The true calamity of clean eating is not that it is entirely false. It is that it contains “a kernel of truth”, as Giles Yeo puts it. “When you strip down all the pseudo babble, they are absolutely right to say that we should eat more vegetables, less refined sugar and less meat,” Yeo said, sipping a black coffee in his office at the Institute of Metabolic Science in Cambridge, where he spends his days researching the causes of obesity. Yeo agrees with the clean eaters that our environment of cheap, plentiful, sugary, fatty food is a recipe for widespread obesity and ill health. The problem is it’s near impossible to pick out the sensible bits of “clean eating” and ignore the rest. #Eatclean made healthy eating seem like something “expensive, exclusive and difficult to achieve”, as Anthony Warner writes. Whether the term “clean” is used or not, there is a new puritanism about food that has taken root very widely.
A few weeks ago, I overheard a fit, middle-aged man at the gym berating a friend for not eating a better diet – a conversation that would once have been unimaginable among men. The first man was telling the second that the “skinny burgers” he preferred were nothing but “shitty mince and marketing” – and arguing that he could get almost everything he needed from a diet of vegetables, cooked with no oil. “Fat is fat, at the end of the day,” he concluded, before bemoaning the “idiots” who tried to eat something wholesome like a salad, then ruined it by adding salt. “If you have one bad diet day a week, you undo all your good work.”
The real question is how to fight this kind of diet absolutism without bouncing back to a mindless celebration of the modern food environment that is demonstrably making so many people sick. In 2016, more than 600 children in the UK were registered as living with type 2 diabetes; before 2002, there were no reported cases of children suffering from the condition, whose causes are diet-related.
Our food system is in desperate need of reform. There’s a danger that, in fighting the nonsense of clean eating, we end up looking like apologists for a commercial food supply that is failing in its basic task of nourishing us. Former orthorexia sufferer Edward L Yuen has argued – in his 2014 book, Beating Orthorexia – that the old advice of “everything in moderation” no longer works in a food environment where eating in the “middle ground” may still leave you with chronic diseases. When portions are supersized and Snickers bars are sold by the metre (something I saw in my local Tesco recently), eating “normally” is not necessarily a balanced option. The answer isn’t yet another perfect diet, but a shift in our idea of what constitutes normal food.
Sales of courgettes in the UK soared 20% from 2014 to 2015, fuelled by the rise of the spiraliser. But overall consumption of vegetables, both in the UK and worldwide, is still vanishingly small (with 74% of the adult UK population not managing to eat five a day). That is much lower than it was in the 1950s, when freshly cooked daily meals were still something that most people took for granted.
Among the affluent classes who already ate a healthier-than-average diet, the Instagram goddesses created a new model of dietary perfection to aim for. For the rest of the population, however, it simply placed the ideal of healthy food ever further out of reach. Behind the shiny covers of the clean-eating books, there is a harsh form of economic exclusion that says that someone who can’t afford wheatgrass or spirulina can never be truly “well”.
As the conversation I overheard in the gym illustrates, this way of thinking is especially dangerous because it obscures the message that, in fact, small changes in diet can have a large beneficial impact. If you think you can’t be healthy unless you eat nothing but vegetables, you might miss the fact that (as a recent overview of the evidence by epidemiologists showed) there are substantial benefits from raising your fruit-and-veg intake from zero portions a day to just two.
Among its many other offences, “clean eating” was a series of claims about food that were all or nothing – which only serves to underline the fact that most people, as usual, are stuck with nothing.
Heart disease remains the number one cause of death in the United States. One of the risk factors that increases your risk of heart disease is high cholesterol, particularly LDL cholesterol. But, although cholesterol tends to take the blame as the cause of heart disease, it is actually just one of many factors that can put you at risk.
What is Cholesterol?
Cholesterol is a fatty, wax-like substance found in your body and bloodstream. It is either made by the liver or comes from the food you eat. There are two main types of cholesterol, LDL and HDL. LDL is considered “bad” cholesterol because it can build up in the arteries increasing risk of heart attack or stroke. HDL cholesterol is “good” cholesterol because it is responsible for clearing any build-up from the arteries.1
Cholesterol and Heart Disease
When there is too much cholesterol in the blood, it can start to be deposited along the artery walls. When this happens the arteries become thick and hardened, making it difficult for the blood to pass through. The narrowing of the arteries is called atherosclerosis. If one of these arteries becomes blocked, this can lead to a heart attack or stroke.
The American Heart Association and American College of Cardiologists publish guidelines every 5 years to help doctors manage cholesterol in hopes of lowering the risk of heart attack and stroke. The latest 2018 guidelines did have a focus on cholesterol management. The guidelines recommend that total cholesterol be maintained at 150 mg/dL and LDL at less than 100 mg/dL, unless there were other risk factors for heart disease.2
But, even these guidelines encourage physicians to look at the whole picture, including lifestyle, genetics, and other medical conditions before starting a statin to lower cholesterol levels. The reason for this is that heart disease risk is more than just about cholesterol and some research indicates that cholesterol may not be the biggest risk factor. A 2016 review of 19 studies found that elderly people over 60 years old with high LDL cholesterol lived as long or longer than those with low LDL cholesterol.3
Lowering Your Risk
As the understanding of the connection between cholesterol and heart disease continues to grow, recommendations will certainly change over time. But, in the meantime, if you are concerned about your risk for heart disease, there are several things you can do.
First, see your doctor regularly to monitor your cholesterol, blood sugar, blood pressure, weight, and other risk factors. They can help you manage any conditions that can increase your risk.
Second, consider some lifestyle changes that can help improve the health of your heart. You might think you need to cut out all cholesterol from your diet to reduce your blood cholesterol, but new research has found that dietary cholesterol doesn’t impact blood cholesterol all that much. Instead, focus on eating more healthy fats and reducing your intake of saturated fats, which can increase cholesterol levels. A 2016 study found that just a few small changes, such as modifying the type of fat you eat, can significantly improve cholesterol.4
Other lifestyle factors such as smoking, exercise, and your body weight can also impact your risk for heart disease. Aim for at least 30 minutes of exercise daily and maintain a healthy weight. If you are a smoker, work with a professional to help you quit.
Cholesterol and heart disease risk is a complex issue. As researchers begin to learn more about how high cholesterol impacts the development of this disease, guidelines will improve. But, for now you should focus on living a healthy lifestyle to protect your heart and body.
For many people, summertime is simply incomplete without serving a delicious array of scrumptious green vegetables. But here’s an idea: why not take a break from the usual leafy green salads, and dig into a plateful of succulent zucchini instead?
A member of the gourd family (Cucurbitaceae), zucchini is an easy-to-grow summer squash native to Central America and Mexico. It was brought to the United States by Italian immigrants during the 1920s. Some popular zucchini varieties include golden zucchini, tatume, costata romanesco, and yellow crooknecks.
Zucchini grows best in warm, frost-free weather, and thrives in fertile, moisture-rich soil. It grows on bushy plants that are 2 ½ feet tall, with rambling vines. Aside from the actual fruit (zucchini is a fruit, botanically speaking), the large, yellow, trumpet-shaped blossoms are also edible.
Zucchini can grow to massive sizes, but bigger does not necessarily mean better when it comes to this garden favorite. Small and medium-sized zucchinis (six to eight inches long and two inches in diameter) are more flavorful. The bigger the zucchini, the harder, seedier, and less flavorful it becomes. Look for dark-skinned zucchinis, which are richer in nutrients.
You won’t run out of uses for zucchini, as it is a highly versatile food that can suit many recipes. Mix it into soups, salads, or frittatas, serve it as a side dish with your meat dishes, or make “zucchini fries,” served with an onion dip as an appetizer. Want a healthy, no-grain and no-wheat pasta? Make zucchini “noodles” using a vegetable peeler – it will be as al dente as regular spaghetti.
Health Benefits of Zucchini
You’ll surely be impressed with the nutritional bounty that zucchini offers. It’s low-calorie (with only 17 calories per 100 grams) and high in fiber and has no cholesterol or unhealthy fats. It’s also rich in flavonoid antioxidants such as zeaxanthin, carotenes, and lutein, which play a significant role in slowing down aging and preventing diseases with their free radical-zapping properties.
Most of the antioxidants and fiber are in its skin, though, so it’s best to keep the skin when serving this food.
Zucchini is also a wonderful source of potassium, a heart-friendly nutrient that helps moderate your blood pressure levels and counters the effects of too much sodium. In fact, a zucchini has more potassium than a banana.
Zucchini is rich in B-complex vitamins, folate, B6, B1, B2, B3, and choline, as well as minerals like zinc and magnesium, which are all valuable in ensuring healthy blood sugar regulation – a definite advantage for diabetics. It also contains essential minerals such as iron, manganese, and phosphorus.
However, remember that most zucchini varieties in the United States are genetically modified, so it’s best to purchase this vegetable organically.
Zucchini Nutrition Facts
Serving Size: 3.5 ounces (100 grams), with skin, raw
Calories from Fat
*Percent Daily Values are based on a 2,000 calorie diet. Your daily values may be higher or lower depending on your calorie
Studies on Zucchini
A study revealed the wide array of health benefits that summer squashes, including zucchini, provide. According to food expert and food industry analyst Phil Lempert, the starchy carbohydrates in these crops come from polysaccharides in the cell walls and include pectins. An increasing number of animal studies now show that these starchy components in squash may have antioxidant, anti-inflammatory, anti-diabetic, and insulin-regulating properties.1
1. Melt the coconut oil or butter in a large soup pot. Add the celery seeds, zucchini, celery, bell pepper, and salt. Stir, cover, and cook over low heat until the vegetables are tender about 30 minutes.
2. Puree the cashews in the vegetable stock in a blender or food processor.
3. Combine the vegetables and the cashew-stock mixture in a blender. Puree thoroughly.
4. Place a large sieve (wire mesh strainer) over the soup pot.* Strain the vegetable-cashew mixture through it, stirring, and pressing the mixture down with the back of a spoon. Scrape bottom of sieve frequently. This step allows the soup to become creamy.
5. Discard the remaining “material” that pulls from the sieve.
6. Reheat the soup to serving temperature.
This recipe makes six servings.
*If using cashew butter, mix in the cashew butter after the third step and reheat in a soup pot.
Zucchini Fun Facts
Did you know that the largest zucchini ever recorded was 69 ½ inches long and weighed 65 pounds? Thanks to Bernard Lavery of Plymouth Devon, UK, who grew the massive vegetable in his garden.
If you’re a true zucchini fan, head to Obetz, Ohio every August 22nd to 25th, where they hold a zucchini festival, which features a parade, pageant, contests, arts and crafts, and games – a unique celebration of the remarkable and versatile zucchini.
What’s not to love about zucchini? Botanically a fruit but more commonly perceived as a vegetable, this versatile summer squash is a must-have in your garden – and your plate. It’s easy-to-grow and requires minimal care while providing a tasty and versatile bounty that you can incorporate into many recipes – it can even be transformed into veggie noodles!
Zucchini possesses an impressive nutritional content – it boasts high levels of potassium, B-vitamins, dietary fiber, and antioxidants, which all offer immense benefits to your health. It can even potentially help regulate blood sugar levels, which can greatly benefit diabetics.
Here’s one yummy way to enjoy zucchini: simply slice it lengthwise, brush with coconut oil and a light sprinkle of sea salt, then lightly grill it. This will bring out the natural sweetness of this healthy food. Make sure to buy organic, non-GMO zucchini.
Cheerios are the best-selling breakfast cereal in America. The multi-grain version contains 18 milligrams of iron per serving, according to the label. Like almost any refined food made with wheat flour, it is fortified with iron. As it happens, there’s not a ton of oversight in the fortification process. One study measured the actual iron content of 29 breakfast cereals, and found that 21 contained 20 percent1 more than the label value, and 8 contained 50 percent more.1 One contained nearly 200 percent of the label value.
If your bowl of cereal actually contains 120 percent more iron than advertised, that’s about 22 mg. A safe assumption is that people tend to consume at least two serving sizes at a time.1 That gets us to 44 mg. The recommended daily allowance of iron is 8 mg for men and 18 mg for pre-menopausal women. The tolerable upper intake—which is the maximum daily intake thought to be safe by the National Institutes of Health—is 45 mg for adults.
It is entirely feasible that an average citizen could get awfully close to exceeding the maximum daily iron intake regarded as safe with a single bowl of what is supposed to be a pretty healthy whole-grain breakfast option.
And that’s just breakfast.
At the same time that our iron consumption has grown to the borders of safety, we are beginning to understand that elevated iron levels are associated with everything from cancer to heart disease. Christina Ellervik, a research scientist at Boston Children’s Hospital who studies the connection between iron and diabetes, puts it this way: “Where we are with iron now is like where we were with cholesterol 40 years ago.”
The story of energy metabolism—the basic engine of life at the cellular level—is one of electrons flowing much like water flows from mountains to the sea. Our cells can make use of this flow by regulating how these electrons travel, and by harvesting energy from them as they do so. The whole set-up is really not so unlike a hydroelectric dam.
The sea toward which these electrons flow is oxygen, and for most of life on earth, iron is the river. (Octopuses are strange outliers here—they use copper instead of iron, which makes their blood greenish-blue rather than red). Oxygen is hungry for electrons, making it an ideal destination. The proteins that facilitate the delivery contain tiny cores of iron, which manage the handling of the electrons as they are shuttled toward oxygen.
This is why iron and oxygen are both essential for life. There is a dark side to this cellular idyll, though.
Oxygen and iron are essential for the production of energy, but may also conspire to destroy the delicate order of our cells.
Normal energy metabolism in cells produces low levels of toxic byproducts. One of these byproducts is a derivative of oxygen called superoxide. Luckily, cells contain several enzymes that clean up most of this leaked superoxide almost immediately. They do so by converting it into another intermediary called hydrogen peroxide, which you might have in your medicine cabinet for treating nicks and scrapes. The hydrogen peroxide is then detoxified into water and oxygen.
Things can go awry if either superoxide or hydrogen peroxide happen to meet some iron on the way to detoxification. What then happens is a set of chemical reactions (described by Haber-Weiss chemistry and Fenton chemistry) that produce a potent and reactive oxygen derivative known as the hydroxyl radical. This radical—also called a free radical—wreaks havoc on biological molecules everywhere. As the chemists Barry Halliwell and John Gutteridge—who wrote the book on iron biochemistry—put it, “the reactivity of the hydroxyl radicals is so great that, if they are formed in living systems, they will react immediately with whatever biological molecule is in their vicinity, producing secondary radicals of variable reactivity.”2
Such is the Faustian bargain that has been struck by life on this planet. Oxygen and iron are essential for the production of energy, but may also conspire to destroy the delicate order of our cells. As the neuroscientist J.R. Connor has said, “life was designed to exist at the very interface between iron sufficiency and deficiency.”3
Hemoglobin, ferritin, and transferrin
At the end of the 20th century, the metabolism of iron in the human body was still a bit of a mystery. Scientists knew of only two ways that the body could excrete iron—bleeding, and the routine sloughing of skin and gastrointestinal cells. But these processes amount to only a few milligrams per day. That meant that the body must have some way to tightly regulate iron absorption from the diet. In 2000 a major breakthrough was announced—a protein was found that functioned as the master regulator for iron. The system, as so many biological systems are, is perfectly elegant. When iron levels are sufficient, the protein, called hepcidin, is secreted into the blood by the liver. It then signals to gastrointestinal cells to decrease their absorption of iron, and for other cells around the body to sequester their iron into ferritin, a protein that stores iron. When iron levels are low, blood levels of hepcidin fall, and intestinal cells begin absorbing iron again. Hepcidin has since become recognized as the principal governor of iron homeostasis in the human body.
But if hepcidin so masterfully regulates absorption of iron from the diet to match the body’s needs, is it possible for anyone to absorb too much iron?
In 1996, a team of scientists announced that they had discovered the gene responsible for hereditary hemochromatosis, a disorder causing the body to absorb too much iron. They called it HFE. Subsequent work revealed that the product of the HFE gene was instrumental in regulating hepcidin. People with a heritable mutation in this gene effectively have a gross handicap in the entire regulatory apparatus that hepcidin coordinates.
This, then, leaves open the possibility that some of us could in fact take in more iron than the body is able to handle. But how common are these mutations? Common enough to matter for even a minority of people reading these words?
Surprisingly, the answer is yes. The prevalence of hereditary hemochromatosis, in which two defective copies of the HFE gene are present and there are clinical signs of iron overload, is actually pretty high—as many as 1 in 200 in the United States. And perhaps 1 in 40 may have two defective HFE genes without overt hemochromatosis.4 That’s more than 8 million Americans who could have a significant short-circuit in their ability to regulate iron absorption and metabolism.
What if you have only one defective HFE gene, and one perfectly normal gene? This is called heterozygosity. We would expect to find more people in this situation than the homozygotes, or those with two bad copies of the gene. And in fact we do. Current estimates suggest that more than 30 percent of the U.S. population could be heterozygotes with one dysfunctional HFE gene.4 That’s pretty close to 100 million people.
Does this matter? Or is one good gene enough? There isn’t much research, but so far the evidence suggests that some heterozygotes do have impaired iron metabolism. Studies have shown that HFE heterozygotes seem to have modest elevations of ferritin as well as transferrin, a protein which chaperones iron through the blood, which would indicate elevated levels of iron.5,6 And a study published in 2001 concluded that HFE heterozygotes may have up to a fourfold increased risk of developing iron overload.4
A host of research articles have supported an association between iron and cancer.
Perhaps more concerning is that these heterozygotes have also been shown to be at increased risk for several chronic diseases, like heart disease and stroke. One study found that heterozygotes who smoked had a 3.5 times greater risk of cardiovascular disease than controls, while another found that heterozygosity alone significantly increased the risk of heart attack and stroke.7,8 A third study found that heterozygosity increased nearly sixfold the risk of cardiomyopathy, which can lead to heart failure.9
The connection between excessive iron and cardiovascular disease may extend beyond HFE heterozygotes. A recent meta-analysis identified 55 studies of this connection that were rigorous enough to meet their inclusion criteria. Out of 55 studies, 27 supported a positive relationship between iron and cardiovascular disease (more iron equals more disease), 20 found no significant relationship, and 8 found a negative relationship (more iron equals less disease).10
A few highlights: a Scandinavian study compared men who suffered a heart attack to men who didn’t, and found that elevated ferritin levels conferred a two- to threefold increase in heart attack risk. Another found that having a high ferritin level made a heart attack five times more likely than having a normal level. A larger study of 2,000 Finnish men found that an elevated ferritin level increased the risk of heart attack twofold, and that every 1 percent increase in ferritin level conferred a further 4 percent increase in that risk. The only other risk factor found to be stronger than ferritin in this study was smoking.
Ferritin isn’t a perfect marker of iron status, though, because it can also be affected by anything that causes inflammation. To address this problem a team of Canadian researchers directly compared blood iron levels to heart attack risk, and found that higher levels conferred a twofold increased risk in men and a fivefold increased risk in women.
If cardiovascular disease is one point in iron’s web of disease, diabetes may be another. The first hint of a relationship between iron and diabetes came in the late 1980s, when researchers discovered that patients receiving regular blood transfusions (which contain quite a bit of iron) were at significantly increased risk of diabetes. In hemochromatosis, there had been no way to know if the associated disturbance in glucose metabolism was due to the accumulation of iron itself, or to the underlying genetic defect. This new link between frequent transfusions and diabetes was indirect evidence that the iron itself may be the cause.
The next step was to mine existing data for associations between markers of iron status and diabetes. The first study to do so came out of Finland in 1997: Among 1,000 randomly selected Scandinavian men, ferritin emerged as a strong predictor of dysfunctional glucose metabolism, second only to body mass index as a risk factor.11 In 1999, researchers found that an elevated ferritin level increased the odds of having diabetes fivefold in men and nearly fourfold in women—similar in magnitude to the association between obesity and diabetes.12 Five years later, another study found that elevated ferritin roughly doubled the risk for metabolic syndrome, a condition that often leads to diabetes, hypertension, liver disease, and cardiovascular disease.13
Christina Ellervik’s first contribution to the field came in 2011, with a study investigating the association between increased transferrin saturation—a measure of how much iron is loaded onto the transferrin protein, which moves iron through the blood—and diabetes risk.14 Ellervik found that within a sample of nearly 35,000 Danes, transferrin saturation greater than 50 percent conferred a two- to threefold increased risk of diabetes. She also identified an increase in mortality rates with transferrin saturation greater than 50 percent.
In 2015, she led another study that found that, among a sample of 6,000 people, those whose ferritin levels were in the highest 20 percent had 4 times greater odds of diabetes than those with ferritin levels in the lowest 20 percent.15 Blood glucose levels, blood insulin levels, and insulin sensitivity all were raised with higher ferritin levels.
“It’s incredible that there is so much promising literature, and nobody—nobody—is doing the clinical trials.”
There’s a problem here, though. All of these studies show associations. They show that two things tend to happen together. But they don’t tell us anything about causality. To learn something about causality, you need an intervention. In the case of iron, you’d need to lower the iron and then watch what happens. Fortunately, there’s a very easy and very safe intervention to lower iron levels that’s performed millions of times every year—phlebotomy, also known as blood donation.
One of the first studies to use phlebotomy to examine the relationship between iron and diabetes was published in 1998.16 The authors found that among both healthy and diabetic subjects, phlebotomy improved insulin sensitivity and glucose metabolism. A 2005 study found that regular blood donors exhibited lower iron stores and significantly greater insulin sensitivity than non-donors.17 In 2012, researchers phlebotomized pre-diabetic volunteers until their ferritin levels dropped significantly, and found a marked subsequent improvement in their insulin sensitivity.18 In that same year, a different group of scientists studied the effect of phlebotomy on several elements of metabolic syndrome, including glucose metabolism. They found that a single phlebotomy session was associated with improvement in blood pressure, fasting glucose, hemoglobin A1C (a marker for average glucose levels), and blood cholesterol six weeks later.19
Many caveats apply to this evidence—the line between correlation and causation remains unclear, some of the studies used relatively small sample sizes, and phlebotomy may cause other changes in addition to lowering iron. But taken together, the data lends weight to the idea that iron plays a significant role in the tortuous pathophysiology of diabetes.
As more published data began to suggest a relationship between iron, cardiovascular disease, and diabetes, researchers started casting broader nets.
Next up was cancer.
It had been known since the late 1950s that injecting large doses of iron into lab animals could cause malignant tumors, but it wasn’t until the 1980s that scientists began looking for associations between iron and cancer in humans. In 1985, Ernest Graf and John Eton proposed that differences in colon cancer rates among countries could be accounted for by the variation in the fiber content of local diets, which can in turn affect iron absorption.20
The following year, Richard Stevens found that elevated ferritin was associated with triple the risk of death from cancer among a group of 20,000 Chinese men.21 Two years later Stevens showed that American men who developed cancer had higher transferrin saturation and serum iron than men who didn’t.22 In 1990, a large study of Swedish blood donors found that they were 20 percent less likely to get cancer than non-donor controls.23 Four years later, a group of Finnish researchers found that elevated transferrin saturation among 40,000 Scandinavians conferred a threefold increase risk for colorectal cancer, and a 1.5-fold increased risk for lung cancer.24
A host of research articles have been published since Graf and Eton’s first paper, and most have supported an association between iron and cancer—particularly colorectal cancer. In 2001, a review of 33 publications investigating the link between iron and colorectal cancer found that more than 75 percent of them supported the relationship.25 A 2004 study found an increased risk of death from cancer with rising serum iron and transferrin saturation. People with the highest levels were twice as likely to die from cancer than those with the lowest levels.26 And in 2008, another study confirmed that Swedish blood donors had about a 30 percent decrease in cancer risk.27
There are a few other lines of evidence that support the association between iron and cancer. People with an HFE mutation have an increased risk of developing colon and blood cancers.28 Conversely, people diagnosed with breast, blood, and colorectal cancers are more than twice as likely to be HFE heterozygotes than are healthy controls.29
There are also a handful of interventional trials investigating the relationship between iron and cancer. The first was published in 2007 by a group of Japanese scientists who had previously found that iron reduction via phlebotomy essentially normalized markers of liver injury in patients with hepatitis C. Hepatocellular carcinoma (HCC) is a feared consequence of hepatitis C and cirrhosis, and they hypothesized that phlebotomy might also reduce the risk of developing this cancer. The results were remarkable—at five years only 5.7 percent of patients in the phlebotomy group had developed HCC compared to 17.5 percent of controls. At 10 years the results were even more striking, with 8.6 percent of phlebotomized patients developing HCC compared to an astonishing 39 percent of controls.30
The second study to investigate the effects of phlebotomy on cancer risk was published the following year by Leo Zacharski, a colorful emeritus professor at Dartmouth. In a multi-center, randomized study originally designed to look at the effects of phlebotomy on vascular disease, patients allocated to the iron-reduction group were about 35 percent less likely to develop cancer after 4.5 years than controls. And among all patients who did develop cancer, those in the phlebotomy group were about 60 percent less likely to have died from it at the end of the follow-up period.31
The brain is a hungry organ. Though only 2 to 3 percent of body mass, it burns 20 percent of the body’s total oxygen requirement. With a metabolism that hot, it’s inevitable that the brain will also produce more free radicals as it churns through all that oxygen. Surprisingly, it’s been shown that the brain appears to have less antioxidant capacity than other tissues in the body, which could make it more susceptible to oxidative stress.32 The balance between normal cellular energy metabolism and damage from reactive oxygen species may be even more delicate in the brain than elsewhere in the body. This, in turn, points to a sensitivity to iron.
It’s been known since the 1920s that neurodegenerative disease—illnesses like Alzheimer’s and Parkinson’s—is associated with increased iron deposition in the brain. In 1924, a towering Parisian neurologist named Jean Lhermitte was among the first to show that certain regions of the brain become congested with abnormal amounts of iron in advanced Parkinson’s disease.33 Thirty years later, in 1953, a physician named Louis Goodman demonstrated that the brains of patients with Alzheimer’s disease had markedly abnormal levels of iron deposited in the same regions as the famed plaques and tangles that define the illness.34 Goodman’s work was largely forgotten for several decades, until a 1992 paper resurrected and confirmed his findings and kindled new interest. Two years later an exciting new technology called MRI was deployed to probe the association between iron and disease in living patients, confirming earlier autopsy findings that Alzheimer brains demonstrated significant aberrations in tissue iron.35
Zacharski is convinced that iron overload is a huge common fulcrum underlying much of the chronic metabolic disease that is sweeping Western countries.
By the mid 1990s, there was compelling evidence that Alzheimer’s and Parkinson’s disease involved some dysregulation of iron metabolism in the brain, but no one knew whether the relationship was cause or consequence of the disease process. Hints began trickling in at around the same time the MRI findings were being published. A 1993 paper reported that iron promoted aggregation of amyloid-b, the major constituent of Alzheimer’s plaques.36 In 1997, researchers found that the aberrant iron associated with Alzheimer’s plaques was highly reactive and able to freely generate toxic oxygen radicals.37 By 2010, it had been shown that oxidative damage was one of the earliest detectable changes associated with Alzheimer’s, and that reactive iron was present in the earliest stages of the disease.38,39 And in 2015, a seven-year longitudinal study showed that cerebrospinal fluid ferritin levels were a strong predictor of cognitive decline and development of Alzheimer’s dementia.40
Perhaps most surprising was the discovery in 1999 that the pre-cursor to amyloid-b was under direct control by cellular iron levels—the more iron around, the more amyloid was produced.41 This raised the tantalizing possibility that amyloid plaques might actually represent an adaptive response rather than a cause, an idea that has been indirectly supported by the spectacular failure of essentially all efforts to directly target amyloid protein as treatment for the disease.
Together, these findings suggest that abnormal iron metabolism in the brain could be a causative factor in Alzheimer’s and other neurodegenerative diseases. If that’s true, then we might expect people who are genetically predisposed to an aberrant iron metabolism would be at higher risk of dementing diseases than others. And so they are.
In the early 2000s, it was discovered that patients with familial Alzheimer’s were more likely to possess one of the HFE genes than healthy controls.42 Another study found that these genotypes were associated with earlier onset of the disease compared to controls, and that there was an even more powerful effect in people who an HFE as well as an ApoE4 gene, the primary genetic risk factor for Alzheimer’s disease.43 A 2004 study showed that the co-occurrence of the HFE gene with a known variant in the transferrin gene conferred a fivefold increased risk of Alzheimer’s.44 Two years later a team of Portuguese scientists found that the HFE variants were associated with increased risk of Parkinson’s as well.45
What about interventional trials? For neurodegenerative disease, there has been exactly one. In 1991, a team of Canadian scientists published the results of a two-year randomized trial of the iron chelator desferrioxamine in 48 patients with Alzheimer’s disease.46 Chelators are a class of medication that bind metal cations like iron, sequester them, and facilitate their excretion from the body. Patients were randomly allocated to receive desferrioxamine, placebo, or no treatment. The results were impressive—at two years, iron reduction had cut the rate of cognitive decline in half.
The study was published in The Lancet, one of the world’s most prestigious medical journals, but seems to have been forgotten in the 20-odd year interim. Not a single interventional study testing the role of iron in Alzheimer’s disease has been published since.
If so many studies seem to show a consistent association between iron levels and chronic disease, why isn’t more work being done to clarify the risk?
“It’s incredible that there is so much promising literature, and nobody—nobody—is doing the clinical trials,” Dartmouth’s Zacharski said to me. “If people would just take up the gauntlet and do well-designed, insightful studies of the iron hypothesis, we would have a much firmer understanding of this. Just imagine if it turns out to be verified!”
His perspective on why more trials haven’t been done is fascinating, and paralleled much of what other experts in the field said. “Sexiness,” believe it or not, came up in multiple conversations—molecular biology and targeted pharmaceuticals are hot (and lucrative), and iron is definitively not. “Maybe it’s not sexy enough, too passé, too old school,” said one researcher I spoke to. Zacharski echoed this in our conversation, and pointed out that many modern trials are funded by the pharmaceutical industry, which is keen to develop the next billion-dollar drug. Government agencies like the NIH can step in to fill gaps left by the for-profit research industry, but publically funded scientists are subject to the same sexiness bias as everyone else. As one senior university scientist told me, “NIH goes for fashion.”
Zacharski is convinced that iron overload is a huge common fulcrum underlying much of the chronic metabolic disease that is sweeping Western countries. He thinks that even subtly elevated iron levels can result in free radical formation, which then contribute to chronic inflammation. And chronic inflammation, we know, is strongly linked to everything from heart disease to diabetes, cancer to Alzheimer’s.
“If this doesn’t deserve randomized trials,” he told me, “then I don’t know what does.”
Until those randomized trials arrive—I’ll see you at the blood bank.
Clayton Dalton is an emergency medicine resident at Massachusetts General Hospital in Boston. He has published stories and essays with NPR, Aeon, and The Los Angeles Review.
Lead image: Liliya Kandrashevich / Shuttterstock
1. Whittaker, P., Tufaro, P.R., & Rader. J.I. Iron and folate in fortified cereals. The Journal of the American College of Nutrition20, 247-254 (2001).
2. Halliwell, B. & Gutteridge, J.M. Oxygen toxicity, oxygen radicals, transition metals and disease. Biochemical Journal219, 1-14 (1984).
3. Connor, J.R. & Ghio, A.J. The impact of host iron homeostasis on disease. Preface. Biochimica et Biophysica Acta1790, 581-582 (2009).
4. Hanson, E.H., Imperatore, G., & Burke, W. HFE gene and hereditary hemochromatosis: a HuGE review. Human Genome Epidemiology. American Journal of Epidemiology154, 193-206 (2001).
5. Beutler, E., Felitti, V.J., Koziol, J.A., Ho, N.J., & Gelbart, T. Penetrance of 845G—> A (C282Y) HFE hereditary haemochromatosis mutation in the USA. The Lancet359, 211-218 (2002).
6. Rossi, E., et al. Effect of hemochromatosis genotype and lifestyle factors on iron and red cell indices in a community population. Clinical Chemistry47, 202-208 (2001).
7. Roest, M., et al. Heterozygosity for a hereditary hemochromatosis gene is associated with cardiovascular death in women. Circulation100, 1268-1273 (1999).
8. Tuomainen, T.P., et al. Increased risk of acute myocardial infarction in carriers of the hemochromatosis gene Cys282Tyr mutation: A prospective cohort study in men in eastern Finland. Circulation100, 1274-1279 (1999).
9. Pereira, A.C., et al. Hemochromatosis gene variants in patients with cardiomyopathy. American Journal of Cardiology 88, 388-391 (2001).
10. Muñoz-bravo, C., Gutiérrez-bedmar, M., Gómez-aracena, J., García-rodríguez, A., & Navajas, J.F. Iron: protector or risk factor for cardiovascular disease? Still controversial. Nutrients5, 2384-2404 (2013).
11. Tuomainen, T.P., et al. Body iron stores are associated with serum insulin and blood glucose concentrations. Population study in 1,013 eastern Finnish men. Diabetes Care20, 426-428 (1997).
12. Ford, E.S. & Cogswell, M.E. Diabetes and serum ferritin concentration among U.S. adults. Diabetes Care22, 1978-1983 (1999).
13. Jehn, M., Clark, J.M., & Guallar, E. Serum ferritin and risk of the metabolic syndrome in U.S. adults. Diabetes Care27, 2422-2428 (2004).
14. Ellervik, C., et al. Elevated transferrin saturation and risk of diabetes: three population-based studies. Diabetes Care34, 2256-2258 (2011).
15. Bonfils, L., et al. Fasting serum levels of ferritin are associated with impaired pancreatic beta cell function and decreased insulin sensitivity: a population-based study. Diabetologia58, 523-533 (2015).
16. Facchini, F.S. Effect of phlebotomy on plasma glucose and insulin concentrations. Diabetes Care 21, 2190 (1998).
17. Fernández-real, J.M., López-bermejo, A., & Ricart, W. Iron stores, blood donation, and insulin sensitivity and secretion. Clinical Chemistry51, 1201-1205 (2005).
18. Gabrielsen, J.S., et al. Adipocyte iron regulates adiponectin and insulin sensitivity. Journal of Clinical Investigation 122, 3529-3540 (2012).
19. Houschyar, K.S., et al. Effects of phlebotomy-induced reduction of body iron stores on metabolic syndrome: results from a randomized clinical trial. BMC Medicine10:54 (2012).
20. Graf, E. & Eaton, J.W. Dietary suppression of colonic cancer. Fiber or phytate?. Cancer56, 717-718 (1985).
21. Stevens, R.G., Beasley, R.P., & Blumberg, B.S. Iron-binding proteins and risk of cancer in Taiwan. Journal of the National Cancer Institute76, 605-610 (1986).
22. Stevens, R.G., Jones, D.Y., Micozzi, M.S., & Taylor, P.R. Body iron stores and the risk of cancer. New England Journal of Medicine319, 1047-1052 (1988).
23. Merk, K., et al. The incidence of cancer among blood donors. International Journal of Epidemiology19, 505-509 (1990).
24. Knekt, P., et al. Body iron stores and risk of cancer. International Journal of Cancer56, 379-382 (1994).
25. Nelson, R.L. Iron and colorectal cancer risk: human studies. Nutrition Review59, 140-148 (2001).
26. Wu, T., Sempos, C.T., Freudenheim, J.L., Muti, P., & Smit, E. Serum iron, copper and zinc concentrations and risk of cancer mortality in US adults. Annals of Epidemiology14, 195-201 (2004).
27. Edgren, G., et al. Donation frequency, iron loss, and risk of cancer among blood donors. Journal of the National Cancer Institute100, 572-579 (2008).
28. Nelson, R.L., Davis, F.G., Persky, V., & Becker, E. Risk of neoplastic and other diseases among people with heterozygosity for hereditary hemochromatosis. Cancer76, 875-879 (1995).
29. Weinberg, E.D. & Miklossy, J. Iron withholding: a defense against disease. Journal of Alzheimer’s Disease13, 451-463 (2008).
30. Kato, J., et al. Long-term phlebotomy with low-iron diet therapy lowers risk of development of hepatocellular carcinoma from chronic hepatitis C. Journal of Gastroenterology42, 830-836 (2007).
31. Zacharski, L.R., et al. Decreased cancer risk after iron reduction in patients with peripheral arterial disease: results from a randomized trial. Journal of the National Cancer Institute100, 996-1002 (2008).
32. Lee, H.G., et al. Amyloid-beta in Alzheimer disease: the null versus the alternate hypotheses. Journal of Pharmacology and Experimental Therapeutics321, 823-829 (2007).
33. Lhermitte, J., Kraus, W.M., & Mcalpine, D. Original Papers: On the occurrence of abnormal deposits of iron in the brain in Parkinsonism with special reference to its localisation. Journal of Neurology and Psychopathology5, 195-208 (1924).
34. Goodman, L. Alzheimer’s disease; a clinico-pathologic analysis of twenty-three cases with a theory on pathogenesis. The Journal of Nervous and Mental Disease118, 97-130 (1953).
35. Bartzokis, G., et al. In vivo evaluation of brain iron in Alzheimer’s disease and normal subjects using MRI. Biological Psychiatry35, 480-487 (1994).
36. Mantyh, P.W., et al. Aluminum, iron, and zinc ions promote aggregation of physiological concentrations of beta-amyloid peptide. Journal of Neurochemistry61, 1171-1174 (1993).
37. Smith, M.A., Harris, P.L., Sayre, L.M., & Perry, G. Iron accumulation in Alzheimer disease is a source of redox-generated free radicals. Proceedings of the National Academy of Sciences94, 9866-9868 (1997).
38. Nunomura, A., et al. Oxidative damage is the earliest event in Alzheimer disease. Journal of Neuropathology and Experimental Neurology60, 759-767 (2001).
39. Smith, M.A., et al. Increased iron and free radical generation in preclinical Alzheimer disease and mild cognitive impairment. Journal of Alzheimer’s Disease19, 363-372 (2010).
40. Ayton, S., Faux, N.G., & Bush, A.I. Ferritin levels in the cerebrospinal fluid predict Alzheimer’s disease outcomes and are regulated by APOE. Nature Communications6:6760 (2015).
41. Rogers, J.T., et al. Translation of the alzheimer amyloid precursor protein mRNA is up-regulated by interleukin-1 through 5’-untranslated region sequences. Journal of Biological Chemistry274, 6421-6431 (1999).
42. Moalem, S., et al. Are hereditary hemochromatosis mutations involved in Alzheimer disease? American Journal of Medical Genetics93, 58-66 (2000).
43. Combarros, O., et al. Interaction of the H63D mutation in the hemochromatosis gene with the apolipoprotein E epsilon 4 allele modulates age at onset of Alzheimer’s disease. Dementia and Geriatric Cognitive Disorders 15, 151-154 (2003).
44. Robson, K.J., et al. Synergy between the C2 allele of transferrin and the C282Y allele of the haemochromatosis gene (HFE) as risk factors for developing Alzheimer’s disease. Journal of Medical Genetics41, 261-265 (2004).
45. Pulliam, J.F., et al. Association of HFE mutations with neurodegeneration and oxidative stress in Alzheimer’s disease and correlation with APOE. American Journal of Medical Genetics; Part B119B, 48-53 (2003).
46. Crapper-McLachlan, D.R., et al. Intramuscular desferrioxamine in patients with Alzheimer’s disease. The Lancet337, 1304-1308 (1991).
After all, 16 hours is a long time to go without eating. Here’s everything you need to know about the popular weight-loss regimen—including whether it actually works.
Chris Pratt! Hugh Jackman! Halle Berry! Kourtney Kardashian! What these celebrities have in common, other than a gratuitous exclamation point after their names, is a professed fondness for intermittent fasting, the diet craze turning the fitness world on its sweaty, well-toned head. For help determining whether you, too, should incorporate this into your 2019 resolution-related plans, we asked a few experts to explain what it is, why people love it, and whether it’s really worth the pain of forgoing on-demand snacks for the rest of the winter.
What is intermittent fasting, exactly?
Intermittent fasting, unlike many other diets, is famously flexible in that you choose the days and hours during which you think it’s best to fast. The two most common methods are the 16:8 strategy—where you eat whatever you want (within reason) for eight hours a day and then fast for the other 16—and the 5:2 method, where you eat normally five days a week and then keep your food intake to roughly 500-600 calories for the other two days. It’s kind of a simplified-calories math problem that’s supposed to prevent the yo-yo effect of weight loss and weight gain.
“There are different ways to do this diet, but the bottom line is that no matter which you choose, you’re taking in less energy, and because of that, you’re going to start using your own body stores for energy,” says Lisa Sasson, a clinical professor of nutrition at NYU. “If you don’t, you’re not going to lose weight.”
Why might I want to try it?
A recent study completed by the German Cancer Research Center concluded that intermittent fasting indeed “helps lose weight and promotes health,” and noted that the regimen proved especially adept at getting rid of fat in the liver. A USC study also found that the diet reduced participants’ risk of cancer, diabetes, heart disease, and other age-related diseases. While researchers involved cautioned that more testing is necessary, the results are at least encouraging.
Most people who swear by intermittent fasting will tell you it helps not only with losing weight but also with reducing “belly fat.” This is not a conclusion with scientific backing, but it is the sort of thing to which every six-pack enthusiast aspires.
Why might I not want to try it?
“There’s really no conclusive evidence that there’s any benefit,” Sasson says. The German Cancer Research Center study qualified its findings by noting that the positive results weren’t noticeably better than those experienced by subjects who adopted a conventional calorie-reduction diet. In other words, it works, but not notably better than the alternative. (Sasson also offered a helpful list of individuals who should not give intermittent fasting a try: pregnant women and anyone with diabetes, cancer, or an eating disorder.)
The best long-term diets, no matter what their rules entail, are the ones that are least difficult to maintain—and again, in this regard, intermittent fasting isn’t inherently superior to anything else. “Are you making changes in your behavior? Have you learned positive habits so that when you go back to not fasting, you’re going to be a healthier eater?” Sasson asks. “I know people who fast because they think, Okay, I’m going to be really bad and overdrink or overeat, and then two days a week I’m going to have a clean life, and that’s just not how it works.”
Also, for many people, a full 16 hours of fasting just isn’t realistic, says Cynthia Sass, a New York City– and L.A.-based performance nutritionist. She recommends 12 hours of overnight fasting at most and believes the 16-hour gap is especially tough on those who exercise early in the morning or late at night. “If fasting makes you feel miserable and results in intense cravings and rebound overeating, it’s not the right path for you,” she says.
So—should I try it?
As long as you’re aware that it isn’t nutritional magic, Sasson isn’t against intermittent fasting altogether. “I’ve worked with patients who need positive reinforcement to see that their weight went down to feel better, and they feel in control for the first time,” she says. “That self-efficacy, that feeling that they could do it—for some, that might be important.”
Of the two most popular methods, Sasson leans toward the 5:2 schedule as slightly more manageable, since you’re only reducing your intake twice a week. But again, that’s contingent on you being a responsible dieter on your days of lowered caloric intake, which requires an immense amount of discipline—especially when it comes to remembering to drink water. “You can go a long time without food, but only a few days without adequate hydration,” she warns.
If these extended periods without delicious food sound too painful to handle, rest assured: The best available evidence indicates that a regular ol’ diet is at least as safe and healthy and efficacious as intermittent fasting. Besides, sooner or later, a shiny new fad is bound to come along for the A-listers to fawn over, she says: “There’s going to be a new darling of the month before you know it.”