As America’s opioid crisis affects millions, babies nationwide are born with harrowing inherited-withdrawals from their addicted mothers.
Wikimedia CommonsAn intubated newborn in a neonatal unit, 2011.
The opioid epidemic has become a national crisis. While most victims of this modern phenomenon are addicted adults, there are newborn babies born of those addicts who suffer from withdrawal the moment they enter the world. These babies end up in the Neonatal Intensive Care Unit (NICU) — an ICU for infants — and experience pain and suffering before they can experience anything else in their world.
The nationwide consequences of America’s opioid epidemic have become so stark and prevalent that the National Institute on Drug Abuse reported that a baby is born suffering from opioid withdrawal every fifteen minutes.
In response, hospitals across the country have received volunteer aid by regular citizens who serve as “baby cuddlers” and rock the ailing infants to sleep, provide a necessary human connection, and allow them a small semblance of peace.
Hospitals across the country are opening up individual “cuddler” programs as part-time jobs to combat the crisis and can be found from Iowa and Virginia to Massachusetts and San Antonio.
Wikimedia CommonsA newborn in the Neonatal Intensive Care Unit.
University Hospital is in Bexar County, San Antonio, Texas has the largest number of babies born with NAS in the entire state of Texas. A third of babies born with NAS are born there — and the number of babies born with NAS has spiked by 60 percent over the last five years.
So when the University Hospital put out a call for its cuddling program in the NICU, Army veteran Doug Walters was quick to volunteer Texas Public Radio reported.
“Jonathan is supposed to be going to sleep, but we’re having some challenges right now,” said Walters in reference to an infant he volunteered to care for. “He’s three and a half months. So he’s been a resident for a little while.”
Walters has been a part-time baby cuddler for over three years now and said he has specialized on those who enter the NICU with neonatal abstinence syndrome (NAS) — opioid withdrawal inherited from their mothers.
Wikimedia CommonsA crying newborn.
The symptoms of NAS include tight muscles and subsequent body stiffness, tremors, seizures, and overly increased reflexes. Newborns with NAS are prone to gastrointestinal problems, and thus, have trouble with being fed. These babies can also have trouble breathing.
All of the infants suffering from NAS let out a unique, high-pitched shriek which Walters said is immediately identifiable as a cry stemming from that particular syndrome.
“You can tell when kids cry because they’re mad, or they’re hungry, and (babies with NAS) just…it’s a very sad cry,” he said. “It’s just sad, because they don’t understand what’s happening, and they don’t understand why things hurt. They just don’t understand.”
Laurie Weaver has been a nurse in the University Hospital NICU for 27 years and has come to care for babies with NAS more than any other type of patient. For her, it’s the fairness factor — a tipping of the scales that dealt these infants a heavy hand — that draws her to them.
“I just feel like they were given a rough start, and I just like holding them and comforting them,” she said.
PixabayA newborn girl in the Neonatal Intensive Care Unit.
“Touch is so important to babies,” said Vicki Agnitsch, a former nurse now part of the 22-person Cuddler Volunteer program at Blank Children’s Hospital in Des Moines, Iowa. “Without that, there would be failure to thrive.”
Agnitsch said that the more cuddling and physical touch these infants get has a direct correlation to fewer required and administered medications. The human connection provided through these programs literally supports the immune systems of babies born with NAS.
“When they know someone else is touching them, it gives them that warmth and safety and security that they crave,” she explained. “They had that inside the mom, and then they come out into this cold, bright world. They don’t have that, so all of that swaddling, touch, and talk helps their development.”
Agnitsch said that the simple act of spending a few hours per week with newborns with NAS can help physically course-correct the very direction of their early lives. She also said that thee Cuddler Volunteer program, which she’s been a part of since 2011, is “the best part of my week.”
She doesn’t seem to be the only one who finds catharsis in that, as the Blank Children’s Hospital Cuddler Volunteer program — one of many across the country — has a two-year waiting list of volunteers.
Tennessee Department of Children’s ServicesA volunteer cuddler at the East Tennessee Children’s Hospital.
Halfway across the country, Warrenton, Va.’s Fauquier Hospital has established a cuddler program of its own. Director of women services Cheryl Poelma told WTOP that infants born with NAS received morphine shortly after birth to help assuage their withdrawal symptoms.
Babies in withdrawal “tend to be irritable, they aren’t coordinated with their suck, they can’t eat well, they can sneeze a loot, have loose stools — it’s all part of withdrawing,” she said. Fauquier Hospital decided to implement a two-pronged cuddler program in conjunction with the administering of morphine.
“They sit, and they rock infants and hold them tight,” she said. “They tend to like to have their hands close to their chests, they like a tight blanket swaddled around them. They also like to suck on pacifiers, so it’s rocking, sucking, keeping them in a quiet environment, reducing stimuli.”
Poelma explained that volunteer cuddlers have shown results in a matter of weeks.
“You’ll see them engaging you more, their eye contact will be better, they’ll start feeding better, not being so fussy, and they’ll start to sleep better,” she said.
PixabayA newborn being cuddled, wrapped in a blanket, 2015.
A study published in 2014 in the Biological Psychiatry journal suggested that infants born in the NICU formed healthier sleep habits and showed increased attention if they were regularly cuddled from birth.
The New York Presbyterian Brooklyn Methodist Hospital, UCI Health in Orange County, Calif., the Blank Children’s Hospital in Des Moines, Iowa — these programs are springing up all over the United States, and those are just the ones currently at capacity.
It’s proactive empathy like this that makes all the difference in the world — especially for those least able to help themselves.
Frailty is associated with a higher risk of both Alzheimer’s disease and its crippling symptoms, a new study shows.
“By reducing an individual’s physiological reserve, frailty could trigger the clinical expression of dementia when it might remain asymptomatic in someone who is not frail,” said study leader Dr. Kenneth Rockwood, a professor at Dalhousie University in Halifax, Canada.
“This indicates that a ‘frail brain‘ might be more susceptible to neurological problems like dementia as it is less able to cope with the pathological burden,” he added.
The study included 456 adults in Illinois, aged 59 and older, who did not have Alzheimer’s when first enrolled in the Rush Memory and Aging Project. They underwent annual assessments of their mental and physical health, and their brains were examined after they died.
By their last assessment, 53 percent of the participants had been diagnosed with possible or probable Alzheimer’s disease.
For the physical assessments, the researchers created a frailty index using 41 components, including fatigue, joint and heart problems, osteoporosis, mobility and meal preparation abilities.
Overall, 8 percent of the participants had significant Alzheimer’s disease-related brain changes without having been diagnosed with dementia, and 11 percent had Alzheimer’s but little evidence of disease-related brain changes.
Those with higher levels of frailty were more likely to have both Alzheimer’s disease-related brain changes and symptoms of dementia, while others with substantial brain changes, but who were not frail, had fewer symptoms of the disease.
After adjusting for age, sex and education, the researchers concluded that frailty and Alzheimer’s disease-related brain changes independently contribute to dementia, though they could not prove that frailty caused Alzheimer’s and its symptoms.
The investigators also said there was a significant association between frailty and Alzheimer’s-related brain changes after they excluded activities of daily living from the frailty index and adjusted for other risk factors such as stroke, heart failure, high blood pressure and diabetes.
The study was published Jan. 17 in The Lancet Neurology journal.
“This is an enormous step in the right direction for Alzheimer’s research,” Rockwood said in a journal news release. “Our findings suggest that the expression of dementia symptoms results from several causes, and Alzheimer’s disease-related brain changes are likely to be only one factor in a whole cascade of events that lead to clinical symptoms.”
Understanding frailty could help predict and prevent dementia, Dr. Francesco Panza, from the University of Bari Aldo Moro in Italy, wrote in an accompanying editorial.
They followed 161 older adults for five years and found that those with the most severe memory declines had the greatest leakage in their brain’s blood vessels, regardless of whether the Alzheimer’s-related proteins amyloid and tau were present.
The findings could help with earlier diagnosis of Alzheimer’s and suggest a new drug target for slowing down or preventing the disease, according to the researchers from the University of Southern California.
“The fact that we’re seeing the blood vessels leaking, independent of tau and independent of amyloid, when people have cognitive [mental] impairment on a mild level, suggests it could be a totally separate process or a very early process,” said study senior author Dr. Berislav Zlokovic. He is director of the Zilkha Neurogenetic Institute at the university’s Keck School of Medicine in Los Angeles.
“That was surprising, that this blood-brain barrier breakdown is occurring independently,” Zlokovic added in a university news release.
The blood-brain barrier prevents harmful substances from reaching brain tissue. In some people, this barrier weakens with age.
“If the blood-brain barrier is not working properly, then there is the potential for damage,” explained study co-author Arthur Toga, who is director of the Stevens Neuroimaging and Informatics Institute at Keck.
“It suggests the vessels aren’t properly providing the nutrients and blood flow that the neurons need. And you have the possibility of toxic proteins getting in,” Toga said.
“The results were really kind of eye-opening,” said study first author Daniel Nation, an assistant professor of psychology. “It didn’t matter whether people had amyloid or tau pathology; they still had cognitive impairment.”
The findings were published recently in the journal Nature Medicine.
The next step in this research is to determine how soon mental decline occurs after damage to brain blood vessels.
The number of Americans with Alzheimer’s is expected to nearly triple to about 14 million by 2060, according to the U.S. Centers for Disease Control and Prevention.
An artificial intelligence (AI) system can analyze chest X-rays and spot patients who should receive immediate care, researchers report.
The system could also reduce backlogs in hospitals someday. Chest X-rays account for 40 percent of all diagnostic imaging worldwide, and there can be large backlogs, according to the researchers.
“Currently, there are no systematic and automated ways to triage chest X-rays and bring those with critical and urgent findings to the top of the reporting pile,” explained study co-author Giovanni Montana. He is formerly of King’s College London and is now at the University of Warwick in Coventry, England.
Montana and his colleagues used more than 470,300 adult chest X-rays to develop an AI system that could identify unusual results.
The system’s performance in prioritizing X-rays was assessed in a simulation using a separate set of 15,887 chest X-rays. All identifying information was removed from the X-rays to protect patient privacy.
The system was highly accurate in distinguished abnormal from normal chest X-rays, researchers said. Simulations showed that with the AI system, critical findings received an expert radiologist opinion within an average of 2.7 days, compared with an average of 11.2 days in actual practice.
The study results were published Jan. 22 in the journal Radiology.
“The initial results reported here are exciting as they demonstrate that an AI system can be successfully trained using a very large database of routinely acquired radiologic data,” Montana said in a journal news release.
“With further clinical validation, this technology is expected to reduce a radiologist’s workload by a significant amount by detecting all the normal exams, so more time can be spent on those requiring more attention,” he added.
The researchers said the next step is to test a much larger number of X-rays and to conduct a multi-center study to assess the AI system’s performance.
The mainstream media is largely funded by drug companies and vaccine manufacturers and demonstrates extreme conflicts of interest in reporting on vaccines. Perhaps that’s why dishonest media outlets refuse to report the following ten stunning facts about the vaccine industry that are all provably true.
FACT #1) Mercury is still used in vaccines, and the CDC openly admits it. There is NO safe level of mercury for injecting into a human child. Not even “trace” levels. There is NO evidence of safety for mercury at any dose whatsoever. Any doctor who says the level of mercury in a vaccine is “safe” to inject into a child is only demonstrating their outrageous ignorance of scientific facts.
Mercury is arguably the most neurotoxic element on the entire Table of Elements. It is used in vaccines for the convenience of the vaccine manufacturer at the expense of the safety of the child. Any doctor who injects mercury into a child — at any dose! — should be immediately stripped of their medical license.
See the list of studies on the neurotoxicity of mercury at SCIENCE.naturalnews.com, now the largest relational research resource for chemicals, health, nutrients and drugs.
The power of the elements: Discover Colloidal Silver Mouthwash with quality, natural ingredients like Sangre de Drago sap, black walnut hulls, menthol crystals and more. Zero artificial sweeteners, colors or alcohol. Learn more at the Health Ranger Store and help support this news site.
Additional FACT: There is no “safe” form of mercury as is often ridiculously claimed by vaccine pushers. Both ethyl and methyl mercury are extremely toxic to the human nervous system. Neither should, under ANY circumstances, be deliberately injected into a human child at any dose whatsoever.
FACT #2) Injecting any substance into the human body makes it orders of magnitude more potentially toxic because it bypasses the protections of the digestive tract or the respiratory system. Injecting mercury into a human being — at any dose — should be globally condemned as a criminal act. That it is currently considered an acceptable act in the field of medicine only condemns the true destructive nature of modern medicine. Under the vaccine doctrine, “First do no harm” has become “Poison children for profit.”
FACT #4) Top virologists working for Merck have blown the whistle and gone public with shocking revelations that claim the company routinely fabricated lab results to claim a 95% efficacy rate of its mumps vaccine in order to continue receiving government contracts on a vaccine that didn’t work.
As these charts show, measles was almost completely eradicated before the arrival of the measles vaccine. Why the decline? Mostly due to improvements in public hygiene and sanitation. It’s no exaggeration to say that good plumbing saves more lives than vaccines ever did.
FACT #7) The vaccine industry refuses to conduct scientific tests on the health outcomes of vaccinated children vs. unvaccinated children. Why? Because these test would no doubt show unvaccinated children to be healthier, smarter and far better off than vaccinated children in terms of behavioral disorders, allergies and even autoimmune disorders. Check the people you know: Don’t you routinely find that the most heavily-vaccinated kids are the ones who get sick all the time? Meanwhile, groups like the Amish who largely refuse to vaccinate their children have near-zero rates of autism.
FACT #8) The U.S. Supreme Court has already declared that the secret “vaccine court” is a higher power than the Supreme Court. The so-called “vaccine court” is granted extraordinary powers to operate utterly outside the Constitution, the Bill of Rights and completely outside the rules of due process and law.
The vaccine court itself — which isn’t even a court of law — is a violation of law and a violation of basic human rights. It must be abolished like Apartheid.
FACT #9) The mainstream media receives a significant portion of its revenues from the very same drug companies selling vaccines. This financial influence results in the media refusing to cover stories about vaccine-damaged children for fear of losing advertising revenues.
This is why the mainstream media frequently features guests and authors who ridiculously claim that all the vaccine damaged children across America do not exist or are “mere delusions” of their parents. These despicable vaccine apologists are intellectual bullies who, like Hitler’s minions, relish in aiding and abetting a real-life holocaust that’s harming millions of children around the globe.
All of these substances are toxic to human biology when injected. All of them are still listed on the CDC website as vaccine additives. There is no rational doctor or scientist in the world who can say they believe injecting infants and children with mercury, formaldehyde, MSG and aluminum is somehow “safe,” yet doctors inject children with these substances every single day in the form of vaccines.
Doctors who inject children with vaccines are delusional. They are practicing a medical holocaust against humanity while fraudulently calling it “immunization.” For the record, vaccination does not equal immunization. Click here to see the book of the same title.
Corporate-controlled “science” isn’t real science at all
The real truth is that science never has a monopoly on facts, and science makes enormous mistakes (such as condoning smoking cigarettes) on a regular basis. Science is also for sale and easy corrupted by corporate interests.
Peer-reviewed science journals, too, are often little more than a collection of corporate-funded make-believe science tabloids. “It is simply no longer possible to believe much of the clinical research that is published, or to rely on the judgment of trusted physicians or authoritative medical guidelines,” writes the former editor of The New England Journal of Medicine, Marcia Angell.
“I take no pleasure in this conclusion, which I reached slowly and reluctantly over my two decades as an editor of The New England Journal of Medicine,” she says in Drug Companies & Doctors: A Story of Corruption.
With that in mind, take a look at the similarities between Big Tobacco science lies and vaccine industry science lies:
Let me guess, part of your New Year’s Resolution was to “exercise more”. What does that REALLY mean? How do you know if you are doing the right exercises? If you are doing it “more”, are you really getting the benefit you want from said exercise? Are you enjoying the exercising you are doing, or are you just making yourself move because you think this is the way to get that super cute bikini beach body you think you are supposed to have?
Being active DOES NOT mean that you need to work out 6 days a week for an hour at each session. That simply gets you to burn out! It’s a hefty goal if you are just beginning a exercise routine. It creates an unhealthy relationship between you and the joy that movement can, and should be. I’ve pushed myself to do those workouts and my body did not appreciate it. Some people enjoy this type of torture, but I’m going to venture that this is not you.
Why Exercise Even If You Don’t Feel Like It
So, you ask, how long should I be working out and what kinds of exercise should I be doing? This is a great question! Before I answer this, I’d love to take you back to your childhood.
Yes, you read that right, and just bear with me here. When you were a child, you didn’t play because you thought that you HAD to. You did this because it was fun and you got enjoyment out of whatever it was that you were doing.
Picture this time. What were you doing? Who were you with? What is the best part of what you were doing? For me, it was playing basketball and kick the can with my neighbors.
So, how does that translate into what you could or should be doing for workouts now? Working out should be fun! It should bring you joy. If it doesn’t, then it is just more WORK that we are adding to a busy, already stressful day.
Your workouts should be a release of negativity and toxins. It should be a way to clear our minds and bodies. Our bodies were not meant to be stagnant, but we also don’t need to torture them!
If you want to make working out a healthy and fun change in your world, I have some suggestions for you. Stop talking to yourself as if this is a chore. If you are telling yourself negative things about working out, you are creating a path of resistance to move. If you find something that you love doing, you won’t feel as if it is such a chore to go do it. You won’t hate doing a workout for 15, 20, 40, 60 minutes, or however long feels good to you. I’m also going to venture to say that some of your good friends would also enjoy doing these workouts with you. Grab your tribe, or someone from your tribe and get moving together!
Your workouts don’t have to take as much time as you think. If you’ve been paying attention, you’ve heard that this should be fun and enjoyable. It will cease to be fun if you are pushing yourself to keep going longer than feels good. So, do your running, skipping, swimming, yoga or whatever it is, for a shorter period of time to start and build up your love for it as you go.
Movement can be a gentle activity such as going for a walk. Bonus points if it is outside in the fresh air and you get some vitamin D! It could be simply having a dance party to one of your favorite songs. Did you realize you can think of cleaning your home as a ezercise? Score! Any time you are moving your body, give yourself some accolades for a job well done!
The problem with these “New Year’s Resolutions” is that we come into them hot out of the gate and want results NOW! We push ourselves to the limit and get burned out just as quickly as we started. Lasting change does not happen when we burn the candle at both ends. It happens when we make meaningful, loving changes.
So I challenge you, my dear. Go back to a time when you enjoyed moving your body. Stop beating yourself up thinking that you need to hit the gym 5-6 times a week in order to see any change. Let yourself find and enjoy a new way of moving that makes you happy. Life is too damn short to keep stressing ourselves out over every little thing.
Love what your body can do. Love what your mind can do. Love yourself.
A few drinks can relax you – but, says scientist David Nutt, that morning-after feeling is the booze playing tricks with your brain
If you are looking forward to your first stiff drink after a dry January, be warned: it may feel bittersweet. You may feel you deserve an alcoholic beverage after toughing it out all month – but have you forgotten what it feels like to wake up haunted by worries about what you said or did the night before? These post-drinking feelings of guilt and stress have come to be known colloquially as “hangxiety”. But what causes them?
David Nutt, professor of neuropsychopharmacology at Imperial College, London, is the scientist who was fired in 2009 as the government’s chief drug adviser for saying alcohol is more dangerous than ecstasy and LSD. I tell him I have always assumed my morning-after mood was a result of my brain having shrivelled like a raisin through alcohol-induced dehydration. When Nutt explains the mechanics of how alcohol causes crippling anxiety, he paints an even more offputting picture.
Alcohol, he says, targets the Gaba (gamma-aminobutyric acid) receptor, which sends chemical messages through the brain and central nervous system to inhibit the activity of nerve cells. Put simply, it calms the brain, reducing excitement by making fewer neurons fire. “Alcohol stimulates Gaba, which is why you get relaxed and cheerful when you drink,” explains Nutt.
The first two drinks lull you into a blissful Gaba-induced state of chill. When you get to the third or fourth drink, another brain-slackening effect kicks in: you start blocking glutamate, the main excitatory transmitter in the brain. “More glutamate means more anxiety,” says Nutt. “Less glutamate means less anxiety.” This is why, he says, “when people get very drunk, they’re even less anxious than when they’re a bit drunk” – not only does alcohol reduce the chatter in your brain by stimulating Gaba, but it further reduces your anxiety by blocking glutamate. In your blissed-out state, you will probably feel that this is all good – but you will be wrong.
The body registers this new imbalance in brain chemicals and attempts to put things right. It is a little like when you eat a lot of sweets and your body goes into insulin-producing overdrive to get the blood sugar levels down to normal; as soon as the sweets have been digested, all that insulin causes your blood sugar to crash. When you are drunk, your body goes on a mission to bring Gaba levels down to normal and turn glutamate back up. When you stop drinking, therefore, you end up with unnaturally low Gaba function and a spike in glutamate – a situation that leads to anxiety, says Nutt. “It leads to seizures as well, which is why people have fits in withdrawal.”
It can take the brain a day or two to return to the status quo, which is why a hair of the dog is so enticing. “If you drank an awful lot for a long time,” says Nutt, “it might take weeks for the brain to readapt. In alcoholics, we’ve found changes in Gaba for years.”
To add to the misery, the anxiety usually kicks in while you are trying to sleep off the booze. “If you measure sleep when people are drunk, they go off to sleep fast. They go into a deeper sleep than normal, which is why they sometimes wet the bed or have night terrors. Then, after about four hours, the withdrawal kicks in – that’s when you wake up all shaky and jittery.”
Imbalances in Gaba and glutamate are not the only problem. Alcohol also causes a small rise in noradrenaline – known as the fight-or-flight hormone. “Noradrenaline suppresses stress when you first take it, and increases it in withdrawal,” says Nutt. “Severe anxiety can be considered a surge of noradrenaline in the brain.”
Another key cause of hangxiety is being unable to remember the mortifying things you are sure you must have said or done while inebriated – another result of your compromised glutamate levels. “You need glutamate to lay down memories,” says Nutt, “and once you’re on the sixth or seventh drink, the glutamate system is blocked, which is why you can’t remember things.”
If this isn’t ringing any bells, it may be because hangxiety does not affect us all equally, as revealed by a study published in the journal Personality and Individual Differences. Researchers quizzed healthy young people about their levels of anxiety before, during and the morning after drinking alcohol. According to one of the authors, Celia Morgan, professor of psychopharmacology at the University of Exeter: “The people who were more shy had much higher levels of anxiety [the following day] than the people who weren’t shy.” The team also found a correlation between having bad hangxiety and the chance of having an alcohol use disorder. “Maybe it’s playing a role in keeping problematic drinking going,” says Morgan.
One theory as to why very shy people might be more at risk of hangxiety and alcoholism is the possibility that alcohol’s seesaw effect on Gaba levels is more pronounced in them. Their baseline Gaba levels may be lower to start with, says Morgan. “It could also be a psychological effect – people who are more highly anxious are more prone to rumination, going over thoughts about the night before, so that’s another potential mechanism.”
However, the study’s findings have wider implications – after all, most drinkers lean on alcohol as social lubrication to some degree.
The bad news is that there seems to be little you can do to avoid hangxiety other than to drink less, and perhaps take painkillers – they will at least ease your headache. “Theoretically, ibuprofen would be better than paracetamol,” says Nutt, “because it’s more anti-inflammatory – but we don’t know how much of the hangover is caused by inflammation. It’s something we’re working on, trying to measure that.”
Morgan suggests trying to break the cycle. “Before drinking in a social situation you feel anxious in, try fast-forwarding to the next day when you’ll have much higher anxiety levels. If you can’t ride that out without drinking, the worry is that you will get stuck in this cycle of problematic drinking where your hangxiety is building and building over time. Drinking might fix social anxiety in the short term, but in the long term it might have pretty detrimental consequences.” Exposure therapy is a common treatment for phobias, where you sit with your fear in order to help you overcome it. “By drinking alcohol, people aren’t giving themselves a chance to do that,” says Morgan.
But there might be hope for the future. Nutt is involved in a project to develop a drink that takes the good bits of alcohol and discards the damaging or detrimental effects. “Alcosynth”, as it is currently called, drowns your sorrows in the same way as alcohol, but without knocking the Gaba and glutamate out of kilter. “We’re in the second stage of fundraising to take it through to a product,” he says. “The industry knows [alcohol] is a toxic substance. If it was discovered today, it would be illegal as a foodstuff.”
Until Alcosynth reaches the market, Nutt says his “strong” message is: “Never treat hangxiety with a hair of the dog. When people start drinking in the mornings to get over their hangxiety, then they’re in the cycle of dependence. It’s a very slippery slope.”
City council chambers and local officials in the US are facing the outcry of residents frightened by the next generation 5G wireless communications which by all accounts, will be taking over neighborhoods soon.
A resident in Montgomery County, Maryland raised her voice to ask local officials “Why can’t we do a real health assessment here and find out what the real health risks are — to our children?” at a public meeting held at the county .
What are the risks? More to the point what is 5G?
What is 5G?
The 5th generation wireless systems (5G) are new network technologies designed to make your cell phone and similar wireless devices become super-duper powerful and fast.
Scheduled to be deployed from 2018 and made commercially available in 2020  we are told 5G is expected to support at least 100 billion devices and up to 100 times faster than current 4G technology. (4G is already about 10 times faster than 3G).
The 5G tech will employ low-(0.6 GHz – 3.7 GHz), mid-(3.7 – 24 GHz), and high-band frequencies (24 GHz and higher). The “high-band” frequencies largely consist of millimeter waves (MMWs), a type of electromagnetic radiation with wavelengths within 1- 10 millimeters and frequencies ranging from 30 to 300 GHz.
Health Hazards from Cell Phone Technology “Beyond Measure”
Cell phones operate essentially by sending and receiving radiofrequency radiation from their antennas to a nearby cell tower.
5G Technology Comes With Increased RF Radiation Exposure
These millimeter waves (MMWs) as used by the 5G network can transmit large amounts of data within a short period of time. But over short distances and also, the other big issue is that the signal is poorly transmitted through solid materials.
This means massive transmission of MMW will be needed.
Many new antennas will be needed. We are told full-scale implementation may require at least one antenna for every 10 to 12 houses in urban areas.
Also, the MIMO (multiple-input multiple-output) technology is expected to be used massively. The MIMO technology is a wireless system that uses multiple transmitters hence, it is able to send and receive multiple/more data at once. Some 4G base stations already use MIMO technology. Standard MIMO involves four to eight antennae. MIMO for 5G may involve approximately 100 antennas per cell tower – that’s a lot of antennas!
Increased transmission leads to increased capacity, so electromagnetic radiation levels can only increase. The concern is that, given what we know about radio frequency radiation, this mandatory environmental increase in exposure to EM radiation will lead to increased health risks.
A number of studies have demonstrated the detrimental health effects of the MMW frequencies used in 5G technology.
Damaging Effects on the Human Skin
One Israeli study  lead by Dr. Yuri D Feldman found that human sweat ducts act as an array of tiny, helix-shaped antennas when exposed to MMWs. Their findings suggest that human skin not only absorbs but also amplifies the radiation from MMW networks.
A study carried out to evaluate the interactions and implications of MMWs (60GHz) with the human body discovered that “more than 90% of the transmitted (MMWs) power is absorbed in the epidermis and dermis layer.”
The effect of MMWs on the skin is arguably the greatest concern of these new wavelengths utilized by 5G technology.
We might well be looking at the possibility of increased incidences of many skin diseases and cancer in the coming years in areas where the 5G technology is deployed.
Profound Effect On Immune System
A 2002 Russian study  carried out to examine the effects of high-frequency electromagnetic radiation (42HGz) exposure on the blood of healthy mice found that, the activity of cells involved in immunity such as the neutrophils reduced drastically (about 50% decrease in activity).
It was concluded that “the whole-body exposure of healthy mice to low-intensity EHF EMR has a profound effect on the indices of nonspecific immunity.”
Damaging Effects on The Heart
A 1992 study found that frequencies in the range 53-78GHz impacted the heart rate variability (an indicator of stress) in rats. A Russian study on frogs whose skin was exposed to MMWs discovered abnormal heart rate changes (arrhythmias).
Hazardous Effects on the Eyes
In 1994, a study carried out in Poland to evaluate the influence of millimeter radiation on light transmission through the lens of the eyes. It was discovered that low-level MMW radiation produced lens opacity in rats, which is associated the production of cataracts.
A Japanese experiment carried out to examine the potential for 60-GHz millimeter-wave exposure to cause acute ocular injuries found that 60GHz “…millimeter-wave antennas can cause thermal injuries of varying types of levels. The thermal effects induced by millimeter waves can apparently penetrate below the surface of the eye.”
180 Scientist and Doctors Call For A Moratorium
Scientists are concerned as well. More than 180 scientists and doctors from 35 countries , have recommended a temporary ban on the roll-out of 5G technology until its potential hazards on human health and the environment have been fully evaluated by scientists independent of the telecommunication industry.
What Are The Real Dangers Of 5G Technology?
The short answer is: we don’t fully know yet! But the studies we have on this are a cause for concern.
The health hazard of the most studied 3G CMDA technology (shown to cause an array of detrimental health effects) have not been fully revealed, yet, here we are, at the verge of adopting a potentially more dangerous technology.
Don’t you think we should fully evaluate the health effects of 5G before rolling out the technology?
Let’s not forget, alternatives to wireless mobile technology are available. Fiber Optic Broadband Technology is a feasible and safer alternative. I firmly believe that technological improvement can be attained without jeopardizing the health of the general public.
3. Baan R, Grosse Y, Lauby-Secretan B, El Ghissassi F, Bouvard V, Benbrahim-Tallaa L, Guha N, Islami F, Galichet L, Straif K. Carcinogenicity of radiofrequency electromagnetic fields. Lancet Oncol [Internet]. 2011; 12: 624–6. doi: 10.1016/S1470-2045(11)70147-4.
4. Naziroǧlu M, Yüksel M, Köse SA, Özkaya MO. Recent reports of Wi-Fi and mobile phone-induced radiation on oxidative stress and reproductive signaling pathways in females and males [Internet]. Journal of Membrane Biology. 2013 [cited 2017 Dec 25]. p. 869–75. doi: 10.1007/s00232-013-9597-9.
5. Hayes DL, Wang PJ, Reynolds DW, Estes M, Griffith JL, Steffens RA, Carlo GL, Findlay GK, Johnson CM. Interference with cardiac pacemakers by cellular telephones. N Engl J Med [Internet]. Massachusetts Medical Society; 1997 [cited 2018 Feb 5]; 336: 1473–9. doi: 10.1056/NEJM199705223362101.
6. Divan HA, Kheifets L, Obel C, Olsen J. Prenatal and postnatal exposure to cell phone use and behavioral problems in children. Epidemiology [Internet]. 2008 [cited 2017 Dec 27]; 19: 523–9. doi: 10.1097/EDE.0b013e318175dd47.
7. Hutter HP, Moshammer H, Wallner P, Kundi M. Subjective symptoms, sleeping problems, and cognitive performance in subjects living near mobile phone base stations. Occup Environ Med [Internet]. BMJ Publishing Group Ltd; 2006 [cited 2018 Feb 5]; 63: 307–13. doi: 10.1136/oem.2005.020784.
8. Feldman Y, Puzenko A, Ben Ishai P, Caduff A, Agranat AJ. Human Skin as Arrays of Helical Antennas in the Millimeter and Submillimeter Wave Range. Phys Rev Lett [Internet]. 2008 [cited 2018 Mar 19]; 100: 128102. doi: 10.1103/PhysRevLett.100.128102.
10. Kolomytseva MP, Gapeev AB, Sadovnikov VB, Chemeris NK. Suppression of nonspecific resistance of the body under the effect of extremely high frequency electromagnetic radiation of low intensity. Biofizika [Internet]. 2002 [cited 2018 Mar 19]; 47: 71–7. Available from http://www.ncbi.nlm.nih.gov/pubmed/11855293
11. Potekhina IL, Akoev GN, Enin LD, Oleĭner VD. The effect of low-intensity millimeter-range electromagnetic radiation on the cardiovascular system of the white rat]. Fiziol Zh SSSR Im I M Sechenova [Internet]. 1992 [cited 2018 Mar 19]; 78: 35–41. Available from http://www.ncbi.nlm.nih.gov/pubmed/1330714
12. Chernyakov, GM and Korochkin, VL and Babenko, AP and Bigdai E. Reactions of biological systems of various complexity to the action of low-level EHF radiationNo Title. Millim Waves Med Biol. 1989; 1: 141–167.
13. Kojima M, Hanazawa M, Yamashiro Y, Sasaki H, Watanabe S, Taki M, Suzuki Y, Hirata A, Kamimura Y, Sasaki K. ACUTE OCULAR INJURIES CAUSED BY 60-GHZ MILLIMETER-WAVE EXPOSURE. Health Phys [Internet]. 2009 [cited 2018 Mar 19]; 97: 212–8. doi: 10.1097/HP.0b013e3181abaa57.
Cheerios are the best-selling breakfast cereal in America. The multi-grain version contains 18 milligrams of iron per serving, according to the label. Like almost any refined food made with wheat flour, it is fortified with iron. As it happens, there’s not a ton of oversight in the fortification process. One study measured the actual iron content of 29 breakfast cereals, and found that 21 contained 20 percent1 more than the label value, and 8 contained 50 percent more.1 One contained nearly 200 percent of the label value.
If your bowl of cereal actually contains 120 percent more iron than advertised, that’s about 22 mg. A safe assumption is that people tend to consume at least two serving sizes at a time.1 That gets us to 44 mg. The recommended daily allowance of iron is 8 mg for men and 18 mg for pre-menopausal women. The tolerable upper intake—which is the maximum daily intake thought to be safe by the National Institutes of Health—is 45 mg for adults.
It is entirely feasible that an average citizen could get awfully close to exceeding the maximum daily iron intake regarded as safe with a single bowl of what is supposed to be a pretty healthy whole-grain breakfast option.
And that’s just breakfast.
At the same time that our iron consumption has grown to the borders of safety, we are beginning to understand that elevated iron levels are associated with everything from cancer to heart disease. Christina Ellervik, a research scientist at Boston Children’s Hospital who studies the connection between iron and diabetes, puts it this way: “Where we are with iron now is like where we were with cholesterol 40 years ago.”
The story of energy metabolism—the basic engine of life at the cellular level—is one of electrons flowing much like water flows from mountains to the sea. Our cells can make use of this flow by regulating how these electrons travel, and by harvesting energy from them as they do so. The whole set-up is really not so unlike a hydroelectric dam.
The sea toward which these electrons flow is oxygen, and for most of life on earth, iron is the river. (Octopuses are strange outliers here—they use copper instead of iron, which makes their blood greenish-blue rather than red). Oxygen is hungry for electrons, making it an ideal destination. The proteins that facilitate the delivery contain tiny cores of iron, which manage the handling of the electrons as they are shuttled toward oxygen.
This is why iron and oxygen are both essential for life. There is a dark side to this cellular idyll, though.
Oxygen and iron are essential for the production of energy, but may also conspire to destroy the delicate order of our cells.
Normal energy metabolism in cells produces low levels of toxic byproducts. One of these byproducts is a derivative of oxygen called superoxide. Luckily, cells contain several enzymes that clean up most of this leaked superoxide almost immediately. They do so by converting it into another intermediary called hydrogen peroxide, which you might have in your medicine cabinet for treating nicks and scrapes. The hydrogen peroxide is then detoxified into water and oxygen.
Things can go awry if either superoxide or hydrogen peroxide happen to meet some iron on the way to detoxification. What then happens is a set of chemical reactions (described by Haber-Weiss chemistry and Fenton chemistry) that produce a potent and reactive oxygen derivative known as the hydroxyl radical. This radical—also called a free radical—wreaks havoc on biological molecules everywhere. As the chemists Barry Halliwell and John Gutteridge—who wrote the book on iron biochemistry—put it, “the reactivity of the hydroxyl radicals is so great that, if they are formed in living systems, they will react immediately with whatever biological molecule is in their vicinity, producing secondary radicals of variable reactivity.”2
Such is the Faustian bargain that has been struck by life on this planet. Oxygen and iron are essential for the production of energy, but may also conspire to destroy the delicate order of our cells. As the neuroscientist J.R. Connor has said, “life was designed to exist at the very interface between iron sufficiency and deficiency.”3
Hemoglobin, ferritin, and transferrin
At the end of the 20th century, the metabolism of iron in the human body was still a bit of a mystery. Scientists knew of only two ways that the body could excrete iron—bleeding, and the routine sloughing of skin and gastrointestinal cells. But these processes amount to only a few milligrams per day. That meant that the body must have some way to tightly regulate iron absorption from the diet. In 2000 a major breakthrough was announced—a protein was found that functioned as the master regulator for iron. The system, as so many biological systems are, is perfectly elegant. When iron levels are sufficient, the protein, called hepcidin, is secreted into the blood by the liver. It then signals to gastrointestinal cells to decrease their absorption of iron, and for other cells around the body to sequester their iron into ferritin, a protein that stores iron. When iron levels are low, blood levels of hepcidin fall, and intestinal cells begin absorbing iron again. Hepcidin has since become recognized as the principal governor of iron homeostasis in the human body.
But if hepcidin so masterfully regulates absorption of iron from the diet to match the body’s needs, is it possible for anyone to absorb too much iron?
In 1996, a team of scientists announced that they had discovered the gene responsible for hereditary hemochromatosis, a disorder causing the body to absorb too much iron. They called it HFE. Subsequent work revealed that the product of the HFE gene was instrumental in regulating hepcidin. People with a heritable mutation in this gene effectively have a gross handicap in the entire regulatory apparatus that hepcidin coordinates.
This, then, leaves open the possibility that some of us could in fact take in more iron than the body is able to handle. But how common are these mutations? Common enough to matter for even a minority of people reading these words?
Surprisingly, the answer is yes. The prevalence of hereditary hemochromatosis, in which two defective copies of the HFE gene are present and there are clinical signs of iron overload, is actually pretty high—as many as 1 in 200 in the United States. And perhaps 1 in 40 may have two defective HFE genes without overt hemochromatosis.4 That’s more than 8 million Americans who could have a significant short-circuit in their ability to regulate iron absorption and metabolism.
What if you have only one defective HFE gene, and one perfectly normal gene? This is called heterozygosity. We would expect to find more people in this situation than the homozygotes, or those with two bad copies of the gene. And in fact we do. Current estimates suggest that more than 30 percent of the U.S. population could be heterozygotes with one dysfunctional HFE gene.4 That’s pretty close to 100 million people.
Does this matter? Or is one good gene enough? There isn’t much research, but so far the evidence suggests that some heterozygotes do have impaired iron metabolism. Studies have shown that HFE heterozygotes seem to have modest elevations of ferritin as well as transferrin, a protein which chaperones iron through the blood, which would indicate elevated levels of iron.5,6 And a study published in 2001 concluded that HFE heterozygotes may have up to a fourfold increased risk of developing iron overload.4
A host of research articles have supported an association between iron and cancer.
Perhaps more concerning is that these heterozygotes have also been shown to be at increased risk for several chronic diseases, like heart disease and stroke. One study found that heterozygotes who smoked had a 3.5 times greater risk of cardiovascular disease than controls, while another found that heterozygosity alone significantly increased the risk of heart attack and stroke.7,8 A third study found that heterozygosity increased nearly sixfold the risk of cardiomyopathy, which can lead to heart failure.9
The connection between excessive iron and cardiovascular disease may extend beyond HFE heterozygotes. A recent meta-analysis identified 55 studies of this connection that were rigorous enough to meet their inclusion criteria. Out of 55 studies, 27 supported a positive relationship between iron and cardiovascular disease (more iron equals more disease), 20 found no significant relationship, and 8 found a negative relationship (more iron equals less disease).10
A few highlights: a Scandinavian study compared men who suffered a heart attack to men who didn’t, and found that elevated ferritin levels conferred a two- to threefold increase in heart attack risk. Another found that having a high ferritin level made a heart attack five times more likely than having a normal level. A larger study of 2,000 Finnish men found that an elevated ferritin level increased the risk of heart attack twofold, and that every 1 percent increase in ferritin level conferred a further 4 percent increase in that risk. The only other risk factor found to be stronger than ferritin in this study was smoking.
Ferritin isn’t a perfect marker of iron status, though, because it can also be affected by anything that causes inflammation. To address this problem a team of Canadian researchers directly compared blood iron levels to heart attack risk, and found that higher levels conferred a twofold increased risk in men and a fivefold increased risk in women.
If cardiovascular disease is one point in iron’s web of disease, diabetes may be another. The first hint of a relationship between iron and diabetes came in the late 1980s, when researchers discovered that patients receiving regular blood transfusions (which contain quite a bit of iron) were at significantly increased risk of diabetes. In hemochromatosis, there had been no way to know if the associated disturbance in glucose metabolism was due to the accumulation of iron itself, or to the underlying genetic defect. This new link between frequent transfusions and diabetes was indirect evidence that the iron itself may be the cause.
The next step was to mine existing data for associations between markers of iron status and diabetes. The first study to do so came out of Finland in 1997: Among 1,000 randomly selected Scandinavian men, ferritin emerged as a strong predictor of dysfunctional glucose metabolism, second only to body mass index as a risk factor.11 In 1999, researchers found that an elevated ferritin level increased the odds of having diabetes fivefold in men and nearly fourfold in women—similar in magnitude to the association between obesity and diabetes.12 Five years later, another study found that elevated ferritin roughly doubled the risk for metabolic syndrome, a condition that often leads to diabetes, hypertension, liver disease, and cardiovascular disease.13
Christina Ellervik’s first contribution to the field came in 2011, with a study investigating the association between increased transferrin saturation—a measure of how much iron is loaded onto the transferrin protein, which moves iron through the blood—and diabetes risk.14 Ellervik found that within a sample of nearly 35,000 Danes, transferrin saturation greater than 50 percent conferred a two- to threefold increased risk of diabetes. She also identified an increase in mortality rates with transferrin saturation greater than 50 percent.
In 2015, she led another study that found that, among a sample of 6,000 people, those whose ferritin levels were in the highest 20 percent had 4 times greater odds of diabetes than those with ferritin levels in the lowest 20 percent.15 Blood glucose levels, blood insulin levels, and insulin sensitivity all were raised with higher ferritin levels.
“It’s incredible that there is so much promising literature, and nobody—nobody—is doing the clinical trials.”
There’s a problem here, though. All of these studies show associations. They show that two things tend to happen together. But they don’t tell us anything about causality. To learn something about causality, you need an intervention. In the case of iron, you’d need to lower the iron and then watch what happens. Fortunately, there’s a very easy and very safe intervention to lower iron levels that’s performed millions of times every year—phlebotomy, also known as blood donation.
One of the first studies to use phlebotomy to examine the relationship between iron and diabetes was published in 1998.16 The authors found that among both healthy and diabetic subjects, phlebotomy improved insulin sensitivity and glucose metabolism. A 2005 study found that regular blood donors exhibited lower iron stores and significantly greater insulin sensitivity than non-donors.17 In 2012, researchers phlebotomized pre-diabetic volunteers until their ferritin levels dropped significantly, and found a marked subsequent improvement in their insulin sensitivity.18 In that same year, a different group of scientists studied the effect of phlebotomy on several elements of metabolic syndrome, including glucose metabolism. They found that a single phlebotomy session was associated with improvement in blood pressure, fasting glucose, hemoglobin A1C (a marker for average glucose levels), and blood cholesterol six weeks later.19
Many caveats apply to this evidence—the line between correlation and causation remains unclear, some of the studies used relatively small sample sizes, and phlebotomy may cause other changes in addition to lowering iron. But taken together, the data lends weight to the idea that iron plays a significant role in the tortuous pathophysiology of diabetes.
As more published data began to suggest a relationship between iron, cardiovascular disease, and diabetes, researchers started casting broader nets.
Next up was cancer.
It had been known since the late 1950s that injecting large doses of iron into lab animals could cause malignant tumors, but it wasn’t until the 1980s that scientists began looking for associations between iron and cancer in humans. In 1985, Ernest Graf and John Eton proposed that differences in colon cancer rates among countries could be accounted for by the variation in the fiber content of local diets, which can in turn affect iron absorption.20
The following year, Richard Stevens found that elevated ferritin was associated with triple the risk of death from cancer among a group of 20,000 Chinese men.21 Two years later Stevens showed that American men who developed cancer had higher transferrin saturation and serum iron than men who didn’t.22 In 1990, a large study of Swedish blood donors found that they were 20 percent less likely to get cancer than non-donor controls.23 Four years later, a group of Finnish researchers found that elevated transferrin saturation among 40,000 Scandinavians conferred a threefold increase risk for colorectal cancer, and a 1.5-fold increased risk for lung cancer.24
A host of research articles have been published since Graf and Eton’s first paper, and most have supported an association between iron and cancer—particularly colorectal cancer. In 2001, a review of 33 publications investigating the link between iron and colorectal cancer found that more than 75 percent of them supported the relationship.25 A 2004 study found an increased risk of death from cancer with rising serum iron and transferrin saturation. People with the highest levels were twice as likely to die from cancer than those with the lowest levels.26 And in 2008, another study confirmed that Swedish blood donors had about a 30 percent decrease in cancer risk.27
There are a few other lines of evidence that support the association between iron and cancer. People with an HFE mutation have an increased risk of developing colon and blood cancers.28 Conversely, people diagnosed with breast, blood, and colorectal cancers are more than twice as likely to be HFE heterozygotes than are healthy controls.29
There are also a handful of interventional trials investigating the relationship between iron and cancer. The first was published in 2007 by a group of Japanese scientists who had previously found that iron reduction via phlebotomy essentially normalized markers of liver injury in patients with hepatitis C. Hepatocellular carcinoma (HCC) is a feared consequence of hepatitis C and cirrhosis, and they hypothesized that phlebotomy might also reduce the risk of developing this cancer. The results were remarkable—at five years only 5.7 percent of patients in the phlebotomy group had developed HCC compared to 17.5 percent of controls. At 10 years the results were even more striking, with 8.6 percent of phlebotomized patients developing HCC compared to an astonishing 39 percent of controls.30
The second study to investigate the effects of phlebotomy on cancer risk was published the following year by Leo Zacharski, a colorful emeritus professor at Dartmouth. In a multi-center, randomized study originally designed to look at the effects of phlebotomy on vascular disease, patients allocated to the iron-reduction group were about 35 percent less likely to develop cancer after 4.5 years than controls. And among all patients who did develop cancer, those in the phlebotomy group were about 60 percent less likely to have died from it at the end of the follow-up period.31
The brain is a hungry organ. Though only 2 to 3 percent of body mass, it burns 20 percent of the body’s total oxygen requirement. With a metabolism that hot, it’s inevitable that the brain will also produce more free radicals as it churns through all that oxygen. Surprisingly, it’s been shown that the brain appears to have less antioxidant capacity than other tissues in the body, which could make it more susceptible to oxidative stress.32 The balance between normal cellular energy metabolism and damage from reactive oxygen species may be even more delicate in the brain than elsewhere in the body. This, in turn, points to a sensitivity to iron.
It’s been known since the 1920s that neurodegenerative disease—illnesses like Alzheimer’s and Parkinson’s—is associated with increased iron deposition in the brain. In 1924, a towering Parisian neurologist named Jean Lhermitte was among the first to show that certain regions of the brain become congested with abnormal amounts of iron in advanced Parkinson’s disease.33 Thirty years later, in 1953, a physician named Louis Goodman demonstrated that the brains of patients with Alzheimer’s disease had markedly abnormal levels of iron deposited in the same regions as the famed plaques and tangles that define the illness.34 Goodman’s work was largely forgotten for several decades, until a 1992 paper resurrected and confirmed his findings and kindled new interest. Two years later an exciting new technology called MRI was deployed to probe the association between iron and disease in living patients, confirming earlier autopsy findings that Alzheimer brains demonstrated significant aberrations in tissue iron.35
Zacharski is convinced that iron overload is a huge common fulcrum underlying much of the chronic metabolic disease that is sweeping Western countries.
By the mid 1990s, there was compelling evidence that Alzheimer’s and Parkinson’s disease involved some dysregulation of iron metabolism in the brain, but no one knew whether the relationship was cause or consequence of the disease process. Hints began trickling in at around the same time the MRI findings were being published. A 1993 paper reported that iron promoted aggregation of amyloid-b, the major constituent of Alzheimer’s plaques.36 In 1997, researchers found that the aberrant iron associated with Alzheimer’s plaques was highly reactive and able to freely generate toxic oxygen radicals.37 By 2010, it had been shown that oxidative damage was one of the earliest detectable changes associated with Alzheimer’s, and that reactive iron was present in the earliest stages of the disease.38,39 And in 2015, a seven-year longitudinal study showed that cerebrospinal fluid ferritin levels were a strong predictor of cognitive decline and development of Alzheimer’s dementia.40
Perhaps most surprising was the discovery in 1999 that the pre-cursor to amyloid-b was under direct control by cellular iron levels—the more iron around, the more amyloid was produced.41 This raised the tantalizing possibility that amyloid plaques might actually represent an adaptive response rather than a cause, an idea that has been indirectly supported by the spectacular failure of essentially all efforts to directly target amyloid protein as treatment for the disease.
Together, these findings suggest that abnormal iron metabolism in the brain could be a causative factor in Alzheimer’s and other neurodegenerative diseases. If that’s true, then we might expect people who are genetically predisposed to an aberrant iron metabolism would be at higher risk of dementing diseases than others. And so they are.
In the early 2000s, it was discovered that patients with familial Alzheimer’s were more likely to possess one of the HFE genes than healthy controls.42 Another study found that these genotypes were associated with earlier onset of the disease compared to controls, and that there was an even more powerful effect in people who an HFE as well as an ApoE4 gene, the primary genetic risk factor for Alzheimer’s disease.43 A 2004 study showed that the co-occurrence of the HFE gene with a known variant in the transferrin gene conferred a fivefold increased risk of Alzheimer’s.44 Two years later a team of Portuguese scientists found that the HFE variants were associated with increased risk of Parkinson’s as well.45
What about interventional trials? For neurodegenerative disease, there has been exactly one. In 1991, a team of Canadian scientists published the results of a two-year randomized trial of the iron chelator desferrioxamine in 48 patients with Alzheimer’s disease.46 Chelators are a class of medication that bind metal cations like iron, sequester them, and facilitate their excretion from the body. Patients were randomly allocated to receive desferrioxamine, placebo, or no treatment. The results were impressive—at two years, iron reduction had cut the rate of cognitive decline in half.
The study was published in The Lancet, one of the world’s most prestigious medical journals, but seems to have been forgotten in the 20-odd year interim. Not a single interventional study testing the role of iron in Alzheimer’s disease has been published since.
If so many studies seem to show a consistent association between iron levels and chronic disease, why isn’t more work being done to clarify the risk?
“It’s incredible that there is so much promising literature, and nobody—nobody—is doing the clinical trials,” Dartmouth’s Zacharski said to me. “If people would just take up the gauntlet and do well-designed, insightful studies of the iron hypothesis, we would have a much firmer understanding of this. Just imagine if it turns out to be verified!”
His perspective on why more trials haven’t been done is fascinating, and paralleled much of what other experts in the field said. “Sexiness,” believe it or not, came up in multiple conversations—molecular biology and targeted pharmaceuticals are hot (and lucrative), and iron is definitively not. “Maybe it’s not sexy enough, too passé, too old school,” said one researcher I spoke to. Zacharski echoed this in our conversation, and pointed out that many modern trials are funded by the pharmaceutical industry, which is keen to develop the next billion-dollar drug. Government agencies like the NIH can step in to fill gaps left by the for-profit research industry, but publically funded scientists are subject to the same sexiness bias as everyone else. As one senior university scientist told me, “NIH goes for fashion.”
Zacharski is convinced that iron overload is a huge common fulcrum underlying much of the chronic metabolic disease that is sweeping Western countries. He thinks that even subtly elevated iron levels can result in free radical formation, which then contribute to chronic inflammation. And chronic inflammation, we know, is strongly linked to everything from heart disease to diabetes, cancer to Alzheimer’s.
“If this doesn’t deserve randomized trials,” he told me, “then I don’t know what does.”
Until those randomized trials arrive—I’ll see you at the blood bank.
Clayton Dalton is an emergency medicine resident at Massachusetts General Hospital in Boston. He has published stories and essays with NPR, Aeon, and The Los Angeles Review.
Lead image: Liliya Kandrashevich / Shuttterstock
1. Whittaker, P., Tufaro, P.R., & Rader. J.I. Iron and folate in fortified cereals. The Journal of the American College of Nutrition20, 247-254 (2001).
2. Halliwell, B. & Gutteridge, J.M. Oxygen toxicity, oxygen radicals, transition metals and disease. Biochemical Journal219, 1-14 (1984).
3. Connor, J.R. & Ghio, A.J. The impact of host iron homeostasis on disease. Preface. Biochimica et Biophysica Acta1790, 581-582 (2009).
4. Hanson, E.H., Imperatore, G., & Burke, W. HFE gene and hereditary hemochromatosis: a HuGE review. Human Genome Epidemiology. American Journal of Epidemiology154, 193-206 (2001).
5. Beutler, E., Felitti, V.J., Koziol, J.A., Ho, N.J., & Gelbart, T. Penetrance of 845G—> A (C282Y) HFE hereditary haemochromatosis mutation in the USA. The Lancet359, 211-218 (2002).
6. Rossi, E., et al. Effect of hemochromatosis genotype and lifestyle factors on iron and red cell indices in a community population. Clinical Chemistry47, 202-208 (2001).
7. Roest, M., et al. Heterozygosity for a hereditary hemochromatosis gene is associated with cardiovascular death in women. Circulation100, 1268-1273 (1999).
8. Tuomainen, T.P., et al. Increased risk of acute myocardial infarction in carriers of the hemochromatosis gene Cys282Tyr mutation: A prospective cohort study in men in eastern Finland. Circulation100, 1274-1279 (1999).
9. Pereira, A.C., et al. Hemochromatosis gene variants in patients with cardiomyopathy. American Journal of Cardiology 88, 388-391 (2001).
10. Muñoz-bravo, C., Gutiérrez-bedmar, M., Gómez-aracena, J., García-rodríguez, A., & Navajas, J.F. Iron: protector or risk factor for cardiovascular disease? Still controversial. Nutrients5, 2384-2404 (2013).
11. Tuomainen, T.P., et al. Body iron stores are associated with serum insulin and blood glucose concentrations. Population study in 1,013 eastern Finnish men. Diabetes Care20, 426-428 (1997).
12. Ford, E.S. & Cogswell, M.E. Diabetes and serum ferritin concentration among U.S. adults. Diabetes Care22, 1978-1983 (1999).
13. Jehn, M., Clark, J.M., & Guallar, E. Serum ferritin and risk of the metabolic syndrome in U.S. adults. Diabetes Care27, 2422-2428 (2004).
14. Ellervik, C., et al. Elevated transferrin saturation and risk of diabetes: three population-based studies. Diabetes Care34, 2256-2258 (2011).
15. Bonfils, L., et al. Fasting serum levels of ferritin are associated with impaired pancreatic beta cell function and decreased insulin sensitivity: a population-based study. Diabetologia58, 523-533 (2015).
16. Facchini, F.S. Effect of phlebotomy on plasma glucose and insulin concentrations. Diabetes Care 21, 2190 (1998).
17. Fernández-real, J.M., López-bermejo, A., & Ricart, W. Iron stores, blood donation, and insulin sensitivity and secretion. Clinical Chemistry51, 1201-1205 (2005).
18. Gabrielsen, J.S., et al. Adipocyte iron regulates adiponectin and insulin sensitivity. Journal of Clinical Investigation 122, 3529-3540 (2012).
19. Houschyar, K.S., et al. Effects of phlebotomy-induced reduction of body iron stores on metabolic syndrome: results from a randomized clinical trial. BMC Medicine10:54 (2012).
20. Graf, E. & Eaton, J.W. Dietary suppression of colonic cancer. Fiber or phytate?. Cancer56, 717-718 (1985).
21. Stevens, R.G., Beasley, R.P., & Blumberg, B.S. Iron-binding proteins and risk of cancer in Taiwan. Journal of the National Cancer Institute76, 605-610 (1986).
22. Stevens, R.G., Jones, D.Y., Micozzi, M.S., & Taylor, P.R. Body iron stores and the risk of cancer. New England Journal of Medicine319, 1047-1052 (1988).
23. Merk, K., et al. The incidence of cancer among blood donors. International Journal of Epidemiology19, 505-509 (1990).
24. Knekt, P., et al. Body iron stores and risk of cancer. International Journal of Cancer56, 379-382 (1994).
25. Nelson, R.L. Iron and colorectal cancer risk: human studies. Nutrition Review59, 140-148 (2001).
26. Wu, T., Sempos, C.T., Freudenheim, J.L., Muti, P., & Smit, E. Serum iron, copper and zinc concentrations and risk of cancer mortality in US adults. Annals of Epidemiology14, 195-201 (2004).
27. Edgren, G., et al. Donation frequency, iron loss, and risk of cancer among blood donors. Journal of the National Cancer Institute100, 572-579 (2008).
28. Nelson, R.L., Davis, F.G., Persky, V., & Becker, E. Risk of neoplastic and other diseases among people with heterozygosity for hereditary hemochromatosis. Cancer76, 875-879 (1995).
29. Weinberg, E.D. & Miklossy, J. Iron withholding: a defense against disease. Journal of Alzheimer’s Disease13, 451-463 (2008).
30. Kato, J., et al. Long-term phlebotomy with low-iron diet therapy lowers risk of development of hepatocellular carcinoma from chronic hepatitis C. Journal of Gastroenterology42, 830-836 (2007).
31. Zacharski, L.R., et al. Decreased cancer risk after iron reduction in patients with peripheral arterial disease: results from a randomized trial. Journal of the National Cancer Institute100, 996-1002 (2008).
32. Lee, H.G., et al. Amyloid-beta in Alzheimer disease: the null versus the alternate hypotheses. Journal of Pharmacology and Experimental Therapeutics321, 823-829 (2007).
33. Lhermitte, J., Kraus, W.M., & Mcalpine, D. Original Papers: On the occurrence of abnormal deposits of iron in the brain in Parkinsonism with special reference to its localisation. Journal of Neurology and Psychopathology5, 195-208 (1924).
34. Goodman, L. Alzheimer’s disease; a clinico-pathologic analysis of twenty-three cases with a theory on pathogenesis. The Journal of Nervous and Mental Disease118, 97-130 (1953).
35. Bartzokis, G., et al. In vivo evaluation of brain iron in Alzheimer’s disease and normal subjects using MRI. Biological Psychiatry35, 480-487 (1994).
36. Mantyh, P.W., et al. Aluminum, iron, and zinc ions promote aggregation of physiological concentrations of beta-amyloid peptide. Journal of Neurochemistry61, 1171-1174 (1993).
37. Smith, M.A., Harris, P.L., Sayre, L.M., & Perry, G. Iron accumulation in Alzheimer disease is a source of redox-generated free radicals. Proceedings of the National Academy of Sciences94, 9866-9868 (1997).
38. Nunomura, A., et al. Oxidative damage is the earliest event in Alzheimer disease. Journal of Neuropathology and Experimental Neurology60, 759-767 (2001).
39. Smith, M.A., et al. Increased iron and free radical generation in preclinical Alzheimer disease and mild cognitive impairment. Journal of Alzheimer’s Disease19, 363-372 (2010).
40. Ayton, S., Faux, N.G., & Bush, A.I. Ferritin levels in the cerebrospinal fluid predict Alzheimer’s disease outcomes and are regulated by APOE. Nature Communications6:6760 (2015).
41. Rogers, J.T., et al. Translation of the alzheimer amyloid precursor protein mRNA is up-regulated by interleukin-1 through 5’-untranslated region sequences. Journal of Biological Chemistry274, 6421-6431 (1999).
42. Moalem, S., et al. Are hereditary hemochromatosis mutations involved in Alzheimer disease? American Journal of Medical Genetics93, 58-66 (2000).
43. Combarros, O., et al. Interaction of the H63D mutation in the hemochromatosis gene with the apolipoprotein E epsilon 4 allele modulates age at onset of Alzheimer’s disease. Dementia and Geriatric Cognitive Disorders 15, 151-154 (2003).
44. Robson, K.J., et al. Synergy between the C2 allele of transferrin and the C282Y allele of the haemochromatosis gene (HFE) as risk factors for developing Alzheimer’s disease. Journal of Medical Genetics41, 261-265 (2004).
45. Pulliam, J.F., et al. Association of HFE mutations with neurodegeneration and oxidative stress in Alzheimer’s disease and correlation with APOE. American Journal of Medical Genetics; Part B119B, 48-53 (2003).
46. Crapper-McLachlan, D.R., et al. Intramuscular desferrioxamine in patients with Alzheimer’s disease. The Lancet337, 1304-1308 (1991).
After all, 16 hours is a long time to go without eating. Here’s everything you need to know about the popular weight-loss regimen—including whether it actually works.
Chris Pratt! Hugh Jackman! Halle Berry! Kourtney Kardashian! What these celebrities have in common, other than a gratuitous exclamation point after their names, is a professed fondness for intermittent fasting, the diet craze turning the fitness world on its sweaty, well-toned head. For help determining whether you, too, should incorporate this into your 2019 resolution-related plans, we asked a few experts to explain what it is, why people love it, and whether it’s really worth the pain of forgoing on-demand snacks for the rest of the winter.
What is intermittent fasting, exactly?
Intermittent fasting, unlike many other diets, is famously flexible in that you choose the days and hours during which you think it’s best to fast. The two most common methods are the 16:8 strategy—where you eat whatever you want (within reason) for eight hours a day and then fast for the other 16—and the 5:2 method, where you eat normally five days a week and then keep your food intake to roughly 500-600 calories for the other two days. It’s kind of a simplified-calories math problem that’s supposed to prevent the yo-yo effect of weight loss and weight gain.
“There are different ways to do this diet, but the bottom line is that no matter which you choose, you’re taking in less energy, and because of that, you’re going to start using your own body stores for energy,” says Lisa Sasson, a clinical professor of nutrition at NYU. “If you don’t, you’re not going to lose weight.”
Why might I want to try it?
A recent study completed by the German Cancer Research Center concluded that intermittent fasting indeed “helps lose weight and promotes health,” and noted that the regimen proved especially adept at getting rid of fat in the liver. A USC study also found that the diet reduced participants’ risk of cancer, diabetes, heart disease, and other age-related diseases. While researchers involved cautioned that more testing is necessary, the results are at least encouraging.
Most people who swear by intermittent fasting will tell you it helps not only with losing weight but also with reducing “belly fat.” This is not a conclusion with scientific backing, but it is the sort of thing to which every six-pack enthusiast aspires.
Why might I not want to try it?
“There’s really no conclusive evidence that there’s any benefit,” Sasson says. The German Cancer Research Center study qualified its findings by noting that the positive results weren’t noticeably better than those experienced by subjects who adopted a conventional calorie-reduction diet. In other words, it works, but not notably better than the alternative. (Sasson also offered a helpful list of individuals who should not give intermittent fasting a try: pregnant women and anyone with diabetes, cancer, or an eating disorder.)
The best long-term diets, no matter what their rules entail, are the ones that are least difficult to maintain—and again, in this regard, intermittent fasting isn’t inherently superior to anything else. “Are you making changes in your behavior? Have you learned positive habits so that when you go back to not fasting, you’re going to be a healthier eater?” Sasson asks. “I know people who fast because they think, Okay, I’m going to be really bad and overdrink or overeat, and then two days a week I’m going to have a clean life, and that’s just not how it works.”
Also, for many people, a full 16 hours of fasting just isn’t realistic, says Cynthia Sass, a New York City– and L.A.-based performance nutritionist. She recommends 12 hours of overnight fasting at most and believes the 16-hour gap is especially tough on those who exercise early in the morning or late at night. “If fasting makes you feel miserable and results in intense cravings and rebound overeating, it’s not the right path for you,” she says.
So—should I try it?
As long as you’re aware that it isn’t nutritional magic, Sasson isn’t against intermittent fasting altogether. “I’ve worked with patients who need positive reinforcement to see that their weight went down to feel better, and they feel in control for the first time,” she says. “That self-efficacy, that feeling that they could do it—for some, that might be important.”
Of the two most popular methods, Sasson leans toward the 5:2 schedule as slightly more manageable, since you’re only reducing your intake twice a week. But again, that’s contingent on you being a responsible dieter on your days of lowered caloric intake, which requires an immense amount of discipline—especially when it comes to remembering to drink water. “You can go a long time without food, but only a few days without adequate hydration,” she warns.
If these extended periods without delicious food sound too painful to handle, rest assured: The best available evidence indicates that a regular ol’ diet is at least as safe and healthy and efficacious as intermittent fasting. Besides, sooner or later, a shiny new fad is bound to come along for the A-listers to fawn over, she says: “There’s going to be a new darling of the month before you know it.”