10 of the most evil medical experiments in history

The subjects are often society’s most vulnerable, and the doctors have rarely had to answer for their crimes

10 of the most evil medical experiments in history

10 of the most evil medical experiments in history
U.S. helicopter sprays Agent Orange in Vietnam
AlterNet Evil scares us. Arguably our best horror stories, the ones that give us nightmares, are about evil people doing evil things—especially evil experiments. The Island of Dr. Moreau by H.G. Wells is a classic that comes to mind. In modern cinema, movies like The Human Centipede continue that gruesome tradition. But these are fictional. The truth is that we need only look at recent human history to find real, live, utterly repugnant evil. Worse yet, it is evil perpetrated by doctors.

Here are 10 of the most evil experiments ever performed on human beings—black and other people of color, women, prisoners, children and gay people have been the predominant victims.

1. The Tuskegee Experiments

There’s a good reason many African Americans are wary of the good intentions of government and the medical estblishment. Even today, many believe the conspiracy theory that AIDS, which ravaged the African-American community, both gay and straight, was created by the government to wipe out African Americans. What happened in Tuskegee, Alabama in 1932 is one explanation for these fears.

At the time, treatments for syphilis, a sexually transmitted disease that causes pain, insanity and ultimately, death, were mostly toxic and ineffective (things like mercury, which caused, kidney failure, mouth ulcers, tooth loss, insanity, and death). Government-funded doctors decided it would be interesting to see if no treatment at all was better than the treatments they were using. So began the Tuskegee experiments.

Over the course of the next 40 years, the Tuskegee Study of Untreated Syphilis in the Negro Male denied treatment to 399 syphilitic patients, most of them poor, black, illiterate sharecroppers. Even after penicillin emerged as an effective treatment in 1947, these patients, who were not told they had syphilis, but were informed they suffered from “bad blood,” were denied treatment, or given fake placebo treatments. By the end of the study, in 1972, only 74 of the subjects were still alive. Twenty eight patients died directly from syphilis, 100 died from complications related to syphilis, 40 of the patients’ wives were infected with syphilis, and 19 children were born with congenital syphilis.

2. The Aversion Project


They didn’t like gay people in apartheid-era South Africa. Especially in the armed forces. How they got rid of them is shocking. Using army psychiatrists and military chaplains, who were, presumably privy to private, “confidential” confessions, the apartheid regime flushed out homosexuals in the armed forces. But it did not evict them from the military. The homosexual “undesirables” were sent to a military hospital near Pretoria, to a place called Ward 22 (which in itself sounds terrifying).

There, between 1971 and 1989, many victims were submitted to chemical castrations and electric shock treatment, meant to cure them of their homosexual “condition.” As many as 900 homosexuals, mostly 16-24 years old who had been drafted and had not voluntarily joined the military, were subjected to forced “sexual reassignment” surgeries. Men were surgically turned into women against their will, then cast out into the world, the gender reassignment often incomplete, and without the means to pay for expensive hormones to maintain their new sexual identities.

The head of this project, Dr. Aubrey Levin, went on to become a clinical professor at the University of Calgary. That is until 2010, when his license was suspended for making sexual advances towards a male student. He was sentenced to five years in prison for other sexual assaults (against males).

3. Guatemalan STD Study

Syphilis seemed to bring out the inherent racism in government-funded doctors in the 1940s. Tuskegee’s black people weren’t the only victims of morally reprehensible studies of this disease. Turns out Guatemalans were also deemed suitable unknowing guinea pigs by the U.S. government.

Penicillin having emerged as a cure for syphilis in 1947, the government decided to see just how effective it was. The way to do this, the government decided, was to turn syphilitic prostitutes loose on Guatemalan prison inmates, mental patients and soldiers, none of whom consented to be subjects of an experiment. If actual sex didn’t infect the subject, then surreptitious inoculation did the trick. Once infected, the victim was given penicillin to see if it worked. Or not given penicillin, just to see what happened, apparently. About a third of the approximately 1,500 victims fell into the latter group. More than 80 “participants” in the experiment died.

The Guatemalan study was led by John Charles Cutler, who subsequently participated in the later stages of Tuskegee. In 2010, Secretary of State Hillary Clinton formally apologized to Guatemala for this dark chapter in American history.

4. Agent Orange Experiments

Prisoners, like people of color, have often been the unwilling objects of evil experiments. From 1965 to 1966, Dr. Albert Kligman, funded by Dow Chemical, Johnson & Johnson, and the U.S. Army, conducted what was deemed “dermatological research” on approximately 75 prisoners. What was actually being studied was the effects of Agent Orange on humans.

Prisoners were injected with dioxin (a toxic byproduct of Agent Orange)—468 times the amount the study originally called for. The results were prisoners with volcanic eruptions of chloracne (severe acne combined with blackheads, cysts, pustules, and other really bad stuff) on the face, armpits and groin. Long after the experiments ended, prisoners continued to suffer from the effects of the exposure. Dr. Kligman, apparently very enthusiastic about the study, was quoted as saying, “All I saw before me were acres of skin… It was like a farmer seeing a fertile field for the first time.” Kligman went on to become the doctor behind Retin-A, a major treatment for acne.

5. Irradiation of Black Cancer Patients

During the Cold War, the U.S. and the Soviet Union spent much of their time trying to figure out if they could survive a nuclear catastrophe. How much radiation could a human body take? This would be important information for the Pentagon to know, in order to protect its soldiers in the event they were crazy enough to start an atomic holocaust. Enter the seeming go-to government choice for secret experimentation: unknowing African Americans.

From 1960 until 1971, Dr. Eugene Saenger, a radiologist at the University of Cincinnati, led an experiment exposing 88 cancer patients, poor and mostly black, to whole body radiation, even though this sort of treatment had already been pretty well discredited for the types of cancer these patients had. They were not asked to sign consent forms, nor were they told the Pentagon funded the study. They were simply told they would be getting a treatment that might help them. Patients were exposed, in the period of one hour, to the equivalent of about 20,000 x-rays worth of radiation. Nausea, vomiting, severe stomach pain, loss of appetite, and mental confusion were the results. A report in 1972 indicated that as many as a quarter of the patients died of radiation poisoning. Dr. Saenger recently received a gold medal for “career achievements” from the Radiological Society of North America.

6. Slave Experiments

It should be no surprise that experiments were often conducted on human chattel during America’s shameful slavery history. The man considered the father of modern gynecology, J. Marion Sims, conducted numerous experiments on female slaves between 1845 and 1849. The women, afflicted with vesico-vaginal fistulas, a tear between the vagina and the bladder, suffered greatly from the condition and were incontinent, resulting in societal ostracism.

Because Sims felt the surgery was, “not painful enough to justify the trouble,” as he said in an 1857 lecture, the operations were done without anesthesia. Being slaves, the women had no say as to whether they wanted the procedures or not, and some were subjected to as many as 30 operations. There are many advocates for Dr. Sims, pointing out that the women would have been anxious for any possibility of curing their condition, and that anesthetics were new and unproven at the time. Nevertheless, it is telling that black slaves and not white women, who presumably would have been just as anxious, were the subjects of the experiments.

7. “The Chamber”

Back to the Cold War. Prisoners were again the victims, as the Soviet Secret Police conducted poison experiments in Soviet gulags. The Soviets hoped to develop a deadly poison gas that was tasteless and odorless. At the laboratory, known as “The Chamber,” unknowing and unwilling prisoners were given preparations of mustard gas, ricin, digitoxin, and other concoctions, hidden in meals, beverages or given as “medication.” Presumably, many of these prisoners were not happy with their meals, although, being the gulag, records are spotty. The Secret Police apparently did finally come up with their dream poison, called C-2. According to witnesses, it caused actual physical changes (victims became shorter), and victims subsequently weakened and died within 15 minutes.

8. World War II: Heyday of Evil Experiments

While evil experiments may have been going on in the U.S. during World War II (Tuskegee, for example), it’s hard to argue that the Nazis and the Japanese are the indisputable kings of evil experimentation. The Germans, of course, conducted their well-known experiments on Jewish prisoners (and, to a much lesser extent, Romany people and homosexuals and Poles, among others) in their concentration/death camps. In 1942, the Luftwaffe submerged naked prisoners in ice water for up to three hours to study the effects of cold temperatures on human beings and to devise ways to rewarm them once subjected.

Other prisoners were subjected to streptococcus, tetanus and gas gangrene. Blood vessels were tied off to create artificial “battlefield” wounds. Wood shavings and glass particles were rubbed deep into the wounds to aggravate them. The goal was to test the effectiveness of sulfonamide, an antibacterial agent. Women were forcibly sterilized. More gruesomely, one woman had her breasts tied off with string to see how long it took for her breastfeeding child to die. She eventually killed her own child to stop the suffering. And there is the infamous Josef Mengele, whose experimental “expertise” was on twins. He injected various chemicals into twins, and even sewed two together to create conjoined twins. Mengele escaped to South America after the war and lived until his death in Brazil, never answering for his evil experiments.

Not to be outdone, the Japanese killed as many as 200,000 people during numerous experimental atrocities in both the Sino-Japanese War and WWII. Some of the experiments put the Nazis to shame. People were cut open and kept alive, without the assistance of anesthesia. Body limbs were amputated and sewn on other parts of the body. Limbs were frozen and then thawed, resulting in gangrene. Grenades and flame-throwers were tested on living humans. Various bacteria and diseases were purposely injected into prisoners to study the effects. Unit 731, led by Commander Shiro Ishii, conducted these experiments in the name of biological and chemical warfare research. Before Japan surrendered, in 1945, the Unit 731 lab was destroyed and the prisoners all executed. Ishii himself was never prosecuted for his evil experiments, and in fact was granted immunity by Douglas MacArthur in exchange for the information Ishii gained from the experiments.

9. The Monster Study

Add children to the list of vulnerable people subjected to evil experiments. In 1939, Wendell Johnson, University of Iowa speech pathologist, and his grad student Mary Tudor, conducted stuttering experiments on 22 non-stuttering orphan children. The children were split into two groups. One group was given positive speech therapy, praising them for their fluent speech. The unfortunate other group was given negative therapy, harshly criticizing them for any flaw in their speech abilities, labeling them stutterers.

The result of this cruel experiment was that children in the negative group, while not transforming into full-fledged stutterers, suffered negative psychological effects and several suffered from speech problems for the rest of their lives. Formerly normal children came out of the experiment, dubbed “The Monster Study,” anxious, withdrawn and silent. Several, as adults, eventually sued the University of Iowa, which settled the case in 2007.

10. Project 4.1

Project 4.1 was a medical study conducted on the natives of the Marshall Islands, who in 1952 were exposed to radiation fallout from the Castle Bravo nuclear test at Bikini Atoll, which inadvertently blew upwind to the nearby islands. Instead of informing the residents of the island of their exposure, and treating the victims while they studied them, the U.S. elected instead just to watch quietly and see what happened.

At first the effects were inconclusive. For the first 10 years, miscarriages and stillbirths increased but then returned to normal. Some children had developmental problems or stunted growth, but no conclusive pattern was detectable. After that first decade, though, a pattern did emerge, and it was ugly: Children with thyroid cancer significantly above what would be considered normal. By 1974, almost a third of exposed islanders developed tumors. A Department of Energy report stated that, “The dual purpose of what is now a DOE medical program has led to a view by the Marshallese that they were being used as ‘guinea pigs’ in a ‘radiation experiment.’”

Meet the code-breakers of WWII

“This is Norway checker,” echoed the voice through the scrambler. “I have a good stop for you in Stavanger.”

Nobody on the outside world could have known what she meant.

But inside Bletchley Park, a World War II code-breaking enclave in the English countryside of Buckinghamshire, 18-year-old Ruth Bourne had discovered a vital piece of intelligence.

Bletchley Park was once Britain's best kept secret, with all activity undertaken there strictly hidden for three decades after the war ended.

Working alongside thousands of other women to decipher encoded German signals sent between Nazi generals, Bourne’s discovery meant passing on the information to her superiors to assess whether this was another piece of the decryption puzzle.

With every room named after a country that had been toppled by the Nazis, and each machine christened as one of its towns, Bletchley Park’s simple yet effective checking system proved crucial in the defeat of Hitler’s regime.

A culture of secrecy

Far from being a group of experienced decoders, however, the estate’s recruits mainly consisted of young teenage military personnel, a smattering of crossword whizzes who had been able to complete The Daily Telegraph’s puzzle in less than 12 minutes, and numerous 18-year-old girls plucked from their quiet home towns.

“It was the middle of the war when I received a call saying I was to go into war work to support Britain’s efforts from home,” explains 88-year-old Margaret Bullen, a machine wire operator who served from 1942 until the end of the war.

“A letter from the Foreign Office then arrived saying I had an interview — but I had no idea what it was for, and two weeks later, I was told I’d be off to Bletchley.”

“Before starting work we were told to sign the Official Secrets Act, which was a rather frightening experience for someone as young and naive as I was,” says 90-year-old Becky Webb, who joined the war effort at age 18 in 1941. “I had no idea how I’d comply with it!”

But compliance was the only option, making these three young women — Webb, Bullen and Bourne — fierce guards of the country’s anonymous decoding history for several decades.

Indeed, it wasn’t until some thirty years later that Bletchley’s long maintained shroud of secrecy began to lift, after the publication of “The Ultra Secret” — a tell all book from former RAF officer Frederick W. Winterbotham, who later became an Ultra supervisor.

The 1974 expose revealed how Ultra intelligence had been used to intercept communication behind enemy lines and disseminate vital information to Britain and its allies. Though Winterbotham was accused of embellishing and aggrandizing his role in the tale, without his account, the real story of what went on inside the UK’s code-breaking operation may never have been known.

I never knew what any of my co-workers were doing, and vice versa, and my parents never knew a thing of it.
Ruth Bourne, naval recruit.

“It sounds strange that we knew so little about what was going on, but that was how it was,” reflects Bullen.

“I was sent to live with a couple who were ordered to take me in because of the war. They never once asked me what I was doing there–nobody did–not even the local village workers who’d serve us coffee at the café on our lunch break, in spite of the fact a group of 18-year-olds had suddenly arrived in this little hamlet,” she explains.

“I only heard the name Colossus–the machine I was working on–some three decades after the war ended, and it wasn’t until I later visited Bletchley Park that I said: ‘this is where I worked, this is what I did!'”

While Winterbotham’s revelations sent shock waves through the secretive decryption community, lifting the lid on what really happened inside the park ensued slowly and sporadically, with the bulk of the information being released in the early 2000s.

“I’m delighted that we can discuss our time there now that everything has come out, and I give talks on the subject whenever I’m asked,” enthuses Webb. “I’ve given 97 to date!”

Silent heroines

For many of the young women at Bletchley, though, the removal of the clandestine veil came too late, with the majority of workers’ parents having passed away before the decryption effort became public knowledge.

Bourne, an 18-year-old naval recruit who was sent to one of the park’s expansion locations in Eastcote — on the outskirts of London — was one of many who was never able to tell her loved ones about her contribution to the war.

“You led two lives there,” she recalls. “One life was in A Block, where you ate in the canteen, and talked about boyfriends, and getting trains to London, and where to find black nylon stockings.”

I was sent to live with a couple who were ordered to take me in because of the war. They never once asked me what I was doing there.
Margaret Bullen, WWII Colossus engineer.

“B Block was where we worked, surrounded by high walls, barbed wire and two naval marines guarding the place. If you could make your voice heard over the noise of 12 Turing Bombe machines, that was the only time you would speak about work — but you never would,” she explains. “I never knew what any of my coworkers were doing, and vice versa, and my parents never knew a thing of it.”

After the Nazi regime fell in 1945, many of Bletchley’s women returned home, while others stayed involved with the military’s work. Bourne was given work as a wire destroyer: desoldering the many cables that had been painstakingly connected during intelligence operations throughout the war, while Webb was sent to the Pentagon to paraphrase translated Japanese messages for transmission to officials.

“Upon leaving Bletchley, we really had no skills whatsoever,” remembers Bourne. “Apart from how to keep a secret!”

And that secret was very nearly never told, especially after the original estate was due to be knocked down some 23 years ago, with houses and a supermarket planned to be built in its place.

Preserving Bletchley

It was in May of 1991that Bletchley’s fortunes changed, after a small local committee gathered a group of veterans at the park to say a final farewell to the historic location.

But the group became determined to turn it into a heritage site after hearing the astounding stories of so many code-breakers, engineers and members of the Women’s Royal Naval Service (WREN) who worked at the park during the war.

The Bletchley Park Trust was formed the following year, and from then on, regular reunions and exhibitions at the estate have enabled its former workers and inhabitants to share stories that were on the precipice of being lost forever.

Winterbotham’s book might have been the first time that story of the World War II code-breakers entered the realm of popular culture, but it certainly wasn’t the last, with TV drama “The Bletchley Circle” proving popular in both the UK and United States earlier this year.

With a second series on its way, and exhibitions at the Trust attracting visitors from around the globe, the world’s fascination with the once elusive Bletchley Park shows no sign of slowing.

The culture of secrecy that once threatened Bletchley from being all but erased from the history books has well and truly ended.

Japan declared “state of emergency” after radioactive leak is found.

Japan’s nuclear watchdog has now declared the leak of radioactive water from Fukushima a “state of emergency.” Each day, 300 tons of radioactive water seeps into the ocean, and it’s now clear that TEPCO has engage in a two-and-a-half-year cover-up of immense magnitude. “I believe it’s been leaking into the ocean from the start of the crisis two-and-a-half years ago,” disclosed a 12-year TEPCO veteran named Suzuki-san (SOURCE) “There are still reactor buildings we haven’t gotten into yet,” said another worker named Fujimoto-san. “So there’s always the possibility of another explosion…” TEPCO workers sprayed with wildly radioactive water while waiting for a bus Just how out of control is the situation at Fukushima? It’s so out of control that TEPCO recently had to admit 10 of its workers were somehow — yeah, see if you can figure this out — sprayed with highly radioactive water while waiting for a bus. “The workers’ exposure above the neck was found to be as much as 10 becquerels per square centimeter,” reports Bloomberg.com How exactly did highly radioactive water manage to find its way to a bus stop in the first place? TEPCO isn’t sure. It’s confusing with all those radiation alarms going off all the time. In order to concentrate, the company has found it’s easier to just disable all the alarms and pretend nothing’s wrong.


fukushima-radius-343x300 (1)
The TEPCO cover-up:
To fully grasp the extent of the TEPCO denial, realize that only recently did the company finally admit that radioactive groundwater has been leaking into the ocean. This follows years of stark denials from the company, whose executes have exhibited a remarkable ability to deny reality even when their own workers are dying in droves from cancer. It’s no exaggeration to say that TEPCO’s downplaying of the full extent of the Fukushima disaster has put tens of millions of lives at risk — people who should have been warned about radiation but were denied that information due to the TEPCO cover-up. “At this current time in July of 2013, Fukushima is 80 to 100x more expansive and more intense — letting out about 100x more of the radiation of Chernobyl,” reports Dr. Simon Atkins Phoenix Rising Radio on a BlogTalkRadio interview. “The problem with Fukushima is that it’s not only continuing for 865 days… I mean, let’s wrap our minds around that for a second — it has been leaking out radiation in increasing volumes for 865 days.”

Japan is a society that shuns whistleblowers:
Why has TEPCO been able to cover up the truth about Fukushima for so long? Because Japan is a society of mass conformity. The idea of keeping your head down and not “rocking the boat” is deeply embedded in Japanese culture. Japan is not a nation of “rugged individualism” but of conformist acquiescence. As a result, whistleblowers are shunned, and there is immense peer pressure to defend the status quo… even when it’s a terrible lie. This culture of conformity at all costs is precisely what allows companies like TEPCO to continue operating extremely dangerous nuclear power plants with virtually no accountability. While Japan has entire museums dedicated to the horrifying history of two Japanese cities being bombed by the United States at the end of World War II, when Japan’s own power company is involved in a radiological disaster of similar magnitude, the entire incident gets swept under the rug. Radiation? What radiation? If the government says there’s no radiation, then there’s no radiation! After all, it’s invisible!

Why the U.S. government plays along with the cover-up:
The U.S. government, of course, plays along with the charade because its own top weapons manufacturer — General Electric — designed and built the Fukushima Daiichi power plant in the first place. And the design decisions made by GE, such as storing spent fuel rods in large pools high above the ground, now look not just incompetent but downright idiotic. It turns out there was never any long-term plan to dispose of the spent fuel rods. The idea was to just let them build up over time until someone else inherited the problem. So while Japan and the USA play this game of “let’s all pretend nothing happened,” citizens of both countries continue to be exposed to a relentless wave of deadly radiation that now dwarfs the total radiation release of Chernobyl (which the U.S. media played up in a huge way because the disaster made the Russians look incompetent). The only reason TEPCO is finally getting around to admitting the truth in all this is because you can’t rig all the Geiger counters forever. Radiation follows the laws of physics and atomic decay, not the whims of lying politicians and bureaucrats. As a result, the real story eventually comes out as we’re starting to see right now.

The Fukushima disaster is likely to get far worse, if you can believe that: 
The upshot is that the Fukushima disaster is not only far worse than you’ve been told; it’s very likely going to be worse than you could ever imagine. The radiation leak isn’t plugged, in other words, and another explosion — which many experts believe might be imminent — would release thousands of times more nuclear material into the open environment. Ultimately, the entire Northern hemisphere has been placed at risk by a bunch of corporate bureaucrats who thought building a nuclear facility in the path of a sure-to-happen tidal wave was a fantastic idea. Instead of acknowledging the problem and working to fix it like a responsible person would, our world’s top politicians and ass-coverers have decided it is in their best short-term interests to play along with the TEPCO fairy tale which ridiculously pretends that radioactive leaks can be controlled by wishful thinking.

Remember: Governments can lie about the national debt, health care costs, inflation and unemployment, but they cannot lie about radiation for very long. Sooner or later the physics of it all simply cannot be denied.

Source: Natural News

Does chewing gum take seven years to digest?

As children we’re told not to swallow gum, because it will lie in your stomach for ages. Claudia Hammond chews over the evidence to find out if this is true.


Imagine if you swallowed a piece of gum in the summer of 2006. George W Bush was still in the White House. Twitter had yet to be launched. Pirates of the Caribbean 2 was at the top of the movie charts. It feels like a long time ago, but legend has it that if you had swallowed gum then, your body would be completing its digestion about now.

As children we’re told we mustn’t swallow chewing gum because it will take seven years to digest. Until then, we are led to believe, it will lie there in your stomach, oblivious to the usual bodily processes that break down and process foods. It’s a statement that is confidently asserted in school playgrounds in many countries, but does it have any basis in medical science?

Chewing gum consists of a gum base, sweetener, flavouring, preservatives and softeners. Sugars and flavouring ingredients such as mint oils break down easily and are soon excreted. Likewise, softeners such as vegetable oil or glycerine don’t present a problem for the digestive system. The ingredient that can withstand both the acid in the stomach and the digestive enzymes in the intestines is the gum base. 

Traditionally many manufacturers used chicle, the sap bled from the sapodilla tree, an evergreen native to southern Mexico, Central America and the Caribbean. But after American soldiers took their gum rations overseas during the World War II, its popularity spread and sapodilla trees could no longer keep up with demand.

Today most gum uses other natural or synthetic polymers. The US Food and Drug Administration permits the use of various substances, including butyl rubber, which is used to make inner tubes. Each manufacturer has its own recipe with the aim of getting the perfect degree of elasticity.

But even though the gum base cannot be broken down, that doesn’t mean it stays in your gut for seven years. Nor does it wind itself around your heart, as some also assert. Provided it’s a small piece, it does eventually find its way down the digestive tract. Foreign bodies such as coins can usually pass out of the stomach provided they’re less than 2cm in diameter. Chewing gum has the advantage over a lot of other accidentally-ingested objects in that it’s soft.

The only way that chewing gum could stay for seven years is if there was a vast amount of it, and even then symptoms such as constipation would mean it’s probably discovered soon. A 1998 paper reports outlines alarming case studies of three children who did develop obstructions as a result of the habit.

One was a four-year-old boy who had been suffering from constipation for two years. He found it so hard to go to the toilet that his parents began offering chewing gum as an incentive to try. He ate between five and seven pieces a day and always swallowed them, rather than spitting them out. After four days of fibre supplements, oils and enemas had no effect, doctors sedated him and removed a “taffy-like” mass (referring to its similarity to chewy, toffee-like sweets from the US) from his rectum consisting chiefly of gum. It wasn’t seven years old, but it did cause him serious problems.

Inside the second patient, also aged four, doctors found a multi-coloured mass, which again turned out to be chewing gum. Doctors said the patient was in the habit of swallowing her gum quickly in order to get more.

The third child was just 18-months-old. Doctors found four coins stuck together with a “peculiar sticky wax-like substance” in her stomach. It turned out that she regularly ate chewing gum, and, it appears, small coins. The families of two of the children were aware that they were swallowing their gum and found it “a source of levity”, according to the report’s authors.

So regularly swallowing large amounts of gum isn’t a good idea. But if you have eaten the occasional piece, there’s no evidence that you will come to any harm. And, if you were to swallow some today, it will not hang around inside you until finally making its way out in time for the 2020 Olympics.

Source: BBC


Nutritional Adjuncts to the Fat-Soluble Vitamins A, D, and K .


The “K” in “vitamin K” stands for “koagulation,” the German word for blood clotting. From its discovery in the 1930s through the late 1970s, we knew of no other roles for the vitamin.

The 1990s had come and nearly gone by the time awareness of its role in bone metabolism broke out of the confines of the vitamin K research community, and only in the twenty-first century has its role in preventing calcification of the blood vessels and other soft tissues become clear.

Vitamin K2, found in animal fats and fermented foods, is present in much smaller quantities in most diets when compared to vitamin K1, found in leafy greens.

Since researchers throughout the twentieth century saw the two forms of the vitamin as interchangeable, they ignored vitamin K2 as though its scarcity made it irrelevant.

The realization that vitamin K is not just for “koagulation,” however, led us to discover that vitamins K1 and K2 are not interchangeable after all: vitamin K1 more effectively supports blood clotting, while vitamin K2 more effectively ensures that calcium winds up in the bones and teeth where it supports health rather than in the soft tissues where it contributes to disease.

It was thus only in 2006 that the United States Department of Agriculture determined the vitamin K2 contents of common foods for the first time.1

Vitamins A, D, and K

While vitamin K2 languished in obscurity, vitamins A and D continually traded places with one another as the favored vitamin du jour. The pendulum initially swung in favor of vitamin D because rickets was common in the early twentieth century while eye diseases resulting from vitamin A deficiency were rare. It then swung in favor of vitamin A when that vitamin became known as the “anti-infective” vitamin.2

After World War II, the medical establishment had easy access to antibiotics and thus lost interest in battling infections with vitamin A.3 Vitamin D fared far worse, taking the blame for a British epidemic of infant hypercalcemia and eventually earning a reputation as “the most toxic of all the vitamins.”4 These days, the pendulum has swung full force in the opposite direction: we blame an epidemic of osteoporosis on vitamin A, and see vitamin D as the new panacea.5

Though a paradigm of synergy never took hold, it was not for want of opportunity. When Mellanby and Green first demonstrated in the 1920s that vitamin A prevented infections, they concluded that vitamin D could not be “safely substituted for cod-liver oil in medical treatment,” and that “if a substitute for cod-liver oil is given it ought to be at least as powerful as this oil in its content of both vitamins A and D.”

Consistent with this point of view, clinical trials in the 1930s showed that cod liver oil could reduce the incidence of colds by a third and cut hours missed from work in half.6 Cod liver oil also caused dramatic reductions in mortality from less common but more severe infections. The medical establishment, for example, had been successfully using it to treat tuberculosis since the mid-nineteenth century.7

Studies in the 1930s expanded this to the treatment of measles.8 These findings made the popularity of cod liver oil soar .

The idea that vitamin A alone was “antiinfective,” however, led to similar trials with halibut liver oil, which is rich in vitamin A but poor in vitamin D. These trials often failed to show any benefit. I.G. Spiesman of the University of Illinois College of Medicine proposed a simple solution to this paradox: vitamins A and D worked together to prevent infection, he suggested, and both vitamins are needed to prevent the common cold.

He published his own clinical trial in 1941, showing that massive doses of each vitamin alone provided no benefit and often proved toxic. Massive doses of both vitamins together, however, caused no toxicity and offered powerful protection against the common cold.10 Nevertheless, as antibiotics grew in popularity after World War II, interest in the fat-soluble vitamins waned and cod liver oil use began its steady decline .

The emergence of molecular biology in the late twentieth century provided new evidence for synergy. Vitamins A and D both make independent contributions to immune function by binding to their respective receptors and thereby directing cellular processes in favor of healthful immune responses, but studies in isolated cells suggest that vitamin D may only be able to activate its receptor with the direct cooperation of vitamin A.11, 12

We now know that vitamins A and D also cooperate together to regulate the production of certain vitamin K-dependent proteins. Once vitamin K activates these proteins, they help mineralize bones and teeth, support adequate growth, and protect arteries and other soft tissues from abnormal calcification, and protect against cell death.

As described below, the synergistic action of the fat-soluble trio depends on support from other nutrients like magnesium, zinc, fat and carbohydrate, as well as important metabolic factors such as carbon dioxide and thyroid hormone

Magnesium and the Fat-Soluble Vitamins

Magnesium contributes to more than three hundred specific chemical reactions that occur within our bodies, including every reaction that depends on ATP, the universal energy currency of our cells.13 Magnesium also activates the enzyme that makes copies of DNA, as well as the enzyme that makes RNA, which is responsible for translating the codes contained within our genes into the production of every protein within our body. This process of translating the DNA code in order to produce proteins is called “gene expression.”

Vitamins A and D carry out most of their functions by regulating gene expression, which means they rely directly on magnesium to carry out these functions. They also rely indirectly on magnesium because our cells can only produce their receptors and all the proteins with which they interact with the assistance of this critical mineral.

The well-studied interaction of magnesium with vitamin D and calcium provides an illustrative example. Magnesium is required for both steps in the activation of vitamin D to calcitriol, the form of vitamin D that regulates gene expression and stimulates calcium absorption. Even fully activated vitamin D (calcitriol), however, is useless in the absence of magnesium. Humans who are deficient in magnesium have low blood levels of both calcitriol and calcium, but treating them with calcitriol does nothing to restore calcium levels to normal. The only way to normalize calcium levels in these subjects is to provide them with sufficient magnesium. Magnesium also supports the cellular pumps that keep most calcium out of our soft tissue cells and make it available for the extracellular matrix of bones and teeth.

Zinc and the Fat-Soluble Vitamins

As with magnesium, the fat-soluble trio can only support health if our diets contain adequate zinc. The interaction between vitamin A and zinc is particularly well studied.15 Vitamin A supports the intestinal absorption of zinc, possibly by increasing the production of a binding protein in the intestines. Zinc, in turn, supports the formation of vesicles involved in transporting vitamin A and the other the fat-soluble vitamins across the intestinal wall.

Zinc is an essential structural component of many vitamin A-related proteins, including the primary protein that transports vitamin A through the blood, the enzyme that carries out the first step in the activation of vitamin A to retinoic acid, and the nuclear receptor that binds to retinoic acid and allows it to regulate gene expression.

Numerous studies have demonstrated the interaction between zinc and vitamin A in humans. For example, in humans with marginal zinc status, zinc supplementation supports vitamin A’s role in visual function16 and eye development (Figure 2).17

Although less well studied, zinc also interacts with vitamin D. Vitamin D and zinc most likely promote each other’s intestinal absorption.18 In rats, dietary zinc supports the production of the vitamin D receptor.19 Once the receptor is formed, zinc provides it with essential structural support. Although in the absence of this structural support the receptor still binds to vitamin D, the structural support is needed to allow this vitamin-receptor complex to bind to DNA.20 Studies with isolated cells illustrate the importance of this interaction: adding zinc to these cells increases the rate at which vitamin D activates the expression of genes.21

Fat, Carbs, Thyroid and Carbon Dioxide

In order to absorb fat-soluble vitamins from our food, we need to eat fat. Human studies show that both the amount and type of fat are important. For example, one study showed that absorption of beta-carotene from a salad with no added fat was close to zero. The addition of a lowfat dressing made from canola oil increased absorption, but a high-fat dressing was much more effective.23 Canola oil, however, is far from ideal. Studies in rats show that absorption of carotenoids is much higher with olive oil than with corn oil.24

Similarly, studies in humans show that consuming beta-carotene with beef tallow rather than sunflower oil increases the amount we absorb from 11 to 17 percent. The reason for this is unknown, but it may be that oils rich in polyunsaturated fatty acids promote the oxidative destruction of fat-soluble vitamins in the intestines before we are able to absorb them. Thus, the more fat we eat, and the lower those fats are in polyunsaturated fatty acids, the more fat-soluble vitamins we absorb.

While dietary fat is clearly important, there may be a role for dietary carbohydrate as well. Once vitamins A and D stimulate the production of vitamin K-dependent proteins, vitamin K activates those proteins by adding carbon dioxide to them. Once added to a protein, carbon dioxide carries a negative charge and allows the protein to interact with calcium, which carries a positive charge. The greater the supply of carbon dioxide, the better vitamin K can do its job.25 Carbohydrates are rich in carbon and oxygen, and when we break them down for energy we release these elements in our breath as carbon dioxide. Because carbohydrates are richer in oxygen, burning them generates about 30 percent more carbon dioxide per calorie than burning fat, and low-carbohydrate diets have been shown to lower blood levels of carbon dioxide .

Ideally, we should study this further by determining whether dietary carbohydrate affects the amount of activated vitamin K-dependent proteins in humans.

We also produce more carbon dioxide when we burn more calories, regardless of whether we are burning carbohydrate or fat. Intense exercise more than doubles the amount of carbon dioxide we produce compared to what we produce when at rest.27 Even working at a standing desk rather than a sitting desk increases both calories burned and carbon dioxide generated by about a third .

Future studies should directly investigate whether exercise increases the activation of vitamin K-dependent proteins, but it seems reasonable to suggest that part of the reason exercise promotes cardiovascular health may be because it ensures a more abundant supply of carbon dioxide, which vitamin K uses to activate proteins that protect our heart valves and blood vessels from calcification. Thyroid hormone is a key regulator of the metabolic rate and may thus be a major determinant of the carbon dioxide available for activating vitamin K-dependent proteins. Theoretically, thyroid hormone should increase the rate of metabolism and a greater rate of metabolism should produce a proportionally greater supply of carbon dioxide.

Thyroid hormone directly increases the production of vitamin K-dependent proteins and protects blood vessels from calcification in rats.29 The reason for this relationship is unclear. We could speculate, however, that our bodies in their infinite wisdom use thyroid hormone to tie the production of vitamin K-dependent proteins to the production the carbon dioxide needed to activate them.

The Big Picture

It is clearly time to move beyond viewing each vitamin in isolation. The fat-soluble vitamins not only synergize with each other, but cooperate with many other nutrients and metabolic factors such as magnesium, zinc, fat, carbohydrate, carbon dioxide and thyroid hormone.

This paradigm has two important implications. At the level of scientific research, a study about one vitamin can easily come to false conclusions unless it takes into account its interactions with all the others. We should reverently and humbly bow before the complexity of these interactions, realizing how little we know and recognizing that we are always learning. At the level of personal health, these interactions emphasize the need to consume a well-rounded, nutrient-dense diet. Supplementation with an individual vitamin runs the risk of throwing it out of balance with its synergistic partners. The fat-soluble vitamins work most safely and effectively when we obtain them from natural foods within the context of a diet rich in all their synergistic partners.

Zinc and the Dark Adaptation Test for Vitamin A Deficiency

The role of vitamin A in vision is unusual. This vitamin carries out most of its known actions by regulating the expression of specific sets of genes. Vitamin A regulates gene expression only after being activated in a two-step process from retinol to retinal, and finally to retinoic acid. Vitamin A supports vision, however, in its semi-activated form as retinal. Retinal binds to a protein known as opsin, forming a vitamin-protein complex known as rhodopsin. Each photon of light that enters our eye and collides with rhodopsin causes the retinal to change shape and release itself from the complex. This event then translates into an electrical impulse that our optic nerve transmits to our brain. The brain synthesizes myriad such electrical impulses at every moment and interprets them as vision.30

While the function of opsin is to help generate visual images by binding and releasing vitamin A, opsin can only maintain its proper shape and function when it is bound to zinc. In addition, zinc supports the conversion of retinol to retinal, the form of vitamin A that binds to opsin. We could predict, then, that vitamin A would only be able to support vision in the presence of adequate zinc. This can be studied by determining dark adaptation thresholds, which determine the dimmest spots of light we are able to see after having spent a period of time in the dark to maximize our visual sensitivity. When vitamin A is insufficient, we lose the ability to see the dimmer spots of light.

Robert Russell of Tufts University studied ten patients with deficient blood levels of vitamin A who also failed the dark-adaptation test. Eight of them achieved normal dark-adaptation thresholds after supplementing with 10,000 international units of vitamin A for two to four weeks. Two of them, however, had deficient blood levels of zinc. Vitamin A supplementation alone failed to normalize their visual function, but adding 220 milligrams per day of zinc to the regimen for two weeks brought it back to normal.16 These results show that vitamin A can only support healthy vision with the direct assistance of zinc.

Source: mercola.com