Inexpensive Vitamin Treats ‘So Many Diseases’ It Threatens Big Pharma

Peripheral neuropathy, which is characterized by sharp pain or numbing and tingling, affects people of all ages. But the cure isn’t some outlandish and expensive pharmaceutical medicine – it’s a vitamin found easily in fish or supplements. Patients suffering from a loss of muscular control, painful tingling, numbness and loss of sensation in their limbs don’t have to undergo invasive surgeries or take debilitating meds – they may be able to simply take high doses of Vitamin B12. This vitamin benefits many other diseases as well.

b12 vitamin

Our bodies need 13 different B vitamins, and B12 can be especially lacking. Most of us are already familiar with B1 (thiamine), B2 (riboflavin), B3 (niacin), B6, and folic acid. But the B complex also includes vitamin B5 (pantothenic acid), B12, biotin, PABA, choline, and inositol. These water-soluble vitamins are indispensable for good health.

What’s more, the high sugar diet that many Americans partake of also destroys Vitamin B12 in the intestinal tract.

As Natural Society previously reported, one UK doctor is a big fan of B12 – so much so that he has treated all kinds of illnesses with B12 shots. He fully documented his successes, but UK health officials were wary because he was giving inexpensive B12 injections to patients whose blood serum B12 levels were above 150, which the UK medical establishment considers normal.

Most of us don’t get enough B12 in our diets, though you can find it primarily in animal sources like seafood, shellfish, dairy, and meat. It also isn’t given its due credit in the medical establishment, as many vitamins and minerals don’t since they cannot be patented and sold as billion dollar drugs. But even a geneticist recently uncovered the ability of B12 to treat the rare disease that causes loss of muscle control in a toddler.

David L. Katz, MD, MPH, FACPM, FACP Director, Yale University Prevention Research Center Director says that B12 might be one of the most important vitamins that you aren’t paying any attention to. One of the most obviously vital functions of the vitamins, as he points out, is its ability to promote the normal replication of DNA. Without B12, we cannot regenerate normal, healthy cells.

This means that even if you don’t have a full-blown disease like peripheral neuropathy, you will likely feel old before your time, excessively tired, anemic, dizzy, and irritable. You may also more frequently suffer from dementia as you age.

If you are vegan or vegetarian, it can be difficult to get enough B12. Try the following tips:

  • Eat more meat, dairy, and seafood.
  • Take one B12 supplement daily providing at least 10 micrograms
  • Eat fortified foods two or three times a day to get at least three micrograms (mcg or µg) of B12 a day.

If you’ve been feeling sluggish, check that you are getting enough B12, and if not, supplement. Your renewed health will be testament to the miracle of this vitamin.

Does Sodium Intake Affect Mortality and CV Event Risk?

Sodium intake may not be associated with mortality or incident cardiovascular events in older adults, according to a study published Jan. 19 in the JAMA: Internal Medicine.

In the Health, Aging and Body Composition (Health ABC) Study, initiated in 1997, researchers assessed self-reported sodium intake from 2,642 Medicare beneficiaries, ages 71-80 years old. Participants were excluded for difficulties with walking or activities of daily life, cognitive impairment, inability to communicate, and previous heart failure (HF). At the first annual follow-up visit, researchers recorded food intake as reported by participants, specifically examining sodium intake. After 10 years, 34 percent of patients had died, while 29 percent and 15 percent had developed cardiovascular disease and HF, respectively.

The results of the study showed that there was no association between participant-reported sodium intake and 10-year mortality, incident HF or incident cardiovascular disease. Further, there was no indication that consuming less than 1,500 mg/d of sodium benefitted participants any more than consuming the recommended amount (1,500-2,300 mg/d). However, the study showed a slight potential for harm when participants had a sodium intake of greater than 2,300 mg/d, especially in women and African Americans.

The authors note that while the food frequency questionnaire used by participants at the first annual follow-up has limitations in its accuracy, “self-reported adoption of a low-salt diet was not associated with significantly higher risk for [any] events.” They conclude that moving forward, there is a need for further research and stronger evidence in order to create better recommendations for older adults.

– See more at:

How The “Nocebo Effect” Can Trick Us Into Actually Dying

How The "Nocebo Effect" Can Trick Us Into Actually Dying

The “nocebo effect” is like the placebo effect, except in reverse. Whereas placebos trick people into feeling better, “nocebos” are things that make people feel worse, even though they don’t really exist. They can even kill people! Here’s how we can trick ourselves into dying, and how doctors may have found a cure.

The Nocebo Effect

Let’s begin with a story. A young man who had been in treatment for depression for some time hit a low point after his girlfriend left him. He took an entire bottle of his medication. As soon as he finished the bottle, he realized he’d made a mistake. At the hospital, grievously ill, he lingered near death. He couldn’t breathe. His blood pressure was dangerously low. After making inquiries, the doctors found out that he had been in a study for a new antidepressant. That might explain why none of the tests they ran on him was able to indicate what drug was poisoning him. They contacted the doctors coordinating the study.

How The "Nocebo Effect" Can Trick Us Into Actually Dying

The man had been given placebos. The placebos had worked, improving his mood and making him sure that he’d been given the active medication. When he overdosed on the sugar pills, the placebo effect he’d been experiencing transitioned into the “nocebo effect.” Convinced he was dying, he actually began to die. Once he was told he was in no danger, he recovered.

Not all people do. There are numerous comical instances of the nocebo effect in which people are given what they believe is an emetic and begin to throw up, only to discover that the emetic was saline. Huge swathes of the population rush to the hospital whenever a new virus gets reported on the news. Some secret societies even took advantage of the nocebo effect; they would blindfold their initiates and either run the back of a knife along their skin or cut them very slightly, then pour warm water over the cut, making them feel like they were bleeding profusely and scaring the hell out of them.

In the wrong situation, the nocebo effect can be extremely dangerous. People who expect to die going into surgery are more likely to have delayed recovery and serious complications. At least one man seems to have died of a metastasized tumor that didn’t metastasize. He was told he had a given number of months to live, and lived only that long. The autopsy revealed that he wasn’t in any danger from the small and isolated tumor they found on his liver. He’d just died.

Why Your Doctor Talks Like That

The nocebo effect and the attempts made to avoid it help explain the exaggeratedly bland hospital language that often exasperates patients. “This is going to hurt like hell,” seems charmingly honest, but it’s also something that can cause people to hurt more than they would for the comparatively disingenuous “some patients may experience some discomfort.”

How The "Nocebo Effect" Can Trick Us Into Actually Dying

A few words are effective in causing or preventing pain. Patients with back pain who took a stretch test were more likely to feel pain if the doctors administering it admitted it could hurt. If the doctors just shut up and let them stretch, they tended to report no pain.

This puts doctors in a bind. There’s no ethical way to practice medicine without allowing patients informed consent. Informed consent means letting a patient know about everything that could go wrong with them. By mentioning these details, all the things that could cause a patient to feel pain, to regress, or to die, a doctor could be increasing the suffering, or even hastening the death, of a person who would have been fine with less information. This puts doctors in the position of trying to give patients information while simultaneously trying to keep them from focusing on it. The practice leads to some weird quirks. One paper on the nocebo effect and surgery notes “an epidemic that “kills 1,286 people out of every 10,000” is perceived as worse than an epidemic that “kills 24.14% of the population,” even though the latter kills almost twice as many people.” It advises doctors to give death and complication rates in percentages, rather than numbers per thousand.

The Cure for the Non-Existent

Experiments designed to study the nocebo effect are few and far between, as most doctors are loathe to try to create illness in their patients. But there is some hope for those who don’t think they can best their own brain in a battle of wills. The nocebo effect seems to follow a specific pathway in the brain, and that pathway can be blocked.

Cholecystokinin (cck) is a hormone found in the gut and in the brain. In the gut cck and its variants regulate the release of bile and other digestive enzymes. In the brain, it’s a little different — two versions of it cholecystokinin, cck-a and cck-b, both bind to receptors that dial down the dopamine in the brain when activated. This dials up the level of depression and anxiety that a person feels. Increased relaxation and happiness decrease pain. Increased anxiety and depression alert people to pain, and help them focus on it. When injected with something that temporarily takes out the cck receptors, the nocebo effect lessens, and people have better outcomes.

Which makes me want a meta-experiment. What if people are told they are being injected with something that takes out the nocebo effect, but it’s actually a placebo? What will happen to patients in the battle of the ‘cebos?

Physicist suggests new experiments could make or break axion as dark matter theory

Leslie Rosenberg, a physicist with the University of Washington has published a paper in Proceedings of the National Academy of Sciences, describing the current state of research that involves investigating the possibility that axions are what make up dark matter. He also offers some perspective on the work suggesting that at least one project is likely to lead to either proving or disproving that axions are dark matter.
dark matter

For several years now, scientists have been hard at work trying to detect WIMPs, the thinking has been that if they can be detected, than it would go a long way towards proving that they are what makes up dark matter—the theoretical stuff that is now believed to make up approximately 85 percent of all mass in the universe. Unfortunately, despite their best efforts, scientists have not yet been able to detect the presence of a single one, causing some to wonder if they exist at all. That doubt has led some scientists to consider other types of particles as dark matter candidates—one of them is the neutrino, though more and more it appears to be falling from favor. Another is the axion, a particle first theorized in the early 70’s. One of its major proponents is Rosenburg, who has been developing experimental devices with the purpose of either proving that dark matter is made up of axions, or it is not.

Axion theory suggests that axions can decay into photons—one axion into two photons, and vice-versa—the inbetween state is known as the virtual axion. Because of this property, most axion detectors are dedicated to measuring them after they have decayed into photons because that is something we know how to detect. Rosenburg describes research into devices meant to study the impact axions may have on the sun, or halos around astronomical objects, and notes that super novae should also produce them. He also notes that some experiments are looking into what has come to be known as “Shining light through walls”—the idea being if axions decay into photons just after passing through a wall, or other object, it should be possible to detect them. The problem here of course would be proving that they have anything to do with dark matter.

Rosenburg then describes a radio frequency approach, another way to capture axions decaying into light—current experiments in this area are meant to capture axions that are part of the Milky Way’s dark matter halo. In this case, the idea is to rouse the axions into decaying into and then detecting them—the most prominent project appears to be the Axion Dark Matter eXperiment (ADMX), which Rosenburg claims is likely to either prove or disprove axions as once and for all.

Explore further: Is an understanding of dark matter around the corner? Experimentalists unsure

More information: Dark-matter QCD-axion searches, Leslie J Rosenberg, PNAS, DOI: 10.1073/pnas.1308788112

In the late 20th century, cosmology became a precision science. Now, at the beginning of the next century, the parameters describing how our universe evolved from the Big Bang are generally known to a few percent. One key parameter is the total mass density of the universe. Normal matter constitutes only a small fraction of the total mass density. Observations suggest this additional mass, the dark matter, is cold (that is, moving nonrelativistically in the early universe) and interacts feebly if at all with normal matter and radiation. There’s no known such elementary particle, so the strong presumption is the dark matter consists of particle relics of a new kind left over from the Big Bang. One of the most important questions in science is the nature of this dark matter. One attractive particle dark-matter candidate is the axion. The axion is a hypothetical elementary particle arising in a simple and elegant extension to the standard model of particle physics that nulls otherwise observable CP-violating effects (where CP is the product of charge reversal C and parity inversion P) in quantum chromo dynamics (QCD). A light axion of mass 10−(6–3) eV (the invisible axion) would couple extraordinarily weakly to normal matter and radiation and would therefore be extremely difficult to detect in the laboratory. However, such an axion is a compelling dark-matter candidate and is therefore a target of a number of searches. Compared with other particle dark-matter candidates, the plausible range of axion dark-matter couplings and masses is narrowly constrained. This focused search range allows for definitive searches, where a nonobservation would seriously impugn the dark-matter QCD-axion hypothesis. Axion searches use a wide range of technologies, and the experiment sensitivities are now reaching likely dark-matter axion couplings and masses. This article is a selective overview of the current generation of sensitive axion searches. Not all techniques and experiments are discussed, but I hope to give a sense of the current experimental landscape of the search for dark-matter axions.

Read more at:



The Human Protein Atlas, a major multinational research project supported by the Knut and Alice Wallenberg Foundation, recently launched (November 6, 2014) an open source tissue-based interactive map of the human protein. Based on 13 million annotated images, the database maps the distribution of proteins in all major tissues and organs in the human body, showing both proteins restricted to certain tissues, such as the brain, heart, or liver, and those present in all. As an open access resource, it is expected to help drive the development of new diagnostics and drugs, but also to provide basic insights in normal human biology.

In the Science article, “Tissue-based Atlas of the Human Proteome”, the approximately 20,000 protein coding genes in humans have been analysed and classified using a combination of genomics, transcriptomics, proteomics, and antibody-based profiling, says the article’s lead author, Mathias Uhlén, Professor of Microbiology at Stockholm’s KTH Royal Institute of Technology and the director of the Human Protein Atlas program.

The analysis shows that almost half of the protein-coding genes are expressed in a ubiquitous manner and thus found in all analysed tissues.

Approximately 15% of the genes show an enriched expression in one or several tissues or organs, including well-known tissue-specific proteins, such as insulin and troponin. The testes, or testicles, have the most tissue-enriched proteins followed by the brain and the liver.

The analysis suggests that approximately 3,000 proteins are secreted from the cells and an additional 5,500 proteins are located to the membrane systems of the cells.

“This is important information for the pharmaceutical industry. We show that 70% of the current targets for approved pharmaceutical drugs are either secreted or membrane-bound proteins,” Uhlén says. “Interestingly, 30% of these protein targets are found in all analysed tissues and organs. This could help explain some side effects of drugs and thus might have consequences for future drug development.”

The analysis also contains a study of the metabolic reactions occurring in different parts of the human body. The most specialised organ is the liver with a large number of chemical reactions not found in other parts of the human body.



The first large prospective cohort study to examine the relationship between menopausal symptoms and bone health in postmenopausal women has found that those who experience moderate to severe hot flashes and night sweats during menopause tend to have lower bone mineral density and higher rates of hip fracture than peers with no menopausal symptoms.

The study followed thousands of women for eight years. After adjusting for age, body mass index and demographic factors, it found that women who reported moderate to severe hot flashes at baseline enrollment showed a significant reduction in the bone density in the femoral neck region of their hips over time and were nearly twice as likely to have a hip fracture during the follow-up period.

This study employed data and study participants from the Women’s Health Initiative (WHI) initiated by the U.S. National Institutes of Health (NIH) in 1991 to address major health issues causing morbidity and mortality in postmenopausal women.

The WHI consisted of three clinical trials and an observational study undertaken at 40 clinical centers throughout the US, including the University at Buffalo Clinical Center directed by Wactawski-Wende.

She says the research team examined data from 23,573 clinical trial participants, aged 50 to 79, who were not then using menopausal hormone therapy nor assigned to use it during the trial. They conducted baseline and follow-up bone density examinations in 4,867 of these women.

Wactawski-Wende says, “We knew that during menopause, about 60 percent of women experience vasomotor symptoms (VMS), such as hot flashes and night sweats. They are among the most bothersome symptoms of menopause and can last for many years.

“It also was known that osteoporosis, a condition in which bones become structurally weak and more likely to break, afflicts 30 percent of all postmenopausal women in the United States and Europe, and that at least 40 percent of that group will sustain one or more fragility fractures in their remaining lifetime,” she says.

“What we did not know,” says Wactawski-Wende, “was whether VMS are associated with reductions in bone mineral density or increased fracture incidence.

“Women who experience vasomotor menopausal symptoms will lose bone density at a faster rate and nearly double their risk of hip fracture,” she says, “and the serious public health risk this poses is underscored by previous research that found an initial fracture poses an 86 percent risk for a second new fracture.”

Wactawski-Wende says, “Clearly more research is needed to understand the relationship between menopausal symptoms and bone health. In the meantime, women at risk of fracture may want to engage in behaviors that protect their bones including increasing their physical activity and ensuring they have adequate intakes of calcium and vitamin D.”



Picture a toddler getting his first eye exam. He’s seated in a strange room, with strange instruments and strange bright lights. He can’t sit still or open his eyes long enough for that diagnostic poof of air – especially if he has trouble seeing anyway, as children with achromatopsia do.

But according to research from the Baylor Visual Function Testing Center, future little ones might not have to squirm in their seats during routine eye exams. The research, which was published in JAMA Ophthalmology, explores a new non-invasive technology that’s kind of like a handheld CT scanner for the eye.

The technology, known as spectral-domain optical coherence tomographic imaging (SD-OCT), helps pediatric ophthalmologists detect achromatopsia by studying retina thickness. It can scan the structure of the eye from a distance, without getting too close to the young patient.

That non-invasive approach is a step up from previous methods, when specialists diagnosed based on age, family history and the standard eye exam procedure (air poof included).

Also known as “day blindness,” achromatopsia is a rare condition that causes bad vision in daylight, color blindness and shaking eyes. It affects one in 40,000 U.S. children and tends to run in families. Worst of all, it’s not easy to predict in young children because the current diagnostic tools were made for grown-ups.

“It has been very difficult to understand the retinal structure of children with achromatopsia because young children are known to be uncooperative during eye examinations designed for the adults,” said Yuquan Wen, PhD, scientific director of the Baylor Visual Function Testing Center. He, along with researchers at the Casey Eye Institute of Oregon Health & Science University, helped develop the study.

As part of the research, investigators studied 18 patients, each of them about 4 years old. Half of the participants suffered from achromatopsia and the other half (control) had normal visual function. By using the SD-OCT, researchers produced 3D high-definition imaging of the kids’ retinas, which is the back part of the eye responsible for creating visual pictures. In many ways, it’s like the film in a camera.

Through those images, they found that the achromatopsia patients had significantly thinner-than-normal retinas, as much as 17 percent thinner than the control participants. The findings imply the importance of studying a child’s retinal thickness when looking for achromatopsia.

Researchers also noted that, in young children, those retinal qualities seemed milder than older patients with the same achromatopsia diagnosis. This could mean a possible therapeutic window to help patients while they’re still young.

“We think that retinal thickness measurement is a more reliable predictor than age alone or genotype alone,” Dr. Wen said. “With the knowledge of retinal thickness in young children with achromaptosia, smarter clinical studies could be designed and monitored based on real structural changes of the retina in conjunction with the visual function change.”

As those new studies take shape, they’ll likely include a form of gene therapy that involves special therapies to make up for the non-functioning genes the patients were born with. Gene therapy has emerged in several clinical trials for blinding eye diseases and likely will continue to do so well into the future.

Before the availability of the handheld SD-OCT, pediatric ophthalmologists had only simple tools and instruments (all of them designed for adults) to detect achromatopsia in children. But based on these findings, the handheld SD-OCT could join those standard tools very soon – as well as be useful in pediatric eye exams in general, Dr. Wen said.

And the squirming kids who endure those eye exams? Like the first-time toddler, things won’t be so tough for them.



While the link between salt and hypertension is well known, scientists until now haven’t understood how high salt intake increased blood pressure. By studying the brains of rats, a team led by Prof. Charles Bourque of McGill’s Faculty of Medicine discovered that ingesting large amounts of dietary salt causes changes in key brain circuits.

“We found that a period of high dietary salt intake in rats causes a biochemical change in the neurons that release vasopressin (VP) into the systemic circulation”, says Bourque who is also a researcher at the The Research Institute of the McGill University Health Centre (RI-MUHC). “This change, which involves a neurotrophic molecule called BDNF (brain-derived neurotrophic factor), prevents the inhibition of these particular neurons by other cells”.

The team’s findings, published today in the journal Neuron, found that high salt intake prevents the inhibition of VP neurons by the body’s arterial pressure detection circuit. The disabling of this natural safety mechanism allows blood pressure to rise when a high amount of salt is ingested over a long period of time.

While the team’s discovery advances the understanding of the link between salt intake and blood pressure, more work is needed to define new targets that could potentially be explored for therapeutic intervention. Among the questions for further research: Does the same reprogramming effect hold true for humans? If so, how might it be reversed?

In the meantime, Bourque says, the message remains: limit dietary salt.



E. coli usually brings to mind food poisoning and beach closures, but researchers recently discovered a protein in E. coli that inhibits the accumulation of potentially toxic amyloids—a hallmark of diseases such as Parkinson’s.

Amyloids are formed by proteins that misfold and group together, and when amyloids assemble at the wrong place or time, they can damage brain tissue and cause cell death, according to Margery Evans, lead author of the University of Michigan study, and Matthew Chapman, principal investigator and associate professor in U-M Molecular, Cellular, and Developmental Biology.

The findings could point to a new therapeutic approach to Parkinson’s disease and a method for targeting amyloids associated with such neurodegenerative diseases.

A key biological problem related to patients with Parkinson’s is that certain proteins accumulate to form harmful amyloid fibers in brain tissues, which is toxic to cells and causes cell death.

While these amyloids are a hallmark of Parkinson’s and other diseases such as Alzheimer’s, not all amyloids are bad. Some cells, those in E. coli included, assemble helpful amyloids used for cell function.

E. coli make amyloid curli on the cell surface, where it’s protective, rather than toxic. The curli anchor the bacteria to kitchen counters and intestinal walls, where they can cause infections and make us sick. These helpful amyloids that E. coli produce do not form on the inside of the cell where they would be toxic.

“It means that something in E. coli very specifically inhibits the assembly of the amyloid inside the cell. Therefore, amyloid formation only occurs outside the cell where it does not cause toxicity,” said Evans, a doctoral student in molecular, cellular, and developmental biology.

Evans and the U-M team went on a biochemical hunt to understand how E. coli prevented amyloids from forming inside cells and uncovered a protein called CsgC that is a very specific, effective inhibitor of E. coli amyloid formation.

U-M researchers have been collaborating with scientists from Umeå University in Sweden and Imperial College in London, and in the current study found that the CsgC protein also inhibits amyloid formation of the kind associated with Parkinson’s.

Another implication of the research is that the curli could be a target for attacking biofilms, a kind of goo created by bacteria, which acts as a shield to thwart antibiotics and antiseptics. These bacteria can cause chronic infections, but treating these infections using molecules that block curli formation may degrade the biofilm and leave the bacteria more vulnerable to drug therapy.

The study, “The bacterial curli system possesses a potent and selective inhibitor of amyloid formation,” is scheduled to appear Jan. 22 in the online edition of Molecular Cell.



New research suggests pre-Homo human ancestral species, such asAustralopithecus africanus, used human-like hand postures much earlier than was previously thought.

Anthropologists from the University of Kent, working with researchers from University College London, the Max Planck Institute for Evolutionary Anthropology in Leipzig (Germany) and the Vienna University of Technology (Austria), have produced the first research findings to support archaeological evidence for stone tool use among fossil australopiths 3-2 million years ago.

The distinctly human ability for forceful precision (e.g. when turning a key) and power “squeeze” gripping (e.g. when using a hammer) is linked to two key evolutionary transitions in hand use: a reduction in arboreal climbing and the manufacture and use of stone tools. However, it is unclear when these locomotory and manipulative transitions occurred.

Dr Matthew Skinner, Senior Lecturer in Biological Anthropology and Dr Tracy Kivell, Reader in Biological Anthropology, both of Kent’s School of Anthropology and Conservation, used new techniques to reveal how fossil species were using their hands by examining the internal spongey structure of bone called trabeculae. Trabecular bone remodels quickly during life and can reflect the actual behaviour of individuals in their lifetime.

The researchers first examined the trabeculae of hand bones of humans and chimpanzees. They found clear differences between humans, who have a unique ability for forceful precision gripping between thumb and fingers, and chimpanzees, who cannot adopt human-like postures. This unique human pattern is present in known non-arboreal and stone tool-making fossil human species, such as Neanderthals.

The research, titled Human-like hand use in Australopithecus africanus, shows that Australopithecus africanus, a 3-2 million-year-old species from South Africa traditionally considered not to have engaged in habitual tool manufacture, has a human-like trabecular bone pattern in the bones of the thumb and palm (the metacarpals) consistent with forceful opposition of the thumb and fingers typically adopted during tool use.

These results support previously published archaeological evidence for stone tool use in australopiths and provide skeletal evidence that our early ancestors used human-like hand postures much earlier and more frequently than previously considered.