Statins myth: thousands are dying because of warnings over non-existent side effects


False claims about the risks of statins may have cost the lives of tens of thousands of Britons, researchers have said, after a Lancet study found the drugs do not cause side-effects which have deterred many.

The research on 10,000 people found that if they did not know what drugs they were given, they were no more likely than those given sugar pills to report symptoms such as muscle pain, sleep disturbance and cognitive impairment.

Yet when participants in a second part of the trial were told the drugs were statins, rates of some reported side-effects shot up – with muscle pain appearing up to 41 per cent more common.

Last night the study’s lead author accused British medicines regulators of “jumping the gun” in ever listing such side-effects on drug packaging.

Prof Peter Sever, from Imperial College London, urged the Medicines and Healthcare Products Regulatory Agency (MHRA) to now strip packets of such warnings, in order to save “tens if not hundreds of thousands of lives”.

There are people out there who are dying because they’re not taking statins, and the numbers are large, the numbers are tens of thousands, if not hundreds of thousandsProf Peter Sever, Imperial College London

He said it was a “tragedy” akin to the MMR scandal that high risk patients had been deterred from taking drugs which could save their lives. Urging patients not to “gamble” with the risk of heart attacks and strokes, he said “bad science” had misled the public, deterring many from taking life-saving medication.

The study of patients at risk of heart disease, found that those told that their daily drug was a statin were far more likely to think they were suffering side-effects.

Researchers said it illustrated a “nocebo effect” which meant patients were more likely to think they were experiencing side-effects if they expected them.

As a result, daily aches and pains were more likely to be attributed to statins.

The phenomenon is the opposite to the well-known placebo effect, the beneficial response sometimes experienced by those given “dummy” drugs as part of trials.

NHS guidance recommends the cholesterol-busting drugs for around 40 per cent of adults.

But a number of doctors have argued against “mass medicalisation” saying too many pills are being doled out instead of efforts to improve lifestyles.

The new study suggests millions of patients could benefit from high doses of statins

Prof Sever said many of those arguing against statins had exagerrated risks such as muscle pain, which were not backed by the new study, the largest ever research into their side-effects.

In 2009, the MHRA listed such side-effects on packaging for statins, after a series of observational studies suggested such links.

Prof Sever said the regulator should never have taken such action.

“There are people out there who are dying because they’re not taking statins, and the numbers are large, the numbers are tens of thousands, if not hundreds of thousands. And they are dying because of a nocebo effect, in my opinion,” he said.

“Many of us would say that the MHRA … did not make a profound value judgment based on the evidence,” the professor said.

“We would hope that the MHRA will withdraw that request that these side effects should be listed.”

He added: “These warnings should not be on the label … I would love to see these side effects removed.

A spokesman for the MHRA said: “The benefits of statins are well established and are considered to outweigh the risk of side-effects in the majority of patients.”

“Any new significant information on the efficacy or safety of statins will be carefully reviewed and action will be taken if required, including updates to product labelling.”

The study’s researchers said statins were not without any side-effects. Statins carry around a 9 per cent increased risk of diabetes, they said, with links to uncommon side effects such as myopathy, resulting in muscle weakness.

Even so, the benefits of the drugs in reducing risk of heart attacks and strokes “overwhelmed” the risk of side-effects, Prof Sever said.

Speaking of the nocebo effect, he said: “Just as the placebo effect can be very strong, so too can the nocebo effect.

“This is not a case of people making up symptoms, or that the symptoms are ‘all in their heads’. Patients can experience very real pain as a result of the nocebo effect and the expectation that drugs will cause harm.”

The study was funded by drug company Pfizer, which makes statins, but the authors said all data collection, analysis and interpretation of the results was carried out independently.

London cardiologist Dr Aseem Malhotra, who has argued against mass prescribing of statins, last night insisted the drugs had only “marginal” benefits for those with established heart disease, and did not save lives for lower risk patients.

Other research had found that more than half of patients put on statins abandoned them within a year, most commonly because of side-effects, he said.

He said the misrepresentation of the risks and benefits of statins would unfold to become “one of the biggest scandals in the history of medicine”.

 

How the “Anti-Vaccine” Movement Threatens Us All


Step back and take a good look. It’s a full blown, parent on parent brawl. I’m struck with an urgency that the vaccine discussion is perilously off track and acutely needs correction. The anti-vaccine controversy isn’t really about disease, public health, science, autism, or chronic illness. It’s not even about vaccines.

It’s about the role of government in our lives

As parents face off and hurl epithets, colossal special interests are having a field day codifying a set of laws that are systematically and comprehensively taking away our fundamental rights. It’s a massive overreach.

Will you grant government bureaucrats carte blanche to define and ultimately direct the education and welfare of your children across a broad spectrum of issues, and to allow your children to be taken away if you do not comply?

How-Anti-Vaccine-Movement-Threatens-All4

Yes, that’s exactly what this is about.

So stop saying whether you vaccinate

It doesn’t matter. And acting as if it does is a big part of the problem. Whether you choose all, some, or no vaccines, it’s way past time to quit publicly disclosing your family’s personal medical information as a badge of honor. Just because other people are asking doesn’t mean that you should do it.

There are myriad reasons that factor into each family’s decision, relating to matters that are simply no one else’s business. You shouldn’t have to explain or justify any of them. You shouldn’t open yourself to the possibility of needing to explain or justify any of them. It’s entirely feasible to have an educated and thoughtful discussion on vaccination without oversharing. In fact, it’s probably more effective that way.

For a bit of context only. Should couples with a family history of Down’s syndrome be permitted to have children? Should people reveal blood test results that provide a very early warning of Alzheimer’s? Or how about genetic markers whose expression will make you a less desirable employee, mate, or insurance risk? And so on.

This is precisely the point. If we don’t treat this critically important decision as the intensely private affair that it is, then we co-create a culture in which it’s legitimate, then appropriate, and ultimately imperative for others — bureaucrats, doctors, schools, employers, reporters, neighbors — to ask and then tell us what we must think and do. 

Discuss the topic responsibly

I’m definitely not saying we shouldn’t talk about vaccination. It’s very clear that this topic needs to be discussed a lot.

Let’s cultivate the knowledge, discipline, and mastery to talk about vaccination responsibly. This means doing the work to be in possession of the facts. Don’t exaggerate or wing it. Present the issues as someone who can stand in another’s shoes. Speak in a manner that is as calm and unemotional and even detached as possible. Don’t proselytize. And in the end, if necessary, agree to disagree.

This requires far more than just “book” learning. For many of us, it means a commitment to work on ourselves and to step away from activism as a form of therapy. Because, let’s face it, when the conversation gets tough, it’s far easier to say what we do and walk away and allow that to be the ultimate line in the stand.

It’s a deliberate distraction… it’s theater

Announcing whether you vaccinate sets the entire stage.

Parents judging parents is high drama. Parents feel sorry for those who aren’t doing their own research. Other parents, in turn, pity those who are looking for something to blame. None of us has the big picture. We are all actors, playing into a narrative. But it’s more than a narrative. It’s a play. It’s theater. And like most forms of popular entertainment, there’s a purpose.

It’s meant to distract the masses. That’s all of us, people.

We seemingly understand our roles and deliver them with brio. But have we really thought it through?

What are the other roles? It’s not our stage. It’s not our script. There are actors and directors we never see. Who are the producers? Do we agree with the moral of the story?

And here’s the kicker. The whole thing wouldn’t work without our participation. We aren’t just complicit. We’re indispensable. We’re on set and the cameras are rolling. We’re advancing someone else’s agenda.

It’s all enabled by the belief that we must share a private decision.

Backdrop #1: The anti-vaccine bucket

Every single person who declares that there’s something more to vaccination than meets the eye is unceremoniously dropped into the “anti-vaccine bucket.”

The name notwithstanding, it’s a rather nice bucket. It should really be called the “Green Bucket” or the “Wellness Bucket” or, yes, the “Fearless Bucket.” It’s filled with smart, passionate people that we enjoy hanging out with and learning from. We increasingly spend our time with people in the bucket. We go to doctors in the bucket. We buy products from businesses in the bucket. We work to make the bucket bigger. We fundraise for the bucket. We’re proud that we’re in the bucket.

We become attached to the inevitability that, one day, everyone will understand the wisdom of our bucket.

Backdrop #2: The conflict

We are perplexed by people who aren’t in the bucket… the many parents with no urgency to investigate before dutifully trudging to the pediatrician with their infant, baby, toddler, child, or teenager in tow and doing as they’re told by the CDC and the American Academy of Pediatrics. How can they not explore the science that links vaccines and their ingredients to chronic, autoimmune, or neurodevelopmental disorders, which already affect half of US children?

Many of these “no research” parents and their children are important to us — family, dear friends, loved ones. We venture outside the bucket to recruit and teach them. But most won’t give us the time of day here. They won’t read the books we recommend; watch the movies and docuserieswe want to share; or attend the events we beg them to consider. Some threaten to take drastic measures if we don’t shut up. We’ve lost precious relationships over this issue.

It makes us sad. Maybe we get frustrated or angry. We may even feel that their unwillingness to engage in this issue is now threatening our own families’ well-being. And, hey, some of these people are, gasp, actually in the bucket but pretend they’re not. That’s not right! Silence isn’t neutrality. It’s tacit approval.

But what can we do? We can’t enter another person’s will or change her path. It’s a relief, in a way. Live and let live. No one agrees on everything. No family is an island, after all. Better to quietly take an exemption and allow the movement to grow organically. Trust the unfoldment. We go back into the bucket and do our own thing.

Backdrop #3: The masses tune out

This is a messy debate with exceptionally high stakes involving all parents and our children plus the federal government, 50 state governments, the pharmaceutical industry, the American Academy of Pediatrics, the American Medical Association, thousands of medical doctors, and virtually all daycare centers and schools in the country.

Isn’t it odd that there is absolutely no forum for thoughtful, methodical, respectful engagement designed to raise the issues, hear the concerns, and advance the discussion?

None.

And isn’t it odd that this acrimonious thing just won’t go away?

As a result, the topic is experienced by most people as random, chaotic, confusing, and above all, unsafe for general conversation. It’s a hodgepodge of medical protocol, old science, new science, history, media headlines, conventional wisdom, individual stories, angry accusations, fear, psychology, habit, wishful thinking, and a deep, abiding desire to carve out some certainty in an uncertain world:

Vaccine injury is exceedingly rare.

There’s been a three-fold increase in vaccine doses since 1989.

It’s genetic. Most people are vaccinated and nothing happens.

The mercury-based vaccine preservative, thimerosal, is neurotoxic.

We need herd immunity or we’ll be overrun with diseases.

There’s a chronic enterocolitis that may be related to neurodevelopmental impairment that appears after administration of the combination MMR vaccine in some children.

Some children can’t be vaccinated.

Injection of aluminum adjuvants can overcome genetic resistance to autoimmunity.

Children are a vector for disease.

There’s a risk of DNA insertion via human diploid cells in MMR, chickenpox, and Hep A vaccines.

Everyone must be vaccinated because the vaccines don’t always work.

The autism changepoint year occurred around the time of the neonatal (day-of-birth) hepatitis B shot.

It’s like mandating seat belts and bike helmets, for the greater good.

Did you know that there are GMOs in vaccines?  

And we’re just warming up. Is it any wonder that the vast majority of people tune it out? Have you ever wondered if this is by design?

The vaccine minefield is really about the age old battle that our founding fathers understood all too well.

Vigilance against the expanding scope of power

We’re talking about authoritarianism and privacy and hidden agendas of powerful players whose interests are not aligned with ours.

Have you thought about Edward Snowden lately? From the Snowden movie:

CIA bigwig: Most Americans don’t want freedom. They want security. It’s a simple bargain… you pay the price of admission… Where’s the modern battlefield, soldier? [Everywhere.] What’s the first rule of battle? [Never reveal your position.] And if one unauthorized person knew? [If Congress knows, so would the enemy.] That, Mr. Snowden, is the state of the world. Secrecy is security. And security is victory.

Snowden: The people being able to question the government and hold it accountable, that’s the principle the United States of America was founded on… And when those in power try to hide by classifying everything, we will call them out on it. And when they try to scare us into sacrificing our basic human rights, we won’t be intimidated and we won’t give up. We will not be silenced.

There’s a reason that the Constitution and the Bill of Rights were written foremost as a call for vigilance against the expanding scope of government power and to protect individual rights.

Do we want government taking away our basic rights and messing in our personal and family matters? Should the state be allowed to judge our religious beliefs, constrain our exercise of conscience, and evaluate and override our parenting? Will we be so easily cowed and distracted, and give away the farm?

Stem cells – hope or hype?


Stem cell technology offers the promise of curing the incurable – but for the moment lives are being lost while the issue is mired in controversy.

After 21 years of unsuccessful heart treatments, including several heart procedures, 68-year-old Coenie de Jongh was desperate. So when his cardiologist suggested a last-resort experimental therapy, it represented a literal life line.

Coenie, from Bloubergstrand near Cape Town, had his first heart attack at the young age of 40. A bypass operation followed and his condition improved, but seven years later Coenie’s health started deteriorating again. More operations and more intense treatment followed, but in 2002 his health took a real turn for the worse.

His condition was so bad he struggled to find a cardiologist who was willing to perform another bypass operation. The procedure was eventually done, but it wasn’t as successful as they’d hoped.

At that stage, Dr Andre Saaiman from Kuils River Hospital was conducting research involving the use of stem cells*. He was inspired by the work done by Prof Philippe Menasche from France, who had figured out a way to inject stem cells derived from skeletal muscle into failing hearts.

After getting ethical approval from Stellenbosch University, Dr Saaiman decided to try out the novel therapy on Coenie, who by then was extremely ill and confined to a wheelchair. In December 2004, he called Coenie in and took cells from his upper leg, which he then cultivated in a laboratory. A month later, he injected the cultivated stem cells into 40 areas of Coenie’s failing heart.

The results were little short of miraculous.

In less than two weeks, Coenie’s condition improved dramatically. “He was a different person,” Marlene, Coenie’s wife, recalls.

“Before the operation, he had only 10 percent heart function; afterwards, his heart function shot up to almost 35 percent. It was amazing to see what he could do again. He started walking again, and could lead a relatively normal life.”

Tragically, due to medical complications unrelated to the stem-cell transplant, Coenie passed away on 10 February 2008.

Even though stem-cell transplants are still experimental and research into this field is in its baby shoes, for Marlene and Coenie this procedure was a miracle.


Coenie de Jongh, here with his wife Marlene and grandchildren,
had experimental stem cell therapy that repaired his ailing heart.

Medical miracle –and controversy
Stem cells are one of the most exciting advances to have happened in medicine in the last few decades. Researchers are inspired by the prospect of curing the incurable, and many positive results are already being seen.

The use of stem cells, particularly those of the embryonic type, is, however, mired in controversy, thanks largely to the position adopted by conservative political and religious groupings. Former US President George W Bush firmly opposed stem-cell research during his term, arguing that working with cells ‘harvested’ from human embryos is tantamount to taking life.

This has had two spinoffs: the first is that the presidential vetoing of a number of stem-cell research bills has led to severe limitation on funds for the creation of new embryonic stem cell lines in the US. This, in turn, has greatly hampered the international research process.

The second issue is that the row has led to a global situation in which the potential use of stem cells is shrouded in excited confusion. This is alarming: even using stem cells in the current limited way, it’s calculated that one in every 200 people who reach the age of 70 will, at some point, develop a disease that could benefit from stem-cell transplantation. In other words, the concern about the ethics of stem cell technology could result in thousands upon thousands of unnecessary illnesses and deaths.

But while the debate is heated in the northern hemisphere, things are quiet at the southernmost tip of Africa – particularly with regard to research around the use of embryonic stem cells. According to Prof Michael Pepper, Extraordinary Professor in Immunology at the University of Pretoria’s Faculty of Health Sciences, no basic research of note is currently being conducted here.

So what can be used?
Stuck in the middle of the international controversy are thousands of patients, many of whom anxiously await life-saving treatment.

While adult stem cells have been used for several decades in the treatment of disease – also in South Africa – the problem is that these cells aren’t as flexible as embryonic stem cells. They have fewer applications in the treatment of disease and they’re restricted to very specific tissues.

To compound the frustration, the use of adult stem cells is also quite limited. These cells have many important and wonderful applications (such as the way in which the technology was used to heal Coenie’s heart), but these are either in a legitimate experimental stage or are regarded as unethical, and aren’t accepted by the medical community as a routine form of therapy.

The South African government is in the process of producing regulations on stem cells, currently in draft form. “In the absence of regulations, doctors don’t have any local guidance at this stage, and have to rely on international standards and codes of practice,” Prof Pepper says.

While bone-marrow transplants are covered by the National Health Act, legislation that deals with human tissues, Section Eight, hasn’t been promulgated. In August 2009, the Financial Mail reported that, in its absence, researchers have to fall back on the Human Tissue Act of 1983. “This was published when many of the complex issues that require rules and guidelines were not yet part of the scientific landscape,” Razina Munshi writes.

“This also means that we don’t have a legal framework in which to work,” Pepper adds.

At this stage, adult stem cells are used in bone-marrow transplants only. This is applied in the treatment of several diseases, but mainly in the treatment of cancer. These cells make it possible for patients to receive very high doses of chemotherapy and/or radiation therapy.

The way forward
Most experts agree that stem-cell technology holds enormous potential. We’re experiencing the benefits already. “The current reality is that close to 100 diseases can already be treated with bone-marrow transplants. Unfortunately, limited funds mean that it’s hugely underutilised,” Prof Pepper says.

A solution seems to be coming out of the alternative ways scientists are slowly finding to obtain embryonic cells. This could mean they might be able to circumvent any ethically controversial issues in future, paving the way to more research and, hopefully, more stem-cell-related treatment options.

In 2007, Japanese researchers managed to coax human and mouse skin cells into stem cells that are identical to those found in embryos – a discovery that has been hailed a major breakthrough. These results have also been replicated by scientists elsewhere. So, the future is looking bright.

Prof Pepper believes that the current excitement centred on curing a myriad of conditions is most certainly justifiable. Several potential uses of both adult and embryonic stem cells are currently being investigated, but are not yet a reality in a clinical sense – but he has no doubt that these applications will in the future become part of standard medical practice.

* Stem cells serve as a sort of repair system for the body – they are ‘immortal’ cells that can produce all the different cells in the body. Theoretically, they can divide and continue to divide, replenishing other damaged cells in your body for as long as you live. It’s hoped that scientists will one day succeed in replacing damaged genes or add new genes to stem cells in order to give them characteristics that can ultimately treat disease, according to the US National Institutes of Health.

Source:health24.com

Big Questions Around Facebook’s Suicide-Prevention Tools


Mental health researchers wonder if the social network’s intervention techniques will be effective.

Big Questions Around Facebook’s Suicide-Prevention Tools

Mental health researchers wonder if the social network’s intervention techniques will be effective.

Facebook Live will offer help if a viewer reports that a broadcaster seems to show suicidal or self-harming behavior.
It’s been almost a year since the general rollout of Facebook Live, which lets you broadcast live video to followers, and in that time several peoplehave killed themselves while sharing video of themselves—including a 14-year-old Florida girl who hanged herself in a bathroom in a foster home in January.

Facebook wants to avoid these tragedies, and on Wednesday it rolled outa handful of tools that it thinks may help. These include allowing viewers to report friends who are broadcasting via Facebook Live that appear to be veering toward self-injury or a suicide attempt; the broadcaster will then see a message—while still shooting the live video—that offers resources like the opportunity to contact a help line or talk with a friend. These are the same kinds of tools Facebook already offers to users when a friend on the site reports one of their status updates for similar concerns.

Can such an intervention be helpful, though? Joe Franklin, an assistant professor at Florida State University who runs the school’s Technology and Psychopathology Lab, says it’s a move in the right direction, but there’s no great scientific evidence that such things are particularly helpful.

“I don’t think it’s a bad thing and I think we should study it,” he says. “But I would immediately have questions—I would not assume it would be effective.”

Willa Casstevens, an associate professor at North Carolina State University whose work includes studying suicide prevention, is hopeful that such intervention might be positively received by younger people in particular, since they’re used to interacting via social media.

“In the moment, a caring hand reached out can move mountains and work miracles,” she says. “The question would then be if they would still be in a position to take advantage of it.”

Facebook also said Wednesday that it’s testing the use of pattern recognition to figure out when a post may contain suicidal thoughts. A flagged post can then be reviewed by the site’s community operations team, which can decide whether to reach out to the person who wrote it.

Franklin, whose research includes studying how machine learning can mine health records to determine a person’s risk of attempting suicide, sees this kind of method as the future of spotting suicidal behavior, particularly because it is so easy to scale (and, he thinks, can be more accurate than reports from other people). But in his work he’s found that people often use words like “suicide” or phrases like “kill myself” colloquially, and it’s hard for algorithms to do a good job of distinguishing that from situations where someone really means it.

Still, he says, “it’s a great step forward in terms of trying to identify people who are thinking about or considering suicide.”

Source:www.technologyreview.com

 

Scientists Have Recently Advised Women to Stop Wearing Bras. This is Why…


The birth of feminism in the late 1960s and early 1970s featured young women burning bras as a counterpoint to young men burning their draft cards. They were considered statements of feminine independence. Now there’s discussion regarding the medical merits of those demonstrations.

Controversial findings have been made that at least associate excessive bra wear to non-malignant breast fibrocystic disease as well as malignant breast cancer. Some assert there is a definite link.

 Ironically, it was an American woman who invented the bra around the turn of the 20th Century. Up until the beginning of the 20th Century, corsets are what made women exhibit that desired hourglass figure and inadvertently pushed up the bust line for fashionable clothing of that time.

Problem was, corsets messed with internal organs while shaping those hourglass figures, and their tightness resulted in women fainting easily and often.

The Birth of the Bra.


In 1893, Marie Tucek made a “breast supporter” that looked like a modern brassiere. But a little later Mary Phelps Jacobs designed a better version and called it a brassiere. She patented it and sold the patent to a company named Warner Brothers Corset Company in Bridgeport, Connecticut for $1,500. It caught on.

By the 1950s, teenage girls were urged to buy and use training bras to hold their breasts firmly in a desirable way and prevent sagging. But even the brassier industry admits that the only time bras prevent sagging is while wearing them.

It’s been observed that using artificial breast support long enough will cause the breasts’ cup shaped suspensory Cooper’s ligaments to atrophy, allowing the breasts to sag over time anyway. Exercises that strengthen pectoral muscles can be helpful.

It’s recommended to use a one piece sport bra for exercising. Some women use one piece sports bras as a healthier alternative for regular bras when not exercising.

Bras and Breast Health Consequences

 The connection between wearing bras and painful and bothersome non-malignant breast fibrocystic disease as well as malignant breast cancer was hardly mentioned until the book Dressed to Kill by researchers Sydney Ross Singer and Soma Grismaijer came out in 1995.

They surveyed 5,000 women and discovered that women who wore bras for 12 hours or more greatly increased breast cancer risk than women who wore bras less.

Dr. Gregory Heigh of Florida has discovered that over 90% of women with fibrocystic breast disease find improvement when they stop wearing their brassieres. There are case testimonies (source below) from breast fibrocystic disorders who realized this when they stopped or at less lessened brassiere use.

The connection between breast tumors, non-malignant or malignant and bras has merit when considering the lymph drainage issues from wearing bras too often. The lymph system, which includes lymph nodes in the breasts, requires body movement to pump out the lymph nodes accumulation of toxic waste materials. That’s what bouncing on a rebounder is about.

Not only are breasts’ movements inhibited by bras to not allowing proper lymph nodes draining, but the actually tight enclosures of bras constricts the breasts and restricts lymph material flow.

There was a study that attempted to debunk the link of excess bra wearing to breast fibrocystic disease and breast cancer.

Sources:
Case histories of fibrocystic relief upon ditching or easing bra use:
http://all-natural.com…
http://all-natural.com…
http://www.reocities.com…
http://www.breastnotes.com…
http://www.breastnotes.com...

realfarmacy.com

Butter unlikely to harm health, but margarine could be deadly


a block of butter 
Butter is not likely to kill you 

Saturated fat found in butter, meat or cream is unlikely to kill you, but margarine just might, new research suggests.

Although traditionally dieticians have advised people to cut down on animal fats, the biggest ever study has shown that it does not increase the risk of stroke, heart disease or diabetes.

However trans-fats, found in processed foods like margarine raises the risk of death by 34 per cent.

 “For years everyone has been advised to cut out fats,” said study lead author Doctor Russell de Souza, an assistant professor in the Department of Clinical Epidemiology and Biostatistics, at McMaster University in Canada.

“Trans fats have no health benefits and pose a significant risk for heart disease, but the case for saturated fat is less clear.

“That said, we aren’t advocating an increase of the allowance for saturated fats in dietary guidelines, as we don’t see evidence that higher limits would be specifically beneficial to health.”

Saturated fats come mainly from animal products, such as butter, cows’ milk, meat, salmon and egg yolks, and some plant products such as chocolate and palm oils.

In contrast Trans unsaturated fats or trans fats – are mainly produced industrially from plant oils for use in margarine, snack foods and packaged baked goods.

Guidelines currently recommend that saturated fats are limited to less than 10 per cent, and trans fats to less than one per cent of energy, to reduce risk of heart disease and stroke.

However the new research which looked at 50 studies involving more than one million people found there was no evidence that saturated fat was bad for health.

It backs up recent research from the University of Cambridge that found saturated fat in dairy foods might protect against diabetes.

Cheese counter at a shop
Cheese and other saturated fats are unlikely to be harmful to health   

Last year leading heart scientist Dr James DiNicolantonio of Ithica College, New York, called for health guidelines on saturated fats to be changed in an article in the British Medical Journal.

The “vilification” of saturated fats dates back to the 1950s when research suggested a link between high dietary saturated fat intake and deaths from heart disease.

But the study author drew his conclusions on data from six countries, choosing to ignore the data from a further 16, which did not fit with his hypothesis, and which subsequent analysis of all 22 countries’ data.

Nevertheless the research stuck and since the 1970s most public health organisations have advised people to cut down on fat.

However the new research found no clear association between higher intake of saturated fats and death for any reason, coronary heart disease, cardiovascular disease, ischemic stroke or type 2 diabetes.

In contrast, consumption of industrial trans fats was associated with a 34 per cent increase in death, a 28 per cent increased risk of death from coronary heart disease, and a 21 per cent increase in the risk of cardiovascular disease.

Despite the research British health experts cautioned against changing to a diet which was high in saturated fat.

Prof Tom Sanders, Emeritus Professor of Nutrition and Dietetics, King’s College London, said: “It would be foolish to interpret these findings to suggest that it is OK to eat lots of fatty meat, lashings of cream and oodles of butter.

“Death rates from CVD have fallen in the UK by about 55 per cent since 1997 despite the rise in obesity for reasons that remain uncertain but this may in part be due to changes in the food supply particularly fewer trans and more omega-3 fatty acids.”

 

Victoria Taylor, Senior Dietitian, British Heart Foundation, added: “While saturated fats were not robustly associated with total or deaths from CHD, this does not mean we should all go back to eating butter – the studies that this review is based on can’t show cause and effect.

“Rather, it highlights how difficult it is to understand the true relationship between diet and our health.

“Diets high in saturated fat are linked to raised cholesterol levels, a risk factor for CHD. But when one nutrient is reduced it will be replaced by another and, depending on what this is, it can have positive or negative health consequences.”

Source: British Medical Journal.

People with RH Negative Blood Type Are Aliens.


http://fullyawaremind.com/rh-negative-blood-type/

The ADHD Controversy.


ADHD was already a controversial diagnosis; are Jerome Kagan’s recent criticisms of it warranted?

Is attention deficit hyperactivity disorder (ADHD) a legitimate diagnosis or is it mostly a fraud? The answer has important implications for many individuals and for society. The diagnosis is accepted as legitimate by the psychiatric profession, but continues to have its vehement critics. Recently, noted psychologist Jerome Kagan has been giving tremendous weight to these criticisms by calling ADHD mostly a fraud. There are significant problems with his criticism, however.

What is ADHD?

ADHD was first described in children in 1902, and was understood as an impulse control disorder. It was not formally recognized as a diagnosis, however, until the second edition of the DSM in 1968. The first approved drug used to treat ADHD was benzedrine in 1936. Ritalin, which is still used to treat the disorder, was approved in 1955.

Here is the official DSM diagnosis:

  • A persistent pattern of inattention and/or hyperactivity-impulsivity that interferes with functioning or development
    • Six or more of the symptoms have persisted for at least six months to a degree that is inconsistent with developmental level and that negatively impacts directly on social and academic/occupational activities. Please note: The symptoms are not solely a manifestation of oppositional behaviour, defiance, hostility, or failure to understand tasks or instructions. For older adolescents and adults (age 17 and older), five or more symptoms are required
  • Several inattentive or hyperactive-impulsive symptoms were present prior to age 12 years
  • Several inattentive or hyperactive-impulsive symptoms are present in two or more settings (e.g. at home, school, or work; with friends or relatives; in other activities)
  • There is clear evidence that the symptoms interfere with, or reduce the quality of, social, academic or occupational functioning
  • The symptoms do not occur exclusively during the course of schizophrenia or another psychotic disorder and are not better explained by another mental disorder (e.g. mood disorder, anxiety disorder, dissociative disorder, personality disorder, substance intoxication or withdrawal)

There are a few aspects of this diagnosis worth pointing out. First, this is what we call a clinical diagnosis, it is based entirely on signs and symptoms without any objective diagnostic tests. You cannot see ADHD on an MRI scan of the brain, an EEG, or a blood test. This is not unusual in medicine, especially for brain disorders. The same is true, for example, of migraine headaches. It is entirely a clinical diagnosis.

This, by itself, should not call the diagnosis into question. Brain function relies not only on the health of the cells and the absence of identifiable anatomical or gross pathology. It also depends on the pattern of connections among brain cells, the density of their connections, and the details of their biochemistry. We are just starting to be able to image the brain at this level.

As an example, raise someone in a closet for 20 years and I guarantee you they will have a psychological disorder, but you would not be able to tell that from looking at their brain with any tool we currently have.

Because mood, thought, and behavior largely rely on brain function that cannot be imaged, psychiatrists have relied on elaborate schemes of clinical diagnoses to at least have a common language for thinking and talking about mental illness. It is imperfect, and extremely fuzzy around the edges, but it has its utility.

That fuzziness is partly based in the limits of our current technology and understanding. But it is also based in the fact that humans are neurologically heterogeneous and the fact that the brain is an extremely complex system. This means that the same end result (behavior, for example) might result from almost endless permutations of interactions among various systems in the brain and their interaction with the environment.

You can see this in the formal description of ADHD above. There is a sincere attempt to capture a real neurological phenomenon, and to filter out other factors that might contribute to or cause similar symptoms. Signs used to establish the diagnosis cannot be temporary, or isolated to only one environment, or related to other conditions or situations that might provoke them. You need to have many symptoms persistent over a long time without other identifiable causes and to a sufficient degree that they cause demonstrable harm.

There is also an attempt to separate out those who have a real disorder from the typical spectrum of human behavior. This is also a common problem in medicine. Many disorders, like high blood pressure, do not have a sharp demarcation line. The curves for normal blood pressure and hypertension overlap. Experts have to decide where to draw the line, either capturing more people with the disorder but also more people just at the upper range of normal, vs excluding those who are just at the upper range of normal but also then missing more people with the disorder.

Eventually such clinical questions evolve from, “Who has the disorder” to “Who benefits from treatment for the disorder.” That is the real question.

Neuroanatomical Correlates

Despite the fact that ADHD is a fuzzy clinical entity, we have made progress in understanding what is happening in the brain of most people with ADHD. The current consensus is that ADHD is a deficit of executive functions. The frontal lobes carry out many critical functions, some considered executive functions: they include being able to focus your attention, maintain focus, switch among tasks, filter out distractions, and impulse control. Executive function includes the ability to weigh the probable outcomes of your behavior and then make high-level decisions about how you will behave.

As an adult neurologist I see patients with executive function disorder frequently, usually from head trauma. Car accidents in particular result in frontal lobe damage as it is common to hit your head against the windshield during many types of accidents. Patients frequently develop the symptoms of ADHD after frontal head trauma. They have poor focus, and poor impulse control. In one dramatic case a patient’s entire personality changed. She lost all ability to control or moderate her behavior (as have others). Often these patients respond favorably to the same stimulants we use to treat ADHD.

When we look at the brains of those who meet the clinical diagnosis of ADHD with our modern imaging techniques, such as fMRI and EEG, we find a similar pattern of brain dysfunction:

Convergent data from neuroimaging, neuropsychology, genetics and neurochemical studies consistently point to the involvement of the frontostriatal network as a likely contributor to the pathophysiology of ADHD. This network involves the lateral prefrontal cortex, the dorsal anterior cingulate cortex, the caudate nucleus and putamen. Moreover, a growing literature demonstrates abnormalities affecting other cortical regions and the cerebellum.

At this point there is no reasonable disagreement about the fact that ADHD is a disorder of brain function. Children who meet the strict diagnostic criteria are demonstrably different, in consistent and predictable ways, than children who do not (controlling for other possible factors). They have impaired executive functions, and we can see this in changes to the relevant parts of the brain. We still have a lot to learn (again, the brain is complex) but a consistent picture is emerging.

Jerome Kagan’s criticism

Jerome Kagan is a preeminent psychologist. This gives his opinions about a psychological topic a great deal of weight. The press loves him because he has a sensational story to tell and he has impeccable credential. Articles about Kagan often spend an entire paragraph or two touting those credentials.

Unfortunately this is a common mistake that mainstream journalists make when discussing scientific topics. They confuse the expertise of an individual with scientific authority. No individual ever represents the consensus of scientific opinion, they can only represent their own quirky opinions (which may or may not be in line with the consensus).

This is a classic example of this error. Kagan’s opinions do not conform to the current consensus of scientific opinion, but he is presented as an unimpeachable authority. Further, all reporting that I have seen on Kagan’s opinions regarding ADHD fail to put his expertise into a reasonable context. Kagan is a psychologist. He is not a psychiatrist, nor a neuroscientist.

Often related fields covering the same question have different opinions. Geologists and paleontologists disagree about the relative contribution of a meteor impact to the extinction of the dinosaurs at the K-Pg boundary. If a reporter talked only to a geologist they would not capture the true state of the broader scientific opinion.

Many psychologists have opinions about psychiatry that do not reflect the consensus of psychiatric opinion. In essence, even though Kagan has relevant expertise, he is not a clinician, and therefore is an outsider when it comes to the practice of psychiatry. He also does not seem to be up to date on the neuroscience of ADHD.

Yet his recent interview with Spiegel is being widely reports as definitive criticism of the diagnosis and treatment of ADHD. Here are some of the highlights: He says:

Let’s go back 50 years. We have a 7-year-old child who is bored in school and disrupts classes. Back then, he was called lazy. Today, he is said to suffer from ADHD (Attention Deficit Hyperactivity Disorder). That’s why the numbers have soared.

We are familiar with a similar criticism of autism diagnoses. Yes, diagnostic practices have changed. Awareness of the diagnosis has changed. The implication here is that the 1950s diagnosis (a bored child) was better than the current diagnosis of ADHD.

But, if you recall the diagnostic criteria from above, displaying ADHD behavior in school alone is not sufficient to establish the diagnosis. So, Kagan’s example is simply wrong. The child in his example should not be diagnosed with ADHD.

Being generous, he may be implying only that doctors are overdiagnosing ADHD and not following their own diagnostic criteria. This is a real issue, but here is a far more nuanced discussion from an actual clinician:

ADHD is real—it’s not made up. But it exists on a continuum. There’s no marker or white line that says you’re in the “definite” or “highly likely” group. There’s almost unanimous agreement that five or six percent clearly have enough of these symptoms for an ADHD diagnosis. Then there’s the next group, where the diagnosis is more of a judgment call, and for these kids, behavioral therapy might work. And then there’s a third group, on the borderline. These are the ones we’re worried about being pushed into an inaccurate diagnosis.

The real issue is – are schools pushing for more kids in the gray zone to be diagnosed because of funding and regulation issues? Also, there is a real “demarcation problem” with the diagnosis, and we have to carefully consider the risks and benefits of using looser or tighter criteria. These discussions are happening within the profession, and are very evidence-based and nuanced. Kagan’s criticism, by comparison, is shooting from the hip and simplistic. (I will add the caveat that the interview may not reflect the full depth of his opinion, but he is responsible for how he communicates to the public, especially given how widely his opinions have been spread.)

He continues:

SPIEGEL: Experts speak of 5.4 million American children who display the symptoms typical of ADHD. Are you saying that this mental disorder is just an invention?

Kagan: That’s correct; it is an invention. Every child who’s not doing well in school is sent to see a pediatrician, and the pediatrician says: “It’s ADHD; here’s Ritalin.” In fact, 90 percent of these 5.4 million kids don’t have an abnormal dopamine metabolism. The problem is, if a drug is available to doctors, they’ll make the corresponding diagnosis.

That characterization, while you might dismiss it as hyperbole, is irresponsible. “Every” child? Again, this does not meet the official diagnostic criteria for ADHD which requires more than just not doing well in school. His reference to “dopamine metabolism” is just weird. It is true that some studies show some children with ADHD have impaired reward system function. This may be playing a role in some subtypes of ADHD. It is not a core feature of ADHD, however, and the evidence is still very preliminary. Invoking what is essentially a preliminary side point about the neuroanatomical correlates of ADHD as reason to doubt the diagnosis is, to be kind, highly problematic.

Kagan then broadens his criticism to encompass psychiatry in general:

We could get philosophical and ask ourselves: “What does mental illness mean?” If you do interviews with children and adolescents aged 12 to 19, then 40 percent can be categorized as anxious or depressed. But if you take a closer look and ask how many of them are seriously impaired by this, the number shrinks to 8 percent. Describing every child who is depressed or anxious as being mentally ill is ridiculous. Adolescents are anxious, that’s normal. They don’t know what college to go to. Their boyfriend or girlfriend just stood them up. Being sad or anxious is just as much a part of life as anger or sexual frustration.

This is a typical anti-mental illness statement. This is simply a straw man of what psychiatry does.

He is saying that we should not confuse the normal range of behavior with a disorder, as if this is a huge insight. This understanding has already been incorporated into clinical thinking. As I pointed out above – there are great pains taken when defining mental disorders to separate true disorders from the healthy range of human behavior.

Further, being “seriously impaired” is already part of the diagnosis, so what is he talking about?

He goes on to argue that some people are depressed in response to a life event. Right – psychiatrists call this a “reactive depression” because it is already recognized, and not confused with a chronic depression. That is why the diagnosis of clinical depression excludes depression that follows a major trigger, and must continue for greater than six months to be considered a disorder.

From reading the entire interview I am left wondering, exactly what Kagan is criticizing? He is certainly not criticizing the standard of care within psychiatry. He seems to be tilting at a straw man of the worst possible malpractice that deviates from that standard. He is raising issues as if these are not already part of a vigorous evidence-based discussion within psychiatry itself.

A kernel of truth

We often take a sharply critical approach to medical science here at SBM. Self-criticism is critical to improvement. That is the essence of science itself, it is designed for error correction through self-criticism.

Our nuanced position is that science basically works, but there is a lot of room for improvement. Enemies of science, however, or those with a specific ideological axe to grind, use the same evidence to argue that the institution of science is fatally flawed and can be comfortably dismissed or ignored.

I find the same is true of much of the public criticism of psychiatry. There is a lot to criticize in the profession (as in medicine in general), and a lot of room for improvement. Some of that is just the current status of the science. We don’t know everything, and yet medicine (including psychiatry) is an applied science. We have to make important decisions with limited information.

There are also many issues of quality control. Medicine is hard, and keeping quality standards high is challenging.

So there are many legitimate criticisms of ADHD and psychiatry, but that does not mean ADHD is a fraud. The scientific evidence, both clinical and neuroscience, is robust. Kagan’s criticisms are mostly greatly exaggerated, or they are straw men because they are already incorporated into the standard of care.

Unfortunately, you will not be exposed to any of that from reading any of the popular press breathlessly reporting that ADHD is a fraud.

Source:https://sciencebasedmedicine.org

Renowned Harvard Psychologist Says ADHD Is Largely A Fraud


Renowned Harvard Psychologist Says ADHD Is Largely A Fraud

http://curiousmindmagazine.com/harvard-psychologist-says-adhd-largely-fraud/

 

How Pasteurized Dairy Destroys Your Bones From the Inside


Milk is the only beverage still aggressively pushed on children as a health promoting food when it is the exact opposite – a disease promoting food. Drinking pasteurized milk is not nearly as good for general health or bones as the dairy industry has made it out to be. In fact, this fairy tale of “milk doing a body good” is being exposed more frequently by many independent scientists and researchers who have had just about enough of the propaganda.

 bone

According to a large scale study of thousands of Swedish people, cow’s milk has a deteriorating effect on health when consumed in the long-term. The research was published in The British Medical Journal (BMJ).

The study, which tracked 61,433 women aged 39 to 74 over 20 years, and 45,339 men of similar age for 11 years, found that the more cow’s milk people drank, the more likely they were to die or experience a bone fracture during the study period.

The risks were especially pronounced for women, a group advised to drink milk to help avoid bone fractures that result from osteoporosis.

Women who said they drank three or more glasses of milk a day had almost double the chance of dying during the study period as those who reported drinking only one. A glass is defined as a 200 milliliter serving. They also had a 16 percent higher chance of getting a bone fracture anywhere in the body.

Why Does Milk Cause Osteoporosis and Bone Fractures

The dairy industry has been hard at work the last 50 years convincing people that pasteurized dairy productssuch as milk or cheese increases bioavailable calcium levels. This is totally false. The pasteurization process only creates calcium carbonate, which has absolutely no way of entering the cells without a chelating agent. So what the body does is pull the calcium from the bones and other tissues in order to buffer the calcium carbonate in the blood. This process actually causes osteoporosis.

Pasteurized dairy contains too little magnesium needed at the proper ratio to absorb the calcium. Most would agree that a minimum amount of Cal. to Mag Ratio is 2 to 1 and preferably 1 to 1. So milk, at a Cal/Mag ratio of 10 to 1, has a problem. You may put 1200 mg of dairy calcium in your mouth, but you will be lucky to actually absorb a third of it into your system.

Over 99% of the body’s calcium is in the skeleton, where it provides mechanical rigidity. Pasteurized dairy forces a calcium intake lower than normal and the skeleton is used as a reserve to meet needs. Long-term use of skeletal calcium to meet these needs leads to osteoporosis.

Dairy is pushed on Americans from birth yet they have one of the highest risk of osteoporosis in the world. Actually, people from the USA, Canada, Norway, Sweden, Australia, and New Zealand have the highest rates of osteoporosis.

The test for pasteurization is called the negative alpha phosphatase test. When milk has been heated to 165 degrees (higher for UHT milk) and pasteurization is complete, the enzyme phosphatase is 100 percent destroyed. Guess what? This is the enzyme that is critical for the absorption of minerals including calcium! Phosphatase is the third most abundant enzyme in raw milk and those who drink raw milk enjoy increased bone density. Several studies have documented greater bone density and longer bones in animals and humans consuming raw milk compared to pasteurized.

The message that estrogen builds fracture-resistant bones (prevents osteoporosis) has been hammered into women’s minds over the past 4 decades by the pharmaceutical industry, selling HRT formulas, such as Premarin and Prempro. Food also raises estrogen levels in a person’s body–and dairy foods account for about 60 to 70% of the estrogen that comes from food. The main source of this estrogen is the modern factory farming practice of continuously milking cows throughout pregnancy. As gestation progresses the estrogen content of milk increases from 15 pg/ml to 1000 pg/ml.

The National Dairy Council would like you to believe, “There is no evidence that protein-rich foods such as dairy foods adversely impact calcium balance or bone health.” But these same dairy people know this is untrue and they state elsewhere, “Excess dietary protein, particularly purified proteins, increases urinary calcium excretion. This calcium loss could potentially cause negative calcium balance, leading to bone loss and osteoporosis. These effects have been attributed to an increased endogenous acid load created by the metabolism of protein, which requires neutralization by alkaline salts of calcium from bone.”

The More Milk You Drink, The More Inflammatory Molecules

The most likely explanation of the negative health effects of milk are the damaging inflammation caused by galactose, a breakdown product of lactose, the main sugar in milk. In a separate group of people, the team found that the more milk that people drink, the more inflammatory molecules were present in their urine.

What’s more, women who reported eating a lot of cheese and yogurt had a lower chance of fracturing a bone or dying during the study than women who ate low amounts of the dairy products. This supports the inflammation hypothesis because yogurt and cheese contain much less lactose and galactose than milk.

Cancer Fuel

Of the almost 60 hormones, one is a powerful GROWTH hormone called Insulin- like Growth Factor ONE (IGF-1). By a freak of nature it is identical in cows and humans.

The foods you eat can influence how much IGF-I circulates in the blood. Diets higher in overall calories or in animal proteins tend to boost IGF-I, and there seems to be an especially worrisome role played by milk.

Consider this hormone to be a “fuel cell” for any cancer… (the medical world says IGF-1 is a key factor in the rapid growth and proliferation of breast, prostate and colon cancers, and we suspect that most likely it will be found to promote ALL cancers). IGF-1 is a normal part of ALL milk… the newborn is SUPPOSED to grow quickly! What makes the 50% of obese American consumers think they need MORE growth? Consumers don’t think anything about it because they do not have a clue to the problem… nor do most of our doctors.

Studies funded by the dairy industry show a 10% increase in IGF-1 levels in adolescent girls from one pint daily and the same 10% increase for postmenopausal women from 3 servings per day of nonfat milk or 1% milk.

IGF-1 promotes undesirable growth too–like cancer growth and accelerated aging. IGF-1 is one of the most powerful promoters of cancer growth ever discovered. Overstimulation of growth by IGF-1 leads to premature aging too–and reducing IGF-1 levels is “anti-aging.”

A review published by the World Cancer Research Fund and the American Institute for Cancer Research in 1997 found that cancer risk paralleled milk consumption in numerous studies.

 Pasteurization Masks Low-Quality Milk and Destroy Nutrients and Enzymes

Why do humans still drink milk? Because they think it’s safe due to pasteurization. However, heat destroys a great number of bacteria in milk and thus conceals the evidence of dirt, pus and dirty dairy practices. It’s cheaper to produce dirty milk and kill the bacteria by heat, that to maintain a clean dairy and keep cows healthy. To combat the increase in pathogens milk goes through ‘clarification’, ‘filtering’, ‘bactofugation’ and two ‘deariation’ treatments. Each of these treatments uses heat ranging from 100-175 degrees Fahrenheit. Dairies count on many heat treatments to mask their inferior sanitary conditions: milk filled with pus, manure and debris. Consumer Reports found 44% of 125 pasteurized milk samples contained as many as 2200 organisms per cubic centimeter (fecal bacteria, coliforms)

Pasteurization also destroys vitamin C, and damages water soluble B vitamins diminishing the nutrient value of milk. Calcium and other minerals are made unavailable by pasteurization. The Maillard reaction, a chemical reaction between proteins and sugars, occurs at higher heats and causes browning, discoloring the milk.

Milk enzymes, proteins, antibodies as well as beneficial hormones are killed by pasteurization resulting in devitalized ‘lifeless’ milk. Milk enzymes help digest lactose and both enzymes and milk proteins help to absorb vitamins. Protective enzymes in milk are inactivated, making it more susceptible to spoilage.

Overall, pasteurized milk is not a beverage that can be recommended to either maintain or advance health. It has no significant nutritional value and there is a far greater risk in consuming it than not. There are also plenty of alternatives including coconut milk, nut milks (i.e. almond, cashew), and hemp milk which far exceed conventional cow’s milk in terms of nutrition and health promoting properties.

Natasha Longo has a master’s degree in nutrition and is a certified fitness and nutritional counselor. She has consulted on public health policy and procurement in Canada, Australia, Spain, Ireland, England and Germany.


Sources:
bmj.com
aicr.org
pcrm.org
ncbi.nlm.nih.gov
drmcdougall.com
pgtv.ca