Haemophilia and the Contaminated Blood Scandal. 


Following disclosure on Panorama last week of his intention to take legal action against the Department of Health, Jason Evans has been joined immediately by over 120 people who wish to join the proposed group action. They also wish to join with him in his call for a full Public Inquiry into the scandal.

Speaking today Jason Evans said “It is a scandal that whilst it was at David Cameron’s direction that the Leveson Inquiry was put in place to examine the culture, practice and ethics of the press, now in the face of overwhelming evidence presented to Parliament by Andy Burnham last month, Theresa May has still not seen fit to order an Inquiry into the culture, practice and ethics of the Department of Health in dealing with this human tragedy in which many 1000s of people including 100s of children were infected, many fatally, with HIV and Hep C through Factor conentrates”

Des Collins of Collins Solicitors said “The public response has been unprecedented. It is widely thought that the campaign for a Public Inquiry, already supported by Andy Burnham, Dr David Owen, Bianca Jagger and others, will shortly receive the support of all the main opposition parties when manifestos are published over the next few days. This is a wrong which must now be put right. It is essential that lessons are learned from this tragedy so that such disastrous mistakes are not repeated in the future.”

Des Collins
Senior Partner
Collins Solicitors
T) 01923 223 324
dcollins@collinslaw.co.uk

Danielle Holliday
Partner
Collins Solicitors
T) 01923 223 324
dholliday@collinslaw.co.uk

Noncommunicable diseases


Key facts

  • Noncommunicable diseases (NCDs) kill 40 million people each year, equivalent to 70% of all deaths globally.
  • Each year, 15 million people die from a NCD between the ages of 30 and 69 years; over 80% of these “premature” deaths occur in low- and middle-income countries.
  • Cardiovascular diseases account for most NCD deaths, or 17.7 million people annually, followed by cancers (8.8 million), respiratory diseases (3.9million), and diabetes (1.6 million).
  • These 4 groups of diseases account for over 80% of all premature NCD deaths.
  • Tobacco use, physical inactivity, the harmful use of alcohol and unhealthy diets all increase the risk of dying from a NCD.
  • Detection, screening and treatment of NCDs, as well as palliative care, are key components of the response to NCDs.

Overview

Noncommunicable diseases (NCDs), also known as chronic diseases, tend to be of long duration and are the result of a combination of genetic, physiological, environmental and behaviours factors.

The main types of NCDs are cardiovascular diseases (like heart attacks and stroke), cancers, chronic respiratory diseases (such as chronic obstructive pulmonary disease and asthma) and diabetes.

NCDs disproportionately affect people in low- and middle-income countries where more than three quarters of global NCD deaths – 31 million – occur.

Who is at risk of such diseases?

People of all age groups, regions and countries are affected by NCDs. These conditions are often associated with older age groups, but evidence shows that 15 million of all deaths attributed to NCDs occur between the ages of 30 and 69 years. Of these “premature” deaths, over 80% are estimated to occur in low- and middle-income countries. Children, adults and the elderly are all vulnerable to the risk factors contributing to NCDs, whether from unhealthy diets, physical inactivity, exposure to tobacco smoke or the harmful use of alcohol.

These diseases are driven by forces that include rapid unplanned urbanization, globalization of unhealthy lifestyles and population ageing. Unhealthy diets and a lack of physical activity may show up in people as raised blood pressure, increased blood glucose, elevated blood lipids and obesity. These are called metabolic risk factors that can lead to cardiovascular disease, the leading NCD in terms of premature deaths.

Risk factors

Modifiable behavioural risk factors

Modifiable behaviours, such as tobacco use, physical inactivity, unhealthy diet and the harmful use of alcohol, all increase the risk of NCDs.

  • Tobacco accounts for 7.2 million deaths every year (including from the effects of exposure to second-hand smoke), and is projected to increase markedly over the coming years. (1)
  • 4.1 million annual deaths have been attributed to excess salt/sodium intake. (1)
  • More than half of the 3.3 million annual deaths attributable to alcohol use are from NCDs, including cancer. (2)
  • 1.6 million deaths annually can be attributed to insufficient physical activity. (1)

Metabolic risk factors

Metabolic risk factors contribute to four key metabolic changes that increase the risk of NCDs:

  • raised blood pressure
  • overweight/obesity
  • hyperglycemia (high blood glucose levels) and
  • hyperlipidemia (high levels of fat in the blood).

In terms of attributable deaths, the leading metabolic risk factor globally is elevated blood pressure (to which 19% of global deaths are attributed), (1) followed by overweight and obesity and raised blood glucose.

What are the socioeconomic impacts of NCDs?

NCDs threaten progress towards the 2030 Agenda for Sustainable Development, which includes a target of reducing premature deaths from NCDs by one-third by 2030.

Poverty is closely linked with NCDs. The rapid rise in NCDs is predicted to impede poverty reduction initiatives in low-income countries, particularly by increasing household costs associated with health care. Vulnerable and socially disadvantaged people get sicker and die sooner than people of higher social positions, especially because they are at greater risk of being exposed to harmful products, such as tobacco, or unhealthy dietary practices, and have limited access to health services.

In low-resource settings, health-care costs for NCDs quickly drain household resources. The exorbitant costs of NCDs, including often lengthy and expensive treatment and loss of breadwinners, force millions of people into poverty annually and stifle development.

Prevention and control of NCDs

An important way to control NCDs is to focus on reducing the risk factors associated with these diseases. Low-cost solutions exist for governments and other stakeholders to reduce the common modifiable risk factors. Monitoring progress and trends of NCDs and their risk is important for guiding policy and priorities.

To lessen the impact of NCDs on individuals and society, a comprehensive approach is needed requiring all sectors, including health, finance, transport, education, agriculture, planning and others, to collaborate to reduce the risks associated with NCDs, and promote interventions to prevent and control them.

Investing in better management of NCDs is critical. Management of NCDs includes detecting, screening and treating these diseases, and providing access to palliative care for people in need. High impact essential NCD interventions can be delivered through a primary health care approach to strengthen early detection and timely treatment. Evidence shows such interventions are excellent economic investments because, if provided early to patients, they can reduce the need for more expensive treatment.

Countries with inadequate health insurance coverage are unlikely to provide universal access to essential NCD interventions. NCD management interventions are essential for achieving the global target of a 25% relative reduction in the risk of premature mortality from NCDs by 2025, and the SDG target of a one-third reduction in premature deaths from NCDs by 2030.

WHO response

WHO’s leadership and coordination role

The 2030 Agenda for Sustainable Development recognizes NCDs as a major challenge for sustainable development. As part of the Agenda, Heads of State and Government committed to develop ambitious national responses, by 2030, to reduce by one-third premature mortality from NCDs through prevention and treatment (SDG target 3.4). This target comes from the High-level Meetings of the UN General Assembly on NCDs in 2011 and 2014, which reaffirmed WHO’s leadership and coordination role in promoting and monitoring global action against NCDs. The UN General Assembly will convene a third High-level Meeting on NCDs in 2018 to review progress and forge consensus on the road ahead covering the period 2018-2030.

To support countries in their national efforts, WHO developed a Global action plan for the prevention and control of NCDs 2013-2020, which includes nine global targets that have the greatest impact on global NCD mortality. These targets address prevention and management of NCDs.


References

(1) GBD 2015 Risk Factors Collaborators. Global, regional, and national comparative risk assessment of 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks, 1990–2015: a systematic analysis for the Global Burden of Disease Study 2015. Lancet, 2016; 388(10053):1659-1724

 

European study finds raw milk boosts immunity, prevents colds and infections. 


The U.S. Food and Drug Administration (FDA) claims that it’s basically a death sentence for you and your children. But raw milk consumption, according to a new study published in The Journal of Allergy and Clinical Immunology, can actually help prevent colds, viruses and respiratory tract infections (RTIs) from forming in kids, as opposed to commercially processed milk which provides little or no health benefits.

Raw milk

A cohort of researchers, doctors and other medical professionals from across Europe investigated the effects of raw milk versus boiled farm-fresh milk and commercial processed milk as part of a larger investigatory project known as “PASTURE.” A group of women, roughly half of whom lived and worked on livestock farms in rural areas of mostly central Europe, were recruited to participate in the research.

All of the women were in their third trimesters of pregnancy at the time of the study, and detailed consumption and lifestyle patterns, including milk-drinking habits, were carefully evaluated and compared. In total, 983 children were included in the final data set, which revealed that milk in its pure, raw, unobstructed form is superior in terms of immune-boosting nutrition.

According to the research, raw milk works a lot like breast milk in providing protective, anti-infective health benefits to children. Compared to highly processed commercial milk, raw milk was found to help lower C-reactive protein levels, which are directly associated with inflammation. Raw milk, in other words, works against inflammation, while processed milk may help promote it due to its altered proteins.

“The main finding of this analysis was an inverse association between consumption of unprocessed cow’s milk and rhinitis [cold or runny nose], RTI [respiratory tract infections], and otitis [ear infection],” wrote the authors. “The effect was strongest when cow’s milk was consumed raw; boiled farm milk exhibited an attenuated effect.”

Does ultra-heat-treated commercial milk promote respiratory and other health problems?

Conversely, consumption of ultra-heat-treated commercial milk, the most widely available milk product on the market, was not found to decrease levels of C-reactive protein, which appear to be a leading cause of disease. With fevers specifically, commercial milk was found to actually increase their prevalence compared to raw milk.

Raw milk, on the other hand, is associated with a roughly 30 percent decrease in respiratory infections and fever, and could help babies and young children overcome these common ailments. Even minimally processed milk boiled directly on the farms was found to be beneficial, though much less so than true raw milk.

“[W]e are now not talking about asthma and allergies, but fever and infections in young children,” stated Dr. Ton Baars, a professor and senior scientist for milk quality and animal welfare at the Research Institute of Organic Agriculture in Germany, and one of the lead authors of the study. “It means there is additional new evidence that raw milk is a protective agent in infectious diseases in young children.”

Unlike in the U.S. where irrational superstition and paranoia have landed raw milk in the “dangerous” category, Europe is already widely accepting of raw milk, and increasingly so. In many countries, raw milk vending machines are prevalent on busy city streets, providing quick and easy access to fresh milk from local farms.

“In Europe, the consumption of unpasteurized milk has repeatedly correlated with protection against allergic disease,” wrote Moises Velasquez-Manoff in a piece for The New York Times late last year.

“In America, 80 percent of the Amish studied by Dr. [Mark] Holbreich consume raw milk. In a study published earlier this year, Dr. [Bianca] Schaub’s group showed that European children who consumed farm milk had more of those regulatory T-cells, irrespective of whether they lived on farms. The higher the quantity of those cells, the less likely these children were to be given diagnoses of asthma.”

Sources:

http://www.jacionline.org

http://www.en.uni

http://www.realmilk.com*
*[A Google web cache of this page is available here]

http://thebovine.wordpress.com

http://www.nytimes.com

http://www.drfranklipman.com

http://science.naturalnews.com

 

Doctors look after the mental condition of others. Who would look after the doctors?


A number of health and professionals from other industries have been studied in recent years and many, not unsurprisingly, also show high levels of stress

A new study from Cardiff University has revealed nearly 60% of doctorshave experienced mental illness and psychological problems at various stages in their career. That is bad enough in itself, but what is much worse is that very few of the 2,000 surveyed said that they had sought help.

A number of health and professionals from other industries have been studied in recent years and many, not unsurprisingly, also show high levels of stress. Sadly, however, it seems that this failure to seek help is not a phenomenon that is confined purely to the medical profession.

Findings from the British Psychological Society and New Savoy, for example – reporting on their 2015 staff well-being survey – showed that nearly half of psychological professionals report being depressed, along with admitting feelings of being a failure.

Work again was a culprit, with 70% of those who responded saying that they were finding their jobs stressful. For both medical doctors and psychological doctors, therefore, the current climate in the NHS is not, sadly, a healthy one. Workers on the front line of care are becoming governed more and more by contracts and targets rather than by the imperative of caring for people. The threat of cuts, often presented as efficiency savings and the imposition of contracts on junior doctors are just two of many current examples.

Risk and resolution

Across the caring professions – medical, psychological, nursing, professions allied to medicine, and caring – there is, overall, a picture of worrying levels of depression and stress leading to low morale and burnout.

Burnout is something experienced by people who have been working on the front line of human services in a context where they are caring for, and committed to providing services to, others. Its features are a combination of high levels of depersonalisation – where a person no longer sees themselves or others as valuable – and emotional exhaustion together with low levels of feelings of personal accomplishment. This is exactly what we are seeing reported here in the Cardiff study.

The Cardiff study found that the likelihood of doctors reporting mental health problems differed between different stages of their careers: young doctors and trainees were least likely to disclose any problems. Female doctors were found to be particularly at risk of burnout, as were GPs and trainee and junior doctors.

Almost certainly, the reason why is stigma. People throughout society – particularly frontline professionals – are afraid of disclosing that they are having problems because they fear the repercussions and possible effects that disclosure may have on their careers.

This was also recently demonstrated in a wider paper by Sarah Clements of Kings College London who, with colleague Graham Thornicroft, carried out a meta analysis of 144 studies involving more than 90,000 people. Their resulting global report showed that although one in four people – both inside and outside the healthcare profession – in Europe and the USA have a mental health problem, as many as 75% of people do not receive treatment.

How can we care for our carers? 

What – if anything – can be done about this situation? Do we really want to consult with professionals who are less able to confront their own difficulties than we are? How can we help them confront their own issues to help others in society overcome the stigma?

There have been moves towards a more open mental health culture within the health professions, with some senior members of staff sharing their experiences. Retired GP Chris Manning, for example, has been greatly involved in the promotion of doctors’ psychological health and self-care after experiencing depression and burnout.

Clare Gerarda, former chair of the Council of the Royal College of General Practitioners, has also been a long-time advocate for doctors’ health and is the medical director of the practitioner health programme– a free and confidential NHS service for doctors and dentists who are experiencing psychological or physical health concerns. Additionally, Dr Gerarda established the Founders Group and Founders Network, a coalition working together to promote psychologically healthy environments within the NHS.

new Charter on Psychological Staff Wellbeing and Resilience was also launched recently by the British Psychological Society and New Savoy. Building on this, a collaborative learning network of employers in health and social care has been established and will have its first meeting on June 21 in order to begin working together to establish and maintain psychologically healthy working environments.

Fundamentally, though, there has to be a change in culture. People need to be able to speak freely about their feelings of stress and psychological needs – and be supported to seek help. I have tried, personally, to model this as the president of the British Psychological Society over this past year and have talked openly about my own experiences of burnout, stress, depression and bipolar disorder while working as a clinical psychologist.

It is my belief that this culture change could begin to be enabled for doctors, both medical and psychological, nurses, allied health professionals and all in the caring professions too, if senior additions and managers begin to talk openly about their own psychological health.

To do so is a sign of strength and humility.

Modern drugs give HIV patients in Europe and US extra 10 years of life expectancy


The HIV virus targets immune cells in the bloodstream
The HIV virus targets immune cells in the bloodstream

Life expectancy for young HIV-positive adults has risen by 10 years in the United States and Europe thanks to improvements in AIDS drugs known as antiretroviral therapy, researchers said on Thursday.

This meant many patients can expect to live as long as those without HIV, according to their study published in The Lancet medical journal.

The scientists said the improvements were likely to be largely due to the transition to less toxic medicine combinations, with more drug options for people infected with drug-resistant HIV strains, and better adherence to treatment.

“Our research illustrates a success story of how improved HIV treatments coupled with screening, prevention and treatment of health problems associated with HIV infection can extend the lifespan,” said Adam Trickey, who led the research at Britain’s University of Bristol.

Antiretroviral therapy, or ART, first became widely used in the mid 1990s. It involves a combination of three or more drugs that block the HIV virus’ replication. This helps prevent and repair damage to the immune system caused by the HIV, and also prevents onward spread of the disease.

The World Health Organisation (WHO) now recommends ART should be given as soon as possible after diagnosis to everyone with HIV.

The researchers analysed 18 European and North American studies involving 88,504 people with HIV who started ART between 1996 and 2010.

Fewer people who started treatment between 2008-2010 died during their first three years of treatment than those who started treatment between 1996-2007.

Trickey’s team said when they looked specifically at deaths due to AIDS, the number during treatment declined over time between 1996 and 2010, probably because more modern drugs are more effective in restoring the immune system.

 As a result, the researchers said that between 1996 and 2013, the life expectancy of 20-year-olds treated for HIV increased by nine years for women and 10 years for men in the European Union and North America.

This suggests that life expectancy of a 20-year-old who began ART from 2008 onwards and responded well to it would get close to a life expectancy of the general population – 78 years.

But the improvements were not seen in all people with HIV. Life expectancy of those infected through injecting drugs, for example, did not increase as much as in other groups.

Mr Trickey said this underlined the need for prevention and treatment efforts to be focused on high-risk groups.

 

The Era Of Chimeras: Scientists Fearlessly Create Bizarre Human/Animal Hybrids


Did you know that scientists are creating cow/human hybrids, pig/human hybrids and even mouse/human hybrids?  This is happening every single day in labs all over the western world, but most people have never even heard about it.  So would you drink milk from a cow/human hybrid that produces milk that is almost identical to human breast milk?  And how would you interact with a mouse that has a brain that is almost entirely human?

These are the kinds of questions that we will have to start to address as a society as scientists create increasingly bizarre human/animal hybrids.  Thanks to dramatic advances in genetic technology, we have gotten to the point where it is literally possible for college students to create new hybrid lifeforms in their basements.   Of course our laws have not kept pace with these advances, and now that Pandora’s Box has been opened, it is going to be nearly impossible to shut it.

Scientists try to justify the creation of human/animal hybrids by telling us that it will help “cure disease” and help “end world hunger”, but what if scientists discover that combining human DNA with animal DNA can give us incredible new abilities or greatly extended lifespans?  Will humanity really have the restraint to keep from going down that road?

In my previous article entitled “Transhumanists: Superhuman Powers And Life Extension Technologies Will Allow Us To Become Like God“, I explored the obsession that transhumanists have with human enhancement.  The temptation to “take control of our own evolution” will surely be too great for many scientists to resist.  And even if some nations outlaw the complete merging of humans and animals, that does not mean that everyone else in the world will.

And once animal DNA gets into our breeding pool, how will we ever put the genie back into the bottle?  As the DNA of the human race becomes corrupted, it is easy to imagine a future where there are very few “pure humans” remaining.

Sadly, most of the scientists working in this field express very little concern for these types of considerations.  In fact, one very prominent U.S. geneticist says that we should not even worry about hybridization because he believes that humans were originally pig/chimpanzee hybrids anyway…

The human species began as the hybrid offspring of a male pig and a female chimpanzee, an American geneticist has suggested.

The startling claim has been made by Eugene McCarthy, who is also one of the world’s leading authorities on hybridisation in animals.

He points out that while humans have many features in common with chimps, we also have a large number of distinguishing characteristics not found in any other primates.

So if we are just hybrid creatures ourselves, why should we be scared of making more hybrids?

From their point of view, it all makes perfect sense.

And right now, extremely weird human/animal hybrids are being grown all over the United States.

For example, just check out the following excerpt from an NBC News article about what is going on in Nevada…

On a farm about six miles outside this gambling town, Jason Chamberlain looks over a flock of about 50 smelly sheep, many of them possessing partially human livers, hearts, brains and other organs.

The University of Nevada-Reno researcher talks matter-of-factly about his plans to euthanize one of the pregnant sheep in a nearby lab. He can’t wait to examine the effects of the human cells he had injected into the fetus’ brain about two months ago.

“It’s mice on a large scale,” Chamberlain says with a shrug.

When this article came across my desk recently, I noted that it was almost ten years old.

Over the past decade, things have gotten much, much stranger.

For example, scientists have now created mice that have artificial human chromosomes “in every cell in their bodies“…

Scientists have created genetically-engineered mice with artificial human chromosomes in every cell of their bodies, as part of a series of studies showing that it may be possible to treat genetic diseases with a radically new form of gene therapy.

In one of the unpublished studies, researchers made a human artificial chromosome in the laboratory from chemical building blocks rather than chipping away at an existing human chromosome, indicating the increasingly powerful technology behind the new field of synthetic biology.

And researchers at the University of Wisconsin figured out a way to transfer cells from human embryos into the brains of mice.  When those cells from the human embryos began to grow and develop, they actually made the mice substantially smarter

Yet experiments like these are going forward just the same. In just the past few months, scientists at the University of Wisconsin and the University of Rochester have published data on their human-animal neural chimeras. For the Wisconsin study, researchers injected mice with an immunotoxin to destroy a part of their brains–the hippocampus–that’s associated with learning, memory, and spatial reasoning. Then the researchers replaced those damaged cells with cells derived from human embryos. The cells proliferated and the lab chimeras recovered their ability to navigate a water maze.

For the Rochester study, researchers implanted newborn mice with nascent human glial cells, which help support and nourish neurons in the brain. Six months later, the human parts had elbowed out the mouse equivalents, and the animals had enhanced ability to solve a simple maze and learn conditioned cues. These protocols might run afoul of the anti-hybrid laws, and perhaps they should arouse some questions. These chimeric mice may not be human, or even really human, but they’re certainly one step further down the path to Algernon. It may not be so long before we’re faced with some hairy bioethics: What rights should we assign to mice with human brains?

So what should we call mice that have brains that are mostly human?

And at what point would our relationship with such creatures fundamentally change?

When they learn to talk?

Scientists all over the planet are recklessly creating these chimeras without really thinking through the implications.

In China, scientists have actually inserted human genes into the DNA of dairy cow embryos.

Now there are hundreds of human/cow hybrids that produce milk that is virtually identical to human breast milk.

Would you buy such milk if it showed up in your supermarket?  The scientists that “designed” these cows say that is the goal.

But of course this is just the tip of the iceberg.  A very good Slate article detailed some more of the human/animal hybrid experiments that have been taking place all over the planet…

Not long ago, Chinese scientists embedded genes for human milk proteins into a mouse’s genome and have since created herds of humanized-milk-producing goats. Meanwhile, researchers at the University of Michigan have a method for putting a human anal sphincter into a mouse as a means of finding better treatments for fecal incontinence, and doctors are building animals with humanized immune systems to serve as subjects for new HIV vaccines.

And Discovery News has documented even more bizarre human/animal hybrids that scientists have developed…

Rabbit Eggs with Human Cells

Pigs with Human Blood

Sheep with Human Livers

Cow Eggs with Human Cells

Cat-Human Hybrid Proteins

As the technology continues to advance, the possibilities are going to be endless.

One professor at Harvard even wants to create a Neanderthal/human hybrid.  He says that he just needs an “adventurous female human” to carry the child…

Professor George Church of Harvard Medical School believes he can reconstruct Neanderthal DNA and resurrect the species which became extinct 33,000 years ago.

His scheme is reminiscent of Jurassic Park but, while in the film dinosaurs were created in a laboratory, Professor Church’s ambitious plan requires a human volunteer.

He said his analysis of Neanderthal genetic code using samples from bones is complete enough to reconstruct their DNA.

He said: ‘Now I need an adventurous female human.

‘It depends on a hell of a lot of things, but I think it can be done.’

I don’t know about you, but that sounds like a really, really bad idea to me.

And right now, the U.S. federal government is actually considering a plan which would allow scientists to create babies that come from genetic material drawn from three parents

A new technology aimed at eliminating genetic disease in newborns would combine the DNA of three people, instead of just two, to create a child, potentially redrawing ethical lines for designer babies.

The process works by replacing potentially variant DNA in the unfertilized eggs of a hopeful mother with disease-free genes from a donor. U.S. regulators today will begin weighing whether the procedure, used only in monkeys so far, is safe enough to be tested in humans.

Because the process would change only a small, specific part of genetic code, scientists say a baby would largely retain the physical characteristics of the parents. Still, DNA from all three — mother, father and donor — would remain with the child throughout a lifetime, opening questions about long-term effects for this generation, and potentially the next. Ethicists worry that allowing pre-birth gene manipulation may one day lead to build-to-order designer babies.

Many scientists believe that these kinds of technologies will “change the world”.

They might be more right about that than they ever could possibly imagine.

When we start monkeying with human DNA, we could be opening up doorways that we never even knew existed.

If we do not learn from history, we are doomed to repeat it.  Hopefully scientists around the globe will understand the dangers of these types of experiments before it is too late.

Thanks to AI, Computers Can Now See Your Health Problems. 


PATIENT NUMBER TWO was born to first-time parents, late 20s, white. The pregnancy was normal and the birth uncomplicated. But after a few months, it became clear something was wrong. The child had ear infection after ear infection and trouble breathing at night. He was small for his age, and by his fifth birthday, still hadn’t spoken. He started having seizures. Brain MRIs, molecular analyses, basic genetic testing, scores of doctors; nothing turned up answers. With no further options, in 2015 his family decided to sequence their exomes—the portion of the genome that codes for proteins—to see if he had inherited a genetic disorder from his parents. A single variant showed up: ARID1B.

The mutation suggested he had a disease called Coffin-Siris syndrome. But Patient Number Two didn’t have that disease’s typical symptoms, like sparse scalp hair and incomplete pinky fingers. So, doctors, including Karen Gripp, who met with Two’s family to discuss the exome results, hadn’t really considered it. Gripp was doubly surprised when she uploaded a photo of Two’s face to Face2Gene. The app, developed by the same programmers who taught Facebook to find your face in your friend’s photos, conducted millions of tiny calculations in rapid succession—how much slant in the eye? How narrow is that eyelid fissure? How low are the ears? Quantified, computed, and ranked to suggest the most probable syndromes associated with the facial phenotype. There’s even a heat map overlay on the photo that shows which the features are the most indicative match.

“In hindsight it was all clear to me,” says Gripp, who is chief of the Division of Medical Genetics at A.I. duPont Hospital for Children in Delaware, and had been seeing the patient for years. “But it hadn’t been clear to anyone before.” What had taken Patient Number Two’s doctors 16 years to find took Face2Gene just a few minutes.

Face2Gene takes advantage of the fact that so many genetic conditions have a tell-tale “face”—a unique constellation of features that can provide clues to a potential diagnosis. It is just one of several new technologies taking advantage of how quickly modern computers can analyze, sort, and find patterns across huge reams of data. They are built in fields of artificial intelligence known as deep learning and neural nets—among the most promising to deliver AI’s 50-year old promise to revolutionize medicine by recognizing and diagnosing disease.

 Genetic syndromes aren’t the only diagnoses that could get help from machine learning. The RightEye GeoPref Autism Test can identify the early stages of autism in infants as young as 12 months—the crucial stages where early intervention can make a big difference. Unveiled January 2 at CES in Las Vegas, the technology uses infrared sensors test the child’s eye movement as they watch a split-screen video: one side fills with people and faces, the other with moving geometric shapes. Children at that age should be much more attracted to faces than abstract objects, so the amount of time they look at each screen can indicate where on the autism spectrum a child might fall.

In validation studies done by the test’s inventor, UC San Diego researcher Karen Pierce,1the test correctly predicted autism spectrum disorder 86 percent of the time in more than 400 toddlers. That said, it’s still pretty new, and hasn’t yet been approved by the FDA as a diagnostic tool. “In terms of machine learning, it’s the simplest test we have,” says RightEye’s Chief Science Officer Melissa Hunfalvay. “But before this, it was just physician or parent observations that might lead to a diagnosis. And the problem with that is it hasn’t been quantifiable.”

A similar tool could help with early detection of America’s sixth leading cause of death: Alzheimer’s disease. Often, doctors don’t recognize physical symptoms in time to try any of the disease’s few existing interventions. But machine learning hears what doctor’s can’t: Signs of cognitive impairment in speech. This is how Toronto-based Winterlight Labs is developing a tool to pick out hints of dementia in its very early stages. Co-founder Frank Rudzicz calls these clues “jitters,” and “shimmers:” high frequency wavelets only computers, not humans, can hear.

Winterlight’s tool is way more sensitive than the pencil and paper-based tests doctor’s currently use to assess Alzheimer’s. Besides being crude, data-wise, those tests can’t be taken more than once every six months. Rudzicz’s tool can be used multiple times a week, which lets it track good days, bad days, and measure a patient’s cognitive functions over time. The product is still in beta, but is currently being piloted by medical professionals in Canada, the US, and France.

If this all feels a little scarily sci-fi to you, it’s useful to remember that doctors have been trusting computers with your diagnoses for a long time. That’s because machines are much more sensitive at both detecting and analyzing the many subtle indications that our bodies are misbehaving. For instance, without computers, Patient Number Two would never have been able to compare his exome to thousands of others, and find the genetic mutation marking him with Coffin-Siris syndrome.

But none of this makes doctors obsolete. Even Face2Gene—which, according to its inventors, can diagnose up to half of the 8,000 known genetic syndromes using facial patterns gleaned from the hundreds of thousands of images in its database—needs a doctor (like Karen Gripp) with enough experience to verify the results. In that way, machines are an extension of what medicine has always been: A science that grows more powerful with every new data point.

Scientists Dug 12 Km Into Earth. What They Found Will Leave You Speechless


Some call it the ‘Door to Hell’. At 12,262 meters the Kola Superdeep Borehole is the deepest artificial point on our planet. No one expected these discoveries. There are some who firmly believe the human race knows more about distant galaxies and alien planets located light years away from Earth than what lies beneath the surface of our planet.

Scientists Dug 12 Km Into Earth. What They Found Will Leave You Speechless!

Curiously, it took the Voyager 1 spacecraft nearly 26 years to exit our solar system, which is about the same amount of time scientists on Earth needed to penetrate 12 kilometers into our planet’s surface. After more than two decades, scientists created the Kola Superdeep Borehole and a drill depth of more than 7.5 miles (12 kilometers).

So, What Did Scientists Find Down There? Well, after 26 years of intensive drilling efforts, experts found that there’s a LOT of water down there. Scientists discovered hot mineralized water almost everywhere down the drilling path. But not only is there water down there, scientists discovered the Earth has Gas. Not that type of gas. Scientists found, helium, hydrogen, nitrogen, and even carbon dioxide (from microbes) all along the borehole. One of the greatest surprises was that experts found there is no basalt under the continent’s granite. Scientists believed that at 9,000 meters the granite would give way to basalt. However, to their surprise, it doesn’t.

Scientists Dug 12 Km Into Earth. What They Found Will Leave You Speechless!

Furthermore, scientists discovered there are FOSSILS in granite located around 6,700 meters below the surface. In addition to the above, scientists found that the temperature at the bottom of the hole reached a staggering 180 degrees celsius, officially too hot to continue, and rightfully earning the nickname ‘Door to Hell’. But perhaps what’s even more impressive is the fact that scientists estimate that the distance to the center of our planet is nearly 4,000 miles (6,400 kilometers). Turns out, 6,500 kilometers isn’t anywhere close and what scientists managed to drill trough barely scratches the surface. “By far the most riveting discovery from the project, however, was the detection of microscopic plankton fossils in rocks over 2 billion years old, found four miles beneath the surface,” reports Bryan Nelson from Mother Nature Network. “These ‘microfossils’ represented about 24 ancient species, and were encased in organic compounds which somehow survived the extreme pressures and temperatures that exist so far beneath the Earth.”.

Watch the video. URL:https://youtu.be/zz6v6OfoQvs

 

Bilingual speakers experience time differently to people who only speak one language, study finds


Researchers suggested being bilingual may also bring long-term benefits for mental wellbeing

 For those who can speak only one language, people who have the ability to speak several are often a source of fascination. What language do they think in? Can they switch mid-way through? Do they dream in one language or both?

It turns out these questions are not without merit as people who can speak two languages actually experience time in a different way.

A study from Lancaster University and Stockholm University, published in the Journal of Experimental Psychology, found that people who arebilingual think about time differently depending on the language context in which they are estimating the duration of events.

 Linguists Professor Panos Athanasopoulos and Professor Emanuel Bylund explained that bilinguals often go back and forth between their languages consciously and unconsciously.

Additionally, different languages often refer to time differently. For example, Swedish and English speakers refer to physical distances: ‘Taking a short break’ while Spanish speakers refer to physical quantities and volume: ‘Taking a small break’.

The researchers asked native Swedish speakers who also spoke Spanish to estimate how much time had passed while watching either a line growing across a screen or a container being filled. Participants were prompted to use the word ‘duracion’ (Spanish for duration) or ‘tid’ (the Swedish equivalent).

When prompted by Spanish words, bilinguals based their estimates on volume relating to a container being filled. When prompted by Swedish words they switched their behaviour and suddenly gave time estimates in distance, referring to the lines travelled, rather than volume.

Professor Athanasopoulos said the results showed our language creeps into our everyday emotions and perceptions more than we realise.

“The fact that bilinguals go between these different ways of estimating time effortlessly and unconsciously fits in with a growing body of evidence demonstrating the ease with which language can creep into our most basic senses, including our emotions, visual perception, and now it turns out, sense of time,” he said.

Professor Athanasopoulos also suggested the results show that bilinguals are more “flexible thinkers” than those who just speak one language.

“There is evidence to suggest that mentally going back and forth between different languages on a daily basis confers advantages on the ability to learn and multi-task, and even long-term benefits for mental well-being,” he said.

Cheese does not increase risk of heart attack or strokes, find researchers


Review of 29 studies involving nearly a million participants finds saturated fats ‘do not increase risk of cardiovascular disease’

camembert.jpg

The belief that cheese is bad for you is wrong, researchers have said, after finding no link between eating dairy products and a heightened risk of heart attack and strokes.

Even full-fat cheese, milk and yoghurt, often avoided by the health-conscious due to their high saturated fat content, does not increase the risk of death or conditions such as coronary heart disease, according to a review of 29 different studies involving nearly a million participants.

“There’s quite a widespread but mistaken belief among the public that dairy products in general can be bad for you, but that’s a misconception,” said researcher Ian Givens, a nutrition professor at Reading University.

“While it is a widely held belief, our research shows that that’s wrong,” he told The Guardian.

“There’s been a lot of publicity over the last five to 10 years about how saturated fats increase the risk of cardiovascular disease and a belief has grown up that they must increase the risk, but they don’t.”

NHS guidelines suggest people cut the amount of saturated fat they eat, because a diet high in saturated fat can raise the level of cholesterol in the blood, increasing the risk of cardiovascular disease.

Men are recommend to eat no more than 30g of saturated fat a day, and women no more than 20g. This sounds like bad news for cheese lovers – if two people share a whole baked 250g camembert, for instance, they will both consume around 19g of saturated fat.

But overall levels of dairy consumption did not appear to be associated with an increased risk of circulatory conditions such as stroke and heart attacks, according to the study, published in the European Journal of Epidemiology.

The research analysed results from previous studies carried out over the last 35 years, using information on the health and diet of 938,465 participants.

cheese-board.jpg
Assorted cheeses

Scientists are divided on whether limiting saturated fats can improve overall health and lower the risk of heart disease.

A study published earlier this year in the British Medical Journal(BMJ) swapping even one per cent of your daily calorie intake from saturated fats like butter and meat to vegetables, wholegrain carbohydrates or polyunsaturated fats found in olive oil and fish can improve heart health.

However, previous research from the University of Bergen in Norway found fatty foods such as cheese, butter and cream could in fact help protect people from heart disease when eaten as part of a diet where overall calorie intake is restricted.

Simon Dankel, who led the study, told The Independent in December the research showed the human body “can do perfectly well with fats as its main energy source.”

“People will say: ‘you can’t lose weight, you can’t go on any diets with saturated fats, no matter what’,” said Dr Dankel.

“But in this context, we see a very positive metabolic response. You can base your energy in your diet on either on carbohydrates or fat. It doesn’t make a big difference.”

According to the British Heart Foundation (BHF)’s website, eating too much cheese “could lead to high cholesterol and high blood pressure, increasing your risk of cardiovascular disease”, and the organisation recommends people “enjoy it sensibly”.

“Saturated fat can increase the ‘bad’ (LDL) cholesterol in your blood which can cause fatty material to build up in your artery walls. The risk is particularly high if you have a high level of bad cholesterol and a low level of good cholesterol,” says the organisation.