Quantum Dot Technology Could Lead To Solar Panel Windows.


Researchers from Los Alamos National Laboratory and the University of Milano-Bicocca have designed and synthesized a new generation of quantum dots for use in solar energy systems that overcome previous inefficiencies in harvesting sunlight. The study has been published in the journal Nature Photonics.

Quantum dots, which are nanocrystals made of semiconducting materials, appeal to scientists for use in solar photovoltaics (solar panel systems) because of their versatility and low-cost. In particular, they are desirable for use in luminescent solar concentrators (LSCs), which are photon-management devices that serve as alternatives to optics-based solar concentration systems.

LSCs are constructed from transparent materials containing emitters such as quantum dots. They concentrate solar radiation absorbed from a large area onto a significantly smaller solar cell, explains Victor Kilmov, one of the authors of the study. One exciting application of LSCs is the potential to develop photovoltaic windows, which could turn buildings into energy making factories.

Although quantum dots are highly efficient emitters, their small Stokes shift (the presence of an overlap between emission and absorption) means that some of the light produced is re-absorbed by the dots, resulting in losses of emission and therefore overall efficiency problems.

To resolve this issue, the team generated “Stokes-shift-engineered” giant quantum dots composed of a cadmium selenide (CdSe) shell which absorbs light, and a cadmium sulfide (CdS) core which is responsible for light emission. This separation of absorption and emission caused a significant reduction in re-absorption losses which previously caused inefficiencies. The dots were then incorporated into a high-quality polymethylmethacrylate (PMMA) matrix, and spectroscopic analysis revealed that re-absorption losses were minimal across distances of tens of centimeters.

The incorporation of the quantum dots into this PMMA matrix is not specific to a particular type of quantum dot; this means that it can be applied to different sized nanocrystals composed of various materials. This technology therefore represents a promising materials platform.

Here’s Why Big Pharma is Deadly Afraid of the Power of Baking Soda.


Cancer is an acid i.e. lactic acid, which is a waste product of the fungus and mold and lives in environment that has low concentration of oxygen. If we bring high concentration of oxygen molecules to the cancer cells they will die.

Everyone will strongly resist the idea that something so simple and cheap such as sodium bicarbonate (Baking Soda) can surpass the most expensive pharmaceutical drugs. There are compelling evidence that are supporting the multitude of theories that suggest that sodium bicarbonate should be the primary and universal medicament for a wide range of diseases, including diabetes and cancer, also all therapists and medical professionals should include it in the medical treatment.

baking-soda

When it comes to sodium bicarbonate, we must say that it is well understood and studied substance. It is widely used for decades, even by oncologists. Sodium bicarbonate must be administered routinely in order to prevent from damaging the kidneys from toxicity of chemotherapy and radiation.

Worldwide millions of people consume bicarbonate ions with the drinking water with purpose of prevention or treating of clinical acidosis as well as in a variety of other conditions, in clinics, hospitals or emergency departments. Every day that helps to be saved countless lives. When baking soda is combined with other basic and strong natural substances like iodine and magnesium chloride, we have a trinity of medical super heroes.

The problem with acidic pH value ​​(relative lack of bicarbonate ions) is a big part of the human physiology. Every biochemical reactions are sensitive to the pH value because the enzymes are very sensitive to this balance.

bakingsoda-450x600

Very important role in maintaining a favorable pH value in our body plays our diet. A large part of the modern diet results in a decrease of the pH value towards acidic. The imbalance of the pH value interferes with cellular activities and functions, especially if the pH value continues to decrease. Very high acidic pH value leads to deterioration of the cells, which at the end leads to serious health problems, such as gastritis, diabetes, osteoporosis, cardiovascular diseases and cancer. The fact that biological life has better function in a non-acidic (alkaline) environment clearly supports the usefulness of baking soda.

bakingsoda-450×600

Sodium bicarbonate is responsible for the transport of oxygen: dilates the blood vessels and release the oxygen into the tissue, which means that increases the pH value. Elevated pH value of the urine prevents crystallization in the urinary tract.

It has been established that uric acid cause kidney stones, diabetes, heart attack, heart disease, heart attack and gouty. Also it creates a toxic compound named alloxan which is produced by fungi, that means that also creates diabetes and cancer cells. Oral administration of baking soda prevents arthritis, infection, gout, fever, and assist the activity of the pancreas for which is known that produces the most of the bicarbonate and also is responsible for the production of insulin.

Regardless of whether someone has neurological or cardiac diseases, cancer or severe form of the flu if sodium bicarbonate is taken together with magnesium chloride is provided the safest and the best treatment. Their common action effectively removes all the toxins and acids from all tissues, cells and organs.

Nanoparticles Deliver Three Cancer Drugs at Once.


Nanomedicine is an exciting field of study that will allow doctors to pinpoint and treat problem areas on a cell-by-cell basis for many diseases, including cancer. While the ability to administer even one medication in a targeted fashion is incredible, researchers from MIT have developed nanoparticles that can deliver three cancer drugs at once, with the option of adding more. The research was led by Jeremiah Johnson and the results were published in the Journal of the American Chemical Society.

This targeted approach will not only get rid of all of the diseased cells but it will spare healthy cells; a sharp contrast to traditional chemotherapy. In order for traditional intravenous drugs to kill all of the cancer cells, a large amount of healthy cells are also killed. This is why many cancer patients lose their hair and experience truly horrific side effects during treatment. It would be a complete game changer if this were no longer the case.

“We think it’s the first example of a nanoparticle that carries a precise ratio of three drugs and can release those drugs in response to three distinct triggering mechanisms,” Johnson said in a press release. This triple-threat nanoparticle eliminated ovarian cancer cells better than nanoparticles which utilized either one or two medications. The design also allows for modification to include additional drugs as well.

The development of these nanoparticles has traditionally been hindered by where the drugs can be attached in the proper quantities. Some designs make an enclosure and tuck the medicine inside, which only allows for one medication. Attaching them to the outside can be done with two drugs, but is extremely difficult.

In order for a particle to deliver three drugs, Johnson’s lab had to take a completely different approach to the problem. Rather than using pieces to carry the drug, the particle is assembled from pieces that are the drug. Carefully regulated construction allows the pieces to fit together while maintaining the proper ratios of medication. The drug molecules are attached to a linking unit and then put into a polymer chain to protect the particles from breaking down in the body before it can enter the cell. Once the drugs have been delivered, the chain easily breaks down.

The team is moving forward in their research and has begun using this technique in animal models. They are also seeking to add a fourth drug to the structure. Their structure will also allow the ratios to be reconfigured as needed, to treat patients on an individual basis or to ensure that each medication is actually being useful.

 

The particles were designed to release doxorubicin when exposed to ultraviolet light. Here, ovarian cancer cells turn red as the doxorubicin is released over time.

The Surprising Truth About Gluten-Free Food and Weight Loss — Health Hub from Cleveland Clinic


Gluten-free diets are the latest craze for those looking to lose weight, but what’s the truth? Is gluten responsible for my love handles?  The answer is no, but let’s clear the air of any confusion.

What is gluten and Celiac disease?

Gluten is a protein found in wheat, rye, barley and countless processed foods including pasta, breads and cereals.

Some people avoid gluten because they have Celiac disease. Celiac disease is an auto-immune disorder. The body’s immune response to gluten damages the small intestine lining. This results in abdominal pain, bloating, nausea and diarrhea.  Celiac disease prevents absorption of vitamins and minerals and promotes weight loss.

Other people avoid gluten because of gluten intolerance. Gluten intolerance mimics symptoms of Celiac disease without the immune response.

Why do people avoid gluten to lose weight?

There’s absolutely no evidence that simply getting rid of gluten will result in weight loss.

However, eating gluten-free often may cause you to eat more whole, unprocessed foods such as fruits, vegetables, legumes and lean meats. These diet changes are often healthier and lower in calories.

People eating gluten-free also tend to make healthier food choices because they are more aware of the need to read food labels.

Ditching the double cheeseburger and fries for a gluten-free meal of salad, chicken breast, and sweet potato is choosing a meal that is much lower in calories. That can mean weight loss over time.

Aren’t all gluten-free foods healthy?

Gluten-free does not necessarily mean healthy because all gluten-free foods are not equally nutritious.

An apple and a gluten-free sugar cookie are both gluten-free, but their nutrients vary drastically.

Grocery and health food stores are full of gluten-free cakes, cookies and sweet treats. These foods often are high in sugar and fat, making them dense with calories.  Be sure to read those food labels!

I don’t have Celiac disease or gluten intolerance. Is it safe to avoid gluten?

Absolutely! Some people choose to follow a gluten-free diet merely because it provides structure to eating healthier and adopting a healthy lifestyle.

Just remember to consume a varied diet rich in fruits, vegetables, and legumes to avoid vitamin/mineral deficiencies and promote a healthy weight and lifestyle.

The bottom line

Gluten-free diets are typically consumed by those who are unable to tolerate gluten on a biological level.  However, some people choose to follow a gluten-free diet for more healthful eating.

There is no harm in avoiding gluten, but remember to consume a balanced diet rich in fruits, vegetables and legumes. Make sure your gluten-free choices are still 100 percent whole grain, such as buckwheat, quinoa or brown rice.

You lose weight when you expend more calories or energy than you consume – not by avoiding gluten.  Diet and exercise are both important components of weight management and a healthy lifestyle.

World Hemophilia Day.


World Hemophilia Day is an international observance held on April 17, 2014 by the World Federation of Hemophilia (WFH). It is an awareness day for hemophilia and other bleeding disorders, which also serves to raise funds and attract volunteers for the WFH. It was started in 1989 and is held annually; April 17 was chosen in honor of Frank Schnabel’s birthday. Frank Schnabel established the WFH in 1963.

Logo of the World Hemophilia Day

Haemophilia is a group of hereditary genetic disorders that impair the body’s ability to control blood clotting or coagulation, which is used to stop bleeding when a blood vessel is broken. Haemophilia A is the most common form of the disorder, present in about 1 in 5,000 – 10,000 male births. Haemophilia B occurs in around 1 in about 20,000 – 34,000 male births.

The WFH is an international non-profit organization dedicated to improve the lives of people with hemophilia and other genetic bleeding disorders. It educates hemophiliacs and lobbies for improved medical treatment. 75 percent of people in the world with bleeding disorders do not receive adequate treatment. The WFH has member organizations in 113 countries and official recognition from the World Health Organization.

Scientists explain how memories stick together.


Scientists at the Salk Institute have created a new model of memory that explains how neurons retain select memories a few hours after an event.

This new framework provides a more complete picture of how memory works, which can inform research into disorders liked Parkinson’s, Alzheimer’s, post-traumatic stress and learning disabilities. 

“Previous models of memory were based on fast activity patterns,” says Terry Sejnowski, holder of Salk’s Francis Crick Chair and a Howard Hughes Medical Institute Investigator. “Our  of memory makes it possible to integrate experiences over hours rather than moments.” 

Over the past few decades, neuroscientists have revealed much about how long-term memories are stored. For significant events—for example, being bit by a dog—a number of proteins are quickly made in activated brain cells to create the new memories. Some of these proteins linger for a few hours at specific places on specific neurons before breaking down. 

This series of biochemical events allow us to remember important details about that event—such as, in the case of the dog bite, which dog, where it was located, if it required an emergency room visit, and so on. 

One problem scientists have had with modeling memory storage is explaining why only selective details and not everything in that 1-2 hour window is strongly remembered. By incorporating data from previous literature, Sejnowski and first author Cian O’Donnell, a Salk postdoctoral researcher, developed a model that bridges findings from both molecular and systems observations of memory to explain how this 1-2 hour memory window works. The work is detailed in the latest issue of Neuron. 

Using computational modeling, O’Donnell and Sejnowski show that, despite the proteins being available to a number of neurons in a given circuit, memories are retained when subsequent events activate the same neurons as the original event. The scientists found that the spatial positioning of proteins at both specific neurons and at specific areas around these neurons predicts which memories are recorded. This spatial patterning framework successfully predicts memory retention as a mathematical function of time and location overlap. 

“One thing this study does is link what’s happing in memory formation at the cellular level to the systems level,” says O’Donnell. “That the time window is important was already established; we worked out how the content could also determine whether memories were remembered or not. We prove that a set of ideas are consistent and sufficient to explain something in the real world.” 

The new model also provides a potential framework for understanding how generalizations from memories are processed during dreams. 

While much is still unknown about sleep, research suggests that important memories from the day are often cycled through the brain, shuttled from temporary storage in the hippocampus to more long-term storage in the cortex. Researchers observed most of this  in non-dreaming sleep. Little is known about if and how memory packaging or consolidation is done during dreams. However, O’Donnell and Sejnowski’s model suggests that some  does happen during dreams. 

“During sleep there’s a reorganizing of memory—you strengthen some memories and lose ones you don’t need anymore,” says O’Donnell. “In addition, people learn abstractions as they sleep, but there was no idea how generalization processes happen at a neural level.” 

By applying their theoretical findings on overlap activity within the 1-2 hour window, they came up with a theoretical model for how the  abstraction process might work during sleep.

Scientists capture ultrafast snapshots of light-driven superconductivity.


A new study pins down a major factor behind the appearance of superconductivity—the ability to conduct electricity with 100 percent efficiency—in a promising copper-oxide material.

Scientists used carefully timed pairs of laser pulses at SLAC National Accelerator Laboratory’s Linac Coherent Light Source (LCLS) to trigger  in the material and immediately take x-ray snapshots of its atomic and electronic structure as superconductivity emerged. 

They discovered that so-called “charge stripes” of increased electrical charge melted away as superconductivity appeared. Further, the results help rule out the theory that shifts in the material’s atomic lattice hinder the onset of superconductivity. 

Armed with this new understanding, scientists may be able to develop new techniques to eliminate charge stripes and help pave the way for , often considered the holy grail of condensed matter physics. The demonstrated ability to rapidly switch between the insulating and superconducting states could also prove useful in advanced electronics and computation. 

The results, from a collaboration led by scientists from the Max Planck Institute for the Structure and Dynamics of Matter in Germany and the U.S. Department of Energy’s SLAC and Brookhaven national laboratories, were published in the journal Physical Review Letters. 

“The very short timescales and the need for  made this experiment extraordinarily challenging,” said co-author Michael Först, a scientist at the Max Planck Institute. “Now, using femtosecond x-ray pulses, we found a way to capture the quadrillionths-of-a-second dynamics of the charges and the . We’ve broken new ground in understanding light-induced superconductivity.” 

 

Ripples in Quantum Sand 

The compound used in this study was a layered material consisting of lanthanum, barium, copper, and oxygen grown at Brookhaven Lab by physicist Genda Gu. Each copper oxide layer contained the crucial charge stripes. 

“Imagine these stripes as ripples frozen in the sand,” said John Hill, a Brookhaven Lab physicist and coauthor on the study. “Each layer has all the ripples going in one direction, but in the neighboring layers they run crosswise. From above, this looks like strings in a pile of tennis racquets. We believe that this pattern prevents each layer from talking to the next, thus frustrating superconductivity.” 

To excite the material and push it into the superconducting phase, the scientists used mid-infrared laser pulses to “melt” those frozen ripples. These pulses had previously been shown to induce superconductivity in a related compound at a frigid 10 Kelvin (minus 442 degrees Fahrenheit). 

“The charge stripes disappeared immediately,” Hill said. “But specific distortions in the crystal lattice, which had been thought to stabilize these stripes, lingered much longer. This shows that only the charge stripes inhibit superconductivity.” 

Stroboscopic Snapshots 

To capture these stripes in action, the collaboration turned to SLAC’s LCLS x-ray laser, which works like a camera with a shutter speed faster than 100 femtoseconds, or quadrillionths of a second, and provides atomic-scale image resolution. LCLS uses a section of SLAC’s 2-mile-long linear accelerator to generate the electrons that give off x-ray light. 

“This represents a very important result in the field of superconductivity using LCLS,” said Josh Turner, an LCLS staff scientist. “It demonstrates how we can unravel different types of complex mechanisms in superconductivity that have, up until now, been inseparable.” 

He added, “To make this measurement, we had to push the limits of our current capabilities. We had to measure a very weak, barely detectable signal with state-of-the-art detectors, and we had to tune the number of x-rays in each laser pulse to see the signal from the stripes without destroying the sample.” 

The researchers used the so-called “pump-probe” approach: an optical laser pulse strikes and excites the lattice (pump) and an ultrabright x-ray laser pulse is carefully synchronized to follow within femtoseconds and measure the lattice and stripe configurations (probe). Each round of tests results in some 20,000 x-ray snapshots of the changing lattice and charge stripes, a bit like a strobe light rapidly illuminating the process. 

To measure the changes with high spatial resolution, the team used a technique called resonant soft x-ray diffraction. The LCLS x-rays strike and scatter off the crystal into the detector, carrying time-stamped signatures of the material’s charge and lattice structure that the physicists then used to reconstruct the rise and fall of superconducting conditions. 

“By carefully choosing a very particular x-ray energy, we are able to emphasize the scattering from the charge stripes,” said Brookhaven Lab physicist Stuart Wilkins, another co-author on the study. “This allows us to single out a very weak signal from the background.” 

Toward Superior Superconductors 

The x-ray scattering measurements revealed that the lattice distortion persists for more than 10 picoseconds (trillionths of a second)—long after the charge stripes melted and superconductivity appeared, which happened in less than 400 femtoseconds. Slight as it may sound, those extra trillionths of a second make a huge difference. 

“The findings suggest that the relatively weak and long-lasting lattice shifts do not play an essential role in the presence of superconductivity,” Hill said. “We can now narrow our focus on the stripes to further pin down the underlying mechanism and potentially engineer superior materials.” 

Andrea Cavalleri, Director of the Max Planck Institute, said, “Light-induced superconductivity was only recently discovered, and we’re already seeing fascinating implications for understanding it and going to higher temperatures. In fact, we have observed the signature of light-induced superconductivity in materials all the way up to 300 Kelvin (80 degrees Fahrenheit)—that’s really a significant breakthrough that warrants much deeper investigations.”

CDC: Hospital Antibiotic Use Promotes ResistanceChecklist Can Improve PracticesMedical News & Perspectives.


Antibiotic prescribing in hospitals is inconsistent and often inappropriate—contributing to the emergence of antibiotic resistance, according to an analysis of hospital antibiotic prescribing by the US Centers for Disease Control and Prevention (CDC). But simple steps, such as implementing checklists, could help hospitals more wisely use these vital medications, the CDC says.

The CDC has launched an increasingly urgent campaign to combat antimicrobial resistance. A report issued by the agency last fall found that antibiotic-resistant bacteria infect 2 million US individuals each year, causing 23 000 deaths and accounting for $20 billion in health costs. The report also raised the specter of the emergence of untreatable infections.

But in a March press briefing, CDC Director Thomas Frieden, MD, MPH, said that it is possible to reduce drug resistance rates by establishing antibiotic stewardship programs at hospitals and improving coordination between facilities. “We want to develop the infrastructure in every hospital, so every physician knows how to prescribe properly in the context of [his or her] hospital,” said Arjun Srinivasan, MD, associate director for health care–associated infection prevention programs at the CDC.

Image not available.

The US Centers for Disease Control and Prevention advises more judicious use of antimicrobials to treat urinary tract infections, pneumonia, and infections with methicillin-resistant Staphylococcus aureus.

For its part, the CDC is providing checklists for hospitals and physicians. And with the help of an additional $30 million in funding in the Obama Administration’s proposed 2015 budget, the CDC also plans to build an improved surveillance system to rapidly detect the emergence of antibiotic resistance.

More than half of hospital patients receive an antibiotic during their stay, and nearly a third receive a broad-spectrum antibiotic, according to the CDC’s analysis of data from 323 hospitals. These statistics aren’t terribly surprising, but the wide variations among hospitals are. Frieden noted that some of the 26 hospitals reporting data to the National Healthcare Safety Network prescribe 3 times more antibiotics than others.

“This provides a warning bell that improvement is possible,” Frieden said.

The analysis found frequent mistakes in the treatment of common conditions. Using data from its Emerging Infections Program, which included information on about 11 000 patients at 183 hospitals in 2011, the CDC found that half of all antibiotics were prescribed for 3 conditions: lower respiratory infections, urinary tract infections (UTIs), and gram-positive infections that are presumed to be resistant. In a review of 296 cases at 36 hospitals in which physicians treated patients with intravenous vancomycin or treated patients with a UTI who did not have a catheter, the CDC found that more than one-third of those cases involved mistakes that could contribute to resistance. For example, samples were not taken before initiating therapy, doses were incorrect, therapy was not reevaluated after 48 hours, or antibiotics were administered for too long.

“The data on surveillance are no surprise, but it is important to have numbers to support stewardship programs,” said Helen Boucher, MD, a physician at Tufts Medical Center and a member of the Infectious Diseases Society of America’s board of directors. She noted that the society has advocated for better stewardship of antibiotics for years.

More judicious use of antimicrobials in hospitals could have a big effect. Based on its models, the CDC estimates that a 30% reduction in the use of broad-spectrum antibiotics in hospitals—representing a 5% reduction in overall hospital antibiotic use—could prevent 26% of Clostridium difficile infections related to antibiotic treatment. These reductions could also help prevent spillover transmission of C difficile into the community.

To aid all these efforts, the CDC plans to use its anticipated funding boost to build the infrastructure necessary to more quickly identify the emergence of resistant strains. Boucher explained that European public health officials are far ahead of the United States in this regard and can provide detailed information on resistance patterns by country and region.

John R. Combes, MD, senior vice president at the American Hospital Association, said that hospitals recognize the need for improvement and that the association is partnering with other organizations to build a toolkit for stewardship programs.

“We must improve our processes, not only to protect our patients, but to protect our antibiotics,” he said.

STEWARDSHIP INFRASTRUCTURE

The CDC recommends that each hospital build an antibiotic stewardship program to provide physicians with the information and tools they need to make the right decisions.

“Antibiotics are a precious resource, yet for decades we have not had a systematic approach in hospitals across the US to ensure they are used wisely,” said Sara Cosgrove, MD, MS, chair of the Society for Healthcare Epidemiology of America antimicrobial stewardship taskforce, in a statement. “Antimicrobial stewardship programs are a critical step toward stemming the tide of antibiotic resistance and ensuring patients are receiving the right antibiotic, at the right dose and for the right duration.”

The CDC recommends that stewardship programs include 7 components:

  • Dedicated human, financial, and technology resources

  • A physician or other leader responsible for overall outcomes

  • A pharmacist leader focused on prescribing

  • An action to improve prescribing, such as requiring reassessment of prescriptions after 48 hours for drug choice, dose, and duration

  • Monitoring of prescribing and resistance patterns

  • Regular reporting of resistance information to clinicians

  • Education about resistance and judicious prescribing

Boucher, who was hired by Tufts to lead its stewardship program, said that not only were these steps reasonable, but that taking them may also bring other benefits for hospitals. She explained that Tufts has saved millions of dollars by improving its stewardship of antibiotics.

Combes emphasized that the recommendations are not intended to limit physicians’ autonomy but to give them the information they need to provide the best care possible. In an age when “health care has become more of a team sport,” he said, the expertise of pharmacists and infectious disease specialists can help a physician choose the right drug.

“This shouldn’t be viewed as a bureaucratic obstacle to good clinical care,” he said. “This is good clinical care.”

The CDC is also recommending that hospitals work more closely with local public health agencies and neighboring health care facilities to better control the spread of microbes between facilities.

“Our hospitals are just one part of a continuous system of care,” said Combes.

With neutrons, scientists can now look for dark energy in the lab.


It does not always take a huge accelerator to do particle physics: First results from a low energy, table top alterative takes validity of Newtonian gravity down by five orders of magnitude and narrows the potential properties of the forces and particles that may exist beyond it by more than one hundred thousand times. Gravity resonance spectroscopy, a method developed at the Vienna University of Technology, is so sensitive that it can now be used to search for Dark Matter and Dark Energy.

All the particles we know to exist make up only about five per cent of the mass and energy of the universe. The rest – “Dark Matter” and “Dark Energy” – remains mysterious. A European collaboration lled by researchers from the Vienna University of Technology has now carried out extremely sensitive measurements of gravitational effects at very small distances at the Institut Laue-Langevin (ILL) in Grenoble. These experiments provide limits for possible new particles or fundamental forces, which are a hundred thousand times more restrictive than previous estimations. 

Undiscovered Particles? 

Dark matter is invisible, but it acts on matter by its gravitational pull, influencing the rotation of galaxies. Dark energy, on the other hand, is responsible for the accelerated expansion of the universe. It can be described by introducing a new physical quantity – Albert Einstein’s Cosmological Constant. Alternatively, so-called quintessence theories have been put forward: “Perhaps empty space is not completely empty after all, but permeated by an unknown field, similar to the Higgs-field”, says Professor Hartmut Abele (TU Vienna), director of the Atominstitut and group leader of the research group. These theories are named after Aristotle’s “quintessence” – a hypothetical fifth element, in addition to the four classical elements of ancient Greek philosophy. 

If new kinds of particles or additional forces of nature exist, it should be possible to observe them here on earth. Tobias Jenke and Hartmut Abele from the Vienna University of Technology developed an extremely sensitive instrument, which they used together with their colleagues to study gravitational forces. Neutrons are perfectly suited for this kind of research. They do not carry electric charge and they are hardly polarizable. They are only influenced by gravity – and possibly by additional, yet unknown forces. Theoretical calculations analysing the behaviour of the  were done by Larisa Chizhova, Professor Stefan Rotter and Professor Joachim Burgdörfer (TU Vienna). U. Schmidt from Heidelberg University and T. Lauer from TU Munich contributed with an analytic tool. 

 

Forces at Small Distances 

The technique they developed takes very slow neutrons from the strongest continuous ultracold neutron source in the world, at the ILL in Grenoble and funnels them between two parallel plates. According to quantum theory, the neutrons can only occupy discrete quantum states with energies which depend on the force that gravity exerts on the particle. By mechanically oscillating the two plates, the quantum state of the neutron can be switched. That way, the difference between the energy levels can be measured. 

“This work is an important step towards modelling gravitational interactions at very short distances. The ultracold neutrons produced at ILL together with the measurement devices from Vienna are the best tool in the world for studying the predicted tiny deviations from pure Newtonian gravity”, says Peter Geltenbort (ILL Grenoble). 

Different parameters determine the level of precision required to find such tiny deviations – for instance the coupling strength between hypothetical new fields and the matter we know. Certain parameter ranges for the coupling strength of quintessence particles or forces have already been excluded following other high-precision measurements. But all previous experiments still left a large parameter space in which new physical non-Newtonian phenomena could be hidden. 

 

A Hundred Thousand Times Better than Other Methods 

The new neutron method can test theories in this parameter range: “We have not yet detected any deviations from the well-established Newtonian law of gravity”, says Hartmut Abele. “Therefore, we can exclude a broad range of parameters.” The measurements determine a new limit for the coupling strength, which is lower than the limits established by other methods by a factor of a hundred thousand. 

Even if the existence of certain hypothetical quintessence particles is disproved by these measurements, the search will continue as it is possible that new physics can still be found below this improved level of accuracy. Therefore, Gravity Resonance Spectroscopy will need to be improved further – and increasing the accuracy by another few orders of magnitude seems feasible to the Abele’s team. However, if even that does not yield any evidence of deviations from known forces, Albert Einstein would win yet another victory: his cosmological constant would then appear more and more plausible.

There’s A Blood Test To Predict Just About Everything, But How Much Would You Want To Know?


Your bloodstream is a messy, tumbling universe of many inhabitants — all of them racing around, eternally hurried, never quite finishing the work they intended to do. Luckily, medical science has made something of an art of capturing this chaos, as the common blood test is perhaps the most useful tool we have to assess a person’s health.

blood test

We’re getting so good, in fact, that just about every major disease has been found, in some capacity, to show up as certain risk factors in a blood test. These include: concussions, cancers, obesity, heart attack, cataracts, developmental delays in newborns, multiple sclerosis, Alzheimer’s disease, fibromyalgia, diabetes, and even death. Countless lives have been saved as a result of these diagnoses, yet on an individual level an important question persists: How much speculation about your health is too much?

For blood tests to mean anything, scientists need to find telltale “biomarkers.” Basically, biomarkers are the red flags in your blood. They can be substances or states, meaning they can be specific tracers introduced into the body, like in the case of certain radioactive isotopes that rainbow under an X-ray, or they can be electrical signals coming from your brain.

Biomarkers also exist naturally in your body. Such is the case with prostate-specific antigens, or PSA. Over time, science has discovered that the greater presence of PSA has a direct correlation to the size of a man’s prostate, which could indicate the incidence of cancer. Blood tests rely on these naturally-occurring associations: The higher your number is, the greater your chance is for some known disease or condition.

The problem is, biomarkers aren’t perfect. PSA tests, for instance, have a notoriously high false-positive rate. They’re highly sensitive in their testing but equally unspecific. In other words, while they are quite good at detecting cancer, they tend to overestimate what else is going on, which means doctors may order painful follow-up biopsies even though there is no cancer present.

  

Doctors, for their part, may be falling into something of a statistical trap. One 1982 studyasked physicians a hypothetical question. The skeleton of the problem has been changed multiple times over the decades, but the effect is equally perplexing: John says he’s itchy. There’s an allergy test that can determine whether he has allergies, but like other tests, it comes with a false-positive risk. In this case, it’s 10 percent. The question is: If one percent of the population has the allergy and John’s test results say “Yes,” what are the chances he actually has the allergy?

The correct answer, to many people’s surprise, is seven percent. “Unfortunately,” the researchers wrote, “most physicians (approximately 95 out of 100…) misinterpret the statements about the accuracy of the test and estimate [the prevalence rate] to be about 75%.” Their thinking was backward, the researchers point out. They assumed that the chances John had an allergy given a positive test result were equal to the chances of a positive test result in a patient with cancer. The difference is subtle, but hugely important. “The latter probability is the one measured in clinical research programs and is very familiar, but it is the former probability that is needed for clinical decision making.”

Now multiply these simple errors in reasoning across the dozens of blood tests currently available to patients. Blood tests may be reliable indicators of fat levels and cholesterol in the blood, but when it comes to associating the results with larger, more urgent diseases — especially those that come with painful or costly treatment regimens — not all information can be trusted. Now more than ever, patients should stick to the sage advice: Get a second opinion.

%d bloggers like this: