Study finds laundry detergent pods, serious poisoning risk for children


Laundry detergent pods began appearing on U.S. store shelves in early 2010, and people have used them in growing numbers ever since. The small packets can be tossed into a washing machine without ever having to measure out a liquid or powder. The convenience, though, has come with risks for young children.

A new study from researchers at Nationwide Children’s Hospital found that from 2012 through 2013, U.S. poison control centers received reports of 17,230 children younger than 6 years of age swallowing, inhaling, or otherwise being exposed to chemicals in pods. That’s nearly one young child every hour. A total of 769 were hospitalized during that period, an average of one per day. One child died.

One and two year-olds accounted for nearly two-thirds of cases. Children that age often put items in their mouths as a way of exploring their environments. Children who put detergent pods in their mouths risk swallowing a large amount of concentrated chemicals. The vast majority of exposures in this study were due to ingestion.

“Laundry detergent pods are small, colorful, and may look like candy or juice to a young child,” said Marcel J. Casavant, MD, a co-author of the study, chief of toxicology at Nationwide Children’s Hospital and medical director of the Central Ohio Poison Center. “It can take just a few seconds for children to grab them, break them open, and swallow the toxic chemicals they contain, or get the chemicals in their eyes.”

Nearly half (48%) of children vomited after laundry detergent pod exposure. Other common effects were coughing or choking (13% of cases), eye pain or irritation (11%), drowsiness or lethargy (7%) and red eye or conjunctivitis (7%).

A leading manufacturer of laundry detergent pods began changing its packaging in the spring of 2013, introducing containers that were not see-through and adding latches and a warning label to the containers. However, laundry detergent pods from many makers continue to be sold in see-through packages with zip-tops or other easily opened containers.

“It is not clear that any laundry detergent pods currently available are truly child resistant; a national safety standard is needed to make sure that all pod makers adopt safer packaging and labeling,” said Gary Smith, MD, DrPH, the study’s senior author and director of the Center for Injury Research and Policy at Nationwide Children’s Hospital. “Parents of young children should use traditional detergent instead of detergent pods.”

Parents and child caregivers can help children stay safe by following these tips:

  • Parents with young children and child caregivers should use traditional laundry detergent, which is much less toxic than laundry detergent pods.
  • Store laundry detergent pods up, away, and out of sight – in a locked cabinet is best.
  • Close laundry detergent pod packages or containers and put them away immediately after use.
  • Save the national Poison Help Line number ( 1-800-222-1222) in your cell phone and post it near your home phones.

Data for this study came from the National Poison Database System, the most comprehensive and accurate database available for investigation of poisonings in the United States. The study was conducted by researchers at the Center for Injury Research and Policy and the Central Ohio Poison Center, both at Nationwide Children’s Hospital, and The Ohio State University College of Medicine.

‘Darwin’s Dilemma’ May Be Solved .


PHOTO: Charles Darwin at his home at Down House, Kent, 1880.

 Scientists following two different lines of evidence have just published research that may help resolve “Darwin’s dilemma,” a mystery that plagued the father of evolution until his death more than a century ago.

Biologists and geologists have been puzzled for decades over why life began so early on this planet, and then took so long to get interesting.

Some estimates indicate the earth was only a few tens of millions of years old when the first simple organisms appeared. There was a little evolution over the first billion years when single-celled organisms morphed into bacteria, slimy algae and other simple kinfolk, but it was still pretty dull.

It didn’t get much better until nearly 600 million years ago when the most dramatic period in the biological history of the planet erupted in what has become known as the “Cambrian Explosion.”

Those boring organisms from early earth evolved into forms of nearly every plant and animal on the planet today in what has seemed like an incredibly short period of time.

It seemed so fast, in fact, that Charles Darwin worried that it might undermine his theory of evolution, thus giving birth to “Darwin’s dilemma.”

Darwin thought evolution was a very slow process, proceeding in tiny changes over many generations — which is not well understood by some people today. But if he was right, how could life have evolved so quickly during the Cambrian era, advancing from simple forms to complex plants and animals in the geological equivalent of the blinking of an eye?

Of course, we know today that Darwin was right about evolution, but wrong about the bleak fossilized record, which was incredibly thin when he was around in the 1800s.

PHOTO: Christopher Reinhard of Georgia Institute of Technology, left, and Noah Planavsky of Yale University collecting fossils in China.

Yale University
PHOTO: Christopher Reinhard of Georgia Institute of Technology, left, and Noah Planavsky of Yale University collecting fossils in China.

Excavations around the world have since uncovered fossils that show the change was rapid, but not too rapid to be inconsistent with evolutionary theory. It actually took millions of years.

But still, why it happened as quickly as it did is still highly debatable. Something must have changed dramatically, and scientists continue to chip away at that annoying conundrum. Two recent studies published days apart may help clear the air.

A paper published last week in Science by Noah Planavsky of Yale University and Christopher Reinhard of Georgia Institute of Technology, based on ancient sediments from China, Australia, Canada and the United States, suggests that scientists have long overestimated the amount of oxygen in the earth’s atmosphere in the pre-Cambrian era just before the “explosion.”

Many had thought the air was about 40 percent oxygen (around twice what it is today) but oxidized chromium — which is directly linked to oxygen in the atmosphere — in those sediments indicates the percentage was only about one-10th of one percent.

No complex organism known today could survive in a world with that little oxygen, so if this team is correct, the stage was not yet set for rapid evolutionary processes. Something had to change before the explosion could occur.

Meanwhile, other evidence published last week in the journal Geology suggests that very dramatic changes driven by the tectonic breakup of the so-called “supercontinents” of the pre-Cambrian era could have caused an extraordinary leap in oxygen levels of both the ancient oceans and the earth’s atmosphere.

This Device Diagnoses Hundreds of Diseases Using a Single Drop of Blood .


rHEALTH X1.

The digital health revolution is still stuck.

Tech giants are jumping into the fray with fitness offerings like Apple Health and Google Fit, but there’s still not much in the way of, well, actual medicine. The Fitbits and Jawbones of the world measure users’ steps and heart rate, but they don’t get into the deep diagnostics of, say, biomarkers, the internal indicators that can serve as an early warning sign of a serious ailment. For now, those who want to screen for a disease or measure a medical condition with clinical accuracy still need to go to the doctor.

Dr. Eugene Chan and his colleagues at the DNA Medical Institute (DMI) aim to change that. Chan’s team has created a portable handheld device that can diagnose hundreds of diseases using a single drop of blood with what Chan claims is gold-standard accuracy. Known as rHEALTH, the technology was developed over the course of seven years with grants from NASA, the National Institutes of Health, and the Bill and Melinda Gates Foundation. On Monday, the team received yet another nod (and more funding) as the winners of this year’s Nokia Sensing XChallenge, one of several competitions run by the moonshot-seeking XPrize Foundation.

The goal of the XChallenge is to accelerate innovation in sensor technologies that address healthcare problems. Teams came up with tools intended to quickly and easily allow individuals to detect possible health problems without having to rely on analysis from large, facility-bound lab instruments. First hatched by DMI in response to a NASA challenge to create a diagnostics device that could work even in space, rHEALTH was portable from the beginning.

“There used to be no method for good, autonomous diagnosis,” Chan tells WIRED. “rHEALTH technology is highly sensitive, quantitative, and capable of meeting the FDA’s bar for sophistication, while still being geared for consumers.”

Blood to Bluetooth

Here’s how it works: One small drop of blood is dropped into a small receptacle, where nanostrips and reagents react to the blood’s contents. The whole cocktail then goes through a spiral micro-mixer and is streamed past lasers that use variations in light intensity and scattering to come up with a diagnosis, from flu to a more serious illness such as pneumonia—or even Ebola—within a few minutes. There’s also a vitals patch that users can wear to get continuous health readings—EKG, heart rate, body temperature—delivered to their smartphone or the rHEALTH device itself via a Bluetooth link. An app called CHAS (Comprehensive Health Assessment Unit) can walk the user through the process of self-diagnosis.

The real innovation of rHEALTH, according to Chan, is in getting all the diagnostics technologies packed together into one handheld device. By shrinking its components so much compared to traditional devices, Chan says, patients will need to give 1,500 times less blood than they would for regular tests. Since it was originally developed for NASA, the device has even been tested in simulated lunar and zero gravity. “It’s a symphony of innovations, but we’ve pushed all of them individually to create the device,” Chan says.

THE HOPE IS THAT PEOPLE WILL USE THE TECHNOLOGY TO MAKE MEANINGFUL LIFESTYLE CHANGES BASED ON REAL, ROBUST MEDICAL DATA.

Right now, rHEALTH is reliable for cell counts, HIV detection, vitamin D levels, and various protein markers in the body. The next challenges, according to Chan, are adding more tests, scaling up production, and going through the laborious process of getting the rHEALTH commercialized. The company is manufacturing three different models: the rHEALTH One, which will be used for translational research; the rHEALTH X, meant to be used as a kind of power tool for clinicians; and the rHEALTH X1, which will be available for consumers.

Since the rHEALTH One must only be vetted by Institutional Review Boards (IRBs) before being used in research—it doesn’t have to meet stringent FDA standards it will need to reach before being marketing to physicians and consumers—Chan says DMI can ship units in a matter of weeks to interested scientists. Chan’s team will learn from how it’s utilized in research settings to make improvements.

Making Real Changes

It could be a while before consumers actually get access to rHEALTH. In the meantime, the next challenge for Chan and his team is to prepare for the bigger, $10 million challenge from the XPrize Foundation, the Tricorder XPRIZE, which the Nokia Sensing XChallenge was set up to feed. The goal is to create a universal, Star Trek-inspired medical diagnostic tool that detects up to 16 separate health conditions. Of the 11 teams included in the Sensing XChallenge, only DMI is also a Tricorder finalist.

When rHEALTH finally does become available to consumers, Chan says the hope is that people will use the technology to make meaningful lifestyle changes based on the real, robust medical data from the device—a step beyond what he sees as the typical fitness tracker.

“It’s interesting to see how people interact with wearables,” says Chan. “A lot of them think of them as toys or gadgets. That’s not what rHEALTH is. It’s really meant to help you take care of yourself when you’ve got a serious health condition.”

Genetically modified crops.


ON NOVEMBER 4th voters in Colorado rejected a ballot initiative that would have required special labels for foods made with genetically modified (GM) ingredients. As The Economistwent to press, voters in Oregon seemed likely to say no to a similar proposal there, though the count was not complete. Regardless of the outcome, however, the referendums indicate the strength of feeling generated by GM crops: the Oregon vote was the costliest ballot in the state’s history. By chance, the day before the poll saw the publication in PLOS ONE of the largest review yet conducted of the crops’ effects on farming. It concludes that these have been overwhelmingly positive.

The review in question is a meta-analysis. This is a statistically rigorous study of studies, rather than a mere summary of the literature. Its authors, Matin Qaim and Wilhelm Klümper, both of Göttingen University, in Germany, went through all examinations of the agronomic and economic impacts of GM crops published in English between 1995 and March 2014. This provides a near-complete survey. Most studies of the subject have been published in English, and the widespread adoption of such crops began only in the mid-1990s.

Commercial genetic modification for crops comes in two forms. One makes them resistant to insect pests. The other confers tolerance to glyphosate, enabling farmers to spray their fields with this herbicide and kill off all the other plants (ie, the weeds) in them. As a consequence, the study found, herbicide-tolerant crops have lower production costs—though this was not true for insect-resistant crops, where the need for less pesticide was offset by higher seed prices, and overall production costs were thus about the same as for unmodified crops. With both forms of modification, however, the yield rise was so great (9% above non-GM crops for herbicide tolerance and 25% above for insect resistance) that farmers who adopted GM crops made 69% higher profits than those who did not.

Many poor countries eschew GM crops, fearing they will not able to export them to areas which ban them, notably the European Union. This has a big opportunity cost. Dr Qaim and Dr Klümper found that GM crops do even better in poor countries than in rich ones. Farmers in developing nations who use the technology achieve yields 14 percentage points above those of GM farmers in the rich world. Pests and weeds are a bigger problem in poor countries, so GM confers bigger benefits.

In debates about GM the methodology of studies has often generated as much controversy as the crops themselves. Drs Klümper and Qaim have done something to moderate these controversies, too. Though some studies they include were not peer-reviewed, and a few of the early ones did not report sample sizes, limiting their value, the data they used for the meta-analysis—which include conference papers, working papers and book chapters as well as work published in academic journals—may correct for perceived publication bias, the tendency of journals to publish only the most dramatic findings. This large body of evidence enabled the authors to control for possible differences in matters other than whether a crop was modified or not, such as fertiliser use. They also found that who pays for a study does not seem to influence its results.

Dr Klümper and Dr Qaim conclude by expressing a hope that their work “may help to gradually increase public trust in this promising technology”. To judge by the heat generated in Oregon and Colorado, that may take time.

New study suggests cats become domesticated for treats .


Scientists have finally pinpointed the genes that separate your feline friend from a wildcat, and the majority are those associated with reward and pleasure.

Your cat may be one of your best friends, but how much do you really know about its heritage? Cats and humans have shared the same households for the past 9,000 years, but this is nothing compared to the 30,000 years that dogs have been domesticated. With much less information to go on, it’s so far been very difficult for scientists to figure out the genetic roots that separate your feline friend from their wild relatives.

In an attempt to find out what makes cats want to cuddle with you on the couch instead of trying to eat you, researchers from Washington University School of Medicine in the US have analysed the genomes of domesticated cats and wildcats to discover just how different certain genes have become. The wildcat (Felix silvestris) is the ancestor of the domestic cat, and subspecies are found in Europe, Africa, and West to Central Asia.

To identify these genes, the team first sequenced the genome of a domestic femaleAbyssinian, and compared it to the genomes of six other domestic cat breeds and two wildcat subspecies.

The results published in the Proceedings of the National Academy of Sciences, identified 13 important genes that have changed as cats morphed from feral to friendly. Some of these genes are involved in cognition and motivation, including the ability to learn new behaviours when offered rewards, which is an important part of the domestication process.

The researchers also uncovered 281 other genes that are responsible for the various differences between the physiology of domestic cats and wildcats. Some of these genes influence fat metabolism, which the team suggests is how domestic cats have adapted to a less carnivorous lifestyle than wildcats.

The team also compared the domestic cat genome with those of other mammals including dogs, cows and humans.

Compared to dogs, cats have several more copies of genes to detect sex hormones, which enables them to monitor their social environment. This ability is not as important for dogs, as they tend to travel in packs. Interestingly, dogs had more copies of the genes for smell receptors than cats, which contributes to their incredible sense of smell.

“The study is great, especially in defining changes in the genome that have led to domestication or, more correctly, to the adaptation of the ancestors of domestic cats that allowed them to associate with humans and thus gain both protection from their predators and an ample food supply (rodents),” Niels Pederson, a veterinary researcher at the University of California in the US who was not involved in the study, told Tia Ghose from Live Science. 

The research is the first evidence for the evolutionary history of cats and the team now hopes to analyse specific genomic regions to see how they impact the behaviour of our feline friends.

Thousands of never-before-seen human genome variations uncovered .


Thousands of never-before-seen genetic variants in the human genome have been uncovered using a new genome sequencing technology. These discoveries close many human genome mapping gaps that have long resisted sequencing. The technique, called single-molecule, real-time DNA sequencing, may now make it possible for researchers to identify potential genetic mutations behind many conditions whose genetic causes have long eluded scientists.
DNA (stock illustration). Researchers are exploring the use of new technologies for uncovering genetic variations in humans and for closing gaps in the mapping of the human genome.

Thousands of never-before-seen genetic variants in the human genome have been uncovered using a new genome sequencing technology. These discoveries close many human genome mapping gaps that have long resisted sequencing.

“We now have access to a whole new realm of genetic variation that was opaque to us before,” Eichler said.

Eichler and his colleague report their findings Nov. 10 in the journal Nature.

To date, scientists have been able to identify the genetic causes of only about half of inherited conditions. This puzzle has been called the “missing heritability problem.” One reason for this problem may be that standard genome sequencing technologies cannot map many parts of the genome precisely. These approaches map genomes by aligning hundreds of millions of small, overlapping snippets of DNA, typically about 100 bases long, and then analyzing their DNA sequences to construct a map of the genome.

This approach has successfully pinpointed millions of small variations in the human genome. These variations arise from substitution of a single nucleotide base, called a single-nucleotide polymorphisms or SNP. The standard approach also made it possible to identify very large variations, typically involving segments of DNA that are 5,000 bases long or longer. But for technical reasons, scientists had previously not been able to reliably detect variations whose lengths are in between — those ranging from about 50 to 5,000 bases in length.

The SMRT technology used in the new study makes it possible to sequence and read DNA segments longer than 5,000 bases, far longer than standard gene sequencing technology.

This “long-read” technique, developed by Pacific Biosciences of California, Inc. of Menlo Park, Calif., allowed the researchers to create a much higher resolution structural variation map of the genome than has previously been achieved. Mark Chaisson, a postdoctoral fellow in Eichler’s lab and lead author on the study, developed the method that made it possible to detect structural variants at the base pair resolution using this data.

To simplify their analysis, the researchers used the genome from a hydatidiform mole, an abnormal growth caused when a sperm fertilizes an egg that lacks the DNA from the mother. The fact that mole genome contains only one copy of each gene, instead of the two copies that exist in a normal cell. simplifies the search for genetic variation.

Using the new approach in the hydatidiform genome, the researchers were able to identify and sequence 26,079 segments that were different from a standard human reference genome used in genome research. Most of these variants, about 22,000, have never been reported before, Eichler said.

“These findings suggest that there is a lot of variation we are missing,” he said.

The technique also allowed Eichler and his colleagues to map some of the more than 160 segments of the genome, called euchromatic gaps, that have defied previous sequencing attempts. Their efforts closed 50 of the gaps and narrowed 40 others.

The gaps include some important sequences, Eichler said, including parts of genes and regulatory elements that help control gene expression. Some of the DNA segments within the gaps show signatures that are known to be toxic to Escherichia coli, the bacteria that is commonly used in some genome sequencing processes.

Eichler said, “It is likely that if a sequence of this DNA were put into an E. coli, the bacteria would delete the DNA.” This may explain why it could not be sequenced using standard approaches. He added that the gaps also carry complex sequences that are not well reproduced by standard sequencing technologies.

“The sequences vary extensively between people and are likely hotspots of genetic instability,” he explained.

For now, SMRT technology will remain a research tool because of its high cost, about $100,000 per genome.

Eichler predicted, “In five years there might be a long-read sequence technology that will allow clinical laboratories to sequence a patient’s chromosomes from tip to tip and say, ‘Yes, you have about three to four million SNPs and insertions deletions but you also have approximately 30,000-40,000 structural variants. Of these, a few structural variants and a few SNPs are the reason why you’re susceptible to this disease.’ Knowing all the variation is going to be a game changer.”


Story Source:

The above story is based on materials provided by University of Washington Health Sciences/UW Medicine. Note: Materials may be edited for content and length.


Journal Reference:

  1. Mark J. P. Chaisson, John Huddleston, Megan Y. Dennis, Peter H. Sudmant, Maika Malig, Fereydoun Hormozdiari, Francesca Antonacci, Urvashi Surti, Richard Sandstrom, Matthew Boitano, Jane M. Landolin, John A. Stamatoyannopoulos, Michael W. Hunkapiller, Jonas Korlach, Evan E. Eichler. Resolving the complexity of the human genome using single-molecule sequencing.Nature, 2014; DOI: 10.1038/nature13907

Playing action video games can boost learning, new study reports


A new study shows for the first time that playing action video games improves not just the skills taught in the game, but learning capabilities more generally.

“Prior research by our group and others has shown that action gamers excel at many tasks. In this new study, we show they excel because they are better learners,” explained Daphne Bavelier, a research professor in brain and cognitive sciences at the University of Rochester. “And they become better learners,” she said, “by playing the fast-paced action games.”

According to Bavelier, who also holds a joint appointment at the University of Geneva, our brains keep predicting what will come next—whether when listening to a conversation, driving, or even preforming surgery. “In order to sharpen its prediction skills, our brains constantly build models, or ‘templates,’ of the world,” she explained. “The better the template, the better the performance. And now we know playing action actually fosters better templates.”

Action Players vs. Non-Action Players

In the current study, published in the Proceedings of the National Academy of Sciences, Bavelier and her team first used a pattern discrimination task to compare action video game players’ visual performance with that of individuals who do not play action video games.

The action-gamers outperformed the non-action gamers. The key to the action-gamers success, the researchers found, was that their brains used a better template for the task at hand.

Video Training

Then, the team conducted another experiment to determine if habitual players of fast-paced, action-rich video games may be endowed with better templates independently of their game play, or if the action game play lead them to have better templates.

Individuals with little video game experience were recruited, and as part of the experiment, they were asked to play video games for 50 hours over the course of nine weeks. One group played action video games, e.g., Call of Duty. The second group played 50 hours of non-action video games, such as The Sims.

The trainees were tested on a pattern discrimination task before and after the video game “training.” The test showed that the action video games players improved their templates, compared to the control group who played the non-action video games. The authors then turned to neural modeling to investigate how action video games may foster better templates.

Measuring Learning

When the researchers gave action gamers a task, the team found that the action video game players were able to build and fine tune templates quicker than non-action game control participants. And they did so on the fly as they engaged in the task.

Being a better learner means developing the right templates faster and thus better performance. And playing action video games, the research team found boosts that process.

“When they began the perceptual learning task, action video gamers were indistinguishable from non-action gamers; they didn’t come to the task with a better template,” said Bavelier. “Instead, they developed better templates for the task, much, much faster showing an accelerated learning curve.”

The researchers also found that the action gamers’ improved performance is a lasting effect. When tested several months to a year later, the action-trained participants still outperformed the other participants, suggesting that they retained their ability to build better templates.

Bavelier’s team is currently investigating which characteristics in action video games are key to boost players’ learning. “Games other than action video games may be able to have the same effect,” she said. “They may need to be fast paced, and require the player to divide his or her attention, and make predictions at different time scales.”

Moderate drinking is healthy only for some people, study finds.


A new study confirms that moderate alcohol consumption can protect against coronary heart disease. But only for the 15% of the population that have a particular genotype.

A new study confirms that moderate alcohol consumption can protect against coronary heart disease. But only for the 15% of the population that have a particular genotype.
A new study at Sahlgrenska Academy, University of Gothenburg, confirms that moderate alcohol consumption can protect against coronary heart disease. But only for the 15% of the population that have a particular genotype.

Protective effect The results, which have been published in Alcohol, confirm the findings of the earlier studies. Moderate consumption of alcohol helps protect people with the genotype against coronary heart disease.

“In other words, moderate drinking has a protective effect among only 15% of the general population,” says Professor Dag Thelle, Professor Emeritus at Sahlgrenska Academy, University of Gothenburg.

Sweeping advice Thus, the researchers believe that the advice frequently given about the health benefits of moderate alcohol consumption is far too sweeping.

“Moderate drinking alone does not have a strong protective effect,” says Professor Lauren Lissner, who also participated in the study. “Nor does this particular genotype. But the combination of the two appears to significantly reduce the risk of coronary heart disease.”

Unknown mechanisms The genotype codes for the Cholesterylester transfer protein (CETP), which affects the ‘good,’ cardio-protective HDL cholesterol that helps remove excess lipids from the blood vessels. One hypothesis is that alcohol somehow affects the CETP in a way that benefits HDL cholesterol.

A second hypothesis is that alcohol contains healthy, protective antioxidants.

The researchers believe that one or both of the hypotheses may prove correct, but the mechanisms by which HDL cholesterol or antioxidants might act remain unknown.

“Our study represents a step in the right direction,” Professor Thelle says, “but a lot more research is needed. Assuming that we are able to describe these mechanisms, it may be a simple matter one day to perform genetic testing and determine whether someone belongs to the lucky 15%. That would be useful to know when offering advice on healthy alcohol consumption. But the most important thing is to identify new means of using the body’s resources to prevent coronary heart disease.”


Story Source:

The above story is based on materials provided by University of Gothenburg. Note: Materials may be edited for content and length.


Journal Reference:

  1. Kirsten Mehlig et al. CETP TaqIB genotype modifies the association between alcohol and coronary heart disease: The INTERGENE case-control study.Alcohol, 2014 DOI: 10.1016/j.alcohol.2014.08.01

The Rise of All-Purpose Antidepressants .


Antidepressant use among Americans is skyrocketing. Adults in the U.S. consumed four times more antidepressants in the late 2000s than they did in the early 1990s. As the third most frequently taken medication in the U.S., researchers estimate that 8 to 10 percent of the population is taking an antidepressant. But this spike does not necessarily signify a depression epidemic. Through the early 2000s pharmaceutical companies were aggressively testing selective serotonin reuptake inhibitors (SSRIs), the dominant class of depression drug, for a variety of disorders—the timeline below shows the rapid expansion of FDA-approved uses.

As the drugs’ patents expired, companies stopped funding studies for official approval. Yet doctors have continued to prescribe them for more ailments. One motivating factor is that SSRIs are a fairly safe option for altering brain chemistry. Because we know so little about mental illness, many clinicians reason, we might as well try the pills already on the shelf.

Common Off-Label Uses
Doctors commonly use antidepressants to treat many maladies they are not approved for. In fact, studies show that between 25 and 60 percent of prescribed antidepressants are actually used to treat nonpsychological conditions. The most common and well-supported off-label uses of SSRIs include:

  • Abuse and dependence
  • ADHD (in children and adolescents)
  • Anxiety disorders
  • Autism (in children)
  • Bipolar disorder
  • Eating disorders
  • Fibromyalgia
  • Neuropathic pain
  • Obsessive-compulsive disorder
  • Premenstrual dysphoric disorder

 

Investigational Uses
SSRIs have shown promise in clinical trials for many more disorders, and some doctors report using them successfully to treat these ailments:

  • Arthritis
  • Deficits caused by stroke
  • Diabetic neuropathy
  • Hot flashes
  • Irritable bowel syndrome
  • Migraine
  • Neurocardiogenic syncope (fainting)
  • Panic disorder
  • Post-traumatic stress disorder
  • Premature ejaculation

 

An Expanding Repertoire: Above are the SSRIs approved in the U.S. and the dates the FDA approved each to treat various disorders.

 

Robot that moves like an inchworm could go places other robots can’t


The peculiar way that an inchworm inches along a surface may not be fast compared to using legs, wings, or wheels, but it does have advantages when it comes to maneuvering in small spaces. This is one of the reasons why researchers have designed and built a soft, worm-like robot that moves with a typical inchworm gait, pulling its body up and extending it forward to navigate its environment. The robots could one day be used in rescue and reconnaissance missions in places that are inaccessible to humans or larger robots.

The researchers, Wei Wang, et al., at Seoul National University in South Korea, have published their paper on the inchworm-inspired in a recent issue of Bioinspiration & Biomimetics.

In nature, the inchworm is the larvae phase of the geometer moth and measures about an inch or two long. The small green worm has two or three near its front, and two or three foot-like structures called “prolegs” at its rear end. Although they don’t have bones, inchworms have complex muscle systems that allow them to perform a variety of , including standing up vertically on their back prolegs.

To mimic the inchworm, the researchers used the soft, highly flexible silicone material PDMS for the robot’s body. The researchers built an inchworm mold using a 3D printer, and then poured PDMS solution into the mold. Then they glued small pieces of polyimide film to make feet at the front and rear ends. To play the role of muscle fibers, the researchers used eight longitudinal shape memory alloy (SMA) wires that extend throughout the inchworm robot’s body.

By actuating the SMA wires with electric currents, the researchers could cause the inchworm robot’s body to move with a natural inchworm gait. Actuating the SMA wires symmetrically causes the robot’s body to contract symmetrically, resulting in linear motion. Asymmetrical actuation results in asymmetric deformation and a turning locomotion using one foot as an anchor. In the inchworm gait, the feet must continually change from being used as anchors to sliding in order to generate the push-pull motion. The researchers used alternating low-friction and high-friction foot segments to replicate these foot changes.

Locomotion of the inchworm-inspired robot. The back and front views show the configuration of the feet at each step throughout the stride. Credit: Wang, et al. ©2014 IOP Publishing

Tests showed that the inchworm robot achieves a stride length of 54 mm (about 2 inches), which is about one-third of its body length, at a speed of about 3.6 mm/s. Turning is slower and more complicated, requiring 21 strides to complete a 90-degree turn. Still, this performance marks an improvement, both in stride length and turning angle, compared to previous similar robots.

In addition, the inchworm robot is simple, lightweight, and quiet. These features make the robot useful not only for rescue and reconnaissance missions, but also as a potential material for smart structures and wearable devices. In the future, the researchers plan to focus on improving the robot’s mobility using an independent control system.

“We want to apply the locomotion and control algorithm of the inchworm-inspired robot to other motor-based robots in order to make quiet, flexible, yet load-carrying machines,” coauthor Sung-Hoon Ahn, Professor at Seoul National University, told Phys.org. “We also want to extend our smart soft composite technology to other types of mechanisms, such as soft artificial limbs, soft electronic appliances, transforming automobiles, etc.”

%d bloggers like this: