Smoking ‘increases risk of breast cancer in older women by almost a fifth’


Study published in British Journal of Cancer found women who used to smoke were still 7% more at risk of disease
  • Smoking increases the risk of breast cancer in older women by almost a fifth, a study has found. The discovery adds to a growing weight of evidence linking exposure to tobacco smoke with the disease.

US scientists who tracked the progress of around 186,000 women aged 50 to 71 found that those who smoked were 19% more likely to develop breast cancer than those who had never smoked. Women who once smoked but then kicked the habit were still 7% more at risk.

The results held true even after accounting for alcohol consumption, which is a breast cancer risk factor that is more common among smokers.

In 2009 the International Agency for Research on Cancer concluded there was “limited evidence” that smoking tobacco could trigger breast cancer. But in the last few years findings have led to a U-turn in expert opinion.

Last year, researchers from the American Cancer Society reported results showing a 24% higher rate of breast cancer among women who smoked.

The risk was much greater among those who started smoking young, either before they started menstruating or before having their first child.

The new study, published in the British Journal of Cancer, covered a period of around 10 years. During this time, 7,500 of the women were diagnosed with breast cancer.

Dr Sarah Nyante, from the US National Cancer Institute in Bethesda, Maryland, said: “Our study adds to a growing body of evidence that suggests an association between cigarette smoking and increased breast cancer risk.

“Previous studies have investigated this relationship, but questions remained regarding the extent to which other breast cancer risk factors, such as alcohol intake, might influence the results. More work is now needed to understand the mechanisms behind the link between smoking and breast cancer in post-menopausal women.”

While other lifestyle factors such as being overweight, drinking alcohol and lack of exercise are all known to increase breast cancer risk in post-menopausal women, the role of smoking has been more difficult to unravel.

Smoking is the biggest preventable cause of cancer worldwide. The habit is linked not only to lung cancer but cancers of the larynx, oesophagus, oral cavity, pharynx, bladder, pancreas, kidney, liver, stomach and bowel.

In the past 50 years, it is estimated that 6.5 million people in the UK have died from tobacco-related diseases.

Dr Julie Sharp, head of health information at Cancer Research UK, which owns the British Journal of Cancer, said: “Evidence remains inconsistent as to whether smoking causes breast cancer after as well as before the menopause, but this study suggests it may increase a post-menopausal woman’s risk of breast cancer if she smokes or has smoked in the past.

“Quitting is not easy but, given that smokers lose an average of 10 years of life compared to non-smokers, the benefits are huge.”

Statins ‘may help to control MS’


Statins may be useful in treating advanced multiple sclerosis (MS), say UK researchers.

statin

Early trial results in The Lancet show the cholesterol-lowering pills slow brain shrinkage in people with MS.

The University College London (UCL) scientists say large trials can now begin.

These will check whether statins benefit MS patients by slowing progression of the disease and easing their symptoms.

“Start Quote

Scientists have worked for years to find a potential treatment that could help people, and now, finally, one has been found that might”

Dr Susan KohlhaasMS Society

MS is a major cause of disability, affecting nerves in the brain and spinal cord, which causes problems with muscle movement, balance and vision.

Currently there is no cure, although there are treatments that can help in the early stages of the disease.

Advanced disease

Usually, after around 10 years, around half of people with MS will go on to develop more advanced disease – known as secondary progressive MS.

It is this later stage disease that Dr Jeremy Chataway and colleagues at UCL hope to treat with low cost statins.

To date, no licensed drugs have shown a convincing impact on this later stage of the disease.

brain scan
This brain scan shows characteristic MS damage in the brain (highlighted in green)

For their phase two trial, which is published in the Lancet, Dr Chataway’s team randomly assigned 140 people with secondary progressive MS to receive either 80mg of a statin called simvastatin or a placebo for two years.

The high, daily dose of simvastatin was well tolerated and slowed brain shrinkage by 43% over two years compared with the placebo.

Dr Chataway said: “Caution should be taken regarding over-interpretation of our brain imaging findings, because these might not necessarily translate into clinical benefit. However, our promising results warrant further investigation in larger phase three disability-driven trials.”

The researchers believe statins may have anti-inflammatory and neuroprotective properties that can guard the nerves from damage.

In an accompanying editorial, Jacqueline Palace from the John Radcliffe Hospital, Oxford, and Neil Robertson from Cardiff University in Wales, said the trial represented a promising starting point in the quest to find a treatment for secondary progressive MS.

Dr Susan Kohlhaas, head of biomedical research at the MS Society, said: “There are no treatments that can stop the condition from worsening in people with progressive MS. Scientists have worked for years to find a potential treatment that could help people, and now, finally, one has been found that might. This is very exciting news.

“Further, larger clinical trials are now absolutely crucial to confirm the safety and effectiveness of this treatment.”

Death ‘core business’ of hospitals


The study found that 3,098 patients died within 12 months of being admitted to Scottish hospitals

Almost one in three hospital patients in Scotland will die within a year, and nearly one in 10 will die during their time in hospital, a study has found.

The Glasgow University report says the findings suggest that part of the “core business” of hospitals is people who are nearing the end of their lives.

“Start Quote

Starting those difficult conversations about end of life needs and wants is challenging work for family members and for professionals”

Professor David ClarkGlasgow University

The research team studied 10,000 people who were in 25 Scottish hospitals on one day – 31 March, 2010.

In total 3,098 patients, almost 31%, died within 12 months.

The study found that 9% died during their admission.

Older patients were more likely to die, and men were more likely to die than women.

Just over half of all male patients over the age of 85 died within the year.

Lead author Professor David Clark said: “I think what this paper really shows us is that what we call ‘acute’ hospitals really have, as part of their core business, the care of people who are coming to the end of their lives.

“The key message for me is how, as organisations, hospitals start to think more widely about the implications of that.”

The likelihood of dying in hospital has been rising, despite the fact that surveys suggest most people would like to die at home.

Ward assessments

A recent international comparison of 34 countries by the University of Auckland found that 59% of all Scottish deaths occur in hospital, and a similar number in England and Wales.

It means British hospitals rank among the top 10 countries with the greatest number of deaths in a hospital setting.

Professor Clark, who is based at Glasgow University’s Dumfries campus, said the next step was to work out how to identify those who were most likely to be nearing the end of their lives.

“A study like this is looking retrospectively,” he said. “The key challenge is how to identify people prospectively; on admission to hospital or when they’re first being assessed in the initial ward round. I think that’s now where we should focus our energy.

“That’s very tricky, of course. Starting those difficult conversations about end of life needs and wants is challenging work for family members and for professionals but I think the study shows there is an opportunity in hospital to start that conversation.”

New doping test ‘1,000 times better’


US researchers have developed a new way to detect performance-enhancing drugs that they say is 1,000 times more sensitive than current tests.

In the laboratory, the new screen detected stimulants and steroids in minute concentrations.

The method is inexpensive and works with existing equipment, the scientists claim.

If validated, the test would significantly extend the time in which cheating athletes could be caught.

The research has been presented at a meeting of the American Chemical Society (ACS).

Most testing for doping products uses a long-established technique called mass spectrometry.

This involves zapping urine samples with a beam of electrons that turns the atoms into charged particles.

“Start Quote

We may be able to detect a steroid or something that’s long lived a couple of years after it was taken”

Dr Daniel ArmstrongUniversity of Texas

These particles then travel through the spectrometer where they are weighed by a magnetic field.

As the scientists already know the weight of many steroids, for example, they are able to rapidly and accurately detect doping. But there are difficulties with this system.

Some byproducts of doping substances are so small and have a negative electrical charge that they may not produce a strong enough signal for detection.

Now chemists at the University of Texas in Arlington believe they have developed a method that builds on existing mass spectrometry techniques to detect these extremely small metabolites.

doping
Doping has long been the scourge of some sports, including professional cycling

Called Paired Ion Electrospray Ionisation (PIESI), the system uses a chemical agent to bind to the minute pieces of steroid or amphetamine and make them more visible to the detector.

“It makes them much more detectable,” Dr Daniel Armstrong, who led the research team, told BBC News.

“We’re talking about parts per trillion, sub-parts per trillion – and the amazing thing is that it is so simple.”

In laboratory tests, the system was able to detect steroids, stimulants, alcohol and depressants.

“We listed our sensitivity versus everything we found in the literature thus far, and that’s where we got this 10-1,000 times more sensitive than anything else recorded, depending on the drug you are talking about,” said Dr Armstrong.

As the key binding agent is commercially available, it should be relatively inexpensive to adapt existing detection methods, the researcher said. A variation of the method is already being used to detect minute amounts of industrial contaminants.

Open window

The new detection method would radically alter the detection window in which an athlete could be caught after taking these drugs.

This is a critical issue in the fight against doping. A detailed knowledge of the length of time a substance is detectable has been used by many cheating athletes and their scientific advisers to avoid being caught.

“With steroids, it’s about two orders of magnitude, about 100 times more sensitive. We may be able to detect a steroid or something that’s long-lived a couple of years after it was taken,” said Dr Armstrong.

The new method wouldn’t work for blood doping, nor would it detect human growth hormone, said the researchers.

The scientists will be submitting their work for peer review after the American Chemical Society meeting in Dallas.

While they have had many enquiries from reporters, so far they’ve had none from the World Anti-Doping Agency (WADA), the US anti-doping agency (USADA), or the International Olympic Committee (IOC).

Earth rocked by double space impact.


We’ve all seen the films where an asteroid hurtles towards our planet, threatening civilisation.

What’s less well known is that menacing space rocks sometimes come in twos.

Researchers have outlined some of the best evidence yet for a double space impact, where an asteroid and its moon apparently struck Earth in tandem.

Using tiny, plankton-like fossils, they established that neighbouring craters in Sweden are the same age – 458 million years old.

Details of the work were presented at the 45th Lunar and Planetary Science Conference in The Woodlands, Texas, and the findings are to be published in the Meteoritics and Planetary Science journal.

However, other scientists cautioned that seemingly contemporary craters could have landed weeks, months or even years apart.

Proposed double impact craters

  • Clearwater East and West (Canada): 26km/36km diameter, 290 million years old
  • Kamensk and Gusev (Russia): 25km/3km diameter, 49 million years old
  • Ries and Steinheim (Germany): 25km/3.8km, 14.5 or 15 million years old

A handful of possible double impacts (or doublets) are already known on Earth, but Dr Jens Ormo says there are disputes over the precision of dates assigned to these craters.

“Double impact craters must be of the same age, otherwise they could just be two craters right next to each other,” the researcher from the Centre for Astrobiology in Madrid, Spain, told BBC News.

Dr Ormo and his colleagues studied two craters called Lockne and Malingen, which lie about 16km apart in northern Sweden. Measuring about 7.5km wide, Lockne is the bigger of the two structures; Malingen, which lies to the south-west, is about 10 times smaller.

Binary asteroids are thought to form when a so-called “rubble pile” asteroid begins to spin so fast under the influence of sunlight that loose rock is thrown out from the object’s equator to form a small moon.

Trilobites
Trilobites were among the most numerous inhabitants of Ordovician seas

Telescope observations suggest that about 15% of near-Earth asteroids are binaries, but the percentage of impact craters on Earth is likely to be smaller.

Only a fraction of the binaries that strike the Earth will have the necessary separation between the asteroid and its moon to produce separate craters (those that are very close together will carve out overlapping structures).

“Start Quote

Short of witnessing the impacts, it is impossible to prove that two closely separated craters were formed simultaneously”

Dr Gareth CollinsImperial College London

Calculations suggest around 3% of impact craters on Earth should be doublets – a figure that agrees with the number of candidates already identified by researchers.

The unusual geological characteristics of both Lockne and Malingen have been recognised since the first half of the 20th Century. But it took until the mid-1990s for Lockne to be formalised as a terrestrial impact crater.

In the last few years, Dr Ormo has drilled about 145m down into the Malingen structure, through the sediment that fills it, down to crushed rocks known as breccias and deeper, reaching the intact basement rock.

Lab analysis of the breccias revealed the presence of shocked quartz, a form of the quartz mineral that is created under intense pressures and is associated with asteroid strikes.

This area was covered by a shallow sea at the time of the Lockne impact, so marine sediments would have begun to fill in any impact craters immediately after they were created.

One-two punch

Dr Ormo’s team set out to date the Malingen structure using tiny fossilised sea creatures called chitinozoans, which are found in sedimentary rocks at the site.

Their method, known as biostratigraphy, allows geologists to assign relative ages to rocks based on the types of fossil creatures found within them.

The results revealed the Malingen structure to be the same age as Lockne – about 458 million years old. This seems to confirm that the area was rocked by a double asteroid strike during the Ordovician Period.

Dr Gareth Collins, who studies impact cratering at Imperial College London, and was not involved with the research, told BBC News: “Short of witnessing the impacts, it is impossible to prove that two closely separated craters were formed simultaneously.

“But the evidence in this case is very compelling. Their proximity in space and consistent age estimates makes a binary-impact cause likely.”

Clearwater East and West
The Clearwater East and West craters are the best known candidates for a double impact

Simulations suggest the asteroid that created Lockne was some 600m in diameter, while the one that carved out Malingen was about 250m. These measurements are somewhat larger than might be suggested by their craters because of the mechanics of impacts into marine environments.

Dr Ormo added that Malingen and Lockne were just the right distance apart to have been created by a binary. As mentioned, if two space rocks are too close, their craters will overlap. But to qualify as a doublet, the craters can’t be too far apart, because they will exceed the maximum distance at which an asteroid and its moon can stay bound by gravitational forces.

“The Lockne impactor was big enough to generate what’s known as an atmospheric blow-out, where you blow away the atmosphere above the impact site,” said Dr Ormo.

This can cause material from the asteroid strike to spread around the globe, as happened during the huge Chicxulub impact thought to have killed off the dinosaurs 66 million years ago.

The Ordovician event wasn’t powerful enough for that material to be traced, as it would have been very dilute in the atmosphere. But the impact would have had regional effects; for example, any sea creatures unlucky enough to be swimming nearby would have been instantly vaporised.

Other candidate double impact craters include Clearwater East and West in Quebec, Canada; Kamensk and Gusev in southern Russia; and Ries and Stenheim in southern Germany.

The Toxins That Threaten Our Brains.


Forty-one million IQ points. That’s what Dr. David Bellinger determined Americans have collectively forfeited as a result of exposure to lead, mercury, and organophosphate pesticides. In a 2012 paper published by the National Institutes of Health, Bellinger, a professor of neurology at Harvard Medical School, compared intelligence quotients among children whose mothers had been exposed to these neurotoxins while pregnant to those who had not. Bellinger calculates a total loss of 16.9 million IQ points due to exposure to organophosphates, the most common pesticides used in agriculture.Last month, more research brought concerns about chemical exposure and brain health to a heightened pitch. Philippe Grandjean, Bellinger’s Harvard colleague, and Philip Landrigan, dean for global health at Mount Sinai School of Medicine in Manhattan, announced to some controversy in the pages of a prestigious medical journal that a “silent pandemic” of toxins has been damaging the brains of unborn children. The experts named 12 chemicals—substances found in both the environment and everyday items like furniture and clothing—that they believed to be causing not just lower IQs but ADHD and autism spectrum disorder. Pesticides were among the toxins they identified.

“So you recommend that pregnant women eat organic produce?” I asked Grandjean, a Danish-born researcher who travels around the world studying delayed effects of chemical exposure on children.

“That’s what I advise people who ask me, yes. It’s the best way of preventing exposure to pesticides.” Grandjean estimates that there are about 45 organophosphate pesticides on the market, and “most have the potential to damage a developing nervous system.”

Landrigan had issued that same warning, unprompted, when I spoke to him the week before. “I advise pregnant women to try to eat organic because it reduces their exposure by 80 or 90 percent,” he told me. “These are the chemicals I really worry about in terms of American kids, the organophosphate pesticides like chlorpyrifos.”

For decades, chlorpyrifos, marketed by Dow Chemical beginning in 1965, was the most widely used insect killer in American homes. Then, in 1995, Dow was fined $732,000 by the EPA for concealing more than 200 reports of poisoning related to chlorpyrifos. It paid the fine and, in 2000, withdrew chlorpyrifos from household products. Today, chlorpyrifos is classified as “very highly toxic” to birds and freshwater fish, and “moderately toxic” to mammals, but it is still used widely in agriculture on food and non-food crops, in greenhouses and plant nurseries, on wood products and golf courses.

Landrigan has the credentials of some superhero vigilante Doctor America: a Harvard-educated pediatrician, a decorated retired captain of the U.S. Naval Reserve, and a leading physician-advocate for children’s health as it relates to the environment. After September 11, he made news when he testified before Congress in disagreement with the EPA’s assessment that asbestos particles stirred into clouds of debris were too small to pose any real threat. Landrigan cited research from mining townships (including Asbestos, Quebec) and argued that even the smallest airborne asbestos fibers could penetrate deeply into a child’s lungs.

Chlorpyrifos is just one of 12 toxic chemicals Landrigan and Grandjean say are having grim effects on fetal brain development. Their new study is similar to a review the two researchers published in 2006, in the same journal, identifying six developmental neurotoxins. Only now they describe twice the danger: The number of chemicals that they deemed to be developmental neurotoxins had doubled over the past seven years. Six had become 12. Their sense of urgency now approached panic. “Our very great concern,” Grandjean and Landrigan wrote, “is that children worldwide are being exposed to unrecognized toxic chemicals that are silently eroding intelligence, disrupting behaviors, truncating future achievements and damaging societies.”

The chemicals they called out as developmental neurotoxins in 2006 were methylmercury, polychlorinated biphenyls, ethanol, lead, arsenic, and toluene. The additional chemicals they’ve since found to be toxins to the developing brains of fetuses—and I hope you’ll trust me that these all are indeed words—are manganese, fluoride, chlorpyrifos, tetrachloroethylene, polybrominated diphenyl ethers, and dichlorodiphenyltrichloroethane.

Grandjean and Landrigan note in their research that rates of diagnosis of autism spectrum disorder and ADHD are increasing, and that neurobehavioral development disorders currently affect 10 to 15 percent of births. They add that “subclinical decrements in brain function”—problems with thinking that aren’t quite a diagnosis in themselves—“are even more common than these neurobehavioral development disorders.”

In perhaps their most salient paragraph, the researchers say that genetic factors account for no more than 30 to 40 percent of all cases of brain development disorders:

Thus, non-genetic, environmental exposures are involved in causation, in some cases probably by interacting with genetically inherited predispositions. Strong evidence exists that industrial chemicals widely disseminated in the environment are important contributors to what we have called the global, silent pandemic of neurodevelopmental toxicity.

When their paper went to press in the journal The Lancet Neurology, the media responded with understandable alarm:

“A ‘Silent Pandemic’ of Toxic Chemicals Is Damaging Our Children’s Brains, Experts Claim” – Minneapolis Post, 2/17/14

“Researchers Warn of Chemical Impacts on Children,” –USA Today, 2/14/14

“Study Finds Toxic Chemicals Linked to Autism, ADHD” – Sydney Morning Herald, 2/16/14

When I first saw these headlines, I was skeptical. It wasn’t news that many of the chemicals on this list (arsenic, DDT, lead) are toxic. With each of these substances, the question is just how much exposure does it take to cause real damage. For instance, organophosphates aren’t something that anyone would categorically consider safe, in that they are poison. They kill insects by the same mechanism that sarin gas kills people, causing nerves to fire uncontrollably. But like asbestos, they are still legally used in U.S. commerce, with the idea that small amounts of exposure are safe. The adage “the dose makes the poison” may be the most basic premise of toxicology. And hadn’t we already taken care of lead? Didn’t we already know that alcohol is bad for fetuses? Wasn’t fluoride good for teeth?

I found that the real issue was not this particular group of 12 chemicals. Most of them are already being heavily restricted. This dozen is meant to illuminate something bigger: a broken system that allows industrial chemicals to be used without any significant testing for safety. The greater concern lies in what we’re exposed to and don’t yet know to be toxic. Federal health officials, prominent academics, and even many leaders in the chemical industry agree that the U.S. chemical safety testing system is in dire need of modernization. Yet parties on various sides cannot agree on the specifics of how to change the system, and two bills to modernize testing requirements are languishing in Congress. Landrigan and Grandjean’s real message is big, and it involves billion-dollar corporations and Capitol Hill, but it begins and ends with the human brain in its earliest, most vulnerable stages.

How Toxins Destroy Brains

About a quarter of your body’s metabolism goes toward operating and maintaining your brain. In order to process even basic information, billions of chemical signals are constantly being carried between neurons. The undertaking is so onerous that even though your brain is not moving (like, say, the powerful muscles in your legs), it uses around 10 times more calories per pound than the rest of you.

Most of that industrious brain and its 86 billion neurons were created in a matter of months. During the first few weeks of gestation, when your mother knew you only as morning sickness and you were a layer of cells huddled in one corner of her uterus, those cells lined up, formed a groove, and then closed to form a tube. One end of that tube eventually became your tiny spinal cord. The rest expanded to form the beginnings of your brain.

For a brain to develop properly, neurons must move to precise places in a precise sequence. They do so under the direction of hormones and chemical neurotransmitters like acetylcholine. The process is an intricate, fast-paced dance on a very tiny scale. Each nerve cell is about one hundredth of a millimeter wide, so it has to travel its own width 25,000 times just to move an inch—which some neurons in the cortex must. At any point, that cell can be knocked off course. Some of the neurotoxins Grandjean and Landrigan discuss have the potential to disrupt this journey, in a slight or serious fashion.

By age two, almost all of the billions of brain cells that you will ever have are in their places. Except in the hippocampus and one or two other tiny regions, the brain does not grow new brain cells throughout your life. When brain cells die, they are gone. So its initial months of formation, when the brain is most vulnerable, are critical. “During these sensitive life stages,” Grandjean and Landrigan write, exposure “can cause permanent brain injury at low levels that would have little or no adverse effect in an adult.”

Federal health officials are aware of this risk. The National Institutes of Health, as Landrigan puts it, “finally woke up in the late 1990s to the fact that children are much more sensitive and vulnerable to chemicals than adults are.” Over the past decade, the federal government has invested substantially more money in looking at just how pregnant women and children have been affected by industrial chemicals. The EPA has awarded millions of dollars in related research grants, and the NIH started funding a network of what it calls Centers for Children’s Environmental Health and Disease Prevention Research. There is one at Mount Sinai and another at Harvard (the respective homes of Landrigan and Grandjean), and there are others at Columbia, UC Berkeley, and elsewhere.

Those centers have established strong research programs called prospective birth-cohort studies. Scientists enroll pregnant female subjects and carefully record objective measures of environmental exposure, using things like blood samples, urine samples, and maybe even dust and air samples from their homes. After the babies are born, the researchers follow up with them at various points in their childhoods. These studies are expensive and take a long time, but they’re incomparably good at connecting prenatal exposures with lost IQ points, shortened attention span, or emergence of ADHD.

Functional MRI reveals the effect of prenatal methylmercury exposure in three adolescents. Subjectes were asked to tap the fingers of their left hands. In the control group (row B), only the right side of the brain was activated. In the subjects who had been exposed to methylmercury (row A), an abnormal activation pattern shows that both sides are involved. (The Lancet Neurology)“That’s the big breakthrough,” Landrigan says. “The scientific community has mastered the technique of doing these studies, and they’ve been running long enough that they’re beginning to put out some spectacularly good results.” At Columbia, for instance, the children’s center is investigating whether children exposed in the womb to BPA and polycyclic aromatic hydrocarbons (PAHs)—byproducts from burning fossil fuels—are more likely to develop learning and behavior disorders than children not exposed. They have also shown that high prenatal exposure to air pollutants like PAHs are associated with attention problems, anxiety, and depression at ages 5 to 7 years. It was this center, together with the UC Berkeley and Mount Sinai children’s centers, that first identified the detrimental impact of chlorpyrifos on IQ and brain development. The researchers even used MRI testing to show that these chemicals appear to change children’s brain structure, causing thinning of the cortex. Other children’s centers are looking at the extent to which these and other chemicals—including arsenic from well water, brominated flame retardants, and the anti-corrosion agent manganese—are to blame for a range of possible neurologic disorders.

Impressive as all this research investment is, the larger question remains: Why are we looking at these hazards now—instead of before we introduced these chemicals into the world?

The Insidious Rise of Lead

The problem with toxic substances is that their effects can be insidious. Take the example of lead—a chemical that lingered in gasoline, house paints, and children’s toys for decades before scientists realized the true extent of the damage.

The doctors began treating the boy with medication to help clear the lead. They also set out to find out where the lead was coming from. An investigation of the boy’s home, which was built in the 1990s, found no lead paint. Despite treatment, though, the boy’s lead tests remained abnormally high. So the doctors did an x-ray.

Inside the boy’s stomach was a one-inch metal medallion, which appeared bright white on the x-ray image. His parents recognized it as a toy necklace they had purchased from a vending machine approximately three weeks earlier. The state environmental quality lab later found that the medallion contained 38.8 percent lead. The manufacturer later did a voluntary recall of 1.4 million of the metal toy necklaces.

A late 19th-century advertisement for lead paint .

By that time, manufacturers had been using the toxic substance for centuries, despite clearly dangerous effects. In 1786, Benjamin Franklinwrote to a friend about the first time he heard of lead poisoning. When he was a boy, he recounted, there had been “a complaint from North Carolina against New England Rum, that it poisoned their people, giving them the dry bellyache, with a loss of the use of their limbs. The distilleries being examined on the occasion, it was found that several of them used leaden still-heads and worms, and the physicians were of the opinion that the mischief was occasioned by that use of lead.” Franklin went on to describe his observations of similar symptoms in patients at a Paris hospital. When he inquired about their occupations, he discovered that these men were plumbers, glaziers, and painters.

In 1921, General Motors began adding tetraethyl lead to gasoline. Lead gave gasoline a higher octane rating, which meant it could handle more compression without combusting. In practical terms, that meant more powerful engines, faster warplanes, and better industrial transport. The Ethyl Corporation that produced leaded gasoline was a joint venture between GM, Standard Oil, and DuPont. One of its executives, Frank Howard, called leaded gasoline “an apparent gift of God,” even as the plant where tetraethyl lead was synthesized became known as “the Houses of Butterflies,” because it was not uncommon for workers to experience hallucinations of insects on their skin.

Americans in the 1950s and ’60s were still widely exposed to unregulated leaded gasoline and paint, as well as piping, batteries, cosmetics, ceramics, and glass. Around that time, studies began to reveal the widespread existence of “subclinical” lead poisoning—damage that was not severe enough to meet diagnostic criteria for a neurologic disease, but would prevent the child from ever achieving optimal intellectual functioning. By 1969, microbiologist and Pulitzer-Prize-winning writer René Dubos said that the problem of lead exposure was “so well-defined, so neatly packaged, with both causes and cures known, that if we don’t eliminate this social crime, our society deserves all the disasters that have been forecast for it.”

Four-year-old Tanya Brinson is tested for lead paint poisoning at Boston City Hall in June 1975. 

By the mid 1970s, the average U.S. preschool child had 15 micrograms of lead per deciliter of blood. Eighty-eight percent of children had a level exceeding 10 μg/dL—which is twice what the CDC currently considers toxic. Among poor black children, the average level was markedly higher: 23 μg/dL.

Instead of making sweeping policy changes, experts largely accused low-income parents—especially mothers—of inadequate supervision and fostering pathological behaviors that led children to eat paint. With parental ineptitude to blame, and poor, minority children bearing the brunt of the problem, a systematic approach to eliminating lead was a low national priority. Bellinger recounted this in the Journal of Clinical Investigation, writing that children were essentially sentinels, used to identify the presence of lead hazards. “As long as the ranks of the lead poisoned consisted primarily of the children of politically and economically disenfranchised parents,” he wrote, “it was hard to interest politicians in the problem. Little political capital could be accumulated by tackling the problem.”

Finally in 1975, the EPA required a gradual phasing of lead out of gasoline. Two years later, the Consumer Product Safety Commission said that residential paint could contain no more than 0.06 percent lead.

Jackie Lay, adapted from

Meanwhile there is still disagreement as to what constitutes a safe level of lead exposure—and if there even is such a thing. As more and more evidence came out over the years showing that low levels are in fact toxic to developing brains, the CDC incrementally lowered that threshold—from 60 micrograms per deciliter of blood in 1970 to 40 in 1971, 30 in 1975, 25 in 1985, 10 in 1991, and finally to just five in 2012.

By 2009 the average lead concentration in the blood Americans was about 1.2 μg/dL for young children—just 8 percent what it was in 1980. But Bellinger notes that even this relatively low level is still “substantially elevated from an evolutionary perspective”—many times higher than before our ancestors “began to disturb the natural distribution of lead in the earth’s crust.”

“Are the blood lead levels of contemporary humans generally below the threshold of toxicity?” Bellinger wrote. “Let us hope so, but the conclusion that they are is based more on faith than on evidence.”

The Toothless Law and the New Test

It’s surprising to learn how little evidence there is for the safety of chemicals all around us, in our walls and furniture, in our water and air. Many consumers assume there is a rigorous testing process before a new chemical is allowed to be a part of a consumer product. Or at least some process.

“We still don’t have any kind of decent law on the books that requires that chemicals be tested for safety before they come to market,” Landrigan said.

The law we do have is the Toxic Substances Control Act (TSCA, pronouncedtoss-ka among those in the know). Passed in 1976 under President Gerald Ford, it is still today the primary U.S. law regulating chemicals used in everyday products. On its face intended to protect people and the environment from dangerous chemical exposure, it is widely acknowledged to have fallen short of its magnanimous goal. It only requires testing for a small percentage of chemicals, those deemed an “unreasonable risk.”

“It’s just an obsolete, toothless, broken piece of legislation,” said Landrigan. “For example, in the early 1990s, EPA was unable to ban asbestos under TSCA.” This was after the National Toxicology Program had classified asbestos as a known cancer-causing agent, and the World Health Organization had called for a global ban. The EPA did briefly succeed in banning asbestos in the U.S. in 1989, but a court of appeals overturned the ban in 1991. Asbestos is still used in consumer products in the U.S., including building materials like shingles and pipe wrap, and auto parts like brake pads.

Landrigan also calls it “a particularly egregious lapse” that when TSCA was enacted, the 62,000 chemicals already on the market were grandfathered in, such that no toxicity testing was required of them. These chemicals were, as Landrigan puts it, “simply presumed safe” and allowed to remain in commerce until a substantial health concern came to public attention.

In the nearly 40 years since the law’s passage, more than 20,000 new chemicals have entered the market. “Only five have been removed,” Landrigan says. He notes that the CDC has picked up measurable levels of hundreds of these chemicals in the blood and urine of “virtually all Americans.” Yet, unlike food and drugs, they enter commerce largely untested.

Landrigan and Grandjean’s purpose in declaring a silent pandemic was less about the 12 named substances and more about using them as cautionary tales. They named in their list a few chemicals that still appear be imminent threats, but they also include some that have been highly restricted in their use for a long time. And at least one of them, fluoride, has proven beneficial in small doses.

“Fluoride is very much a two-edged sword,” Landrigan said. “There’s no question that, at low doses, it’s beneficial.” Flouride has been shown to prevent dental cavities and aid skeletal growth. At higher levels, though, it causes tooth and bone lesions. The epidemiologic studies cited by Grandjean and Landrigan, which came from China, imply that high fluoride exposure has negative effects on brain growth.“Are the exposure levels in China comparable to what we have in our drinking water and toothpaste?” I asked.“No, they’re probably higher,” Landrigan said. “In some places in China, there are naturally high levels of fluoride in the groundwater, which picks it up because it’s water-soluble.”“So your advice isn’t to take it out of our toothpaste?”

“Not at all,” Landrigan said. “I think it’s very good to have in toothpaste.”

He’s more concerned about flame-retardants—a group of compounds known as polybrominated diphenyl ethers (PBDEs). These chemicals came into vogue after their predecessors, called PCBs (polychlorinated biphenyl ethers), were banned in 1979. By the time it became clear that PCBs caused cancer—and a variety of other adverse health effects on the immune, reproductive, nervous, and endocrine systems—they’d been put into hundreds of industrial and commercial uses like plastics and rubber products. So manufacturers switched to PBDEs and advertised PCB-free products, assuming—or, at least, implying—that PBDEs wouldn’t cause problems of their own.

“California, at the urging of the chemical industry several years ago, put the highest standard in the world on the levels of PBDEs that needed to be included in them,” Landrigan explained. “The result is that people in California have the highest levels of brominated flame retardants in their bodies.”

The state finally banned PDBEs in 2006, after studies from Columbia showed high quantities of the compound in women’s breast milk and linked it to IQ losses and shortening of attention span. Between 2008 and 2012, PDBE levels in the blood of California residents decreased by two-thirds.

Landrigan and Grandjean argue that stronger chemical safety legislation could have made all of this back-peddling damage control unnecessary. They don’t expect every chemical to go through long-term, randomized control studies prior to its release. Rather, they want to see industrial chemicals screened through a simple cell-based test. If that test were to come out positive—if the cells in the petri dish showed any kind of toxic reaction—then the chemical would be tested further.

“I don’t think that that should necessarily be a requirement,” Grandjean said. “But I can see if a company has developed a very useful substance, and it turns out to be toxic to nerve cells in petri dishes, then maybe that is the next step.”Landrigan and Grandjean both mentioned something they called Tox21, the Toxicology in the Twenty-First Century program program, which is laying groundwork for a new kind of accelerated, large-scale testing. “TSCA reform really falls under EPA’s jurisdiction,” Landrigan said. “At the NIH and National Institute of Environmental Health Sciences, though, that’s where the latest research on this is.”

When I heard that this Tox21 program is teaching a very large yellow robot to do large-scale rapid chemical testing, I had to learn more. Dr. Linda Birnbaum is the director of the National Institute of Environmental Health Sciences and the National Toxicology Program in North Carolina’s Research Triangle. Birnbaum oversees federal funding for research to discover how the environment influences health and disease, including Tox21.

“If you want to do the full battery of current tests that we have on a chemical, you’re looking at least five years and about $5 million,” Birnbaum told me. “We’re not going to be able to do that on large numbers of chemicals.” The robot is being trained to scan thousands of chemicals at a time and recognize threats inexpensively and quickly—before people get sick. It’s also using alternative testing models—looking at not just isolated cells, but also simple organisms like the roundworm C. elegans or zebrafish—to answer certain basic questions.

The Tox21 robot screening system at the NIH Chemical Genomics Center in Rockville, Maryland. This robot is part of a program that is refining a process to test industrial chemicals for safety quickly and efficiently. It places chemicals on plates with more than 1,500 wells that contain different types of human or rodent cells. (NIH)The program is also looking at how a single chemical might affect a wide range of people. “We’re looking at 1,000 different human genomes from nine different ethnic groups on five continents,” Birnbaum told me.

Like Landrigan, Birnbaum raised the specter of the tens of thousands of chemicals grandfathered in 1976 that underwent no testing, as well as the commonly cited data that less than 20 percent of the 80,000 chemicals in commerce have had any testing at all. She spoke wistfully of the European Union’s chemical testing protocol, a model Grandjean had told me was “very reasonable.” It’s called REACH (Registration, Evaluation, Authorization, and Restriction of Chemicals), and it involves a tiered approach to regulation: If a compound is produced in small amounts, only some cursory information is required. If greater amounts are produced or imported, the EU requires more in-depth testing, such as animal experiments and two-generation studies.

“We’ve learned a heck of a lot in the last 30 to 40 years about the safety of chemicals and what can cause problems,” Birnbaum said, “and it would be really nice if our regulations required us to use some of the newer science to answer the questions of safety.”

Don’t Panic?

“When you use the word pandemic, that’s a scare word,” said Laura Plunkett. “And that’s my problem. There’s a more responsible way to express it. I understand that they want to bring it to attention, but when you bring it to attention, you can still do it in what I would say is a scientifically defensible manner.”

Plunkett has a Ph.D. in pharmacology and toxicology. Reviewing articles written in the wake of the publicity around The Lancet Neurology paper, I was struck by the definitive title of her blog post on a site called Science 20: “There Is No Pandemic of Chemicals Causing Brain Disorders in Children.” Plunkett has been a diplomat for the American Board of Toxicology since 1984. She taught for a while and did research at NIH, but she is now an independent consultant running her own company, Integrative Biostrategies.

Jackie Lay

One of her clients is the American Chemistry Council. She also has clients in the food, pesticide, and chemical business—“industry ties,” as they say. With that in mind, I sought her out as an established scientist who has worked on the side of the chemical-producing companies. Her blog post about the Lancet article was the only response I found telling people not to panic.

“What [Landrigan and Grandjean] are doing with the data is missing the key component, which is the dose,” Plunkett explained. “Many of the chemicals they talk about are well established to be neurodevelopmental toxicants in children—but it’s all about how much they’re exposed to. Just like anything else. If you don’t give people enough, or if you don’t take enough in your water or food or the air you breathe, you’re not going to have an effect.”

Plunkett insists that, unlike lead, some of the chemicals on the Lancet Neurology list are only developmental toxicants at very high levels—the sort, she says, “that nobody would be exposed to on a daily basis.”

Plunkett says she has no problem with a call to ensure that chemical testing is as thorough as possible. “But then to say, and by the way, if you look at the data, ‘We’ve been poisoning people for the last 10 years’? That’s a whole other step that isn’t supported by the data they point to.”

I asked her how concerned American parents should be about certain individual chemicals on Grandjean and Landrigan’s list. “I mean, we knew lead was a problem 30 years ago,” she said, “and that’s why we removed it from gasoline, and that’s why we don’t let it in solder and cans, and we’ve taken lead-based paint off the market.”

“If you really look at the data on fluoride,” she continued, “trying to link an IQ deficit in a population with that chemical is almost impossible to do. Even though statistically, randomly they may have found a relationship, that doesn’t prove anything—it identifies a hazard but doesn’t prove there’s a cause and effect between the two things.”

Jackie Lay

What about the chemical that most concerned Landrigan, the pesticide chlorpyrifos?

“No, because the organophosphate pesticides are one of the most highly regulated groups of chemicals that are out there. The EPA regulates those such that if they’re used in agriculture, people are exposed to very, very low levels.”

Pesticides are indeed more regulated than other industrial chemicals. Before manufacturers can sell pesticides in the U.S., the EPA must ensure that they meet federal standards to protect human health and the environment. Only then will the EPA grant a “registration” or license that permits a pesticide’s distribution, sale, and use. The EPA also sets maximum levels for the residue that remains in or on foods once they’re sold.

An EPA spokesperson told me that a company introducing a new pesticide must “demonstrate more than 100 different scientific studies and tests from applicants.” The EPA also said that since 1996’s Food Quality Protection Act, it has added “an additional safety factor to account for developmental risks and incomplete data when considering a pesticide’s effect on infants and children, and any special sensitivity and exposure to pesticide chemicals that infants and children may have.” Landrigan and Grandjean don’t believe that’s always sufficient; the dose may make the poison, but not everyone believes the EPA’s limits are right for everyone.

When I asked Plunkett whether new industrial chemicals were being screened rigorously enough, even she cited the need to strengthen the Toxic Substances Control Act of 1976. “I’m a very strong proponent of fixing the holes we have,” she said, “and we do have some holes under the old system, under TSCA, and those are what the new improvements are going to take care of. They’re going to allow us to look at the chemicals out there we don’t have a lot of data on—and really those are the ones I’m more concerned about.”

The High Price of Lost IQ

Everyone I spoke to for this story agreed that TSCA needs to be fixed. But every attempt has met with bitter opposition. All parties want it to happen; they just want it to happen on their own terms. Unless it does, they don’t want it to happen at all.

Last May, a bipartisan group of 22 senators, led by Frank Lautenberg and David Vitter, introducing the Chemical Safety Improvement Act of 2013. Lautenberg, then 89 years old, was the last surviving World War II veteran in the Senate and a longtime champion of environmental safety. (Among other things, he wrote the bill that banned smoking on commercial airlines.) A month after he introduced his TSCA reform bill, Lautenberg died of pneumonia.

After Lautenberg’s death, Senator Barbara Boxer told reporters the bill “would not have a chance” of passing without major changes. “I will be honest with you,” said Boxer, who chairs the Committee on Environment and Public Works, “this is the most opposition I’ve ever seen to any bill introduced in this committee.” Some of the resistance came from environmental and health advocates who felt the bill would actually make it harder for states to regulate the chemicals that were grandfathered in by TSCA. Their fears intensified in January, after 10,000 gallons of a coal-processing substance poured into West Virginia’s Elk River, contaminating a nearby water treatment plant. (The Wall Street Journal reported, “Little is known about the chemical’s long-term health effects on people, although it isn’t believed to be highly toxic.”)

In February, with Lautenberg’s bill stalled in the Senate committee, Republican Representative John Shimkus seized the opportunity to introduce another reform option called the Chemicals in Commerce Act. The chemical industry applauded Shimkus’ bill—it won support from the American Chemistry Council, American Cleaning Institute, and the Society of Chemical Manufacturers and Affiliates. Earlier this month at the GlobalChem conference in Baltimore, Dow Chemical’s Director of Products Sustainability and Compliance Connie Deford said that TCSA reform was in the interests of the chemical sector, acknowledging that consumer confidence in the industry is at an all-time low.

Yet the Chemicals in Commerce Act has provoked strong criticism from groups like the Center for Environmental Health and the Natural Resources Defense Council. A senior scientist with the Environmental Defense Fund called the bill “even more onerous and paralyzing” than the present law, and Representative Henry Waxman, ranking member of the House Energy and Commerce Committee, said the bill “would weaken current law and endanger public health.”

I asked the EPA to comment on Landrigan and Grandjean’s claim that we are in the midst of a “silent pandemic” and inquired what, if anything, is being done about it. The agency responded by sending me a statement: “EPA has taken action on a number of the chemicals highlighted in this report which have and are resulting in reduced exposures, better understanding, and more informed decisions.” The agency included a list of the actions it has already taken to reduce exposure to the chemicals identified in the report. (See sidebar.) And it emphasized a 2012 “Work Plan,” which includes plans to assess more than 80 industrial chemicals in the coming years.

When I emailed the statement to Landrigan, he replied, “Many of the items that they list here are things that I helped to put in place.” (In 1997, he spent a sabbatical year setting up EPA’s Office of Children’s Health Protection.) He agreed that the EPA is doing a lot to protect children from environmental threats. “But the problem is that the good people within EPA are absolutely hamstrung by the lack of strong legislation,” he wrote. “They can set up research centers to study chemicals and outreach and education programs, but without strong and enforceable chemical safety legislation, they cannot require industry to test new chemicals before they come to market, and they cannot do recalls of bad chemicals that are already on the market.”

Meanwhile, researchers like David Bellinger, who calculated IQ losses, are highlighting the financial cost to society of widespread cognitive decline. Economist Elise Gould has calculated that a loss of one IQ point corresponds to a loss of $17,815 in lifetime earnings. Based on that figure, she estimates that for the population that was six years old or younger in 2006, lead exposure will result in a total income loss of between $165 and $233 billion. The combined current levels of pesticides, mercury, and lead cause IQ losses amounting to around $120 billion annually—or about three percent of the annual budget of the U.S. government.

Low-income families are hit the hardest. No parent can avoid these toxins—they’re in our couches and in our air. They can’t be sweated out through hot yoga classes or cleansed with a juice fast. But to whatever extent these things can be avoided without better regulations, it costs money. Low-income parents might not have access to organic produce or be able to guarantee their children a low-lead household. When it comes to brain development, this puts low-income kids at even greater disadvantages—in their education, in their earnings, in their lifelong health and well-being.

Grandjean compares the problem to climate change. “We don’t have the luxury to sit back and wait until science figures out what’s really going on, what the mechanisms are, what the doses are, and that sort of thing. We’ve seen with lead and mercury and other poisons that it takes decades. And during that time we are essentially exposing the next generation to exactly the kind of chemicals that we want to protect them from.”

Key Advance Reported in Reducing Alzheimer’s Plaque Formation


Key Advance Reported in Reducing Alzheimer's Plaque FormationScientists at the University of Michigan say they have learned how to fix the Golgi complex that becomes fragmented in all Alzheimer’s patients and appears to be a major cause of the disease. They add that understanding this mechanism will help decode amyloid plaque formation in the brains of Alzheimer’s patients—plaques that kill cells and contribute to memory loss and other Alzheimer’s symptoms.
  • The researchers also reported their discovery of the molecular process behind Golgi fragmentation and the development of two techniques to “rescue” the Golgi structure.

    “We plan to use this as a strategy to delay the disease development,” said Yanzhuang Wang, Ph.D., U-M associate professor of molecular, cellular, and developmental biology. “We have a better understanding of why plaque forms fast in Alzheimer’s and found a way to slow down plaque formation.”

    The paper (“Aβ-induced Golgi fragmentation in Alzheimer’s disease enhances Aβ production”) appears in an upcoming edition of the Proceedings of the National Academy of Sciences.

    Dr. Wang said scientists have long recognized that the Golgi becomes fragmented in the neurons of Alzheimer’s patients, but until now they didn’t know how or why this fragmentation occurred. The Golgi structure has the important role of sending molecules to the right places in order to make functional cells. When the Golgi becomes fragmented, molecules get sent to the wrong places or not sent at all.

    U-M researchers found that the accumulation of the Aβ peptide, the primary culprit in forming plaques that kill cells in Alzheimer’s brains, triggers Golgi fragmentation by activating an enzyme called cdk5 that modifies Golgi structural proteins such as GRASP65.

    “Aβ accumulation triggers Golgi fragmentation by activating cyclin-dependent kinase-5 (cdk5), which phosphorylates Golgi structural proteins such as GRASP65,” wrote the investigators. “Rescue of Golgi structure by inhibiting cdk5 or by expressing nonphosphorylatable GRASP65 mutants reduced Aβ secretion. Our study provides a molecular mechanism for Golgi fragmentation and its effects on APP trafficking and processing, suggesting Golgi as a potential drug target for AD [Alzheimer’s disease] treatment.”

    The next step is to see if Golgi fragmentation can be delayed or reversed in mice, according to Dr. Wang.

Scientists Fail To Find A Link Between Saturated Fat And Heart Disease.


For years we’ve been told to reduce our consumption of saturated fats as a sure-fire way to prevent heart disease. But a recent analysis of 45 studies and 27 trials involving over 600,000 participants is forcing a rethink of this long held — and apparently erroneous — assumption.

The primary takeaway of this study is not necessarily that saturated fats don’t contribute to heart disease (a link that’s now most certainly been cast into doubt) — but that food and the way it affects our health is an incredibly complicated and multifaceted process. One of the study’s authors, Dariush Mozaffarian of the department of epidemiology at Harvard University in Boston, put it best by saying: “Guidelines that focus on the nutrients, single nutrients, as targets for preventing chronic diseases don’t make a lot of sense. I think we need to move to food-based guidelines, to really talk about food, not nutrients.”

Indeed, a prime example of this problem is the unwarranted focus on cholesterol and its apparent association with cardiovascular disease — the so-called LDL-heart disease hypothesis. Recently, however, physicians are coming to realize that cholesterol levels do not strongly predict our chances of developing heart disease, and there are now over a dozen studies that prove it. The notion that lowering saturated fats — which are typically consumed via butter,whole milk, red meat, poultry, coconut oil, and nuts — will lower bad cholesterol is predicated on some rather shaky ideas.

And now, as the new meta-study shows, there’s insufficient data to support current guidelines restricting the consumption of saturated fats in order to prevent heart disease.

A “Careful Reappraisal” Needed

For the study, which now appears in the Internal Annals of Medicine, researchers evaluated data from 72 unique studies analyzing over 600,000 participants from 18 nations. Results showed that total saturated fatty acid, whether it was measured in the diet or in the bloodstream, could not be associated with coronary disease risk. Also, intake of total monounsaturated fatty acids, long-chain omega-3 and omega-6 polyunsaturated fatty acids could not be linked to cardiovascular risk.

Actually, what they did find was that circulating levels of eicosapentaenoic and docosahexaenoic acids (two main types of long-chain omega-3 polyunsaturated fatty acids found in oily fish), and an omega-6 fat were associated with lower coronary risk. However, after investigating the effects of of omega-3 and omega-6 fatty acid supplementations on reducing coronary disease, they did not find any significant effects.

Not surprisingly, the researchers also reaffirmed what most of us already know — trans fats are evil.

“These are interesting results that potentially stimulate new lines of scientific inquiry and encourage careful reappraisal of our current nutritional guidelines,” noted lead author Dr. Rajiv Chowdhury of the University of Cambridge. Indeed, given that more than 17-million people died from a cardiovascular disease in 2008, it’s critical to have guidelines that are actually backed by scientific evidence.

Critically speaking, some of the participants involved in the randomized trials didn’t always follow instructions, and the researchers failed to take repeated looks at fatty acids in some of the studies. But these are slight objections.

Mind What You Eat

Now it’s important to note that this doesn’t mean you should run out and start eating foods laden in saturated fats with reckless abandon. As already noted, it’s the overall quality of the foods that matter. If you down a fast food hamburger, you’re probably taking in more sodium and wheat than what’s considered healthy.

What’s more, as stated by Jeremy Pearson, Associate Medical Director at the British Heart Foundation: “This analysis of existing data suggests there isn’t enough evidence to say that a diet rich in polyunsaturated fats but low in saturated fats reduces the risk of cardiovascular disease. But large scale clinical studies are needed, as these researchers recommend, before making a conclusive judgement.”

According to Pearson, in addition to taking necessary medication, the best way to stay heart healthy is to adopt a healthy lifestyle, including smoking cessation, staying active, and ensuring an overall healthy diet. This includes not only the fats in our diet but also our intake of salt, sugar and fruit and vegetables.

A Surprising Cure For Insomnia


http://m.huffpost.com/us/entry/4907380?ncid=fcbklnkushpmg00000063

Posted from WordPress for Android

Social porn: why people are sharing their sex lives online.


From PornTube to Pinsex to Pornostagram, sex websites are following the lead of social networks, allowing users to like, share, repost and comment on each other’s pornography
Sharing information on smartphones

Nowadays people are happier to share, and that applies to porn too.

In his 2008 book, Click, online behaviour expert Bill Tancer declared thatsocial media was overtaking pornography as the most popular destination on the internet. Those aged 18 to 24 in particular were replacing pornography use with more stimulating social networking pastimes. After the porn frenzy that was the first decade of the internet’s life, users seemed to be finding more “sociable” ways to occupy their time.

Five years later and social media seems to be firmly ahead of pornography in the race for internet dominance – social networking sites make up four out of 10 of the world’s most visited sites. Research fromPew’s internet project suggests that 90% of 18- to 29-year-olds in the US use social networking and 71% of online adults are on Facebook.

But it’s safe to say that pornography still remains popular. It’s notoriously difficult to come by reliable statistics on porn use or the porn industry, but Pornhub – one of the biggest online providers – claims to have hadmore than 14.7bn visits in 2013, with more than 1.68m visits an hour.

However, the line between porn and social media is beginning to blur. From Fuckbook (a porn version of Facebook) to Pornostagram (a porn version of Instagram), to PornTube (a porn version of YouTube), online pornography websites are increasingly starting to behave like social networks – encouraging users to share, like, rate, comment, curate and even create content.

Traditional social media sites have always struggled with the “pornography problem” – the peculiar fact that whenever a means for people to share things online is created, people will start sharing explicit material. It only took four days after Twitter launched Vine for a pornographic video to creep to the top of its “Editor’s Picks” list.

From his position in the digital startup industry in Barcelona, Christian Thorn noticed this tendency and spotted a business opportunity. “If people are putting that stuff up on social media, then they want a site that will allow them to do it,” he says.

Thorn went on to found Pinsex – another addition to the social porn family – just over a year ago. It behaves like the photosharing website Pinterest – a virtual pinboard that allows users to collect images they like and follow others who have similar tastes. Pinsex does exactly the same, but with porn. In its first year 50,000 users have signed up and the site attracts 300,000 visits each day.

A tamer example of the material on the site – a photograph of a topless woman on a beach at dusk posted by a user called Nick – has had 124 repins and 382 likes. Many of the comments cannot be repeated for obvious reasons. “Pure perfection man,” one user has commented. “Doesn’t get much better than that,” reads another. One user, who is apparently female, writes that she will follow Nick, if he follows her back.

“A few years ago nobody would have predicted that people would take pictures of their food and put them on Facebook,” says Thorn. “People would have said: ‘Who is interested in what I had for lunch today?'” Nowadays people are happier to share, he says, and that applies to porn too.

Pinsex users can be broadly split into two categories – those who just like to curate their own collections and enjoy other people’s, and those who are creating their own images. “There are a lot of users posting amateur porn on the site and that porn might not be as beautiful and airbrushed – like you see in magazines or whatever – but it’s very popular.”

The socialisation of internet pornography has been noted with interest by academics. “Traditionally pornography was ‘used’, ‘consumed’ or whatever verb you wish to use, by people on their own,” says Simon Lindgren, a professor of sociology and social media researcher at Umeå University in Sweden.

He is clear that today’s online porn audience is no longer made up of “isolated masturbating loners”, but of an interactive and creative group of critical audience members.

Sharif Mowlabocus, a senior lecturer in media studies at the University of Sussex, argues that the idea of pornography as an intensely antisocial activity is actually relatively new. “With the exception of certain times in history, pornography has actually always had a social dimension,” he says. He points to the blue records of the 1920s and 30s – audio recordings of people having sex – which were often listened to in groups, and to the stag movies of the 1940s, which, again, were watched alongside other people in cinemas. “It was covert, but it was also social,” he says.

The social side of pornography has perhaps been even more important in the history of oppressed sexualities: “In the 1980s there was widespread sharing of porn among particular groups, like the gay community or BDSM community. Those communities have a long history of developing social relationships around porn.”

This all changed when the videotape came along. “The videotape took porn off the street, out of the movie theatres and into the intimate and domestic spaces of the home,” says Mowlabocus.

Taking a picture of a woman on a phone

Whenever a means for people to share things online is created, people will start sharing explicit material’If being sociable with your porn habits isn’t new, then other things that come with the rise of social porn might be. Just as social media was credited as a catalyst for political revolution and a platform for hitherto unheard voices, some argue that the rise of social porn may democratise the world of pornography, providing space for alternative sexual desires.Some argue that once the content is driven by the user, the porn landscape will look much more representative. “The rise of social media in a range of forms has had a crucial impact on the kinds of pornography that are available and how it is made available,” says Susanna Paasonen, professor of media studies at University of Turku, Finland. “You get different production practices, different aesthetics and different politics.”

“I started off studying mainstream online porn and now I don’t think I know what that is any more – it’s become so diverse,” she says. “There are of course examples that are immediately recognisable as mainstream, but then on the same platforms you have all kinds of content that is a far cry from the mainstream.”

Thorn says he wants to attract a diverse audience to Pinsex: eighty per cent of users are currently male and a tiny percentage are trans, says Thorn, but the site would like to attract a wider range of people – especially women. “We want to be a service for everybody because we’re a service that’s created by the users … In the old days people would say, ‘but women don’t like porn’ and we can see now that that’s not necessarily very true.”

“I don’t think that it’s surprising that in some of these more social spaces – where porn is consumed, uploaded, distributed, commented on – we are beginning to see discussions about the alternative politics of pornography,” says Mowlabocus.

But, although there is the opportunity and potential to democratise pornography, he says, it won’t necessarily happen. “Those same types of sociality are still being used to uphold some very misogynist views,” he says. “We need more than a technological platform to make those ideological shifts.”

One thing seems clear – social porn isn’t going anywhere. A Pew surveyasked whether people thought the group referred to as Generation Ywould continue to be “ambient broadcasters who disclose a great deal of personal information in order to stay connected” throughout their lives; 67% of those questioned thought so.

“It’ll be interesting to see if those pornography networks seep out and connections start to be made with your personal email address,” says Mowlabocus. But he suspects that social activity around online porn will always largely be done under the cloak of anonymity. “While porn has always had a social dimension, that social dimension has always been heavily policed.”

It doesn’t seem likely that your social porn activity will be recorded on your Facebook page, for example, as it is with other sites, including music-streaming site Spotify. “The people watching the stag films didn’t tell their bosses and parents,” says Mowlabocus.

“In the end, when you think about the people you’re connected to on Facebook, there are a lot of people you wouldn’t want to share your sexual desires with.”