How you can find out everything Google knows about you

When you use Google, you are making a deal. You get to use services like Gmail, Drive, search, YouTube, and Google Maps for free.


In exchange, you agree to share information about yourself that Google can share with advertisers so their ads are more effective. For instance, airlines want to target people who love to travel. Children’s clothing makers want to target parents.

Google uses a lot of methods to learn about you. There’s the stuff you tell Google outright when you sign up for its Gmail or to use your Android phone. This includes your name, phone number, location, and so on.

But Google also watches you as you scamper around the internet, deducing your interests from your internet searches — what do you search for? click on? — from your use of Google’s other services and from other websites you visit.

By visiting a hard-to-find page called “Web & App Activity,” you can see what Google is watching.

Then by visiting a site called “Ads Settings,” you can see what Google thinks it knows about you, and you can change what it’s telling advertisers about you.

It’s not easy to find your “Web & App Activity” page. You must be logged in to Google to see it. Once logged in, go to “” and click on “all time.”



This brings up a long list of all the web pages you searched. You can delete them, but it isn’t easy. Google lets you delete only one day at a time. That will take forever to cover years’ worth of data, but you can try it anyway. Click on today, then click the delete button at the top.




You’ll have to deal with a warning from Google telling you that you don’t really want to delete this information. The truth is, Google doesn’t want you to delete this information. You may or may not want to, but don’t worry if you do. You won’t break the internet or your Google account if you hit the delete button.


Now, click on the little menu button on the top left of the screen



Here’s where you’ll find links to the voice, device, location, and YouTube records Google keeps on you. You can go to those pages and delete stuff, too. But you’ll have to delete everything one day at a time and deal with Google’s warnings on why you don’t want to do that.

If you click on “location history” in the menu, it takes you to a page with a map, which represents your “timeline” of where and when you traveled, with Google Maps or other location services. Now click on the settings button on the lower-right corner.




From here you can delete all of your location data, if you choose. But if you really want to see all the data Google has collected on you, click on “download a copy of all your data.” You can also get to this download page from your “account settings” page. Click on “select all.” Scroll down and select “next.”


Select your file type. We recommend the default, .zip, since Windows and Macs can typically open those files without problems, and select your delivery method. You might want to save it to Drive if you have the space. Google warns that archives that are emailed may take hours or days to compile. You’ll have to be patient. It still took two hours when using Drive. Google will email you when it’s done.

Google sent me two ENORMOUS 2G files on what it is tracking on me. Inside were folders of stuff, including computer scripts on me and my data. But most of it was photos. Every photo I ever uploaded since 2013, full size. Here’s a photo of my puppy that it sent, and an example of the JSON scripts and the list of files it sent.

While you are waiting, you can explore what advertisers are told about you. While you are logged in, go to any Google service and click on your account icon. Then click on “my account.”




This takes you to your account-settings page. On the left, the “activity controls” lets you explore all the daily information Google keeps on you. “Control your content” lets you download all of your data. But this time, click on “ads settings,” then scroll down and click on “manage ad settings.”



This is what Google thinks I’m into. Some stuff is accurate: bikes, fitness, books, food & drink, mobile phones. Some is not: East Asian Music? Banking? Cleaning Agents? Rap & Hip-Hop? I think that’s Google’s way of guessing my gender (cleaning/hair), my ethnic background (Asian) and my age (Hip-Hop) because I deleted my gender and age information two years ago, the last time I checked on what Google was monitoring.




Scroll down and click on “control signed-out ads” and you can turn off “interest-based ads” at least for this browser, meaning Google won’t share stuff about you to advertisers. Google will warn you against it. Or you can switch to the DuckDuckGo search engine, which promises not to track you at all.



Google Glass takes photos by winking

Jonathan Blake tries out Google Glass
Google said the wink feature could have various potential uses in future
Google has introduced a new feature to its Google Glass, which allows users to take a photo with a “wink of the eye”.

Google said the feature was faster than the camera button or the voice action and works even when the display is off.

The update to Google Glass, dubbed version XE12, also adds a screen lock feature and the ability to upload and share videos on YouTube.

Technology firms have been keen to capture the wearable gadgets market, seen by many as a key growth area.

Google said the wink feature in its Glass could have various other uses in the future.

“Imagine a day where you’re riding in the back of a cab and you just wink at the meter to pay,” the firm said in a blogpost.

“Start Quote

There needs to be discussion about how, and in what environments, gadgets like these can be used openly”

Manoj Menon Frost & Sullivan

“You wink at a pair of shoes in a shop window and your size is shipped to your door. You wink at a cookbook recipe and the instructions appear right in front of you – hands-free, no mess, no fuss,” it added.

Privacy concerns

The launch of Google Glass was accompanied with concerns over its impact on privacy. The worries were triggered by its potential to gather images, video and other data about almost anything a user sees.

Some have argued that privacy will be “impossible” if Google Glass and similar products become widely used.

Analysts said the ability to take a photo by just winking an eye would make it very difficult for people being photographed to notice that someone had taken a picture of them.

“It is a remarkable progress of technology and the possibilities of innovation around it limitless,” Manoj Menon, managing director of consulting firm Frost & Sullivan.

“However, it comes with new issues that we need to understand, not least the worries over security and privacy.

Galaxy Gear
Firms have introduced a range of products in the wearable technology sector in recent months

“There needs to be discussion about how, and in what environments, gadgets like these can be used openly,” he said.

Mr Menon added that as the technology behind these gadgets matures and companies push more for their mainstream adaption, regulations were likely to come into place “to govern their usage”.

Potential growth

The wearable technology market is expected to see robust growth in the coming years.

However, analysts differ over the potential size of the market.

According to Juniper Research, the sector is expected to have annual sales of $19bn (£11.9bn) by 2018, up from $1.4bn this year.

Analysts at the bank Credit Suisse have been more upbeat and have suggested a figure of $50bn figure by the same date.

Research firm Gartner has been more cautious about its predictions. It has said it expects $10bn sales for 2016. But one of its analysts suggested the sector would grow more quickly if businesses decided to equip their workers with such technology.

Growing competition

NTT Docomo Intelligent Glass
A Japanese firm has showcased glasses that can translate menus

Google Glass is one of a number of wearable gadgets that have been launched by firms as they compete to take a a major share of the growing market.

In October, Nike launched its second generation wristband, Fuelband, which helps users track their physical activity.

In September, Samsung unveiled a smartwatch, Galaxy Gear, that can be used for voice calls and run apps.

Also in September, Japanese mobile operator NTT Docomo demonstrated glasses that can translate a menu by projecting an image of translated text over unfamiliar characters.

Earlier this year, US-based Heapsylon said it was developing sensor-equipped socks that would help their owners monitor their balance while walking or running.

Meanwhile, Chinese firm Shanda, has unveiled the Geak Ring – a finger-worn device that can unlock a user’s smartphone or pass data to others.

Jacob Barnett, 14-Year-Old With Asperger’s Syndrome, May Be Smarter Than Einstein.

When Jacob Barnett was 2 years old, he was diagnosed with moderate to severe autism. Doctors told his parents that the boy would likely never talk or read and would probably be forever unable to independently manage basic daily activities like tying his shoe laces.

But they were sorely, extraordinarily mistaken.


Today, Barnett — now 14 — is a Master’s student, on his way to earning a PhD in quantum physics. According to the BBC, the teen, who boasts an IQ of 170, has already been tipped to one day win the Nobel Prize.

Since enrolling at Indiana University-Purdue University Indianapolis (IUPUI) at the age of 10, Barnett has flourished — astounding his professors, peers and family with his spectacular intelligence.

The teen tutors other college students in subjects like calculus and is a published scientific researcher, with an IQ that is believed to be higher than that of Albert Einstein. In fact, according to a 2011 TIME report, Barnett, who frequently tops his college classes, has asserted that he may one day disprove Einstein’s Theory of Relativity. (Watch him explain his genius to 60 minutes’ Morley Safer in a 2012 interview in the video above.)

Outside of his rigorous university commitments, Barnett, who has Asperger’s Syndrome, is also an entrepreneur and aspiring author

The teen, who, with his family, runs a charity called Jacob’s Place for kids on the spectrum, has used his story to raise awareness and dispel myths about autism.

“I’m not supposed to be here at all,” he said last year during a TEDx Teen speech about “forgetting what you know” in New York City. “You know, I was told that I wouldn’t talk. There’s probably a therapist watching who is freaking out right now.”


Though he makes it all look so easy,his mother, Kristine Barnett, says that he has to work hard on a daily basis to handle his autism.

“He overcomes it every day. There are things he knows about himself that he regulates everyday,” his mother told the Indianapolis Star last month.

In April, Kristine Barnett’s memoir about her family’s experience with autism, “The Spark: A Mother’s Story of Nurturing Genius,” was released. A movie deal is said to be in the works.

“I hope it really inspires children to actually be doing something,” Barnett told the Star of his mom’s book and potential film. “[I hope it] encourages them to do what they like doing. I just hope it is inspirational.”

For more on Jacob Barnett, watch this March 2013 YouTube video of him working through what is described as “a simple quantum mechanics problem“:


Is Sugar Really Toxic? Sifting through the Evidence.

Our very first experience of exceptional sweetness—a dollop of buttercream frosting on a parent’s finger; a spoonful of strawberry ice cream instead of the usual puréed carrots—is a gustatory revelation that generally slips into the lacuna of early childhood. Sometimes, however, themoment of original sweetness is preserved. A YouTube video from February 2011 begins with baby Olivia staring at the camera, her face fixed in rapture and a trickle of vanilla ice cream on her cheek. When her brother Daniel brings the ice cream cone near her once more, she flaps her arms and arches her whole body to reach it.


Considering that our cells depend on sugar for energy, it makes sense that we evolved an innate love for sweetness. How much sugar we consume, however—as well as how it enters the body and where we get it from in the first place—has changed dramatically over time. Before agriculture, our ancestors presumably did not have much control over the sugars in their diet, which must have come from whatever plants and animals were available in a given place and season. Around 6,000 BC, people in New Guinea began to grow sugarcane, chewing and sucking on the stalks to drink the sweet juice within. Sugarcane cultivation spread to India, where by 500 BC people had learned to turn bowls of the tropical grass’s juice into crude crystals. From there sugar traveled with migrants and monks to China, Persia, northern Africa and eventually to Europe in the 11th century.

For more than 400 years, sugar remained a luxury in Europe—an exotic spice—until manufacturing became efficient enough to make “white gold” much more affordable. Christopher Columbus brought sugarcane to the New World in 1493 and in the 16th and 17th centuries European powers established sugarcane plantations in the West Indies and South America. Sugar consumption in England increased by 1,500 percentbetween the 18th and 19th centuries. By the mid 19th century, Europeans and Americans had come to regard refined sugar as a necessity. Today, we add sugar in one form or another to the majority of processed foods we eat—everything from bread, cereals, crunchy snacks and desserts to soft drinks, juices, salad dressings and sauces—and we are not too stingy about using it to sweeten many raw and whole foods as well.

By consuming so much sugar we are not just demonstrating weak willpower and indulging our sweet tooth—we are in fact poisoning ourselves according to a group of doctors, nutritionists and biologists, one of the most prominent members of which isRobert Lustig of the University of California, San Francisco, famous for his viral YouTube video “Sugar: The Bitter Truth.” A few journalists, such as Gary Taubes andMark Bittman, have reached similar conclusions. Sugar, they argue, poses far greater dangers than cavities and love handles; it is a toxin that harms our organs and disrupts the body’s usual hormonal cycles. Excessive consumption of sugar, they say, is one of the primary causes of the obesity epidemic and metabolic disorders like diabetes, as well as a culprit of cardiovascular disease. More than one-third of American adults and approximately 12.5 million children and adolescents in the U.S.are obese. In 1980, 5.6 million Americans were diagnosed with diabetes; in 2011 more than 20 million Americans had the illness.


The argument that sugar is a toxin depends on some technical details about the different ways the human body gets energy from different types of sugar. Today, Americans eat most of their sugar in two main forms: table sugar and high-fructose corn syrup. A molecule of table sugar, or sucrose, is a bond between one glucose molecule and one fructose molecule—two simple sugars with the same chemical formula, but slightly different atomic structures. In the 1960s, new technology allowed the U.S. corn industry to cheaply convert corn-derived glucose intro fructose and produce high fructose corn syrup, which—despite its name—is almost equal parts free-floating fructose and glucose: 55 percent fructose, 42 percent glucose and three percent other sugars. Because fructose is about twice as sweet as glucose, an inexpensive syrup mixing the two was an appealing alternative to sucrose from sugarcane and beets.

Regardless of where the sugar we eat comes from, our cells are interested in dealing with fructose and glucose, not the bulkier sucrose. Enzymes in the intestine split sucrose into fructose and glucose within seconds, so as far as the human body is concerned sucrose and high-fructose corn syrup are equivalent. The same is not true for their constituent molecules. Glucose travels through the bloodstream to all of our tissues, because every cell readily converts glucose into energy. In contrast, liver cells are one of the few types of cells that can convert fructose to energy, which puts the onus of metabolizing fructose almost entirely on one organ. The liver accomplishes this primarily by turning fructose into glucose and lactate. Eating exceptionally large amounts of fructose taxes the liver: it spends so much energy turning fructose into other molecules that it may not have much energy left for all its other functions. A consequence of this energy depletion is production of uric acid, which research has linked to gout, kidney stones and high blood pressure.

The human body strictly regulates the amount of glucose in the blood. Glucose stimulates the pancreas to secrete the hormone insulin, which helps remove excess glucose from blood, and bolsters production of the hormone leptin, which suppresses hunger. Fructose does not trigger insulin production and appears to raise levels of the hormone grehlin, which keeps us hungry. Some researchers have suggested that large amounts of fructose encourage people to eat more than they need. In studies with animals and people by Kimber Stanhope of the University of California Davis and other researchers, excess fructose consumption has increased fat production, especially in the liver, and raised levels of circulating triglycerides, which are a risk factor for clogged arteries and cardiovascular disease. Some research has linked a fatty liver to insulin resistance—a condition in which cells become far less responsive to insulin than usual, exhausting the pancreas until it loses the ability to properly regulate blood glucose levels. Richard Johnson of the University of Colorado Denver has proposed that uric acid produced by fructose metabolism also promotes insulin resistance. In turn insulin resistance is thought to be a major contributor to obesity and Type 2 diabetes; the three disorders often occur together.

Because fructose metabolism seems to kick off a chain reaction of potentially harmful chemical changes inside the body, Lustig, Taubes and others have singled out fructose as the rotten apple of the sugar family. When they talk about sugar as a toxin, they mean fructose specifically. In the last few years, however, prominent biochemists and nutrition experts have challenged the idea that fructose is a threat to our health and have argued that replacing fructose with glucose or other sugars would solve nothing. First, as fructose expert John White points out, fructose consumption has been declining for more than a decade, but rates of obesity continued to rise during the same period. Of course, coinciding trends alone do not definitively demonstrate anything. A more compelling criticism is that concern about fructose is based primarily on studies in which rodents and people consumed huge amounts of the molecule—up to 300 grams of fructose each day, which is nearly equivalent to the total sugar in eight cans of Coke—or a diet in which the vast majority of sugars were pure fructose. The reality is that most people consume far less fructose than used in such studies and rarely eat fructose without glucose.


On average, people in America and Europe eat between 100 and 150 grams of sugar each day, about half of which is fructose. It’s difficult to find a regional diet or individual food that contains only glucose or only fructose. Virtually all plants have glucose, fructose and sucrose—not just one or another of these sugars. Although some fruits, such as apples and pears, have three times as much fructose as glucose, most of the fruits and veggies we eat are more balanced. Pineapples, blueberries, peaches, carrots, corn and cabbage, for example, all have about a 1:1 ratio of the two sugars. In his New York Times Magazine article, Taubes claims that “fructose…is what distinguishes sugar from other carbohydrate-rich foods like bread or potatoes that break down upon digestion to glucose alone.” This is not really true. Although potatoes and white bread are full of starch—long chains of glucose molecules—they also have fructose and sucrose. Similarly, Lustig has claimed that the Japanese diet promotes weight loss because it is fructose-free, but the Japanese consume plenty of sugar—about 83 grams a day on average—including fructose in fruit, sweetened beverages and the country’s many meticulously crafted confectioneries. High-fructose corn syrup wasdeveloped and patented in part by Japanese researcher Yoshiyuki Takasaki in the 1960s and ’70s.

Not only do many worrying fructose studies use unrealistic doses of the sugar unaccompanied by glucose, it also turns out that the rodents researchers have studied metabolize fructose in a very different way than people do—far more different than originally anticipated. Studies that have traced fructose’s fantastic voyage through the human body suggest that the liver converts as much as 50 percent of fructose into glucose, around 30 percent of fructose into lactate and less than one percent into fats. In contrast, mice and rats turn more than 50 percent of fructose into fats, so experiments with these animals would exaggerate the significance of fructose’s proposed detriments for humans, especially clogged arteries, fatty livers and insulin resistance.

In a series of meta-analyses examining dozens of human studies, John Sievenpiper of St. Michael’s Hospital in Toronto and his colleagues found no harmful effects of typical fructose consumption on body weightblood pressure or uric acid production. In a 2011 study, Sam Sun—a nutrition scientist at Archer Daniels Midland, a major food processing corporation—and his colleagues analyzed data about sugar consumption collected from more than 25,000 Americans between 1999 and 2006. Their analysisconfirmed that people almost never eat fructose by itself and that for more than 97 percent of people fructose contributes less daily energy than other sugars. They did not find any positive associations between fructose consumption and levels of trigylcerides, cholesterol or uric acid, nor any significant link to waist circumference or body mass index (BMI). And in a recent BMC Biology Q&A, renowned sugar expertLuc Tappy of the University of Lausanne writes: “Given the substantial consumption of fructose in our diet, mainly from sweetened beverages, sweet snacks, and cereal products with added sugar, and the fact that fructose is an entirely dispensable nutrient, it appears sound to limit consumption of sugar as part of any weight loss program and in individuals at high risk of developing metabolic diseases. There is no evidence, however, that fructose is the sole, or even the main factor in the development of these diseases, nor that it is deleterious to everybody.”

To properly understand fructose metabolism, we must also consider in what form we consume the sugar, as explained in a recent paper by David Ludwig, Director of the New Balance Foundation Obesity Prevention Center of Boston Children’s Hospital and a professor at Harvard. Drinking a soda or binging on ice cream floods our intestines and liver with large amounts of loose fructose. In contrast, the fructose in an apple does not reach the liver all at once. All the fiber in the fruit—such as cellulose that only our gut bacteria can break down—considerably slows digestion. Our enzymes must first tear apart the apple’s cells to reach the sugars sequestered within. “It’s not just about the fiber in food, but also its very structure,” Ludwig says. “You could add Metamucil to Coca Cola and not get any benefit.” In a small but intriguing study, 17 adults in South Africa ate primarily fruit—about 20 servings with approximately 200 grams of total fructose each day—for 24 weeks and did not gain weight, develop high blood pressure or imbalance their insulin and lipid levels.

To strengthen his argument, Ludwig turns to the glycemic index, a measure of how quickly food raises levels of glucose in the blood. Pure glucose and starchy foods such as Taubes’s example of the potato have a high glycemix index; fructose has a very low one. If fructose is uniquely responsible for obesity and diabetes and glucose is benign, then high glycemic index diets should not be associated with metabolic disorders—yet they are. A small percentage of the world population may in fact consume so much fructose that they endanger their health because of the difficulties the body encounters in converting the molecule to energy. But the available evidence to date suggests that, for most people, typical amounts of dietary fructose are not toxic.


Even if Lustig is wrong to call fructose poisonous and saddle it with all the blame for obesity and diabetes, his most fundamental directive is sound: eat less sugar. Why? Because super sugary, energy-dense foods with little nutritional value are one of the main ways we consume more calories than we need, albeit not the only way. It might be hard to swallow, but the fact is that many of our favorite desserts, snacks, cereals and especially our beloved sweet beverages inundate the body with far more sugar than it can efficiently metabolize. Milkshakes, smoothies, sodas, energy drinks and even unsweetened fruit juices all contain large amounts of free-floating sugars instantly absorbed by our digestive system.

Avoiding sugar is not a panacea, though. A healthy diet is about so much more than refusing that second sugar cube and keeping the cookies out of reach or hidden in the cupboard. What about all the excess fat in our diet, so much of which is paired with sugar and contributes to heart disease? What about bad cholesterol and salt? “If someone is gaining weight, they should look to sugars as a place to cut back,” says Sievenpiper, “but there’s a misguided belief that if we just go after sugars we will fix obesity—obesity is more complex than that. Clinically, there are some people who come in drinking way too much soda and sweet beverages, but most people are just overconsuming in general.” Then there’s all the stuff we really should eat more of: whole grains; fruits and veggies; fish; lean protein. But wait, we can’t stop there: a balanced diet is only one component of a healthy lifestyle. We need to exercise too—to get our hearts pumping, strengthen our muscles and bones and maintain flexibility. Exercising, favoring whole foods over processed ones and eating less overall sounds too obvious, too simplistic, but it is actually a far more nuanced approach to good health than vilifying a single molecule in our diet—an approach that fits the data. Americans have continued to consume more and more total calories each year—average daily intake increased by 530 calories between 1970 and 2000—whilesimultaneously becoming less and less physically active. Here’s the true bitter truth: Yes, most of us should make an effort to eat less sugar—but if we are really committed to staying healthy, we’ll have to do a lot more than that.



A New Era in Noninvasive Prenatal Testing.

A new, noninvasive prenatal test is poised to change the standard of care for genetic screening. Cell-free fetal DNA (cfDNA) testing requires only a maternal blood sample, can be performed as early as 9 weeks of gestation, and outperforms standard screening tests for trisomies 21, 18, and 13 in high-risk populations. It has a sensitivity exceeding 98% and a specificity above 99.5% characteristics of Representative Noninvasive Prenatal Tests Available in the United States.).

Currently, standard screening entails testing of maternal blood samples at gestational weeks 10 to 13 or 16 to 18 (or at both points) to measure serum markers associated with common trisomies and usually an ultrasound examination, including measurement of nuchal translucency, at 11 to 13 weeks. This approach identifies more than 90% of trisomies, with a screen-positive rate of 5% in the general population. Diagnostic testing for women with positive results on screening requires either amniocentesis or chorionic villus sampling, invasive procedures that carry a risk of miscarriage. Amniocentesis, which is performed far more commonly than chorionic villus sampling, is generally delayed until after 15 weeks, with a 1-to-2-week turnaround time for results.

The use of cfDNA testing may appeal to expectant parents for many reasons: it carries no risk of miscarriage, permits earlier detection, and generally provides earlier information about a fetus’s sex. Earlier testing can reassure parents who have negative results, while offering those with abnormal results timely information to help them make difficult decisions. People who choose to continue a pregnancy after an abnormal result have additional time to prepare to deliver and care for their child.

Nevertheless, the diffusion of cfDNA testing into routine prenatal care may be occurring too quickly. Professional societies do not recommend these tests for normal-risk pregnancies because their clinical utility in the general population is not well established. Yet because the Food and Drug Administration (FDA) is not empowered to require testing companies to produce evidence of clinical utility before receiving marketing approval, companies have been free to build consumer demand for cfDNA testing by aggressively marketing the tests, emphasizing data that do not answer key questions. As a result, cfDNA testing seems to be drifting into routine practice ahead of the evidence.

Tests of cfDNA appear to be highly sensitive and specific in detecting trisomies, but two problems plague the evidence base. First, the sensitivity and specificity of the tests derive from studies done on collections of archived samples with known karyotypes that intentionally included a large proportion of specimens from women with known aneuploid fetuses. Evidence concerning the performance characteristics of the testing in the general population and for multiple gestations is limited.1 Second, cfDNA-testing companies have not reported information about their tests’ positive predictive value (PPV), and there is reason to question the tests’ performance on this measure.2 Arguably, PPV is more important than sensitivity and specificity to patients undergoing testing: it indicates the probability that a positive test result indicates a true fetal aneuploidy. Thus, PPV should be discussed in study reports and marketing materials but isn’t.

Studies of cfDNA testing have often been conducted on samples including a high percentage of specimens with known abnormal karyotypes. Prevalence rates for Down’s syndrome in the samples are as high as 1 in 8.3 Although sensitivity and specificity are unaffected by the condition’s prevalence in the test population, PPV and negative predictive value (NPV) vary considerably with prevalence. At a prevalence of 1 in 8, assuming a constant specificity of 99.7% and a sensitivity of 99.9%, the PPV and NPV are impressively high — 97.94% and 99.99%, respectively. But at a prevalence of 1 in 200 — the approximate prevalence of Down’s syndrome among fetuses of 35-year-old women in the second trimester of pregnancy — the PPV drops to 62.59%.

It is worrisome that some laboratories that performed validation tests may have been aware that the samples included high proportions of specimens with known aneuploidies — but that this isn’t always made clear in the studies’ descriptions. Prior knowledge about the prevalence of aneuploidies in the samples may well have affected an analyst’s decisions about how to classify ambiguous test results: someone who believes 1 in 8 samples is abnormal may be more likely to classify a questionable result as abnormal than someone who believes that 1 in 200 is abnormal. Not all published studies of cfDNA testing have this problem, and one study of a sample without a high prevalence of aneuploidies suggests that the false positive rate for the Harmony test (Natera) is low.1 Without additional evidence, however, the clinical utility of cfDNA remains uncertain.

Given this unproven utility in the general population, the leading professional organizations, including the American Congress of Obstetricians and Gynecologists, the Society for Maternal–Fetal Medicine, and the National Society of Genetic Counselors, recommend cfDNA testing only for “high-risk pregnancies,” without specifically defining “high risk.” Furthermore, they recommend that positive results be confirmed through invasive testing. That recommendation is important for patients to understand, because if patients with positive results on cfDNA testing are counseled to wait until their diagnosis is confirmed before taking action, an important potential benefit of cfDNA testing is lost.

Patients must also weigh the benefits of earlier detection against other informational costs. Tests of cfDNA do not provide information about some disorders that are identified through standard screening, including chromosomal abnormalities other than trisomies. It is thus crucial that providers carefully counsel patients about the test’s advantages and disadvantages. Decision making is further complicated by the fact that cfDNA testing is costly and not widely covered by insurance. Four versions of the test are available in the United States, priced from $795 to more than $2,000 (see table). A few major insurers cover cfDNA testing if it’s accompanied by confirmatory testing for positive results, but many have yet to decide whether to cover it.

Meanwhile, testing companies have pursued various strategies to build consumer demand, including reaching out to expectant mothers through YouTube, Facebook, and Twitter. Some companies have capped out-of-pocket costs and offered “introductory pricing” specials with costs ranging from $200 to $235. This strategy has had apparent success, with one company boasting a “spectacular” adoption rate of 60,000 tests performed in 2012.4

The companies’ marketing strategy risks building demand for tests that may not offer a substantial benefit, particularly for women with low-risk pregnancies. Expectant parents’ excitement about the opportunity to learn their child’s sex and rule out trisomies earlier may lead to discounting the tradeoffs involved, push the standard of care away from professional recommendations for confining use to high-risk populations, and contribute to higher costs. The evidentiary gaps concerning cfDNA testing, aggressive marketing, and rapid diffusion into routine practice can be traced, at least partially, to our country’s regulatory scheme for laboratory-developed tests. Under FDA regulations, commercial test kits — which are distributed to multiple laboratories and health care facilities — are subject to both premarketing assessments of analytic and clinical validity and postmarketing reporting of adverse events. No similar requirements exist for tests, like the cfDNA tests, developed for in-house use by a single laboratory.

Laboratory-developed tests are governed, instead, by the Clinical Laboratory Improvement Amendments of 1988. Laboratories must demonstrate such a test’s accuracy, precision, specificity, and sensitivity — but not its clinical validity or utility. Although companies offering noninvasive prenatal tests have chosen to perform studies in the targeted population, they aren’t obliged to do so,2 nor must they design studies so as to provide robust evidence about clinical utility.

Congress’s choice to require a less onerous regulatory approach for laboratory-developed tests arguably promotes the availability of new tests, but it leaves the real-world benefits and risks of these tests more uncertain than those of commercial tests. The rapid proliferation of direct-to-consumer genetic tests and other laboratory-developed tests has led to controversy, culminating in two unsuccessful congressional attempts to strengthen oversight.5 For now, as with many medical innovations, it will fall to physicians to hold the line against pressures promoting diffusion of cfDNA testing beyond the boundaries of available evidence.



  1. 1

Nicolaides KH, Syngelaki A, Ashoor G, Birdir C, Touzet G. Noninvasive prenatal testing for fetal trisomies in a routinely screened first-trimester population. Am J Obstet Gynecol 2012;207(5):374.e1-374.e6.

  1. 2

Norton ME, Rose NC, Benn P. Noninvasive prenatal testing for fetal aneuploidy: clinical assessment and a plea for restraint. Obstet Gynecol 2013;121:847-850
CrossRef | Web of Science | Medline

  1. 3

Palomaki GE, Kloza EM, Lambert-Messerlian GM, et al. DNA sequencing of maternal plasma to detect Down syndrome: an international clinical validation study. Genet Med 2011;13:913-920
CrossRef | Web of Science | Medline

  1. 4

Lindsay R, Maier P. Sequenom Management Presents at Barclays Capital 2013 Global Healthcare Conference (Transcript). March 13, 2013 (

  1. 5

Weiss RL. The long and winding regulatory road for laboratory-developed tests. Am J Clin Pathol 2012;138:20-26
CrossRef | Web of Science | Medline

Source: NEJM


The Internet: A Superhighway of Paranormal Hoaxes and Fakelore.


It’s been a hot time for hoaxing thanks to the Internet. With Photoshop, citizen journalism sites, YouTube, and postboards for the latest photo leaks, it is way too easy to send a lie half way around the world before the truth can pull its shoes on.

In this post, I wrote about a busy week in paranormal-themed news. In chatting with a correspondent — Jeb Card, Visiting Assistant Professor in the Anthropology Department of Miami University — over a shared interest in the state of the paranormal today or “occulture,” we got to talking about the state of hoaxing.

Make no mistake, hoaxing has always been around. Hoaxers have been trying to fool people by displaying their special skills (scams) or stupendous stories since the beginning of civilization, I think. But there is a particular history of hoaxing in occulture. Lately, it has gotten more frequent (or we sure notice it more), more absurd (to outdo the last one) and more involved (because the payout can be big while the scrutiny greater).

There are many famous hoaxes from this scene. It’s hard to say if it’s more common now than in the past. Some of the hoaxes, notes Jeb, have been very influential in the creation of popular folklore. Big ones have defined UFOlogyRoswell and the Men in Black. Not everyone would conclude these are deliberate hoaxes — there is a grain of truth to them — but they went way out of control and now there are hoaxed videos, documents and tales based on these events that never happened the way the lore says it did. Stories like that, which have taken on a life of their own as if they were true, are called “fakelore.”

The Bigfoot field is trampled over with fake footprints, stories, casts, photos and videos. It can’t be denied that the majority of Bigfoot stories are unbelievable, without supporting evidence, or obvious hoaxes. Every new bit of Bigfoot “evidence” these days makes us roll our eyes and say “SERIOUSLY!?” This reputation is damaging to those who truly believe something is out there to be found. The credibility of Bigfoot researchers scrapes the bottom of the barrel. The history of hoaxes colors this topic deeply when we realize that the seminal story of “Bigfoot,” Ray Wallace’s trackway, was revealed to be a hoax.

Actually, the same can be said for the Loch Ness Monster. The iconic Nessie photo — the long-neck arching out of the rippling water — was hoaxed.

A longtime follower of the occulture fields, Jeb says he can’t think of a time when these communities weren’t awash with simultaneous and multiple hoax accusations. Today, I post some of the latest ridiculous news stories on Doubtful News, but some are too intelligence-insulting to even mention. I can’t waste time on them. The Internet rewards even cheap hoaxes with website hits from the curious. Many sites gain popularity doing just this, collecting the latest mystery tomfoolery and telling you to decide for yourself.

Hoaxes of old lasted a very long time. If the infamous Patterson-Gimlin Bigfoot film is a hoax (as several have espoused), then it’s one of the best because people are STILL fighting about it 46 years later! The Surgeon’s photo mentioned above lasted almost 70 years. The Wallace wooden footprint maker wasn’t revealed widely until he died. The Majestic UFO documents are still believed by many to be genuine as we saw it come up in the recentCitizens Hearing on Disclosure.

Even when the real story is exposed, the fakelore lingers, with adherents still clinging to belief. A modern monster, birthed by the Internet that continues to live despite being utterly demolished is the chupacabra, the alleged goatsucker, a monster from Latin America. Ben Radford‘s book Tracking the Chupacabra was a clean takedown of this folklore and pop-culture-derived beast. But, the critter continually morphed its way into the global consciousness evolving as needed to serve as the scapegoat for whatever fear arose in the public’s mind.

Hoaxes today can be as low-budget as a guy in a ghillie suit walking through the woods at a distance filmed with a smart phone, to professional artists rendering impressive CGI special effects particularly with UFO hoaxes on YouTube. Really spectacular stuffToo spectacular to be real or the whole city would have noticed!

We also have the problem of marketing hoaxes for products, movies or TV shows, in particular. Some universities even ask students to hoax for a class project with the crowd-sourced grading as to how far it can go.

Today money can come out of hoaxing. There are pay-per-view outlets, special memberships sold, funding solicited for “studies,” merchandise and book sales that mean big bucks to those who can milk the public for a little while and steer clear of fraud charges. Also, with an online community of people who share a belief in a questionable phenomena, there may be a misplaced sense of trust and hope. Those who are emotionally invested in the idea of Bigfoot, let’s say, will want to support a potentially groundbreaking new project that will prove to everyone they aren’t crazy in their quest.

Does the ease of the Internet give people incentive to hoax? That’s undeniable. People do it just to see how far they can get, how many YouTube views, what media outlets cover it. As Jeb says: “The Internet removes the gatekeepers, the filters between the potential hoaxer, and the mark. Your fake Bigfoot doesn’t need to be good enough to get on [the TV] news and then filter down. It just needs to be good enough that someone will share it.”

Jeb cites the TV show Ancient Aliens as an example of a successful brand that has captured public interest no matter HOW absurd the ideas presented. On “reality” TV shows, viewers lose perspective that they are watching an edited, at least partially scripted, entertainment device. It’s not actual scientific research.

The occulture scene gets decidedly more unhealthy as money, greed, quest for notoriety and lack of scruples allow the sensationalist speculation and outright hoaxers to keep right on fooling everyone, time and time again. There’s a sucker born every minute.





Exercise Could Hold Key to Successful Cancer and Mental Health Treatment.


Mounting evidence continues to show that exercise may be a key component in successful cancer prevention and treatment. Studies have also found that it can help keep cancer from recurring, so it’s really a triple-win.

Yet not surprisingly few oncologists ever tell their patients to engage in exercise beyond their simple daily, normal activities, and many cancer patients are reluctant to exercise, or even discuss it with their oncologist. Hopefully, you will not be one of them.

Most recently, research announced at the 2013 International Liver Congress1found that mice who exercised on a motorized treadmill for an hour each day, five days a week for 32 weeks, experienced fewer incidents of liver cancer than sedentary mice.

Exercise may also be absolutely crucial in the treatment of depression, according to recent research.2 I’ve often stated this, and the science continues to support this advice.

Meanwhile, mounting evidence condemns the “evidence-based” drug paradigm, as reviews keep finding that large amounts of published drug research is either seriously flawed or outright fraudulent — motivated of course by the financial interests of the funding party.

Might Exercise Be a Key to Cancer Cure?

Hepatocellular carcinoma (HCC) is a cancer that originates in your liver cells, and is one of the most common types of cancers. According to the featured article inMedical News Today,3 HCC accounts for just over five percent of all cancers worldwide, and causes an estimated 695,000 deaths annually.

According to the reported research,4 the first of its kind for this type of tumor, regular exercise may be the key to significantly reducing your chances for developing liver cancer.

The study involved two groups of mice: One group was fed a high fat diet, and then divided into two sub-groups — one that exercised and one that did not. The second group was fed a controlled diet, and also divided into sub-groups of exercise and non-exercise. According to the featured article:

“After 32 weeks of regular exercise, 71 percent of mice on the controlled diet developed tumors larger than 10mm versus 100 percent in the sedentary group. The mean number and volume of HCC tumors per liver was also reduced in the exercise group compared to the sedentary group.”

In the high-fat diet group, exercise decreased the development of non-alcoholic fatty liver disease. Professor Jean-Francois Dufour told Medical News Today:

“We know that modern, unhealthy lifestyles predispose people to non-alcoholic fatty liver disease which may lead to liver cancer; however it’s been previously unknown whether regular exercise reduces the risk of developing HCC. This research is significant because it opens the door for further studies to prove that regular exercise can reduce the chance of people developing HCC.

The results could eventually lead to some very tangible benefits for people staring down the barrel of liver cancer and I look forward to seeing human studies in this important area in the future. The prognosis for liver cancer patients is often bleak as only a proportion of patients are suitable for potentially curative treatments so any kind of positive news in this arena is warmly welcomed.”

Exercise Needs to be Part of the New Standard of Care for Cancer

But the benefits of exercise are not limited to prevention alone. It can also help you recuperate faster and help prevent recurrence of cancer. A report issued by the British organization Macmillan Cancer Support5 just last year argues that exercise really should be part of standard cancer care. It recommends that all patients getting cancer treatment should be told to engage in moderate-intensity exercise for two and a half hours every week, stating that the advice to rest and take it easy after treatment is an outdated view.

The organization offers loads of helpful information about the benefits of exercise for cancer patients on their website, and also has a number of videos on the subject, available on their YouTube channel

According to Ciaran Devane, chief executive of Macmillan Cancer Support:7

Cancer patients would be shocked if they knew just how much of a benefit physical activity could have on their recovery and long term health, in some cases reducing their chances of having to go through the grueling ordeal of treatment all over again…”

Indeed, the reduction in risk for recurrence is quite impressive. For example, previous research has shown that breast and colon cancer patients who exercise regularly have half the recurrence rate than non-exercisers.8 Macmillan Cancer Support also notes that exercise can help you to mitigate some of the common side effects of conventional cancer treatment, including:

Reduce fatigue and improve your energy levels Manage stress, anxiety, low mood or depression Improve bone health
Improve heart health (some chemotherapy drugs and radiotherapy can cause heart problems later in life) Build muscle strength, relieve pain and improve range of movement Maintain a healthy weight
Sleep better Improve your appetite Prevent constipation

Exercise Can Also Benefit Your Mental Health — Even When Forced

Many recent studies have shown that exercise provides a level of protection against stress-related disorders and depression. But could it still work if it was prescribed and forced upon you, by doctor’s orders, for example; or if part of a mandatory program, such as high school students or military, who are required to participate whether they like it or not?

To find out, researchers at the University of Colorado Boulder devised an animal study to determine whether rats that were forced to exercise would experience the same stress- and anxiety-reduction as those who were free to choose if and when to exercise.

The rats exercised either voluntarily or forcibly for six weeks, after which they were exposed to a stressor. The following day, their anxiety levels were tested by measuring how long they froze when placed in an environment they’d been conditioned to fear. The longer the rats remained frozen, like “a deer in headlights,” the greater the residual anxiety from the previous day’s stressor. According to the lead author:9

“Regardless of whether the rats chose to run or were forced to run they were protected against stress and anxiety. The sedentary rats froze for longer periods of time than any of the active rats. The implications are that humans who perceive exercise as being forced — perhaps including those who feel like they have to exercise for health reasons — are maybe still going to get the benefits in terms of reducing anxiety and depression.”

Could 89 Percent of ‘Landmark’ Cancer Research Be Untruthful?

Findings such as the ones above, which demonstrate the significant benefits of lifestyle changes like exercise on your physical and mental health, become all the more important in light of mounting evidence showing that conventional drug treatment research has been sorely compromised by industry funding. As discussed in a recent GreenMedInfo article,10 the alleged “groundbreaking” results of nearly nine out of 10 cancer studies cannot be reproduced by any means!

“This means that to an extent, we have based our healthcare and clinical guidelines on fake studies that reported untruthful results in order to accommodate the interests of industrial corporations,” Eleni Roumeliotou writes.

“Cancer is a major killer in US. The American Cancer Society reports that in 2012, more than half a million Americans died from cancer, while more than 1.6 million new cases were diagnosed. Given the seriousness of these statistics and the necessity of evidence-based medicine, it would make sense to trust that honest, objective research is tirelessly trying to find the best cancer therapies out there.”

Alas, this trust in the scientific rigor of medical research appears to have been misplaced. First of all, nearly three-quarters of all retracted drug studies are due to falsification of data,11 meaning it’s not even a matter of misinterpretation of data; rather the data used to draw conclusions are pure fiction. Large numbers of patients can be affected when false findings are published, as the average lag time between publication of the study and the issuing of a retraction is 39 months. And that’s if it’s ever caught at all.

Last year, former drug company researcher Glenn Begley also showed that the vast majority of the “landmark” studies on cancer are unreliable — and a high proportion of those unreliable studies come from respectable university labs. Begley looked at 53 papers in the world’s top journals, and found that he and a team of scientists could NOT replicate 47 of the 53 published studies — all of which were considered important and valuable for the future of cancer treatments!

Part of the problem, they said, is that scientists often ignore negative findings in their results that might raise a warning. Instead, they opt for cherry-picking conclusions in an effort to put their research in a favorable light. The allegations appeared in the March 28 issue of the prestigious journal Nature.12

“It was shocking,” Begley said.13 “These are the studies the pharmaceutical industry relies on to identify new targets for drug development. But if you’re going to place a $1 million or $2 million or $5 million bet on an observation, you need to be sure it’s true. As we tried to reproduce these papers we became convinced you can’t take anything at face value.”

As if that’s not disturbing enough, Roumeliotou points out that Begley was not permitted to disclose which 53 cancer studies he evaluated and found to be without scientific merit. She writes:14

“…when they contacted the original authors and asked for details of the experiments, they had to sign an agreement that they would not disclose their findings or sources. This shows that the scientists, who published the tainted research, were all along, fully aware of the discrepancies of their articles and criminally conscious of the fact that they were misleading the medical and public opinion.”

Your Lifestyle has Tremendous Influence Over Your Health and Cancer Risk…

In light of the evidence supporting the notion that lifestyle changes, such as exercise, have a profound impact on human health and diseases of both mind and body, it would be foolish in the extreme to ignore such advice. Especially when you consider that the conventional drug paradigm is riddled with unreliable and outright fraudulent research — courtesy of the financial influence of the drug industry itself, which funds the vast majority of drug research.

Studies on exercise and other lifestyle changes however are less likely to be fraudulent simply because there’s no money to be made by coming to the conclusion that exercise may be helpful — unless it was funded by a gym franchise, perhaps…

Whether you’re trying to address your mental or physical health, I would strongly recommend you read up on my Peak Fitness program, which includes high-intensity exercises that can reduce your exercise time while actually increasing your benefits.

Now, if you have cancer or any other chronic disease, you will of course need to tailor your exercise routine to your individual circumstances, taking into account your fitness level and current health. Often, you will be able to take part in a regular exercise program — one that involves a variety of exercises like strength training, core-building, stretching, aerobic and anaerobic — with very little changes necessary. However, at times you may find you need to exercise at a lower intensity, or for shorter durations.

Always listen to your body and if you feel you need a break, take time to rest. But even exercising for just a few minutes a day is better than not exercising at all, and you’ll likely find that your stamina increases and you’re able to complete more challenging workouts with each passing day. In the event you are suffering from a very weakened immune system, you may want to exercise at home instead of visiting a public gym. But remember that exercise will ultimately help to boost your immune system, so it’s very important to continue with your program, even if you suffer from chronic illness or cancer.



Moon Landing Faked!!!—Why People Believe in Conspiracy Theories.

moon-landing-faked-why-people-believe-conspiracy-theories_1New psychological research helps explain why some see intricate government conspiracies behind events like 9/11 or the Boston bombing

Did NASA fake the moon landing? Is the government hiding Martians in Area 51? Isglobal warming a hoax? And what about the Boston Marathon bombing…an “inside job” perhaps?

In the book “The Empire of Conspiracy,” Timothy Melley explains that conspiracy theories have traditionally been regarded by many social scientists as “the implausible visions of a lunatic fringe,” often inspired by what the late historian Richard Hofstadter described as “the paranoid style of American politics.” Influenced by this view, many scholars have come to think of conspiracy theories as paranoid and delusional, and for a long time psychologists have had little to contribute other than to affirm the psychopathological nature of conspiracy thinking, given that conspiricist delusions are commonly associated with (schizotype) paranoia.

Yet, such pathological explanations have proven to be widely insufficient because conspiracy theories are not just the implausible visions of a paranoid minority. For example, a national poll released just this month reports that 37 percent of Americans believe that global warming is a hoax, 21 percent think that the US government is covering up evidence of alien existence and 28 percent believe a secret elite power with a globalist agenda is conspiring to rule the world. Only hours after the recent Boston marathon bombing, numerous conspiracy theories were floated ranging from a possible ‘inside job’ to YouTube videos claiming that the entire event was a hoax.

So why is it that so many people come to believe in conspiracy theories? They can’t all be paranoid schizophrenics. New studies are providing some eye-opening insights and potential explanations.

For example, while it has been known for some time that people who believe in one conspiracy theory are also likely to believe in other conspiracy theories, we would expect contradictory conspiracy theories to be negatively correlated. Yet, this is not what psychologists Micheal Wood, Karen Douglas and Robbie Suton found in a recentstudy. Instead, the research team, based at the University of Kent in England, found that many participants believed in contradictory conspiracy theories. For example, the conspiracy-belief that Osama Bin Laden is still alive was positively correlated with the conspiracy-belief that he was already dead before the military raid took place. This makes little sense, logically: Bin Laden cannot be both dead and alive at the same time. An important conclusion that the authors draw from their analysis is that people don’t tend to believe in a conspiracy theory because of the specifics, but rather because of higher-order beliefs that support conspiracy-like thinking more generally. A popular example of such higher-order beliefs is a severe “distrust of authority.” The authors go on to suggest that conspiracism is therefore not just about belief in an individual theory, but rather an ideological lens through which we view the world. A good case in point is Alex Jones’s recent commentary on the Boston bombings. Jones, (one of the country’s preeminent conspiracy theorists) reminded his audience that two of the hijacked planes on 9/11 flew out of Boston (relating one conspiracy theory to another) and moreover, that the Boston Marathon bombing could be a response to the sudden drop in the price of gold or part of a secret government plot to expand theTransportation Security Administration’s reach to sporting events. Others have pointed their fingers to a ‘mystery man’ spotted on a nearby roof shortly after the explosions. While it remains unsure whether or not credence is given to only some or all of these (note: contradicting) conspiracy theories, there clearly is a larger underlying preference to support conspiracy-type explanations more generally.

Interestingly, belief in conspiracy theories has recently been linked to the rejection of science. In a paper published in Psychological Science, Stephen Lewandowsky and colleagues investigated the relation between acceptance of science and conspiricist thinking patterns. While the authors’ survey was not representative of the general population, results suggest that (controlling for other important factors) belief in multiple conspiracy theories significantly predicted the rejection of important scientific conclusions, such as climate science or the fact that smoking causes lung cancer. Yet, rejection of scientific principles is not the only possible consequence of widespread belief in conspiracy theories.  Another recent study indicates that receiving positive information about or even being merely exposed to conspiracy theories can lead people to become disengaged from important political and societal topics. For example, in their study, Daniel Jolley and Karen Douglas clearly show that participants who received information that supported the idea that global warming is a hoax were less willing to engage politically and also less willing to implement individual behavioral changes such as reducing their carbon footprint.

These findings are alarming because they show that conspiracy theories sow public mistrust and undermine democratic debate by diverting attention away from important scientific, political and societal issues. There is no question as to whether the public should actively demand truthful and transparent information from their governments and proposed explanations should be met with a healthy amount of scepticism, yet, this is not what conspiracy theories offer. A conspiracy theory is usually defined as an attempt to explain the ultimate cause of an important societal event as part of some sinister plot conjured up by a secret alliance of powerful individuals and organizations. The great philosopher Karl Popper argued that the fallacy of conspiracy theories lies in their tendency to describe every event as ‘intentional’ and ‘planned’ thereby seriously underestimating the random nature and unintended consequences of many political and social actions. In fact, Popper was describing a cognitive bias that psychologists now commonly refer to as the “fundamental attribution error”: the tendency to overestimate the actions of others as being intentional rather than the product of (random) situational circumstances.

Since a number of studies have shown that belief in conspiracy theories is associated with feelings of powerlessness, uncertainty and a general lack of agency and control, a likely purpose of this bias is to help people “make sense of the world” by providing simple explanations for complex societal events — restoring a sense of control and predictability. A good example is that of climate change: while the most recent international scientific assessment report (receiving input from over 2500 independent scientists from more than a 100 countries) concluded with 90 percent certainty that human-induced global warming is occurring, the severe consequences and implications of climate change are often too distressing and overwhelming for people to deal with, both cognitively as well as emotionally. Resorting to easier explanations that simply discount global warming as a hoax is then of course much more comforting and convenient psychologically. Yet, as Al Gore famously pointed out, unfortunately, the truth is not always convenient.

Source: scientific American


YouTube’s top five most-viewed videos in the UK, as the site turns eight .

Robert Lustig: The no candy man.

A man who declares that sugar is a toxin in the same league as cocaine and alcohol, and one that must be regulated in the same manner as tobacco, is apt to draw public attention. But Robert Lustig, professor of clinical paediatrics at the University of California, San Fransisco, is not camera shy. Indeed, he revels in the attention, even when it is not always flattering. Where other academics might feel uncomfortable, he exploits his fame to full effect. For example, at a recent symposium in London he argued that sugar was an addictive and dangerous substance, singularly responsible for the soaring rates of obesity and diabetes around the world. He began his speech with a quotation from Gandhi and concluded by declaring a war against the sugar industry. The audience responded with rapture and enthusiasm.

Lustig, a paediatric endocrinologist specialising in neuroendocrinology, owes his fame predominantly to a lecture, posted on YouTube, entitled “Sugar: The Bitter Truth” ( At the time of writing, it had had more than 3.3 million views. Not bad for a 90 minute lecture, the bulk of which is devoted to complex biochemical reactions that happen in the liver. But Lustig is an engaging and passionate speaker, prone to rhetorical flourishes and dramatic pronouncements, which keeps his audience, virtual and real, interested.

Source: BMJ