Your tap water contains fluoride, aluminum, lead, chloride, chlorine and lithium. Use cilantro to purify water


By using natural cilantro, however, you can purify your water to make it safe to drink in a 100% natural way.

Many people use plastic water filters to to remove undesirable things from tap wtaer, but using cilantro is a more natural, purer way in which to do this.

Tap water has been known to include chlorine, fluoride, and different amounts of dissolved minerals such as calcium, magnesium, sodium, chlorides, sulfates as well as the metals iron, manganese, copper, aluminum, nitrates, insecticides and herbicides.

Scarier still, some tested tap water has shown traces of pharmaceutical drugs such as antibiotics and mood stabilizers, proving you never really know what could be in your water each time you turn on the tap.

Although found only in traces, heavy metal accumulate in your body if not flushed out, and can lead to serious health problems further down the line. Accumulative heavy metals have been linked to Parkinson’s disease and even Alzheimer’s.

Cilantro is so effective at removing these toxins because chemicals bind to cilantro, extracting themselves out of the water when they come into contact with it. It’s like nature’s own water filter.

Douglas Schauer from Ivy Tech Community College in Lafayette, Indiana has been studying the effects of cilantro as a water purifier.

He said “The organic toxins we can take care of pretty easily with a number of different methods, but the only way to really get rid of those heavy metals is to treat them with filtering agents like activated charcoal, but those types of materials are kind of expensive. They are a little expensive for us to use, but they are very expensive for the people living in that region.”

Use a hanful of cilantro into about every 2 litres of water you wish to use to ensure cleaner, better water.

LOS ALAMOS STUDY FINDS AIRPORT SCANNERS CAN RIP APART AND ALTER DNA


Can we ever believe what our government tells us about airport security devices?

 

Apparently not. First they told us those X-ray scanners (that showed way too many naked body parts) were perfectly safe.

Even the manufacturer of the device, Rapidscan, openly admitted the scanners had not been adequately tested. The truth was later revealed that the safety tests turned out to be totally rigged, as reported by Natural News.

With fabricated results, the technology was quickly rushed into every airport worldwide. No one listened to what the scientists in the field of radiation were trying to tell them – it’s not safe.

It wasn’t until the backscatter radiation levels the scanners were putting off began showing an increased incidence of cancer in TSA agents (along with the lawsuits that quickly followed), that the devices were finally yanked. The TSA quickly scrambled for another solution.

Now they also want us to believe that the replacement technology, millimeter wave “digital strip search” scanners, are also “perfectly safe”.

 scanner-600x450

Don’t believe it for a second. The TSA failed to adequately test these devices for health and safety factors as well. Unfortunately, in today’s world, security trumps human safety.

These millimeter wave technologies are designed to bombard innocent travelers with high frequency energy particles known as terahertz photons.

A study conducted by Boian S. Alexandrov et.al. at the Center for Nonlinear Studies at Los Alamos National Laboratory in New Mexico, revealed that these terahertz waves could “…unzip double-stranded DNA, creating bubbles that could significantly interfere with processes such as gene expression and DNA replication.”

 

In other words, this study is the smoking gun that raises serious concerns about the impact of terahertz radiation upon fertility, fetal development, and cancer.

Now think about the thousands of people who are subjected to these levels of untested energy particles every day in the name of “National Security”.

The military’s Active Denial weapon uses millimeter wave technology to create an intense burning sensation on the skin’s surface using a 95 GHz (3.2mm wavelength) beam.

But the TSA tells us not to worry about their millimeter waves because:

“Millimeter wave technology bounces harmless electromagnetic waves off the body to create the same generic image for all passengers.” 

This is completely inaccurate because the nature of millimeter waves is that our bodies and water are excellent absorbers of these waves. Millimeter waves do penetrate and absorb into our skin.

At the microwave technology center in Malaysia, health subjects were exposed to microwave radiation between 20 — 38 GHz, the range in which the TSA scanners operate.

They found that millimeter waves penetrated the subject’s skin at depths of between 1.05 mm at 20 GHz to 0.78 mm at 38 GHz. This is enough to penetrate below the epidermal layer of the skin.

Millimeter waves have been reported to produce a variety of bioeffects, many of which are quite unexpected from radiation penetrated less than 1 mm into biological tissues.

dna

Of particular concern is the citing of studies that show there is an irreversible water memory effect by millimeter waves operating in the 36GHz frequency, and that the millimeter wave effects on blood plasma vary greatly from one person to the next.

Does this information make you extremely uncomfortable? Well, it should. And it should also make every one of us mad as hell.

Since the day they first rolled out these human violation technologies in 2007, I have always chosen to “opt out”. I would rather endure the intrusive body pat down any day than subject myself to covert DNA alteration.

So what if it takes an additional 5-15 minutes of your time getting to your gate? It’s time to exercise your own personal body health consciousness, since the US government has clearly demonstrated they don’t possess any qualms about not protecting you.

Alternation of DNA can be subtle and deadly down the line. Who would ever make the connection that a TSA scanning machine might have contributed to any negative health effects you eventually experience.

 

If you are a frequent air traveler, like myself, you should be concerned about your levels of exposure. If you’re a TSA agent, you should find another job.

This past weekend as I was trying to make a flight back to Los Angeles from the Columbia, South Carolina Airport, I did my usual “opt out” thing. The TSA agents from this little backwater airport tried to feed me the propaganda line about “minimal risk.”

I told them I’d read the studies and they needed to be better informed. They looked at me blankly trying to tell me it was just like using a cell phone. Not true.

The millimeter wave scanners the TSA operates put out more than 20 billion times more oscillations per second in smaller terahertz waves, so the cellular effects will be different from cell phones.

I’m sure no one ever told TSA agents this, but they feed the same lies back to the people that they’ve been told, so I tried to be more forgiving.

I’m sure no one had requested an opt out for some time in this South Carolina airport, which is why I got the pat down of all pat downs.

The female agent made sure to give me karate chops straight up to my private parts twice in the back and then another two times in the front. Totally unnecessary. She kneaded my waist in a strange manner, grabbing hold of any loose skin she could find.

I have had hundreds of pat downs over the years, and no one, I mean no one, has ever been as intrusive as this TSA agent.

My first instinct was to tell her how inappropriate she was being, then I remembered how I would most likely be punished for my non-sheep-like behavior and not be allowed to make my flight.

During the procedure she also sniffled and sneezed, spreading her germ warfare all over me through out the entire security grope session.

I think we have all had enough of this undignified treatment in the name of security. It’s already been proven that these scanner devices and intrusive pat downs have not made our world any safer from terrorists.

Airport security testers have snuck through everything from guns to explosives, clearly proving their ineffectiveness. Metal detectors should be sufficient enough.

If everyone opted out of the scanner, the whole program would eventually fall apart. The lines of opt outs would be so long it would bring the air travel industry to a standstill.

It would also send a clear message that unsafe devices are not going to be tolerated. Take the extra time and just do it — opt out. If you love yourself, than you owe it to yourself.

Now I’m already ahead of you on what you’re thinking — that they’ll just suspend all our civil liberties and make it mandatory to go through the scanners whether we want to or not.

Well, I would like to believe that they would be flooded with lawsuits if they did, but there’s an even easier solution. Go to a medical supply store and buy a cheap inexpensive arm sling and put it on before going through TSA.

If you can’t hold both arms up over your head while in their scanner, it renders the results totally unusable. They know this and have to let you opt out for medical reasons. The sheeple are getting smarter. Afterall, life is all about how you handle Plan B.

Effectiveness of a triple-drug regimen for global elimination of lymphatic filariasis: a modelling study.


Background

Lymphatic filariasis is targeted for elimination as a public health problem by 2020. The principal approach used by current programmes is annual mass drug administration with two pairs of drugs with a good safety profile. However, one dose of a triple-drug regimen (ivermectin, diethylcarbamazine, and albendazole) has been shown to clear the transmissible stage of the helminth completely in treated individuals. The aim of this study was to use modelling to assess the potential value of mass drug administration with the triple-drug regimen for accelerating elimination of lymphatic filariasis in different epidemiological settings.

Methods

We used three different transmission models to compare the number of rounds of mass drug administration needed to achieve a prevalence of microfilaraemia less than 1% with the triple-drug regimen and with current two-drug regimens.

Findings

In settings with a low baseline prevalence of lymphatic filariasis (5%), the triple-drug regimen reduced the number of rounds of mass drug administration needed to reach the target prevalence by one or two rounds, compared with the two-drug regimen. For areas with higher baseline prevalence (10–40%), the triple-drug regimen strikingly reduced the number of rounds of mass drug administration needed, by about four or five, but only at moderate-to-high levels of population coverage (>65%) and if systematic non-adherence to mass drug administration was low.

Interpretation

Simulation modelling suggests that the triple-drug regimen has potential to accelerate the elimination of lymphatic filariasis if high population coverage of mass drug administration can be achieved and if systematic non-adherence with mass drug administration is low. Future work will reassess these estimates in light of more clinical trial data and to understand the effect on an individual country’s programme.

Molecular genetics: Chaperone protein gets personal.


https://www.nature.com/articles/nature22487.epdf?shared_access_token=c8s24l3qRiMqNgXLpYb4EtRgN0jAjWel9jnR3ZoTv0PR1bNM3z1AhEaxJXvjnlMRhg_3zMFcFVgFZUmYV_g3Ipj0K7-aw6aWLcuJgbTcaikgT4A99MNPnrBOHl1fVsm_NPTjfl1RKm8B23WgiT43du6jRWUVV-zoD3F4T3cY5Wk%3D

Science has outgrown the human mind and its limited capacities.


Cometh the man; Francis Bacon's insight was that the process of discovery was inherently algorithmic. <em>Photo courtesy NPG/Wikipedia</em>
Cometh the man; Francis Bacon’s insight was that the process of discovery was inherently algorithmic.

The duty of man who investigates the writings of scientists, if learning the truth is his goal, is to make himself an enemy of all that he reads and … attack it from every side. He should also suspect himself as he performs his critical examination of it, so that he may avoid falling into either prejudice or leniency. 

– Ibn al-Haytham (965-1040 CE)

Science is in the midst of a data crisis. Last year, there were more than 1.2 million new papers published in the biomedical sciences alone, bringing the total number of peer-reviewed biomedical papers to over 26 million. However, the average scientist reads only about 250 papers a year.Meanwhile, the quality of the scientific literature has been in decline. Some recent studies found that the majority of biomedical papers were irreproducible.

The twin challenges of too much quantity and too little quality are rooted in the finite neurological capacity of the human mind. Scientists are deriving hypotheses from a smaller and smaller fraction of our collective knowledge and consequently, more and more, asking the wrong questions, or asking ones that have already been answered. Also, human creativity seems to depend increasingly on the stochasticity of previous experiences – particular life events that allow a researcher to notice something others do not. Although chance has always been a factor in scientific discovery, it is currently playing a much larger role than it should.

One promising strategy to overcome the current crisis is to integrate machines and artificial intelligence in the scientific process. Machines have greater memory and higher computational capacity than the human brain. Automation of the scientific process could greatly increase the rate of discovery. It could even begin another scientific revolution. That huge possibility hinges on an equally huge question: can scientific discovery really be automated?

Get Aeon straight to your inbox
DailyWeekly

I believe it can, using an approach that we have known about for centuries. The answer to this question can be found in the work of Sir Francis Bacon, the 17th-century English philosopher and a key progenitor of modern science.

The first reiterations of the scientific method can be traced back many centuries earlier to Muslim thinkers such as Ibn al-Haytham, who emphasised both empiricism and experimentation. However, it was Bacon who first formalised the scientific method and made it a subject of study. In his book Novum Organum (1620), he proposed a model for discovery that is still known as the Baconian method. He argued against syllogistic logic for scientific synthesis, which he considered to be unreliable. Instead, he proposed an approach in which relevant observations about a specific phenomenon are systematically collected, tabulated and objectively analysed using inductive logic to generate generalisable ideas. In his view, truth could be uncovered only when the mind is free from incomplete (and hence false) axioms.

The Baconian method attempted to remove logical bias from the process of observation and conceptualisation, by delineating the steps of scientific synthesis and optimising each one separately. Bacon’s vision was to leverage a community of observers to collect vast amounts of information about nature and tabulate it into a central record accessible to inductive analysis. In Novum Organum, he wrote: ‘Empiricists are like ants; they accumulate and use. Rationalists spin webs like spiders. The best method is that of the bee; it is somewhere in between, taking existing material and using it.’

The Baconian method is rarely used today. It proved too laborious and extravagantly expensive; its technological applications were unclear. However, at the time the formalisation of a scientific method marked a revolutionary advance. Before it, science was metaphysical, accessible only to a few learned men, mostly of noble birth. By rejecting the authority of the ancient Greeks and delineating the steps of discovery, Bacon created a blueprint that would allow anyone, regardless of background, to become a scientist.

Bacon’s insights also revealed an important hidden truth: the discovery process is inherently algorithmic. It is the outcome of a finite number of steps that are repeated until a meaningful result is uncovered. Bacon explicitly used the word ‘machine’ in describing his method. His scientific algorithm has three essential components: first, observations have to be collected and integrated into the total corpus of knowledge. Second, the new observations are used to generate new hypotheses. Third, the hypotheses are tested through carefully designed experiments.

If science is algorithmic, then it must have the potential for automation. This futuristic dream has eluded information and computer scientists for decades, in large part because the three main steps of scientific discovery occupy different planes. Observation is sensual; hypothesis-generation is mental; and experimentation is mechanical. Automating the scientific process will require the effective incorporation of machines in each step, and in all three feeding into each other without friction. Nobody has yet figured out how to do that.

Experimentation has seen the most substantial recent progress. For example, the pharmaceutical industry commonly uses automated high-throughput platforms for drug design. Startups such as Transcriptic and Emerald Cloud Lab, both in California, are building systems to automate almost every physical task that biomedical scientists do. Scientists can submit their experiments online, where they are converted to code and fed into robotic platforms that carry out a battery of biological experiments. These solutions are most relevant to disciplines that require intensive experimentation, such as molecular biology and chemical engineering, but analogous methods can be applied in other data-intensive fields, and even extended to theoretical disciplines.

Automated hypothesis-generation is less advanced, but the work of Don Swanson in the 1980s provided an important step forward. He demonstrated the existence of hidden links between unrelated ideas in the scientific literature; using a simple deductive logical framework, he could connect papers from various fields with no citation overlap. In this way, Swanson was able to hypothesise a novel link between dietary fish oil and Reynaud’s Syndrome without conducting any experiments or being an expert in either field. Other, more recent approaches, such as those of Andrey Rzhetsky at the University of Chicago and Albert-László Barabási at Northeastern University, rely on mathematical modelling and graph theory. They incorporate large datasets, in which knowledge is projected as a network, where nodes are concepts and links are relationships between them. Novel hypotheses would show up as undiscovered links between nodes.

The most challenging step in the automation process is how to collect reliable scientific observations on a large scale. There is currently no central data bank that holds humanity’s total scientific knowledge on an observational level. Natural language-processing has advanced to the point at which it can automatically extract not only relationships but also context from scientific papers. However, major scientific publishers have placed severe restrictions on text-mining. More important, the text of papers is biased towards the scientist’s interpretations (or misconceptions), and it contains synthesised complex concepts and methodologies that are difficult to extract and quantify.

Nevertheless, recent advances in computing and networked databases make the Baconian method practical for the first time in history. And even before scientific discovery can be automated, embracing Bacon’s approach could prove valuable at a time when pure reductionism is reaching the edge of its usefulness.

Human minds simply cannot reconstruct highly complex natural phenomena efficiently enough in the age of big data. A modern Baconian method that incorporates reductionist ideas through data-mining, but then analyses this information through inductive computational models, could transform our understanding of the natural world. Such an approach would enable us to generate novel hypotheses that have higher chances of turning out to be true, to test those hypotheses, and to fill gaps in our knowledge. It would also provide a much-needed reminder of what science is supposed to be: truth-seeking, anti-authoritarian, and limitlessly free.

The Campi Flegrei supervolcano is bubbling away under Italy


Camp_Flegrei_Super_Crater